Super cute line following robot prototype


Tridolph is a line following robot. After taking 10-ish years off from hobby robotics, he is the first mobile robot that I made. Technically, he is the first robot of my own creation, because 10 long years ago I made a few follow-the-instructions bots, but flopped when creating my own masterpiece.

Tridolph is named for his 3 red noses that let me know what he thinks of the brightness of a line under his left, mid, and right eyes. Up top, he’s a frizzy mess of jumper wires.

As a proof of concept, he works perfectly. The concept he proves is that I can start with something simple, get some fun and lots of learning out of it, and then use it to launch on to larger or higher quality projects.

bottom view with itmized components


Line Sensor Sled

The sled contains 4 features:

  1. 3 individually controllable LEDs to indicate whether the program considers the corresponding sensor to be over a line or not.
  2. 3 photoresistors to sense surface brightness, to be interpreted by the microcontroller.
  3. 2 constant-on blue LEDs to illuminate the surface and reflect back to the eyes.
  4. A jumble of point-to-point wires.

Point-to-point wiring with embedded resistors was tedious and reminded me of why I learned to make PCBs all those years ago.

top view of sensor sled and indicator LEDs

I started the project with the line sensor sled. I was getting frustrated at trying to choose some vaguely matching photoresistors out of my ancient junk box of robot doodads. The solution was to create a small sled that holds some LEDs and photoresistors in a fixed position, then I can have more repeatable measurements of voltages in varying light conditions and surfaces. Lego is the perfect way to quickly prototype that sort of system. Pin vice to drill some holes to pass the leads through the plates, then press fit the photoresistors into the holes under the plate. Bent wires keep the floor illumination LEDs kinda pointed straight down, and a 10-year-old tube of Liquid Nails keeps the nose LEDs in place.

The sled slides along the ground which works well on a smooth surface like a carefully prepared line following course floor tile. The friction between the sled and ground is tough on a thick pile carpet, sometimes causing Tridolph to get stuck or be unable to turn fast enough when carpet fibers grab the sled too hard.


rear view with itmized components

Rear view of the body showing:

  1. Battery platform. Very unstable, but a 9v NiMH battery pressure-fits between the two green zip ties and survives moderate activity.

  2. Motors positioned between Lego guides and zip tied into position.

  3. Arduino UNO R3 board with proto-shield on top makes up the top of the body.

  4. Cardboard from tissue box provides insulation/protection for conductive pins poking out the bottom of Arduino.

  5. Jumpers galore form the hair.

  6. Side-view of IR Remote sensor.

Zip tie construction is some what feeble, but easy to make and sturdy enough for basic line following in carefully selected environments (i.e. my room).

top view of tridolph body

User Interface


Infrared Remote

As part of my return to robots, I got an Arduino starter kit that came with an assortment of doodads. One of them was a Vishay TL1838 IR receiver. Google landed me with an Arduino library for interpreting the incoming remote control data.

At first, I simply hooked ground to Arduino ground, Vcc to Arduino regulated 5v, and the output pin to a digital input on the Arduino (pin 11). Sitting on my desk, the prototype worked like a charm, and I smiled as I watched my serial monitor repeat back my IR remote key presses. Then wire it into the robot and program it to stop and start movement. Press start and he starts moving. Happy robot builder. Press stop and he ignores completely and drives off under the desk as I scramble.

I threw a bunch of bypass capacitors of various value and chemistry between Vcc and GND to remove signal noise. Substantial improvement but he still occasionally ignores me. If I wanted a real solution, I would follow the sample circuit on the datasheet and move the sensor away from the motors (it sits directly above a motor in the current build), but for now its good enough and doesn’t bother me too much as I have to test my reflexes and jump to get him when he’s being difficult.


LED Line Indicators

As mentioned up in the sled section, there are 3 LEDs to indicate what the line sensors are determined to be looking at, the two choices being line and not-line. These are functional even when motor control is disabled, lets me manually move the robot over various surfaces to see what the program thinks.

Lesson learned: check all your LEDs before cutting/gluing/drilling/soldering/heat shrinking/clipping. All three came out of a bag of 25 of the same LEDs. All three are using the same size current limiting resistor. Only one was used for testing on breadboard before being committed to the sled. Once she’s up and running, two look great and the third clearly does not match, a manufacturing or packaging error from 10 years ago. So one of the eyes looks funny and now I test all my LEDs before I solder.

Drive Train


Returning to robots after 10 years, I dig through the junk box and find two GM2 gear motors ready to play with. Grab it and insta-fail, the contacts must have gotten exposed to moisture over the years and rusted nearly through and through. One fell off. Tiny bit of decent metal looks visible in the terminal, so I soldered on some new wire. Works but just barely. Quickly zip tied the wires securely to the body so that any future force on the wires will be transmitted through the zip tie into the motor’s body rather than on the delicate terminals.

Not sure how powerful a new set of GM2’s is, but these are quite weak. Might be the disintegrating contacts or might be the age/exposure, or maybe they’re just weak motors which is why they were so cheap back in the day.

Motor Mount

As a first project, I wanted to see some progress and play with a moving object rather than a breadboard sitting on my desk, so the motors are mounted in the least sophisticated way I could think of, namely zip ties and a Lego plate.

Wheels and Coupler

In the same disintegrating bag of old robot parts as the motors, I found a matching pair of Solarbotics wheels that mate directly with the output shaft. They are held on with a little screw that cuts its own threads into the plastic motor output shaft. Rubber bands stretched around the circumference of the wheels provide added traction.

Motor Driver

An incomplete project from 10 years ago provided a SN754410NE quad h-bridge. The microcontroller uses 3 pins to operate each motor, allowing enable/disable plus full directional control. PWM on the enable pin gives speed control, and high/low settings on the directional pins allow for forward, reverse, coast, and regenerative brake.

Control and Logic


Tridolph uses an Arduino UNO R3 programmed with the Arduino IDE/compiler for control.


When commanded via IR Remote, Tridolph runs a calibration routine. For a fixed interval of time, one motor powers forward and one powers backward, thus pivoting the robot around its center. Up to 100 samples are taken for each of the 3 line sensors. I was worried that I’d use up the RAM available on the microcontroller, so it stops sampling after 100, although it never reaches there because I built the timing of the loop to complete the rotation interval before 100 samples is reached.

The following calculations are done for each sensor independently:

  • Let “mean overall” equal the average of all samples
  • Let “mean above” equal the average of all values above the mean overall
  • Let “mean below” equal the average of all values below the mean overall
  • Let “line threshold” equal the average of mean above and mean below

Once this is done for each of the 3, choose one (doesn’t matter, but my program uses the right sensor) and see if the mean overall is closer to the mean above or the mean below. The way he’s wired, high numbers are bright numbers, so if the mean overall is closer to the mean above, then calibration saw more bright samples than dark ones, implying the background is bright and the line is dark. Otherwise, there were more dark samples than bright ones, so the bg is dark and the line is bright.

Using this technique, the only slightly-matching photoresistors are calibrated to the current surface/line/ambient light and the program can independently analyse each sensor and decide whether it is over a line or not. Saves me from using trim potentiometers, which seem to be all the rage but I’m lazy and cheap and personally am more able to comprehend complex software rather than complex electronics and hardware.

For the record, the “fixed interval of time” technique is garbage. I calculated the duration by testing on one specific surface, and the bot does a nice 360 degree turn with a full battery. With falling battery voltage or a different surface texture, the robot either under spins or over spins. A better way would be to see the line at the start, keep rotating until 180 and you see the line again, then continue another 180 until you are back home over the line again. But this is a beginners prototype robot, I’ll save that fanciness for the production model.

Line Following Algorithm

Three basic conditions:

  • Middle sees the line and both sides are the same - full speed ahead on both motors. If both sides are on the line, then we’re centered over a wide line. If both sides are off the line, then we’re centered over a narrow line.

  • Middle plus only one side see the line - half speed on the side that sees the line, full speed on the side that does not. This provides a gentle nudge for a slight drift off of center.

  • Only one side sees the line - stop on the side that sees the line, full speed on the side that does not. This provides a dramatic nudge for a larger drift from the line.

  • Technically there is a 4th situation - no sensors see the line, full stop on all motors. Helps the bot stop in the event he looses the line (and ignores my IR Remote command to stop).

Other Notable Behaviors

  • About Face - Closed loop of tape on the floor. He works in one direction, does he work in the other? Stop him with IR remote. Get off butt and go use hands to turn him around, press play on remote. Way too much work. Press an IR remote button and he does a rotation for 0.5 of the duration that calibration runs, so an about face or 180 degree turn. Not perfect but works good enough for my entertainment.

  • Bouncy - Not intended, however his weight distribution and balance versus his motor power is right on the cusp of flipping backwards. If he hits enough lumpy carpet, he’ll start bouncing like a car with hydraulic shocks. Fun to watch if he’ll recover or eventually fall backwards. A real/production robot would be designed to remove this, but for a prototype its a blast.