Hedge Court Robots
|HCR Home||Projects||Builder's Log||Links||Search|
Tridolph is a line following robot. After taking 10-ish years off from hobby robotics, he is the first mobile robot that I made. Technically, he is the first robot of my own creation, because 10 long years ago I made a few follow-the-instructions bots, but flopped when creating my own masterpiece.
Tridolph is named for his 3 red noses that let me know what he thinks of the brightness of a line under his left, mid, and right eyes. Up top, he's a frizzy mess of jumper wires.
As a proof of concept, he works perfectly. The concept he proves is that I can start with something simple, get some fun and lots of learning out of it, and then use it to launch on to larger or higher quality projects.
Line Sensor Sled
Point-to-point wiring with embedded resistors was tedious and reminded me of why I learned to make PCBs all those years ago.
I started the project with the line sensor sled. I was getting frustrated at trying to choose some vaguely matching photoresistors out of my ancient junk box of robot doodads. The solution was to create a small sled that holds some LEDs and photoresistors in a fixed position, then I can have more repeatable measurements of voltages in varying light conditions and surfaces. Lego is the perfect way to quickly prototype that sort of system. Pin vice to drill some holes to pass the leads through the plates, then press fit the photoresistors into the holes under the plate. Bent wires keep the floor illumination LEDs kinda pointed straight down, and a 10-year-old tube of Liquid Nails keeps the nose LEDs in place.
The sled slides along the ground which works well on a smooth surface like a carefully prepared line following course floor tile. The friction between the sled and ground is tough on a thick pile carpet, sometimes causing Tridolph to get stuck or be unable to turn fast enough when carpet fibers grab the sled too hard.
Zip tie construction is some what feeble, but easy to make and sturdy enough for basic line following in carefully selected environments (i.e. my room).
As part of my return to robots, I got an Arduino starter kit that came with an assortment of doodads. One of them was a Vishay TL1838 IR receiver. Google landed me with an Arduino library for interpreting the incoming remote control data.
At first, I simply hooked ground to Arduino ground, Vcc to Arduino regulated 5v, and the output pin to a digital input on the Arduino (pin 11). Sitting on my desk, the prototype worked like a charm, and I smiled as I watched my serial monitor repeat back my IR remote key presses. Then wire it into the robot and program it to stop and start movement. Press start and he starts moving. Happy robot builder. Press stop and he ignores completely and drives off under the desk as I scramble.
I threw a bunch of bypass capacitors of various value and chemistry between Vcc and GND to remove signal noise. Substantial improvement but he still occasionally ignores me. If I wanted a real solution, I would follow the sample circuit on the datasheet and move the sensor away from the motors (it sits directly above a motor in the current build), but for now its good enough and doesn't bother me too much as I have to test my reflexes and jump to get him when he's being difficult.
LED Line Indicators
As mentioned up in the sled section, there are 3 LEDs to indicate what the line sensors are determined to be looking at, the two choices being line and not-line. These are functional even when motor control is disabled, lets me manually move the robot over various surfaces to see what the program thinks.
Lesson learned: check all your LEDs before cutting/gluing/drilling/soldering/heat shrinking/clipping. All three came out of a bag of 25 of the same LEDs. All three are using the same size current limiting resistor. Only one was used for testing on breadboard before being committed to the sled. Once she's up and running, two look great and the third clearly does not match, a manufacturing or packaging error from 10 years ago. So one of the eyes looks funny and now I test all my LEDs before I solder.
Returning to robots after 10 years, I dig through the junk box and find two GM2 gear motors ready to play with. Grab it and insta-fail, the contacts must have gotten exposed to moisture over the years and rusted nearly through and through. One fell off. Tiny bit of decent metal looks visible in the terminal, so I soldered on some new wire. Works but just barely. Quickly zip tied the wires securely to the body so that any future force on the wires will be transmitted through the zip tie into the motor's body rather than on the delicate terminals.
Not sure how powerful a new set of GM2's is, but these are quite weak. Might be the disintegrating contacts or might be the age/exposure, or maybe they're just weak motors which is why they were so cheap back in the day.
As a first project, I wanted to see some progress and play with a moving object rather than a breadboard sitting on my desk, so the motors are mounted in the least sophisticated way I could think of, namely zip ties and a Lego plate.
Wheels and Coupler
In the same disintegrating bag of old robot parts as the motors, I found a matching pair of Solarbotics wheels that mate directly with the output shaft. They are held on with a little screw that cuts its own threads into the plastic motor output shaft. Rubber bands stretched around the circumference of the wheels provide added traction.
An incomplete project from 10 years ago provided a SN754410NE quad h-bridge. The microcontroller uses 3 pins to operate each motor, allowing enable/disable plus full directional control. PWM on the enable pin gives speed control, and high/low settings on the directional pins allow for forward, reverse, coast, and regenerative brake
Control and Logic
Tridolph uses an Arduino UNO R3 programmed with the Arduino IDE/compiler for control.
When commanded via IR Remote, Tridolph runs a calibration routine. For a fixed interval of time, one motor powers forward and one powers backward, thus pivoting the robot around its center. Up to 100 samples are taken for each of the 3 line sensors. I was worried that I'd use up the RAM available on the microcontroller, so it stops sampling after 100, although it never reaches there because I built the timing of the loop to complete the rotation interval before 100 samples is reached.
The following calculations are done for each sensor independently:
Once this is done for each of the 3, choose one (doesn't matter, but my program uses the right sensor) and see if the mean overall is closer to the mean above or the mean below. The way he's wired, high numbers are bright numbers, so if the mean overall is closer to the mean above, then calibration saw more bright samples than dark ones, implying the background is bright and the line is dark. Otherwise, there were more dark samples than bright ones, so the bg is dark and the line is bright.
Using this technique, the only slightly-matching photoresistors are calibrated to the current surface/line/ambient light and the program can independently analyse each sensor and decide whether it is over a line or not. Saves me from using trim potentiometers, which seem to be all the rage but I'm lazy and cheap and personally am more able to comprehend complex software rather than complex electronics and hardware.
For the record, the "fixed interval of time" technique is garbage. I calculated the duration by testing on one specific surface, and the bot does a nice 360 degree turn with a full battery. With falling battery voltage or a different surface texture, the robot either under spins or over spins. A better way would be to see the line at the start, keep rotating until 180 and you see the line again, then continue another 180 until you are back home over the line again. But this is a beginners prototype robot, I'll save that fanciness for the production model.
Line Following Algorithm
Three basic conditions:
Other Notable Behaviors
|Hedge Court | HCR Home | Contact Us | Search||Copyright ©, HedgeCourt.com|