It had been a while since I’d worked on the robot, and I wanted to work on some movement algorithms. I’ve done some AI work lately on a separate project, and thought that this would help with the automated movement task. Unfortunately, the Robot had a little accident, namely falling out of the loft whilst I was bring it down. It’s been long overdue the removal of some of the excess hardware, and also needed some bugfixes that I now had no choice but to perform.
I’ll bullet point the changes as there are a few:
- Robot now has two charging contacts that need to be specifically enabled via controllable relay
- Excess acrylic was removed
- Robot’s LCD, Speakers, Cam & Battery are also all on relays and can be enabled or disabled via software.
- The Robot has a clap activated switch, and so programs can be triggered via clapping
- The current meter was hooked directly to battery output
- The Robot has additional IR distance sensors. Now 2 front, 2 left, 2 right and 1 back
- Wheels are now much better aligned
- The Robot now has a more compact mic for better voice recognition
- The speaker relay has a capacitor over it to avoid the initial surge when powered on.
- The carnetix voltage regulator had failed and so was replaced.
- The software has been rewritten in C to interface directly with the hardware, and python to provide a TCP server to interface with all necessary functions. I use PHP for the ‘client applications’, to interface with the python interface. Ideally I would use python for the client apps too, but I haven’t got around to learning Python properly, and this wasn’t the time to do it.
The new hardware and software address a lot of the problems that we had before, the platform is also more stable and more reliable. I’ve written a set of functions that allow the robot to move around automatically. I’ve mapped the flat into 5 different areas and written a set of rules for handing each area. Each ruleset gets the robot from the start of one section to the end, ready to hand over to the next ruleset. We only rely on the positioning of the robot vs known objects mapped out. i.e. while right_sensor_1 and right_sensor_2 are both sufficiently close to the wall and the front sensor is sufficiently far from the door, keep moving forward, whilst lining up straight with the right all type logic. The downside with this approach is unexpected objects in the robot’s path, doors left open, etc. This could be overcome with more sophisticated coding.
Here’s a video of the robot navigating automatically from the bedroom through to the lounge:
In the next article, I’ll show how the new hardware and software allows for greater control over movement in general when being controlled manually.