Robot Line Follower

October 12th, 2012

So I’ve had my eyes on the Pololu robot for some time. Here’s a video of it automatically following a track!

Having done that, I decided to add line following ability to the larger robot. It’s quite a simple circuit and fits in nicely with my arduino experimentation. I have a strip of high power LEDs with a pot to trim the brightness. Underneath that, I have 5 light sensitive resistors, reasonably well spaced apart and separated by card. The theory here is that the light sensitive resistor  that is over the black line will have less resistance than the others, and using that, we can work out whether we need to move more to the left or the right.

[nggallery id=2]

 

Pololu with XBEE Manual ControlThe 5 analog inputs are fed to the Arduino chip, which converts them to digital output and passes it to the board over USB serial. From there we can write a simple algorithm to control movement.

I then decided to go ahead and test the new XBEE chips for wireless communication. They really are plug and play.. Here’s another video –

Driving the robot with the power of the mind

May 3rd, 2011

In this video I’ll be showing how I used the Emotiv headset to drive the robot. Please excuse the editing, and in fact the video itself. I will perhaps aim for a better one in future. Hopefully the video is sufficiently clear in what’s happening.

Emotiv EPOC Headset

May 3rd, 2011

Having an interest in Brain Computer Interface (BCI) hardware, and with the release of the Emotiv EPOC headset, I decided to invest in the Research Edition. The research edition comes bundled with the research version of the SDK. The benefit of this version of the SDK is that we also have access to the raw EEG outputs from each channel, as well as a piece of software called TestBench that allow for saving/replaying sessions.

The headset itself comes almost ready to go, just attach the felt pads and wet them with saline. Affixing the headset to the head is a bit difficult the first few times and requires a bit of practice. Once you’ve done it a few times, it becomes a lot easier. The key is getting each of the felt pads absolutely soaking wet with saline before putting it on the head.

I’m going to be using the headset for a number of different purposes. As a BCI, I intend to use it to manipulate the robot, and probably in future other more useful tasks. Separately, I plan to use it to monitor sleep and meditation sessions which are a separate area of interest of mine.

In a previous post, I showed how it was possible to control the robot using voice commands. Here, I’ve modified the voice control script to handle simple keystrokes instead. The Emotiv control panel comes with a tool called ‘Emokey’ that can issue defined keystrokes when certain actions are detected within the control panel. Here’s a video:

The purpose is really to show how simple it is to interface with any regular program/script using EmoKey. There are other more exciting possibilities such as directing music based on the Affectiv suit. The Affectiv suite deals in mind states such as frustration, alertness, meditation, etc.

Robot voice control

May 2nd, 2011

I’ve set up a new mic and used cvoicecontrol (with some bug fixes) to perform voice control. I’ve integrated cvoicecontrol into my C HAL layer. Each voice model needs training and saving, however once done, they can be reused in the code. For example; if (listen(“yesno”)) { … } is all that’s required to listen for a yes or a no, assuming that “yesno.cvc” has been trained in advance. I’ve also integrated the clap switch across the one of the Phidgets digital inputs. The software requires two toggles within 8 seconds of each other, and the hardware configuration requires two claps to generate one output toggle. This seems the best way to filter out other noise from triggering it. The result is two sets of two claps are required to activate the voice control. Here’s an example:
Read the rest of this entry »

Robot vacuum? Who needs a Roomba

May 2nd, 2011

The Roomba’s a great invention. Who needs one when you’ve got one of these though? </joke> For over 4 times the price and nowhere near the simplicity or ease of use, I present:

Linux robot automatically charging

May 2nd, 2011

Since the robot’s rebuild, I finally tackled the automatic charging situation. There are a number of ways to get the device to autocharge. If it always has line of sight to it’s charger, it can spin until it finds it using infra red, then follow the beam – this however doesn’t work without line of sight. It could use a compass, although there are too many magnetic fields, and this requires advance knowledge of positioning. The simplest method would be to always start on charge, and just store movement history, reversing it when it was necessary to charge. Problem here is that even with good wheel alignment, AND accelerometers, even after few movements, simply reversing them is often not good enough to get it even close to it’s original position.
Read the rest of this entry »

Linux Robot – Manual Movement

May 1st, 2011

The robot’s new design also allows for better manual movement – here’s a video of the robot being manually controlled. This is all done via the TCP server and can be hooked to any TCP speaking application.

Rebuilding the Robot

May 1st, 2011

It had been a while since I’d worked on the robot, and I wanted to work on some movement algorithms. I’ve done some AI work lately on a separate project, and thought that this would help with the automated movement task. Unfortunately, the Robot had a little accident, namely falling out of the loft whilst I was bring it down. It’s been long overdue the removal of some of the excess hardware, and also needed some bugfixes that I now had no choice but to perform.
Read the rest of this entry »

Using the Phidget Interface Kit under Linux

August 5th, 2009

I thought that it might be a good idea to write a quick high level overview of getting the USB Phidget Interface Kit working under Linux. In my case I am of course using 32bit Debian, however these instructions should mostly be portable to any other Linux based OS

Read the rest of this entry »

The Robot: Hardware working and ready to go, a few minor glitches

November 26th, 2008
Robot

Robot

Progress has as always been good lately. The robot boots up quickly and appears on the wireless LAN, with openssh running. The internal Atheros miniPCI wasn’t doing the trick and wireless performance was shaky at best. I’m using an Alfa Networks USB adapter (r8187) and an 8dBi gain antenna now, so this has some distance now!

I was also getting frustrated with the laggyness of the board while VLC was running for streaming audio and video and so I decided on an IP Camera (Edimax), which is connected directly to the LAN port on the Alix board (I don’t have any reason to use it for anything else).

The motor control script works well and the device is responsive. At this point I can drive the device around

Robot

Robot

relatively easily and accurately, stream video and audio back to my laptop, which again is connected wirelessly.

Using ‘espeak’ you can easily generate a synthesized voice to provide easy text to voice:

echo “I am a robot”|espeak

Everything is working great and I’m pleased so far. The only reason why there isn’t a video up yet is because I haven’t had the time! There will be one up shortly.
Read the rest of this entry »

The Robot: Independant, moving, talking, and controlled via WiFi

November 22nd, 2008

Robot

Robot

I’ve made some excellent progress over the last week! The Robot is now independant, and it moves freely. I’ve written a simple shell script to take the following characters as control:
a – left
s – stop
d – right
w – forward
x – back
q – hard stop
k – turn anticlockwise
l – turn clockwise

This sends a single byte to the serial port. I am using 2xUSB to TTL converters which show up as

Robot

Robot

/dev/ttyUSB0 and /dev/ttyUSB. Each serial port controls two motors through the sabertooth controller. As we control two motors with only a byte, each motor has a 7 bit resolution from full reverse to full forward. For motor 1, 0 is stop, 1 is full forward, 64 is stop, 127 is full reverse. Motor 2 starts at 128 for full forward, 192 for stop, and 255 as full reverse. Although 7 bits of accuracy, speed changes only seem to occur at roughly 4 intervals, so we technically have about 32 different speeds, 14 forward, 2 stop, 14 reverse. We’re only using 3 speeds though as I can’t see the benefit in programming for any more right now.

The movement now seems to be working well. Smooth, controlled and straight which is something of a miracle 😉

Robot

Robot

The battery is a 12V/7.2Ah sealed lead acid battery. With USB devices active, the board running and the processor 100% active, as well as peripheral fancy LEDs, digital outputs high, wifi active, etc, it uses 12v/800mA.

With all four motors moving at full speed, it uses 12v/6A. Seeing as the motors will be in action for short periods only, I would expect 6h+ battery life.

I have tested the sensors, and they are all working and reporting data except for the top back one which I’m going to have to investigate. Here are some more pictures:

Read the rest of this entry »

The Robot: Base, Wheels, Motors, and Sabertooth Motor Controllers

November 16th, 2008


After attaching the 4 motors and brackets to the acrylic square, I found that it started to dip slightly due to the weight, and as I’d planned to put a 1.5kg lead acid battery in the center and I realised that this needed to be addressed. Rather than another visit to Homebase for some steel reinforcement, I just stuck (melted) two pieces firmly together with polycarbonate acid glue and then trimmed the edges with an electric saw.

Base

Base

Here is the base, the insulation tape all over the place is to hold down the connectors that I won’t be needing. The motors all contain encoders which I didn’t just want to rip out, so I’ve preserved the connectors for future usage, and just cut the – and + cables in a way that they can easily be reconnected to the connector if I ever want to. They were expensive motors so I didn’t want to ruin them!

If anyone is wondering why I didn’t attach standoff cylinders to the controller’s super large heat sink rather than attaching it directly to the acrylic base [which would normally be a bad idea], it’s because I didn’t have any standoff’s left, and the controllers are capable of 25A per channel. I will never drive them at higher than 4A, and the motors running on 4A for 30m or 2A for 2 hours solidly as a test didn’t generate any noticeable heat on the heat sink at all. At first I had also predicted the use of a fan to suck air in from the base, but I’m not sure it will be necessary, as nothing seems to get remotely hot so far..

I’ve also slightly indented the 4 points where the acrylic cylinders will be glued, just for extra stability. The motors are all wired to the two motor controllers, which has a junction box waiting for 12v now. The picoPSU should arrive some time this week, so hopefully I can get on with it.

Omnidirectional Wheel

Omnidirectional Wheel

The wheels are omnidirectional as they contain rollers. It’s a clever design and it seems to work well. Infact, I’m pleased with the way the motors and wheels ended up. Instead of having to work with two wheels and spending time on calculating angles for servo motors and turn radius, I can just attach 4 motors instead in the configuration that I have and using omnidirectional wheels. The motors will pull a lot of weight and I only have to concern myself with backward and forward for each motor, which in any combination will allow it to move in any direction. Hey, I’m not saying that I ‘invented’ this ingenious combination, just taking the credit for a smart move in implementing it! I have connected a power source directly across each of the motors to test. They are straight, and when I turn them all in the same direction, the board rotates around a ‘very almost perfect’ fixed axis which is great. I had in mind when I was positioning these, that I didn’t want to spend a ton of time in the software compensating for wheels that aren’t straight.
Read the rest of this entry »

The Robot: A body, Sensors and Good Progress

November 14th, 2008


Progress is going really well and I’m happy so far. Unfortunately I didn’t want to show the body yet as it is so far from finished but as I haven’t posted an update in a while I decided to just go with it.

Front

Front

Head

Head

The body is ever so slightly lop sided by a few mm here and there which is a shame however from a short distance you wouldn’t notice, it stands up straight and weight distribution is equal throughout the base plate so I’m happy with it. Ok, ‘professionally’ the body’s a mess however for my zero experience in that kind of work, I’m reasonably happy.

This is the front of it, top is a mounted webcam, to the left of that is a phidgets temperature sensor and top right is a phidgets light sensor. I am waiting to add 8 colored status LEDs around a small flat panel 5v stereo speaker as a ‘mouth’ (I got it from a Nokia phone bundle).
Read the rest of this entry »

The Robot: Phidgets USB Interface Board Kit works

November 7th, 2008

The Phidget interface kit arrived and so did a few of the analog sensors that I ordered. I can’t believe just how simple they are to use and just how friendly and comprehensive their SDK is!

Here are some pictures:

Phidget Interface Kit

Phidget Interface Kit

This is the interface kit itself. It’s a regular USB device and draws minimal power. Along the top of the board are the analog sensor inputs. Each is connected via a simple 3-pin wire, ground, data and +V.  Along the right hand side are 8 simple digital on/off inputs. Along the left hand side are 8 just as simple digital on/off outputs. In this case, I have connected the Phidgets analog light sensor which you can just about see on the left of the picture. Download the Phidgets Linux SDK from their site, compile, and run the examples. The range on the light sensor is fantastic. It advertises 0 to 500 range and does indeed live up to the promise. Pitch black and the sensor reads < 5, and pushed up close against a 400W light, the sensor reads > 480. Normal light conditions and the sensor reads between 30 and 180 – very very useful.

The SDK comes with plenty of examples and is incredibly user friendly! I would recommend these all day long.. it really is plug and play.

IR Distance Sensor

IR Distance Sensor

And here’s a distance sensor. It’s a simple IR mechanism that ranges from about 1m to 10cm. There are also 10cm to 5mm sensors available. Again, works great, really reliable.

So now these work, I’ve ordered some more and they’re on their way. One temperature sensor, two voltage sensors, some sonar sensors and more IR sensors – fantastic products.

In the mean time, I’ve ordered a load of clear acrylic and plan to start putting a body together shortly.

I’m still having a little trouble talking to the motor controller so if anyone has any I2C knowledge, please please let me know. I don’t want to buy a prebuilt base.. I think it’s cheating.

The Robot: Successful installation of Debian onto the Alix 3c2 board

November 2nd, 2008

Some hardware has arrived!

Mess

Mess

So my working space is a little bit of a mess at the moment. There’s no better way of getting started than just getting straight to the point.

The Alix 3c2 main board arrived in good health and works well. On the underside is a 512MB CF card and an Atheros MiniPCI Wifi. I’ve soldered single core wire to the I2C bus pinout. GND, CLK, Data & +3v.

I’ve also soldered bell wire across the power input. It accepts a wide input and so I’ve decided on 12v.

This is my prototype “power distribution board”. Currently it consists of 2 12V/2A regulators, some resistors and a 1000uF/30V smoothing capacitor. It provides 12v to the Alix board, and 12v to the motor controller. If both motors stall, they can use up to 6A, so whilst this is fine for testing the controller board, I’m going to have to replace one of the regulators with a transformer system to provide the necessary power to the motors.
Read the rest of this entry »