28 November 2008

Arduino pet robot head taking shape...

My ideas for an Arduino-based pet robot are slowly taking shape in spare minutes. I have a basic movement-based vision system up and running.

At the moment, it has basic (LDR-based) stereo light input, which feeds a weird real-time event integrator algorithm and some associated data structures - one per sensor. These feed a further algorithm which calculates the position data being sent to a servo upon which the two LDRs are mounted (revolving left and right in the the Y axis). The act of turning this "head" servo to indicate captured attention creates new input data itself, leading to complex feedback.

The whole thing is designed to produce nothing more controlled than emergent behaviour. For example, I never told it that if I sweep my hand in front of the sensors from left to right, the head should turn to follow it. Nor did I tell it that if I sweep my hand the other way the head should follow. For a laugh, I mounted the device in front of the computer's monitor, set the wallpaper to black and moved a small white window around. The head followed it, slowly losing interest when it stopped. I placed the device in front of a TV and let it track moving objects. It's weird to watch the head suddenly take an interest in things. At some point, I'll post a video.

The next phase is to add more LDRs to resolve movement in more detail, followed by a second servo to control the head's X axis (sothat it can look up and down), but already this project is getting very interesting.

Stumble Upon Toolbar