Ok, so I’ve got tons of projects on my plate to do and to finish. And tons more that are still very experimental, or haven’t even been started. Plenty to keep me busy, with the summer vacation period nearing fast.
Yet here’s something different. The timeline for this project was imposed by external factors: my daughter Myra doing a project for the last quarter of her second year at the Design Academy in Eindhoven.
She wanted to do something which triggers human interaction, and she wanted to try something with Physical Computing. “Terrific, I’ll help!” – I shouted, completely ignoring all the pending work on my own plate…
So here’s the start of a few articles about the “TwitLEDs” project we’ve been working on recently. All the basic ingredients work as I write this, but we have yet to finish the final setup and go through a last rehearsal.
What is it?
It’s a mix between a matrix printer and a persistence-of-vision (POV) display.
It’s called the TwitLEDs robot. And it’s hooked up to the internet.
Get it? No? Ok, then let me try again. I think this picture tells it best:
The idea is to have a little autonomous robot driving around, leaving messages behind on a floor covered with glow-in-the-dark paint. The messages are collected off Twitter using a configurable search term. This is done from a laptop and then sent to the robot by wireless.
There were several fairly non-trivial problems to solve here, with some experimentation needed to find a workable mix – as well as some time contraints. A few days of work would be the most I could set aside for this. Luckily, I didn’t get lost in too many dead alleys, so it worked out nicely.
Here are the pieces we used:
- A low-cost robot kit called Asuro – based on an ATmega8, so I had all the software ready for it. In fact, I played around a bit with it a while back – as reported here.
- A JeeNode for wireless connectivity.
- An Output Plug to drive some LEDs.
- Seven blue LEDs. I picked a bright one with a very focused beam (C503B-BAN-CY0C0461).
- Glow-in-the-dark paint. Green stuff. Three coatings.
- Some panels to create a floor. Covered with the green stuff.
- Some cardboard to create an “arena” on the floor to contain the robot.
- JeeMon running on a MacBook, with a JeeLink to send out the messages.
- A fairly dark room. This just won’t work with the lights on, unfortunately.
As with every project, the first part is the hardest and the most critical success-factor: figuring out what to do, what not to do, and finding solutions within the many constraints we had to operate under. I’ll spare you the ideas that didn’t make it, and the (really neat) ideas we simply didn’t have time for.
Being the sole programmer on the team, I got to deal with all the software. Yummie! :)
The most important insight for me was that we could implement this project with three completely independent subsystems:
- The LED blinker, driving 7 LEDs in the proper pattern, basically a POV unit (plus receiver).
- The robot, moving around while continuously trying to stay out of trouble.
- The server process running on the laptop, connecting to Twitter and sending messages into the air.
We started off with the LED blinker because it was a major component with few unknowns, i.e. we picked the low-hanging fruit first. Here’s a picture of it, still under construction:
More on the LED blinker tomorrow…