Computing stuff tied to the physical world

Posts Tagged ‘JeeBot’

TwitLEDs finale

In AVR, Hardware, Software on Jun 29, 2010 at 00:01

The TwitLEDs project by Myra and Jean-Claude Wippler (daughter and father) has come to a conclusion. It was great. I’ll just summarize by pointing to the six posts on the Jee Labs weblog describing the technical details, and some pictures and videos, showing the results.

Here you can see the finished TwitLEDs robot:

Dsc 1741

This is the, ehm… print head?

Dsc 1749

And last but not least, some videos. First a trial run for the print mechanism:

Next, a trial run for the vehicle, trying to stay out of trouble:

And here the result!

And one more:

Ok, that’s it. Myra and I both had oodles of fun building this and trying things out – hope you did too :)

Onwards!

(If you can’t view the videos on vimeo, see 1, 2, 3, and 4)

TwitLEDs robot, part 4

In Hardware on Jun 28, 2010 at 00:01

Today’s episode continues the description of a little robot car which “prints” Twitter messages on a glow-in-the-dark surface.

The last missing piece was the Twitter interface. I implemented it in Tcl/Tk using JeeMon.

Here’s the setup we’re using:

Dsc 0002

The idea is to search Twitter every 30 seconds for a specific search term, and then send the message into the air using a JeeLink.

This is the main code, i.e. application.tcl:

Screen Shot 2010 06 27 at 13.02.06

It uses a custom rig (i.e. module) called twitter.tcl to do the generic part, i.e. firing off HTTP GET requests with a query, and reporting results on screen as well as calling a user-specified call-back proc:

Screen Shot 2010 06 23 at 22.58.32

Here’s an example of the output, when left running for a little while – note that only messages containing the text “design” will be picked up:

Screen Shot 2010 06 23 at 23.10.39

This implementation is a bit simplistic. It will truncate each tweet and only send out the first 60 characters (to fit into a single RF12 packet).

But hey… this is proof-of-concept, after all!

Update – the arena wall is ready. Time to try it out!

Dsc 0017

Here’s a video of the TwitLEDs robot, driving around while staying out of trouble… as you can see, it really manages to do so – mostly ;)

That almost concludes this project. Tomorrow, the last installment will be posted, with final photos and videos.

TwitLEDs robot – action shots

In AVR, Hardware, Software on Jun 25, 2010 at 00:01

As promised in the previous post, some pictures and movies of the TwitLEDs robot in action.

First another view of the robot, with the different pieces:

Dsc 1747

We don’t have the proper “arena” yet, i.e. a big fenced-off floor area covered with glow-in-the-dark paint. Instead, these first trials used two pieces of foam board, taped together. It’s not perfect because the robot keeps running off track, and because the hump between the two pieces is quite high. Foam board isn’t really suited for this: it curls up too much from the moisture in the paint. We probably should have waited a bit longer for everything to dry completely…

The inital test code just printed out its name (povGlow) and the compilation date. In the first tests, one of the LEDs wasn’t working (a software bug, not hardware), so the text isn’t quite right:

Dsc 2039

You can clearly see the fading of the letters over time. This happens very quickly, but those faded letters then remain visible for quite a while. That’s why you can still see several trial runs “printed out” on the foam board.

So the basic idea of printing with light works, as you can see!

Here is a video (sorry, not inline), showing how the robot veers to the right as I put my hand in front of it to prevent it from running off the track. Note that I’m not touching the robot, I just briefly trigger the distance sensor. The on-board LED lights up in red when the correction takes place. It’s not very smooth, but it works.

Another picture, showing the decay of the printed text brightness:

Dsc 2067

Maybe 7 pulsed blue lasers (from DVD writers, perhaps?) could be used to create even more intense blue dots. As it is, with these blue LEDs the writing is very clear – but only in a relatively dark room. With the lights on, the text becomes virtually invisible. Even though the LEDs are bright enough to be painful to look at:

Dsc 1749

The quality of this “printer” is actually pretty good, considering how simple its technology is:

Dsc 2056

Finally, a run where the robot happened to stay a bit longer on track, allowing it to display its brief message a few times. Again as video – you can see the wobbling foam board, and this thing driving like a drunken duck, leaving its trail of fading messages behind.

All that remains, is to try and get some tweets into it in real time. Stay tuned…

Several days of testing have now drained the 4 alkaline AA to the point where the robot advances noticeably slower. Looks like we’re getting no more than an hour or two out of these batteries. Which is not really surprising: the DC motors must be eating quite a bit of current, and the LEDs probably draw up to 200 milliamps or so. Self-powered autonomous motion is really hard!

TwitLEDs robot, part 3

In AVR, Hardware, Software on Jun 24, 2010 at 00:01

To continue on yesterday’s post, here is how we got a little autonomous robot going.

The main part was solved by picking the Asuro robot kit. It’s really low-end, but it has just enough functionality for this project, and at €50, it’s very affordable. In fact, Myra built a spare one because the first unit broke down. In the end, I was still able to fix it: a burnt out transistor (both H-bridges are done with discrete components!). So now we have two Asuro’s:

Dsc 1750

The nice bit about the Asuro is that it has an odometer on each wheel. IOW, there are IR LEDs + sensors to count the number of steps (8 per rev) made by the wheel, and the C library code includes logic to adjust the speed of the motor. It’s a bit crude, but because of this the Asuro can drive fairly straight. As we found out later, it has a bit more trouble doing so while driving slowly, so it’s still a bit wiggly.

But it works. Two small DC motors, some simple gears, motor axles soldered to the PCB (what a great low-cost solution), and room for an extension board. To give you an idea of how crude this thing really is: there is no on-board voltage regulator. When used with 4 alkaline AA batteries, you have to remove a jumper so the extra voltage drop over a diode gets the supply voltage down to under 5.5V …

The Asuro is full of such nifty cost-cutting tricks. It even includes a bidirectional IR link, over which new code can be uploaded. The IR link is very short-range, so it would have been insufficient for our purposes – but for quick code tweaks, the IR link works fine.

Speaking of code, here is the main avoider.c logic running on the ATmega8:

Screen Shot 2010 06 18 at 23.57.08

This is not an Arduino sketch, but I made it look a bit like one by using the same setup() and loop() functions.

This code has to be compiled with avr-gcc. The Asuro comes with several examples with Makefiles. I simply copied one and started extending it. A little optional USB-IR adapter is used to connect to the Asuro (an RS232 one is included with the Asuro kit, but I don’t have any RS232 ports). The whole setup works pretty well from the command line, especially considering that it’s a completely different setup than the Arduino IDE.

Anyway, back to the design of the TwitLEDs robot.

The challenge was to find a simple way to make this robot drive around without bumping into things. The Asuro has a row of switches at the front, but these are not very reliable, and besides: bumping and stopping and turning would mess up the LED strip being “printed” on the glow-in-the-dark paint.

So instead, we mounted a Sharp 10..80 cm distance sensor on top. It’s very easy to read out, since it produces an analog voltage, inversely related to the distance of objects in front of it.

The logic for collision avoidance is crude, but sufficiently effective: turn more and more right as you get nearer to an obstacle. The on-board LED turns from off to green to yellow to red as the distance decreases, so it’s easy to see what the robot is doing. I’ve tweaked it so that when it drives straight towards an obstacle, then turning will be quick enough to never bump into that obstacle.

Here’s the completed unit, with the Sharp proximity sensor on a little expansion board at the front, and the LED blinker glued to the side of the battery pack – the LEDs hover 1 .. 2 mm above the ground:

Dsc 1741

Myra’s idea was to have this thing drive inside a more or less circular “fence” made of corrugated cardboard. The whole area inside the fence will be made of some panels, covered with 3 layers of glow-in-the-dark paint.

There is a flaw in the current design, in that the robot can only evade an obstacle by turning to the right. If it approaches a wall obliquely from the right, it will do the wrong thing and drive straight into the wall. But we’re hoping that as long as it doesn’t over-correct, and by sending it off in a tangent, it will keep adjusting slightly to the right as it drives around and around in “sort of” circles.

More tests and tweaks will no doubt be needed.

But as it is, this thing really seems to stay out of trouble. We can let it loose in the room and it’ll veer to the right whenever it comes near an obstacle. Sometimes it veers to far, though. Again, I hope a more controlled “arena” will be sufficiently simple to keep it going.

When the robot does bump into its front switches, it turns both motors off, blinks for 10 seconds, and then starts off again. Time enough to pick it up and aim it in a different direction.

So that’s it. An autonomous unit with two independent computers, IR distance sensing, a short range IR link, and the JeeNode with a longer-range wireless RF link. Even including a JeeLink on the laptop side, the cost of all this was only slightly over €100.

Tomorrow, some pictures + movies. Gives me time to finish the Twitter link, and then on to the grand finale!

(Reminder: one week left for the June special in the Jee Labs shop!)

TwitLEDs robot, part 2

In AVR, Hardware, Software on Jun 23, 2010 at 00:01

Yesterday’s post introduced the robot Myra and I have been working on. Here’s the first part we built:

Dsc 1725

It’s basically a backplane for the LED blinker component. Or to put it differently: a simple persistence-of-vision (POV) unit, using a JeeNode, and Output Plug to drive a few LEDs. Only the output plug was soldered-in permanently. The removable JeeNode allows it to be easily programmed and re-used, and the removable LEDs allow trying out different units. This turned out to be important, because I only had a few green LEDs when starting this, and had no idea then as to what sort of LEDs would give the best POV results later on.

Myra did all the soldering. Here are the two LED mounts we ended up with:

Dsc 1745

The one on the left is the super-duper LED concoction we built as final version. The one on the right was great for initial testing.

Everything is held together with rubber bands, zip-lock ties, tape, and ample amounts of hot glue (once verified to work!) – hacking at its best, clearly:

Dsc 1742

Here’s the LED blinker with the final LED strip, side view:

Dsc 1743

Side view close-up – with the foam board cover:

Dsc 1746

Bottom view:

Dsc 1744

Seven blue LEDs, ready to shine very brightly and controlled by the JeeNode.

The software started out very simple, of course. Things like this, just to make sure it all works:

Screen Shot 2010 06 18 at 23.01.21

This is the main part of what is more or less the final twitLEDs.pde sketch:

Screen Shot 2010 06 18 at 23.03.06

I found a suitable font table by googling around a bit. This is needed to go from ASCII characters to dots-in-a-readable-pattern. No room for Unicode (don’t laugh: some tweets are in Japanese and Chinese, and they won’t show properly).

The amazing bit is that everything worked essentially on first go. It blinked! But does it blink in the proper pattern? Our first test consisted of Myra taking a long-exposure shot, as I waved this thing around in the air – with the lights off. Liesbeth tracked progress through all the shrieks and laughs… but from a safe distance :)

Dsc 1793

Yippie. It really works!

Tomorrow: driving around without bumping into things.

Something different…

In AVR, Hardware, Software on Jun 22, 2010 at 00:01

Ok, so I’ve got tons of projects on my plate to do and to finish. And tons more that are still very experimental, or haven’t even been started. Plenty to keep me busy, with the summer vacation period nearing fast.

Yet here’s something different. The timeline for this project was imposed by external factors: my daughter Myra doing a project for the last quarter of her second year at the Design Academy in Eindhoven.

She wanted to do something which triggers human interaction, and she wanted to try something with Physical Computing. “Terrific, I’ll help!” – I shouted, completely ignoring all the pending work on my own plate…

So here’s the start of a few articles about the “TwitLEDs” project we’ve been working on recently. All the basic ingredients work as I write this, but we have yet to finish the final setup and go through a last rehearsal.

What is it?

It’s a mix between a matrix printer and a persistence-of-vision (POV) display.
It’s called the TwitLEDs robot. And it’s hooked up to the internet.

Get it? No? Ok, then let me try again. I think this picture tells it best:

Dsc 2069

The idea is to have a little autonomous robot driving around, leaving messages behind on a floor covered with glow-in-the-dark paint. The messages are collected off Twitter using a configurable search term. This is done from a laptop and then sent to the robot by wireless.

There were several fairly non-trivial problems to solve here, with some experimentation needed to find a workable mix – as well as some time contraints. A few days of work would be the most I could set aside for this. Luckily, I didn’t get lost in too many dead alleys, so it worked out nicely.

Here are the pieces we used:

  • A low-cost robot kit called Asuro – based on an ATmega8, so I had all the software ready for it. In fact, I played around a bit with it a while back – as reported here.
  • A JeeNode for wireless connectivity.
  • An Output Plug to drive some LEDs.
  • Seven blue LEDs. I picked a bright one with a very focused beam (C503B-BAN-CY0C0461).
  • Glow-in-the-dark paint. Green stuff. Three coatings.
  • Some panels to create a floor. Covered with the green stuff.
  • Some cardboard to create an “arena” on the floor to contain the robot.
  • JeeMon running on a MacBook, with a JeeLink to send out the messages.
  • A fairly dark room. This just won’t work with the lights on, unfortunately.

As with every project, the first part is the hardest and the most critical success-factor: figuring out what to do, what not to do, and finding solutions within the many constraints we had to operate under. I’ll spare you the ideas that didn’t make it, and the (really neat) ideas we simply didn’t have time for.

Being the sole programmer on the team, I got to deal with all the software. Yummie! :)

The most important insight for me was that we could implement this project with three completely independent subsystems:

  • The LED blinker, driving 7 LEDs in the proper pattern, basically a POV unit (plus receiver).
  • The robot, moving around while continuously trying to stay out of trouble.
  • The server process running on the laptop, connecting to Twitter and sending messages into the air.

We started off with the LED blinker because it was a major component with few unknowns, i.e. we picked the low-hanging fruit first. Here’s a picture of it, still under construction:

Dsc 1726

More on the LED blinker tomorrow…

Bad things happen

In Hardware on Jun 13, 2009 at 00:01

Looks like one of the sensors broke:

Broken sensor

That’s an IMU (Inertial Measurement Unit) with a dual accelerometer and a gyroscope. The gyro is the “big” one on the right. It senses rotation around the Z axis perpendicular to the board. The accelerometers are working fine, but the gyro appears to be toast. No matter what I try, it produces a noisy but fixed analog voltage. Connecting either of the self-test pins doesn’t doing anything.

Ouch, the gyro is also the most expensive part (in the €25..50 range). And this was a rather nice and sensitive one. Oh, well…

(that’s an Arduino with a battery pack by LiquidWare underneath, btw – so I can move this thing around and hook up the scope without having to attach power)

A tale of stand-alone builds

In AVR, Software on Jun 10, 2009 at 00:01

The JeeBot Duo runs off a Baby Orangutan controller by Pololu, which isn’t entirely compatible with the Arduino IDE. Well, it can be made to work with it but in this case it looks like just using avr-gcc and avrdude directly from a Makefile would be just as simple.

Boy, was I wrong.

I’ve spent nearly two days trying to understand why this thing wasn’t working. Or rather, it was. Sometimes. But with resets all the time, and the watchdog being tripped even though I never enabled it.

One problem is that the BOC has no other signaling on board than a single LED. So first thing was to set up serial communication. Well, that worked. But then it didn’t. And then things got really totally erratic…

The problem turned out to be in the highlighted portion of this Makefile:

Picture 2.png

Without the $(CFLAGS) on the linker line, everything   s o r t   o f   works.

Until you start using RAM for data, such as statics/externals and strings. Or C++ classes inside the Pololu libraries for example.

Then the system goes berzerk. Because the linker wrongly assumes that RAM starts at 0×0060. So strings and data variables end up being written into the extended I/O area. Want to know what’s at address 0×0060? The watchdog control register.

Anyway. It took me many hours of complete despair to figure this one out. Salvation came from “avr-objdump -h”, which clearly showed that the “.data” segment was in the wrong place compared to an Arduino sketch. With many thanks to Ben @ Pololu for nudging me towards the solution and for patiently responding to my emails, some of which must have been pretty weird…

I think that what baffled me most of all was the fact that the simplest bits of code worked just fine. You can get a lot of little tests going without ever using strings or allocating static variables. And the failures being random resets (from the watchdog) kept me looking for hardware problems – silly me.

At one point, I had figured out the incorrect starting RAM address and added this linker option: “-Wl,-Tdata,0×800100″. Unsure why it had to be specified but it got past the most erratic behavior.

But then interrupts wouldn’t work. Disassembly (with “avr-objdump -D”) showed that the generated code was using an incorrect set of vectors. And then it dawned on me: the $(CFLAGS) includes “-mmcu=atmega328p” which the linker needs to put things in the right place and to use the right vectors.

Doh. Totally obvious in hindsight.

Now I can finally generate and upload code without having to go through the Arduino IDE.

JeeBot Duo schematic

In AVR, Hardware on Jun 8, 2009 at 00:01

Heh – my most “sophisticated” robot so far. Here’s the plan for the electrical hookup of the JeeBot Duo:

Picture 1.png

There’s quite a bit of functionality in there, all the way to an on-board LiPo battery charger. Other things to note: the Baby Orangutan controller (BOC) can shutdown all power (to avoid running down the battery too far) and the JeeNode can reset the BOC if need be.

The two processors communicate with each other via an I2C bus. The on-board FTDI header connects to the BOC serial port and supplies power to the LiPo charger. The FTDI header on the JeeNode can be used for monitoring, control, and uploading, as usual.

JeeBot Duo – in progress

In AVR, Hardware on Jun 7, 2009 at 00:01

To continue this bot-craze, here’s another design, the JeeBot Duo

More JeeBots in progress

This design is considerably more advanced. From left to right:

  • two 50:1 micro motors with wheels and encoders (Pololu)
  • a 6-pin FTDI header
  • the Baby Orangutan 328 motor controller (Pololu)
  • underneath: a low-voltage boost regulator (Pololu)
  • at a 45° angle: IMU board with 2-axis accelerometer and gyro (SparkFun)
  • a JeeNode v2 with on-board RFM12B 868 MHz radio module
  • underneath: a 860 mAh LiPo battery

It’s called a JeeBot Duo because there are two processors on board. The plan is to put the self-balancing smarts in the Baby Orangutan – which has all the hardware needed to connect to all sensors and motors. The JeeNode is available for telemetry, remote control, and perhaps to connect to further gimmicks later on.

Initial tests show that the motors have plenty of speed and torque, and draw 40 .. 300 mA each, depending on load. The boost regulator can produce a constant 5.4 V for the motors, while the battery level gradually decreases. There’s a tiny bit of slack in the gears, which translates to about 1 mm of slack/travel on the wheels.

The remaining space is for a compass module and an on-off switch. That makes this a relatively complete autonomous unit, able to sense its vertical position and change, as well as the distance traveled and the compass heading. With the proper software, it should one day be able to respond to commands such as “go 2 meter north”, or “drive around in a 1-meter circle”.

But that’s a long way off. Lots of tuning and software development will be needed to get there…

JeeBot Mini – in progress

In AVR, Hardware on Jun 6, 2009 at 00:01

Here’s a second design, called the JeeBot Mini – no less:

More JeeBots in progress

The first JeeBot used servo’s but that’s not really such a great fit for self-balancing. This one uses miniature (and pretty weak) motors, directly driven by the ATmega’s I/O pins. The small socket is for the same Memsic 2125 accelerometer as on the original JeeBot, but placed at a 45° angle to use both X and Y axis together.

On the right is a tiny 100 mAh LiPo battery – plenty for simple trials. No gyro, no regulator, and no resonator (so far). No JeeNode either – just a serial FTDI interface and a bare ATmega168 chip running the LilyPad bootstrap @ 8 Mhz, straight off 3.7 V from the battery.

I’m waiting for some wheels for this thing. Am not sure the wheels will turn fast enough to regain balance once detected. If not, maybe raising the center of gravity by extending the “chassis” upwards will help. Or bigger wheels. Or both. Or maybe the design choices are hopelessly wrong already… we’ll see.

Telemetry

In AVR, Software on Jun 4, 2009 at 00:01

The JeeBot now has telemetry functionality, continuously sending real-time process data by wireless for analysis and display on a desktop PC.

It was quite simple to add this: send a buffer with the latest data every 200 msec, using the RF12 driver. No acks, no checking – the receiver grabs as much as it can off the air, and simply ignores invalid and lost packets.

Here is the essence of the code added to the JeeBot loop, which runs every 20 ms – in lock-step with the pulses it generates to control the servos:

Picture 2.png

A second JeeNode acts as receiver to pick up these data packets and transfer them to the serial port – here’s the complete sketch:

Picture 3.png

Sample output:

Picture 1.png

These values show what the JeeBot is doing while running on its own internal batteries … i.e. wireless telemetry in practice!

The next step will be to analyze and visualize this data to help me properly tune the PID control parameters. That’s more work.

Minimal motion

In AVR, Hardware on Jun 3, 2009 at 00:01

Now that I’ve been bitten by the robot bug, it’s hard to stop. Wanted to experiment with a minimal setup for self-balancing:

Minimal motion

These are 1.5 .. 3V motors with 1:96 gears from Conrad. What’s special about these is that they only need some 20..30 mA current, which is sort of within the direct-drive capability of an ATmega168. That means 2 I/O pins per motor can be used to create an H-bridge without any further electronics.

Here’s a small sketch to drive these from a JeeNode:

Picture 1.png

A couple of details show that this really is pushing the limit: without the capacitors, timing becomes erratic. Also, the delays between reversals seems to be needed to bring the motors to a stop. I suspect that there’s still some very bad stuff happening due to inductive kickback, so diodes to ground and VCC are probably required to avoid damaging the MPU chip.

But it does run. It’s all driven off a tiny 100 mAh LiPo battery in that picture above, and it kept going longer on a charge than I was willing to wait for. Probably over an hour – total power draw is in the 50 .. 60 mA range.

Anyway, this is even more spielerei than the JeeBot. Just wanted to find out whether these components might be used for a little baby balancer one day. This would also require wheels and at least an accelerometer.

Vertical JeeBot

In AVR, Hardware on Jun 2, 2009 at 00:01

Here’s a first example of the JeeBot standing upright:

Vertical JeeBot

When held upright on startup, it actually stands on it’s own two wheels very calmly, compensating immediately for slight imbalances for a couple of seconds. But beyond a minor error, it fails to regain its balance – so my hands are right outside the picture, ready to catch it :)

Have added a Memsic 2125 accelerometer to provide a sense of what’s up. It’s connected to port 1, with the X and Y axes each producing pulses at around 100 Hz.  The gyro kicks in when it’s falling either way. Having both near the center of gravity helps, as the counter force of the servo won’t disturb the accelerometer as much.

Here’s the main code. It has a crude low-pass filter and PID control loop:

Picture 1.png

The parameters were set using (much) trial and (much) error. I’ll clearly need to learn more about PID control to make this thing properly self-balancing. The loop runs 49 times per second on average, which also happens to be the servo pulse rate – maybe there’s some accidental interaction with the SoftwareServo library.

But still, a lot better than I had expected after one day!

Horizontal JeeBot

In AVR, Hardware on Jun 1, 2009 at 00:01

The JeeBot has made its first explorations, using a Nunchuk controller:

JeeBot baby steps

It drives on two wheels plus a coaster ball to support the batteries – although not very fast and veering slightly to the right. It will also turn, by rotating the servos in opposite directions. The batteries push down hard on the coaster ball, making the wheels slip when turning if the surface is not hard enough. The control mechanism is somewhat proportional, although it seems to work best by just pushing the joystick to its limits.

Here’s the current primitive control loop:

Picture 1.png

Some more close-ups:

JeeBot baby steps

From left to right: the Wii adapter, batteries with coaster ball underneath, on-off switch, JeeNode, and the edge of the servos.

Not much wiring needed to get it going:

JeeBot baby steps

The two resistors divide the battery voltage in half, so it can be measured by an ADC pin. Note the use of wire to “bundle up” the Nunchuk I2C “bus”.

Port layout so far: 1 = unused, 2 = left servo + battery level, 3 = right servo, 4 = Nunchuk via I2C.

I’ve also mounted the gyroscope (not connected yet) in the proper orientation to sense rotations around the wheel axis:

JeeBot baby steps

Will need to add an accelerometer as well, no doubt.

The center of gravity is just about on the power-switch, between the battery pack and the JeeNode. Which is quite low when this thing is placed upright, not sure the servos + control loop will respond fast / accurately enough for balancing.

Current consumption is around 30 mA when idle, 160 mA with wheels turning freely, and up to some 300 mA under load – these 4 NiMH batteries give about 5.2V under no load when full, but I’ve seen it drop to 3.7 V under load after a bit of use – not so good…

Meet the JeeBot

In AVR, Hardware on May 27, 2009 at 00:01

Here’s a fun new project, the JeeBot:

Meet the JeeBot

A little platform with two continuous-rotation servos, a JeeNode, and a (4x AA) battery pack. It’s all mounted and fixed together with what I had lying around: strong epoxy-based perf-board and lots of fasteners.

The electronics isn’t “quite finished” as you can see:

Meet the JeeBot

The JeeNode can be removed. There’s a small slide switch on the side to turn on power.

First task would be to get this thing rolling and steering – adding a small coaster of some sort on the right side under the batteries, to keep this thing off the floor.

The more ambitious plan is to make this thing stand upright and self-balance… who knows, one day, maybe it’ll be able to walk roll on two feet wheels?

Gyro sensor

In AVR, Hardware, Software on May 25, 2009 at 00:01

Here’s the LISY300AL gyro sensor, mounted on a Sparkfun breakout board:

Gyro sensor

The sensor is not extremely sensitive, but works at 3.3 V. It has a single analog output, so readout is trivial:

Picture 1.png

Sample output:

Picture 2.png

The readout changes as this setup is rotated around the horizontal plane. Looks like the self-test (pin 5) to ground can be omitted, btw.

One (tiny!) step closer to a self-balancing bot…

Update – the Segwii website is a fantastic resource on how to accomplish self-balancing in a fairly similar setup.

Servo fun

In AVR, Hardware on May 24, 2009 at 00:01

Here’s some fun with a servo:

Servo fun

A slider drives the two servos in opposite directions, so you can make this thing turn on the spot.

Here’s the sketch:

Picture 1.png

Had to use SoftwareServo instead of the Servo library which comes with Arduino IDE 0015 – which didn’t work for me.

Turns out that only pulses from 1.4 to 1.6 msec actually have any effect on the speed of the servos, the rest of the 1.0 .. 2.0 msec range just makes them run at full speed. These ara Parallax (Futaba) Continuous Rotation Servos.

Note that the servos are driven from the PWR line, i.e. the full 5V – but the servo pulses are 3.3V, like the rest of the JeeNode.

It’d be nice to use a Wii Nunchuk controller as input. Even more fun would be an accelerometer / gyro combo to turn this thing into a self-balancing bot …