Pets, Plants, and Computer Vision
Header

DRAGON BOT IS GO!

July 1st, 2013 | Posted by admin in audio | demo | Detroit | Electronics | FIRST | Fun! | Maker Faire | pics or it didn't happen | RaspberryPi | robots - (Comments Off on DRAGON BOT IS GO!)
Dragon Bot Scale Model

Dragon Bot Scale Model

FRC 830 has been collaborating with FRC 3322 to build a giant dragon robot for Maker Faire Detroit. I just got back from my trip and a chance to check in with the kids. The goal is to have a giant robot that plays sounds, shoots smoke rings, drives, lights up, and has animatronic eyes and eye brows. The students have prototyped an eye assembly using some servos controlled by the PWM ports on the cRIO side car. The eyes are controlled using the analog joy sticks on the gamepad. After a little bit of debugging we were able to get the animatronic eye assembly running this afternoon.

Another one of the students was able to build a small GPIO driven relay system to control the smoke machine which we plan to power using a second battery and a car inverter. In my spare time this week I was able to cook up a client-server system using RabbitMQ and get it running on the RaspberryPi. The only real trick was setting up the RabbitMQ conf file to run on the space constrained RaspberryPi. This is a little bit outside the scope of the kids ability, but now that I have a sketch working they should be able to take over. The hope is that we can use PyGame and ServervoBlaster to control the lights and sounds on the robot. I want to roll a GUI front end for this using pyGTK. The result looks like this (I now have the GTK gui running).


Mwaaaaahahahha. by @kscottz

Controlling an LED Light Strip with a RaspberryPi.

April 4th, 2013 | Posted by admin in demo | Electronics | Fun! | Open Source | pics or it didn't happen | RaspberryPi - (Comments Off on Controlling an LED Light Strip with a RaspberryPi.)

IMAG0296

I got a RaspberryPi at PyCon and I had a LED light strip laying around. I wanted to see if I could make the two play nice together. The GPIO pins on the Pi can talk pulse width modulation (PWM) and so did the LED light strip I had, so I figured I could get the two to talk. I found this really awesome tutorial that walked me through the process. The gist of the tutorial is that you use low current signals from the RaspberryPi to control high current signals coming from some wires plugged into an AC adapter (in this case a 12V/1A switchable AC adapter I picked up at Radio Shack). You do this by using a Darlington transistor, specifically a TIP120 transistor, which you can pick up at your local Radio Shack for $1.69. The transistor is basically a voltage controlled current source. To control the transistors you use the ServoBlaster C library. This library basically does the PWM and tells the RaspberryPi to set a particular pin high or low. You call the ServoBlaster module via the command line. The author of the original tutorial uses a swell little python script to repeatedly make command line calls and then sleep the python program.

You can see the results below. The track playing in the background is Marijuana by Chrome Sparks.

The next step is to use SimpleCV to acquire images from a USB camera and peg the LED colors to the average image color. I would also like to use a few buttons to start and stop the lights as well as some MP3s (think instant dance party).

If you are curious I am using a $25 LCD for the screen of the RaspberryPi. It didn’t come with an AC Adapter or a composite video cable so I had to buy those separate. I also picked up a USB wireless card and a powered USB hub which come in really handy. I was really pleased when the wireless card worked out of the box with the Raspbian OS Wheezy release. My mini panvice is great for holding the pi. I also picked up a set of male/female jumper cables that made wiring everything a snap.

Simple Steganography

February 16th, 2013 | Posted by admin in code | computer vision | demo | Fun! | pics or it didn't happen | steganography - (Comments Off on Simple Steganography)

I had a moment to play with steganography while I was watching tv tonight. Steganography is a way of encoding text messages inside image files in such a way that they don’t alter the original image content. I came across this nice little tutorial using the stepic python library. I was able to get Stepic working with and hooked into SimpleCV with only a little bit of massaging (needed to turn SimpleCV Image to a PIL Image and back again). Here is the actual commit.

The Source Image

The Source Image

 

 

The encoded Image.

This dinosaur has an encoded message, can you find it?.

I wrote a little bit of code to test my work and to see if I could tease out the algorithm. Basically all I did was encode a message, in this case the wikipedia entry on stegosaurs, into an image and then subtracted that image from the original imaage to create a diff. To the naked eye both the source image and the encoded image look the same. The diff also looks as it should, that is to say all black. To dig a bit deeper I applied a histogram equalization function to stretch out the images’s dynamic range. Bingo, the encoding scheme is clearly visible.

The pixelwise difference between the source and encoded image.

The pixelwise difference between the source and encoded image.

The difference between the source and encode information using an equalize operation.

The difference between the source and encode information using an equalize operation.

The next step is to look at the individual color channels to see if they are holding on to any information. I could look at the actual algorithm in the source, but that would be no fun. It would be interesting to see if I could build a heuristic for determining which images have encoded data. I would also be useful to add AES encryption to the encode/decode calls in SimpleCV.

As a side note I tried to decode my image using this on-line utility but I had no luck. My guess is there are incompatibilities between the stable 0.3 release of the stepic library and the development 0.4 release that might be on the utility.

Crawl, Walk, Drive

February 14th, 2013 | Posted by admin in Ann Arbor | automation | Automation Alley | C++ | code | demo | FIRST | Maker Works | pics or it didn't happen | robots - (Comments Off on Crawl, Walk, Drive)

We finally got the FIRST Team 830 drive train up an running. We still have a lot of hardware to attach, but this is a good start. currently the control system and battery are just loosely attached with cable ties. This will change once we get the final build-out of the pickup and firing mechanism. We have yet to test the pneumatic gear shifters or a PID controller so in these videos the robot is in low gear and capped at seventy percent power.

This is the first run of the robot.

Once we got used to the control I slipped my cell-phone in a spare cRIO slot and got some video.

I have a few screen captures of our current design that gives the programming team a better idea of what other subsystems we need to code and design controls for. Unfortunately I don’t have a nice overview image of the final cad design.

Slight Overview

Disc Shooter

Pickup

Drive Train

Elevator

I have a few screen captures of our current design that gives the programming team a better idea of what other subsystems we need to code and design controls for. Unfortunately I don’t have a nice overview image of the final cad design.

We are down to the final week before bag and tag. Hopefully I will have more cool videos soon.