Pets, Plants, and Computer Vision
Header

ROS Industrial – Industrial Grade Awesome.

March 12th, 2014 | Posted by admin in automation | code | computer vision | demo | industrial computing | Industrial Internet | manufacturing | robots | Uncategorized - (Comments Off on ROS Industrial – Industrial Grade Awesome.)

)

Last week I had the pleasure of going to Southwest Research Institute (SwRI) to attend a ROS Industrial training session. I’ve been insanely busy for the past few months writing computer vision and other code for a fairly substantial Robot Operating System (ROS) project. I’ve been converted over to the dark side of ROS from OpenCV as ROS’s message bus, modular nature, and build tools are absolutely phenomenal. Hopefully a lot of that code will go back to the community once my employer signs the contributors agreement. I’ve gotten to know a lot about the sensor side of ROS but I wanted to round out my knowledge of the actuator side of things. This ROS-Industrial session seemed like a good place to do just this, and also get acquainted with more people working in manufacturing.

SwRI has always had a mythical place in my mind, mainly because all the cool kids got to go there when I got left behind at the lab. When I was in undergrad the RHex robot went there for testing, and while I was at Cybernet our DARPA Urban Challenge crew got to go there while I got to stay home and man the fort. A few months ago SwRI reached out to me and I asked if I could perhaps help with ROS Industrial. I’ve been trying to get some code and documentation done for them but I’ve been so busy I haven’t made as much progress as I would have liked. SwRI is currently the maintainer of ROS Industrial, and along with the OSRF they are making great strides to improving the usability of ROS in industrial settings.



The tutorials published and presented by SWRI were excellent and very well polished. They made a conscience effort to have the tutorials go from high level tutorials for decision makers all the way to nuts and bolts code introductions for programmers and integrators. I really hope the publish more of the tutorials as they were exceptionally well put together, relevant, and well thought out. To be certain what SwRI is trying to accomplish with ROS industrial is no small feat, as you can see from the video of of their 2013 Automate Demo video at the top of the page. At the the ROS Industrial training session SwRI walked us through the high level architecture of this system (all of the components are FOSS software!) at a level where I think I could probably recreate it given a few weeks of coding. For a single day session I thought they covered a lot of ground and the demos they had of ROS industrial were incredible. In addition to the 2013 Automate demo they had a another robot doing arm doing an exceptionally complex deburring maneuver around an complex bent puzzle piece. Another demo showed an on-line object tracking and path planning demo for robot finishing of automotive parts. I capture a few of the demos in a short video.

In addition to the tutorials I picked up a few new and interesting libraries from the other attendees. One that stood out was MTConnect which is a free and open XML/HTML standard for robots and CNC software to communicate their state and status. It looks pretty cool and there are already some open libraries out on github. Another cool suite of tools is this EPICS PLC communication package put out by Paul Scherrer Institute in Switzerland. There also seems to be a mirror of it by Argonne National labs. Apparently CERN uses a lot of PLCs and they were insistent that all PLCs used at CERN had open Linux drivers. EPICS stands for Experimental Physics and Industrial Control System. I haven’t looked too deeply into the package but it seems like it could be handy.

Tapsterbot Mark I

July 10th, 2013 | Posted by admin in automation | Automation Alley | demo | Fun! | pics or it didn't happen | python | robots - (Comments Off on Tapsterbot Mark I)

I’ve been working on creating a clone of a Jason Huggins’ tapsterbot, parallel robot in my spare time . I wanted a friendly desktop robot that I could play with to prototype some computer vision applications. Jason was kind enough to open source the code, the BOM, and and all of the design files in a handy github repo. To build the robot I got a membership to the All Hands Active hackerspace here in Ann Arbor so I could fab the parts. All I really needed to build the robot was a 3D printer and a laser cutter. The robot has a really simple design that only requires a few nuts and bolts, three $8 servos, and an arduino for the controller. Once I got the parts it took me a little over a day to build the thing. I had a few slip-ups along the way so I wanted to collect all my knowledge in a blog post. Jason provided me with a ton of awesome photos of the robot in action so I could figure out how to piece it together. One critical component was how to correctly mount the robots arms onto the servo. Jason has provided an awesome video that shows you how to do just that. I now have everything assembled correctly and I plan to take it all apart and provide step by step instructions on how to put everything together. Currently the robot runs using node.js and I am making a python port using PyFirmata. With any luck I should have that work done within the next week and be able to show some more impressive demos. The first thing I want to do is build a path planning algorithm so I can prevent the tapsterbot from accidentally crushing its own arms or swinging into the legs that support it (I’ve already broken an arm). I’ve been reading up on the robot’s inverse kinematics, but I am not sure it lends itself to a closed form solution.

DRAGON BOT IS GO!

July 1st, 2013 | Posted by admin in audio | demo | Detroit | Electronics | FIRST | Fun! | Maker Faire | pics or it didn't happen | RaspberryPi | robots - (Comments Off on DRAGON BOT IS GO!)
Dragon Bot Scale Model

Dragon Bot Scale Model

FRC 830 has been collaborating with FRC 3322 to build a giant dragon robot for Maker Faire Detroit. I just got back from my trip and a chance to check in with the kids. The goal is to have a giant robot that plays sounds, shoots smoke rings, drives, lights up, and has animatronic eyes and eye brows. The students have prototyped an eye assembly using some servos controlled by the PWM ports on the cRIO side car. The eyes are controlled using the analog joy sticks on the gamepad. After a little bit of debugging we were able to get the animatronic eye assembly running this afternoon.

Another one of the students was able to build a small GPIO driven relay system to control the smoke machine which we plan to power using a second battery and a car inverter. In my spare time this week I was able to cook up a client-server system using RabbitMQ and get it running on the RaspberryPi. The only real trick was setting up the RabbitMQ conf file to run on the space constrained RaspberryPi. This is a little bit outside the scope of the kids ability, but now that I have a sketch working they should be able to take over. The hope is that we can use PyGame and ServervoBlaster to control the lights and sounds on the robot. I want to roll a GUI front end for this using pyGTK. The result looks like this (I now have the GTK gui running).


Mwaaaaahahahha. by @kscottz

FIRST 830 Vortex Cannon

May 31st, 2013 | Posted by admin in Ann Arbor | FIRST | Maker Works | RaspberryPi | robots - (Comments Off on FIRST 830 Vortex Cannon)
Commercial vortex cannon.

Commercial vortex cannon from last year’s robot.

FRC Team 830 along with FRC Team 3322 are working on a show robot for the Detroit Maker Fair. Maker Works was even cool enough to give us a small grant to fab the robot! The working concept is a life-size animatronic dragon that shoots smoke rings. With my ample free time I’ve been figuring out how to make it happen using a combination of the standard cRIO FIRST platform and raspberry pi’s. Last’s year’s robot used a vortex cannon to knock over pac-man ghosts, but this year we wanted to go bigger. We are using the drive train from last year’s robot for the new dragon bot so we should have a lot more time to build cool animatronic stuff.

Hole in the bottom of the garbage can.

Hole in the bottom of the garbage can.


Vortex cannon diaphragm using 5mil sheeting.

Vortex cannon diaphragm using 5mil sheeting.


I wanted to see how big of a vortex cannon we could build. Following some examples I saw on the internet I picked up a garbage can at Recycle Reuse and grabbed some 5 mil plastic sheeting I was using for the garden. This weekend we fabbed a prototype smoke cannon. The design is really simple, you just saw out a circle in the bottom of the garbage can and add a plastic diaphragm on the open side of the can (I secured it with some large rubber bands and duct tape). The trick is to provide a good amount of slack on the diaphragm.

The smoke machine makes the vortex rings visible. This weekend I tried to use some dry ice “smoke” to visualize the rings. The results weren’t that impressive, but playing with five pounds of dry ice was really fun (hint, if you put a quarter on dry ice it “squeals”). Today we tried using a professional fog machine and the results were much more impressive as you can see for yourself.

As for the rest of the animatronics we want the dragon’s eyes and eyebrows to move and for the dragon to play sounds. If we have enough time we want the dragon to have some “ground effects” LED lights. Right now the plan is to drive all of this off the GPIO pins on two raspberry pi’s. We plan to drive the pi’s off a separate router system connected to the cRio’s filtered 5V power supply (with a step up converter). The smoke ring mechanism and smoke generator will probably run off the cRio. The plan is to have each of the pi’s run a python script that provides access to PyGame (for sound), pi-blaster, and servo blaster. Tentatively I think we can use the Pika RabbitMQ library to move message between a client control application and servers running on the raspberry pi. Both the raspberry pi’s will be dispatched by running a client python app that also uses PyGame to grab input from the keyboard and a joystick. Right now we have a skeleton github repo for the project that should get filled out over the course of the summer. I must also add that I am really impressed with the AdaFruit Black Raspberry Occidentalis raspian distro. I am going to be gone for the next month so I can point the kids to the tutorials and let them go to town.