What is Pixy?
Have you thought about connect a camera in your Arduino? (I guess many people has already thought about implementing vision in their robots.) Actually connect a camera directly to Arduino is not so simple (or is not possible), and even if it were possible, it would overwhelm the Arduino.
Pixy was created to meet this need. And thanks to a powerful processor, the work goes easy and fast for the microcontroller.
Pixy processes the image (of a blue object, for example, at a determined point) and sends the coordinates to the microcontroller. With this data you can make, for example, a robotic arm to grab the desired object; instruct DC motors to spin and follow the object; there are many possibilities.
A great feature of is that you can “teach” objects to Pixy. Actually the color of the object will be stored. To teach Pixy an object you can use the “teaching button” or the PixyMon application.
PixyMon
PixyMon is a great application “to see what your robot sees”. Also you can set the signatures, define some settings, run the Pan/tilt Demo, and some other tasks.
Pixy + Arduino
Pixy can work with some microcontrollers. I have tested Pixy only with Arduino (since I just have this microcontroller, and it’s my main focus).
And it couldn’t be easier! A cable comes with Pixy to connect it directly in Arduino boards using the ICSP pins.
The picture below shows the Arduino IDE Serial monitor and the coordinates of the detected object/signature, while Arduino runs the hello_world example.
MDi #4 + Pixy
As I proposed in my entry for the “Call for Reviewers”, I’ll be using Pixy in my robot MDi #4. In the second video (from top to bottom) you can see my robot with Pixy running the Pan/tilt Demo. I was running the Pan/tilt Demo directly from PixyMon (without using Arduino) and the servo motors connected directly in the Pixy. Since in that case Pixy was powered by the USB cable, and as I was using MG995 servos, an external power supply was used for the servo motors, because the USB port has limited current capability.
Tip: while running the Pan/tilt Demo, start with low values for the “gain” and increase it till get the ideal value.
Makeblock mBot + Pixy
I saw some videos of robots chasing objects, and it was very fun. I thought mBot would be ideal to try something. As its brain (the mCore board) uses ATmega238 (like Arduino UNO) and also have the ICSP pins, I just needed to make a bracket (with high impact polystyrene) to attach Pixy in mBot, I could immediately start working on code.
The two last videos show some tests with mBot chasing a green LEGO bucket. The code is working well (just need some fine adjusts), and when done I’ll share it here.
Issues: mCore have built-in RGB LEDs that randomly flicker while running Pixy’s code. It would be nice if the color of the LED was the same of the desired object… but how it’s going is very annoying. I have simply covered the LEDs with electrical tape to make the videos. I’m not sure what is the cause (I guess some conflict between the libraries of mBot and the way how mCore works with its ports).
Conclusion and final considerations
Pixy may seem limited to working only with colors. But this limit can be overcome by working with “color codes”, which are combinations of various colors to form a code. It is a compulsory sensor if you want to make smarter robots.
I intend to keep exploring this sensor in some future projects and share them here.
Something to look forward to is the face detection capability. Can’t wait for it!
The original link: http://letsmakerobots.com/node/50144
Leave a Reply
You must be logged in to post a comment.