Posts tagged “community projects” (Page 3)
You are currently viewing a selection of posts from the Pololu Blog. You can also view all the posts.
Julien de la Bruère-Terreault (also known as DrGFreeman on the Pololu Forum, creator of the Custom Mini Sumo robot and the Romi and Raspberry Pi robot shared on this blog) made “SharpDistSensor” an Arduino library for analog Sharp distance sensors. If you’re running a recent version of the Arduino IDE, you can install it with the Library Manager. The library reads the sensor’s analog voltage output, filters the data, and converts it to a distance measurement. By default it is calibrated to work with the Sharp GP2Y0A60SZLF analog distance sensor 10-150cm 5V, but you can calibrate it to other analog Sharp distance sensors (if you can fit a power function or a fifth order polynomial to the voltage vs distance response of your sensor). Pull requests are welcome for supporting other Sharp distance sensor models!
The readme, library code, and example sketches are available in the GitHub repository.
One of our customers used our custom laser cutting service to cut the birch plywood panels for his retro-gaming TV system that he sells on Etsy. The birch panels are stained with shellac. The system runs on a Raspberry Pi 3 Model B, to which you can connect (not-included) controllers with Bluetooth or USB. The Raspberry Pi’s Ethernet port, SD card slot, and 4 USB ports are accessible in the back.
Customer Lujing Cen’s team built two semi-autonomous 14-DOF quadrupeds using Mini Maestros for their high school class project at the California Academy of Mathematics and Science. The project syllabus includes making two separate robot designs and three total robots that work together. These quadrupeds were the “pack robots” described in the syllabus. An 18-channel Mini Maestro USB servo controller controls the twelve leg and two head servos. The project code includes a rewrite of Pololu’s Maestro C# USB SDK library in Python. The robot uses a camera and an RFID scanner to track its targets.
Forum user DrGFreeman has been busy making robots. I wrote earlier about his Custom Mini Sumo robot; now here is his Romi Chassis and Raspberry Pi robot. It solves mazes using a webcam to do line tracking and intersection identification. This robot is a great example of how a variety of Pololu robot parts can combine into an attractive and functional robot. A black Romi chassis kit provides the base for the robot, to which DrGFreeman added a encoder pair kit and ball caster kit. An A-Star 32U4 Robot Controller SV with Raspberry Pi Bridge drives the motors and monitors four Sharp GP2Y0A60SZLF analog distance sensors and the Romi encoders. The robot controller and the Raspberry Pi communicate via I²C and are elevated above the chassis on a narrow Pololu RP5/Rover 5 expansion plate.
Forum user coyotlgw made this teleoperated Raspberry Pi robot. The robot is controlled remotely over SSH via the Raspberry Pi’s WiFi connection, and snapshots of the webcam feed are available via a Motion web server. The motors of the Dagu Wild Thumper 4WD chassis are driven by a Pololu Dual MC33926 Motor Driver for Raspberry Pi connected to a Raspberry Pi 2 Model B. A Pololu A-Star 32U4 Mini LV interfaces with and records readings from temperature, pressure, humidity, and UV/IR/visible light sensors.
It wasn’t available when they built the robot, but coyotlgw points out the A-Star 32U4 robot controller with Raspberry Pi bridge is an option to consider for similar builds. This robot connects the Raspberry Pi and A-Star Mini with USB; the robot controller would make I²C communication easier. You would still need external motor drivers, because the robot controller’s MAX14870 is not appropriate for the Wild Thumper motors.
For more pictures, details, and a discussion of the issues encountered during the build, see the forum post.
Customer Elise Pham made a bionic hand: a two-fingered gripper triggered with biofeedback. A Pololu Maestro servo controller monitors the trigger source and signals the servo to close the gripper. In this video, she uses a mechanical sensor for biofeedback, and she is exploring using a MyoWare Muscle Sensor as a future enhancement. Her earlier video shows using a MyoWare Muscle Sensor to control a servo like in our demonstration video.
April 10 update: Elise’s project won 1st Award for the 2017 Synopsys Science Fair and was also nominated to advance and compete in the 2017 National Broadcom MASTERS. Additionally, Elise was selected by the Santa Clara County Office of Education to participate in Steve Wozniak’s Silicon Valley Comic Con Science Fair.
Customer Thomas Broughton made a line follower robot controlled by a Raspberry Pi that directly connects to a Pololu QTR-8RC reflectance sensor array. A Raspberry Pi is not typically good for a timing-sensitive application because it runs a regular computer operating system, so it’s nice to see Thomas was able to get it to reliably read the sensor array. The robot also uses four 42×19mm pololu wheels, a Pololu 5V Step-Up/Step-Down Voltage Regulator S18V20F5, and two Sharp distance sensors.
His Python code and more discussion are in his blog post.
Two Bit Circus is building a “micro-amusement park” in Los Angeles, and this robotic bartender will be one of the exhibits. It uses peristaltic pumps to load libations into hand-held shakers, mixes the drinks, and dispenses them. The animatronic motions of the robot are orchestrated by a Maestro servo controller and a Raspberry Pi.
More pictures and details can be found in the Make magazine article featuring the robot.
Customer Carlos Ambrozak developed an “Introduction to Robotics” course that includes a lab where students work on visual object tracking. The example project is two Zumo 32U4 robots playing cat and mouse. One Zumo has a large blue ball on it and drives around avoiding obstacles. The other has a CMUcam5 Pixy on a pan-tilt mount that looks for the blue ball and follows the other robot. The Zumo 32U4 controls the camera via I2C. The lesson’s provided source code is available on GitHub.
Customer Mike McGurrin made this animatronic talking skull that uses Amazon Alexa for interactive voice control. The central part of the project is a Lindberg 3-axis animatronic skull and audio servo controller, which makes the jaw movements follow the audio voice. In this project, the nod, turn, tilt, and eye movements of the skull are controlled by a 12-channel Maestro servo controller running a custom Maestro script that uses one of the channels as an input that is triggered by the Raspberry Pi. The Amazon Alexa integration is handled by AlexaPi.
More details including a parts list and the Maestro Script are available on the project page.