RPB-202, a beginner's robot based on Romi chassis and Raspberry Pi

Here is my first robot project, RPB-202 (for Raspberry Pi Bot - 202 are its last three digits of its IP address on my network). I started this project about a month ago.

The robot is built on the Romi chassis and uses a Raspberry Pi to handle high level tasks and an A-Star 32U4 SV controller to access the different hardware. I added a rover 5 expansion plate on top of the chassis so that I could mount the Raspberry Pi high enough to have access to all usb ports as well as the to HDMI port (although I don’t use it since I remotely program and control the Rapberry Pi using a ssh connection). Sensors include two Sharp IR proximity sensors, two Sharp IR distance sensors and one Maxbotix XL-Maxsonar-EZ4 sonar range finder. Wheel encoders are installed but not yet wired to the A-Star.

So far I have used the rpi-slave library as-is. I started by using the cwiid python library to remote control the robot using a Wii remote. I then developped basic obstacle avoidance that I used in parallel with the wii remote and in autonomous exploration routines.

The next step was to add an old webcam that I had lying around for years. I used the SimpleCV python library to track different objects and developed routines to follow lines and recognize intersections using the camera. This allowed me to do simple maze solving as shown in the following video:

Robot solving simple maze using computer vision on Raspberry Pi

I developed a few basic python classes to provide high level access to the robot components.

The next big step for me will be to add the wheel encoders to the robot so I can do odometry and more complex tasks such as autonomous motion planning as well as solving complex mazes where knowledge of the robot position is required. I have started reading on encoders and believe I will figure out how handle them on the A-Star. The biggest challenge for me will be to learn and understand how to get the data back and forth between the A-Star and the Pi via the I2C interface.

2 Likes

Hello.

We’re pretty excited about our new Romi chassis, so its great to see how you have customized yours! Thanks for sharing. We’d also love to see any updates when you get the encoders working!

By the way, if you have any specific feedback about using with the Romi (ease of assembly, what it was like to find ways to mount your electronics to it, etc.), we would be happy to hear it.

-Jon

Hello Jon, thank you for the feedback!

So far I like the Romi chassis a lot. Assembly has been very easy, the molding quality and the fit of the components are excellent with very little play. The only thing I would recommend is to assemble the wheels on the motor shafts prior installing the motors on the chassis (instead of after as per the instructions). As the fit of the wheels on the shafts is very tight (good once installed), this allows to react the installation force on the opposite end of the shaft rather than on the gearbox bushings.

I will add the optional front ball caster to maintain a better balance. When doing object tracking with the camera, sudden changes in speed can make the front of the chassis pitch down which makes it more difficult to maintain accurate object tracking. I adjusted the gains of the motor PID controllers to avoid this from happening but had to trade some responsiveness in doing so. The front ball caster will certainly help to limit the “pitch” forward displacements.

The holes around the rim makes it easy to install sharp distance sensors. Note that some spacers (I used 1/4in stand-offs) are required to mount the smaller proximity sensors to avoid the pins protuding on the back of the carrier to touch the chassis.

I will also be adding a small servo controlled gimbal for the camera. I plan on controlling the servos directly from the Raspberry Pi’s GPIOs. I like the size of the Romi chassis as it provides enough space for such additional equipment while keeping a nice looking stable platform.

I now have the encoders working (thanks to @AmandaS’s helpful guidance in this thread). Interfacing through I2C was finally much easier than I thought. With the python Struct documentation, I have been able to understand how the example file provided with the rpi-slave library worked, and from there to modify it and the Arduino sketch to add the encoders data.

I am now working on rewriting my Motors class on the Raspberry Pi side to control the motors on the basis on speed rather than command. I use a PID controller with feedback from the encoders on each motor. I am also developing an Odometer class and a MotionController class to allow more advanced high level motion control methods such as “Go to Goal”, “Follow Path”, etc.

I will also install and interface with the MinIMU-9 v5 IMU and see how I can use the compass and gyro to supplement the encoders for better Odometer accuracy.

To be continued…

2 Likes

Thanks for your feedback!

You might have already read about this in the user’s guide, but if you add in the optional front ball caster and are still not happy with the ability of your Romi to mitigate the pitching motion you described, you can add a rubber band to stiffen the dampening.

Feel free to continue adding to this thread; we’d love to hear how your servo gimbal and IMU-supplemented odometer turn out!

-Jon

Hi!

My RPB-202 robot got multiple upgrades since my last post:

  1. Fully functionnal encoders
  2. Two additional Sharp GP2Y0A60SZLF Analog Distance Sensors (4 total) for obstacle avoidance
  3. A front ball caster (with rubber band as mentionned by @JonathanKosh)
  4. A Raspberry Pi V2 Camera on a pan-tilt support replacing my old USB camera
  5. Cleaner wiring (no more breadboard)

My python code library has evolved and now includes an odometer class that uses the encoder counts to calculate the robot’s x, y coordinates and phi angle, speed and turn rate. I also developed a motion controller class that provide simple to use high level functions to control the movements of the robot.

My son (14 yo) used the library to implement his own version of simple maze solving using vision (video below) for his science fair project. His vision algorithm emulates a 5 wide array or IR reflectance sensors by cropping the image to keep the lower 25% and then splitting this band in five images. He runs thresholding on hue values to isolate the green masking tape color then runs blob detection to determine if the line is visible in each of the images. The remainder of the code is a “typical” left hand simple maze solver. A cleaned-up & commented version of his code is available here.

I am now working of finishing my maze solver for mazes with loops using the NetworkX python graph library. as well as testing dead reckoning accuracy of my encoders/odometer classes.

My next big project will be getting into SLAM. For the mapping, I will be using three VL53L0X Time-of-Flight Distance Sensors installed at 120° from each other on top of a “mast” that I will rotate with a servo to get a 360° scan of the area around the robot. I left an empty space between the Raspberry Pi and the pan-tilt system to install the servo. Thanks to this forum post, I have been able to easily connect the three sensors to the A-Star and set their individual I2C addresses using the XSHUT pin on two of the sensors.

For those interested in SLAM and other AI topics applied to robotics, I found this excellent free course (and others) on Udacity.

5 Likes

That’s great! The wiring cleanup and the new camera + pan-tilt mechanism make RPB-202 look a little more sleek and professional than before. It’s also pretty cool that your son is able to contribute!

-Jon

Work kept me quite busy the last month but I did manage to progress on the mast that I’m going to use to support and rotate the array of VL53L0X sensors for my SLAM project. Below is a picture of the assembly mounted on a servo with the sensor carriers installed. The two end adapters are 3D printed while the center shaft is a carbon fiber arrow shank. The 3D files of the supports are available on Thingsiverse

That mast is looking pretty good! Can’t wait to see how well the sensing works!

-Jon

1 Like

This project just keeps getting more and more impressive… thank you for sharing it!

1 Like

Hi.

We posted about this robot on the Pololu Blog! Thanks for sharing.

Ryan

1 Like

Great! Thanks @ryantm!!!