Turning Characteristics Of The RP5 Chassis Using Qik2s9v1

As part of developing an obstacle avoidance strategy for my robot project which is utilizing the RP5 Robot Chassis and the Qik2s9v1 dual serial motor controller interfaced to a Netduino microcontroller and some sensors I thought it would be appropriate to understand the behavior of the RP5 when turning. The idea is to develop a table which maps a range motor speeds, as defined in the Qik2s9v1 control set, against the amount of time the RP5 should execute a turn to complete turns of 90, 180 and 270 degrees. The goal is to create a set of user callable functions to execute each of these turns where the only parameter passed is the motor speed, e.g. TurnRight90(50) - turn right 90 degrees with a motor speed of 50, and the appropriate elapsed time for the turn is taken from the table developed here. It should be mentioned that I turn the RP5 by moving one tread in the forward direction and the other in the reverse.

I have already created a class for the Qik2s9v1 which contains the basic commands for moving forward, backward, turning right and left and stopping. Below are a few lines from my “DualSerialMotorController” class which are relevant.

static byte[] stopCommand = new byte[4] { 0x8A, 0, 0x8C, 0 };
static byte[] rightCommand = new byte[4] { 0x8A, 0, 0x8E, 0 };      // Right speed passed by caller
static byte[] leftCommand = new byte[4] { 0x88, 0, 0x8C, 0 };       // Left speed passed by caller

       //Stop Robot
        public void Stop()
        {
            serialPort.Write(stopCommand, 0, 4);
        }

        //Turn Right
        public void TurnRight(int speed)
        {
            rightCommand[1] = Byte.Parse(speed.ToString());
            rightCommand[3] = rightCommand[1];
            serialPort.Write(rightCommand, 0, 4);
        }

        //Turn Left
        public void TurnLeft(int speed)
        {
            leftCommand[1] = Byte.Parse(speed.ToString());
            leftCommand[3] = leftCommand[1];
            serialPort.Write(leftCommand, 0, 4);
        }

To determine the appropriate times for each of the three types of turns I used the “TurnRight(int speed)” and “Stop()” commands in the above “DualSerialMotorController” class and invoked them with a simple bit of code, an example of which appears below.

motorController.TurnRight(70); Thread.Sleep(3540); motorController.Stop();

For each speed I tested I simply adjusted the time of the “Thread.Sleep” instruction until the desired turn was completed. The testing was done on a hard wooden floor with a fresh set of batteries for the RP5 motors. Once a time was derived for a turn it was tested for repeatability. This was by no means a completely rigorous scientific experiment, for instance the RP5 can exprience tread slippage, but the data gathered is proving useful and I thought it worth sharing. I tend to use the RP5 over a speed range of 30 to 70. Below is the table with the results of my testing.

…90 Degees…180 Degrees…270 Degrees
Speed…(secs)…(secs)…(secs)
30…4.00…8.00…12.00
40…2.50…5.00…7.30
50…1.80…3.60…5.50
60…1.35…2.90…4.25
70…1.20…2.30…3.54

Hello.

Thank you for sharing your test results with us! Have you thought about how you are going to compensate for the effects of changing battery voltage?

- Ben

Hi Ben,

I was very aware of the fact that as the batteries are used the motors will run slower and the timings will change. That is why I mentioned that the tests were done with fresh AA alkaline batteries.

In developing a strategy for collision avoidance there are many factors and approaches. I did a survey of the current academic literature on the subject and it would seem that one of the most successful strategies is the “Vector Field Histogram (VFH) Method – Fast Obstacle Avoidance for Mobile Robots” algorithm. This approach involves placing an array of sensors on the robot to map the obstacles around it and create a frame of refernce that is constantly updated by refershing the sensor readings and converting it to a “map” that the robot can view as it moves - a bit of a simplification but it gives the essential idea. One experiment I read about using this approach had an array of 22 acoustical sensors mounted around the robot. I don’t think I want to spend the money on 22 sensors.

One thought I had was to rotate the robot through fixed angles of rotation and record distnaces detected into a table to build a rudimentary map of what is around the robot. That is the reason for the tests I performed. I then realized that a far better approach would be to get two more ultrasonic sensors to place on the sides of the robot at 90 degree angles to the front facing one. Through daisy chaining of the sensors I could get continuous feedback about what is in front of and to the sides of the robot. However, until I order two more Maxbotix sensors from Polulu in a week and wait for their arrival the approach of physically turning the robot to take readings will allow me to develop the algorithms that I would use taking readings from three sensors at 90 degree angles. By the way, the 270degree rotation was just to see if there was a linear time relationship between 90, 180 and 270. At lower speeds there almost is. Clearly rotating 90 degrees to the left is the same as rotating 270 degrees to the right and takes less time, battery drain and tread slippage.

A whole other project related to all of this is that i am also interested in is mounting a single axis gyroscope to measure yaw on the RP5 along with a times source so that I can integrate angular velocity over time and actually get a measure of the angle ther robot rotates through.

Fun stuff.