Pololu Robotics & Electronics
Menu
My account Comments or questions? About Pololu Contact Ordering information Distributors

Pololu Forum

Line sensor for Balboa balancing robot

#1

I have decided to develop an application for my balancing balboa robot using the latest release of the line sensor specific to the balboa robot. I found that developing code for the robot is challenging.
Step one was to modify the drive around routine to drive in a race track pattern. The outcome of this endeavor provided a timing state machine to provide drive values for the balance drive routine. I successfully turn 90 degrees clockwise in one second, drive straight line for one and a half seconds,
turn 90 degrees counter clockwise for another second, then straight for another second and a half before repeating the process. This exercise allowed me to determine timing limits and the ability to monitor the progress with the LCD or remotely with the “XBEE” transceiver.
Step two is to read the line sensor. Using the line sensor library; as recommended; .I determined that the best approach was to collect raw values from the sensor. Then using the read line function as an example created a routine to provide numbers from -20 to 20 in increments of 5 as the sensor passes over a black line on white paper.
Step three is currently on going. I am endeavoring to combine parts of step one with step two to reliably follow a line on my test track;previously used to verify the line following routines used on my “Arduino ZUMO” and “ZUMO32U4” robots.
I will post more later.

1 Like
#2

Hello.

This sounds like a really neat project! We would love to see some video of it in action once you get it up and running!

-Dan

#3

This is a modification of the DriveAround routine provided in the initial release of your product. The main purpose of this change is to determine timing that adversely affects the balancing algorithm. Also I was able to make changes to the encoder counts to control behavior. The end result was to determine what time is required to rotate 90 degrees, drive straight a predictable distance and ultimately create behavior patterns. Note that variables not shown are global types; allowing use by other functions…

void driveAround () {
  unsigned long currentMillis = millis ();

  if (Chk == 0 && (currentMillis - lastTime >= 1900)) {
    Chk = 1;
    lastTime = currentMillis;
    M1spd = 20;
    M2spd = 20;
  }
  else if (Chk == 1 && (currentMillis - lastTime >= 4096)) {
    Chk = 2;
    lastTime = currentMillis;
    M1spd = 15;
    M2spd = 25;
  }
  else if (Chk == 2 && (currentMillis - lastTime >= 1900)) {
    Chk = 3;
    lastTime = currentMillis;
    M1spd = 20;
    M2spd = 20;
  }
  else if (Chk == 3 && (currentMillis - lastTime >= 4096)) {
    Chk = 0;
    lastTime = currentMillis;
    M1spd = 15;
    M2spd = 25;
  }
  balanceDrive (M1spd, M2spd);
}

With step one completed; I am in the process of collecting information from the line sensor to control the robot behavior needed to follow a black line on a white surface. This effort is still on going.

2 Likes
#4

The following excerpts are illustrating steps toward developing code to use the line sensor to detect a line and direct the robot to drive over the line; following the line around a course.

With this procedure; I was able to predict the robot behavior and record
the behavior for display / transmission while maintaining balance.

// Enable / disable means of
// visualizing parameters.
#define TEST true
// true for LCD, false for Serial1.
#define LCD  false

// Stand up, turn 90 degrees CW, drive
// forward 40 centimeters, turn 90
// degrees CCW and forward 40 CM again.
// Like mowing the lawn.
// Use this routine to report robot
// condition while balancing.
// A means to verify troubleshooting
// tools and estimate control parameters.
void Test_Track () {
  unsigned long currentMillis = millis ();

  // After 500 milliseconds; assign the left
  // encoder 10 ticks clockwise and right
  // encoder 10 ticks counter clockwise.
  if (Chk == 0 && (currentMillis - lastTime >= 500)) {
    Chk = 1;
    lastTime = currentMillis;
    M1spd = 10;
    M2spd = -10;
    #if TEST
    #if LCD
    lcd.clear ();
    lcd.print (M1spd);
    lcd.gotoXY (4, 0);
    lcd.print (M2spd);
    #else
    Serial1.print (M1spd);
    Serial1.print (", ");
    Serial1.println (M2spd);
    #endif
    #endif
  }
  // After one second the robot has turned 90 degrees.
  // Set the left and right encoders to 10 ticks clockwise.
  else if (Chk == 1 && (currentMillis - lastTime >= 1000)) {
    Chk = 2;
    lastTime = currentMillis;
    M1spd = 10;
    M2spd = 10;
    #if TEST
    #if LCD
    lcd.clear ();
    lcd.print (M1spd);
    lcd.gotoXY (4, 0);
    lcd.print (M2spd);
    #else
    Serial1.print (M1spd);
    Serial1.print (", ");
    Serial1.println (M2spd);
    #endif
    #endif
  }
  // Move forward for three seconds seconds
  else if (Chk == 2 && (currentMillis - lastTime >= 2500)) {
    Chk = 3;
    lastTime = currentMillis;
    M1spd = 10;
    M2spd = 10;
  }
  // After three seconds turn 90 degrees counter clockwise.
  else if (Chk == 3 && (currentMillis - lastTime >= 500)) {
    Chk = 4;
    lastTime = currentMillis;
    M1spd = -10;
    M2spd = 10;
    #if TEST
    #if LCD
    lcd.clear ();
    lcd.print (M1spd);
    lcd.gotoXY (4, 0);
    lcd.print (M2spd);
    #else
    Serial1.print (M1spd);
    Serial1.print (", ");
    Serial1.println (M2spd);
    #endif
    #endif
  }
  // After one second drive straight.
  else if (Chk == 4 && (currentMillis - lastTime >= 1000)) {
    Chk = 5;
    lastTime = currentMillis;
    M1spd = 10;
    M2spd = 10;
    #if TEST
    #if LCD
    lcd.clear ();
    lcd.print (M1spd);
    lcd.gotoXY (4, 0);
    lcd.print (M2spd);
    #else
    Serial1.print (M1spd);
    Serial1.print (", ");
    Serial1.println (M2spd);
    #endif
    #endif
  }
  // After three seconds repeat the instructions.
  else if (Chk == 5 && (currentMillis - lastTime >= 2500)) {
    Chk = 0;
    lastTime = currentMillis;
    M1spd = 10;
    M2spd = 10;
  }
  // Inject the encoder ticks into the
  // balance algorithm.
  balanceDrive (M1spd, M2spd);
}

This procedure is a step toward driving the robot above
a black line.

// Line sensor parameters.
// total number of sensors.
#define NUM_SENSORS 5
// Sensor timeout  value.
#define TIMEOUT     2500
// Sensor enable pin assignment.
#define EMITTER_PIN 12
// Weight value for each sensor.
// sensor0*0, sensor1*10..sensor4*40.
#define STEP 10

// Input assignment for line sensor.
const unsigned char sensorPins [NUM_SENSORS] = {A0, A2, A3, A4, 5};
// Configure the sensor library.
QTRDimmableRC qtrrc(sensorPins, NUM_SENSORS, TIMEOUT, EMITTER_PIN);

// Enable the LCD library.
#if LCD
Balboa32U4LCD lcd;
#endif

This is in the setup routine.
  // Configure tke line sensor library.
  qtrrc.setDimmingLevel (0);
  qtrrc.resetCalibration ();

// Line sensor program weighs each sensor
// for robot position on the line.
// Allow 4 milliseconds to read the line.
int Read_LineSensor () {
  // number of sensors on the line.
  char S;
  // collect sensor weight value.
  int lnpos;

  // read the line sensor.
  qtrrc.read (sensorValues);
  // calculate the sensor weight and
  // record number of contributing
  // sensors.
  lnpos = 0;
  S = 0;
  for (int i = 0; i < NUM_SENSORS; i++) {
    if (sensorValues [i] > 1000) {
      lnpos += i * STEP;
      S++;
    }
  }
  // define sensor position.
  // White surface, all sensors report
  // less than 1000.
  state = 'W';
  // Detected a black surface, all
  // sensors report greater than
  // 1000.
  if (S == 5) state = 'B';
  // Line sensor has detected a
  // black and white surface.
  else if (S > 0) {
    // sensor weight by number
    // of contributing sensors.
    lnpos /= S;
    // adjust the sensor value; right (+)
    // or left (-); based on the sensor array
    // center.
    lnpos -= STEP * ((NUM_SENSORS - 1) / 2);
    // positive sensor values near right wheel.
    if (lnpos > 0) state = 'R';
    // negative sensor values near left wheel.
    else if (lnpos < 0) state = 'L';
    // no sensor value when robot centered above
    // the line.
    else state = 'S';
  }
  // Report program results.
  return lnpos;  
}

Now that I control and display robot movements; Can I follow a line using the line sensor?

Not so far!

#5

After several attempts control the Balboa robot; I made some changes to the program. I discovered that making encoder drive parameters; M1spd and M2spd; local and static variables work best for having the robot run a pattern. The state machine in the routine(s) are one shot; by using static values the robot will use the last values until the next state machine change. I feel that my goal is a little closer after my testing up to now. I am able to use encoder values that are not retained; that is static; the robot does run off as much. So far my attempt to follow the line seem a little less aggressive and show a little progress. I keep in touch.

#6

Still working on the algorithm to allow Balboa line sensor to navigate a black line on a white surface.

My attempts to navigate a line is as follows:

// Assign sensor condition to motor
// drive routine to run above a line and
// keep the line within the middle
// sensor range.
// When using display the state time
// minimum is 20 milliseconds.
void Linefollower () {
  // Adjust sensor readings to
  // align / position the robot
  // over the line.
  int error, lineDeviation;
  // variable to collect the
  // condition of the line sensor
  // to compare with the next
  // measurement.
  static int lastError;
  // variable to control motor
  // speed using encober counts.
  static int Lspd, Rspd;
  // Read the time in milliseconds.
  unsigned long currentMillis = millis ();

  if (currentMillis - lastTime >= 80) {
    lastTime = currentMillis;
    error = Read_LineSensor ();
    if (state == 'W' || state == 'B') {
      Lspd = 0;
      Rspd = 0;
      error = 0;
      lastError = 0;
      lineDeviation = 0;
      lastState = state;
    }
    else {
//      lineDeviation = error / 4 + 6 * (error - lastError);
      lineDeviation = error / 4;
      lastError = error;
      if (state == 'L') {
        Lspd = 5;
        Rspd = -2;
      }
      else if (state == 'R') {
        Lspd = -2;
        Rspd = 5;
      }
      else {
        Lspd = 5 - lineDeviation;
        Rspd = 5 + lineDeviation;
      }
      lastState = state;
      Lspd = constrain (Lspd, -10, 10);
      Rspd = constrain (Rspd, -10, 10);
      #if TEST
      #if LCD
      lcd.clear ();
      lcd.print (error);
      lcd.gotoXY (3, 0);
      lcd.print (lineDeviation);
      lcd.gotoXY (0, 1);
      lcd.print (Lspd);
      lcd.gotoXY (4, 1);
      lcd.print (Rspd);
      #else
      Serial1.print (error);
      Serial1.print (",");
      Serial1.print (lineDeviation);
      Serial1.print (",");
      Serial1.print (Lspd);
      Serial1.print (",");
      Serial1.println (Rspd);
      #endif
      #endif
    }
  }

This snippet of code provides visibility by broadcasting parameters to my XBee components while running the robot. This allows me to use to information as a graphic plot or a collection of data points for analysis on my host computer. The difficulty is matching the sensor readings to drive values over time. My robot is likely to over correct and run out of range of the line pattern. Obviously I am able to get good readings from the line sensor and control the motor drive to move and balance. I discovered using timing values less than fifty milliseconds encourage instability. Using more time and the robot pauses before it runs amok. Drive values nearing 20 counts speeds up the event of running off and out of range of the line pattern. Is there a way to characterize the drive values for better control? The three variables; time, sensor values and drive counts; are difficult to predict while the robot is moving around.

I have successfully used the Zumo product line to solve line and wall maze configurations. I guess four wheels and tracks are more stable than the Balboa balance algorithm. Any advise for successfully completing this project is much appreciated.

#7

When building a line-following Balboa here, we found that it works better if the sensor is placed a little bit forward of the wheels (in the configuration called “edge-aligned” on the line sensor product page) and if the maximum speed of the robot is limited so that it moves fairly slowly. This lets it react to the position of the line more gradually with less erratic oscillation. Otherwise, if the sensors are between the wheel’s points of contact, the robot might still be facing in the wrong direction if the line is in the center of the sensor array (since the turning axis is in the same location), and a higher speed makes it shoot off the line quickly and then abruptly try to (over)correct.

If you post pictures of your robot or a video that shows the behavior, we might be able to see other issues.

-Nathan

#8

Good tip on the sensor placement. I haven’t developed far enough to see this condition. I have had my share of run away conditions. During a review of my robot performance; I discovered my right motor acted as if it had no power when compared to the left motor under identical conditions. I am waiting for my replacement motor to arrive before I continue my quest. Meanwhile I will update my sensor position while I wait for parts. I suspect my motor condition is related to a tight fit during initial assembly of the kit. I discovered the tight fit during an update to add the line sensor and headers for Xbee and I2C connections. While I was during an update; I tried some different gear ratios to see an improvement over the noticeable wobble while balancing. I am using 75:1 gear motors with 41:25 gear ratio which works best. However I may revisit this finding after the replacement motor is installed. This is a great product and I enjoy working with it.

#9

While waiting for my order of replacement motors to arrive; I did some troubleshooting to determine root cause of the anomaly. When I replaced the right motor; prime suspect; with the left one; the good guy; I noticed the right side worked fine. Inserting the prime suspect into the left side for fault verification; Lead to the discovery that I have made an error in judgement. Both motors now preform as needed. Assuming a cold solder connection as the best cause of the anomaly; I am moving on to developing the line following algorithm. I have updated my program to allow for the change from "center’; the initial condition; to “edge-aligned” configuration of the sensor bar. I appreciate your advise. Thank you.

#10

I am glad to hear you got that motor working. The Adafruit Guide to Excellent Soldering has helpful tips and techniques for making reliable soldered connections and identifying problematic joints.

-Nathan

#11

Still plugging along.

I recently replaced the 80 mm wheels with a set with hub inserts that removed the slop from all the trial and error coding that affected the original wheels(i.e. wore the d hole out)…

With all the “two steps forward, one step back” development; I have settled on code for the line sensor that I am comfortable with. I noticed I was getting timeout indications with the sensor algorithm outside of the balance algorithm and decided to insert it into the balancing algorithm. The timeout condition exist when you aggressively change the timed loop for the balance drive. The robot would treat the line as a sling shot; i.e detect the line then run off. To verify the condition; I wrote a routine to use the balance drive function alone and without the time event. Trouble shooting is also frustrating using the serial port as well as the LCD. That process got its own timed event. Experience with limits of the balance drive routine and line sensor values returned from the balance update routine; I can return to the line follower routine.

Test routines in balance.ino file.

void Drive_Test () {
//  balanceDrive (-25, 25);
//  balanceDrive (25, -25);
  balanceDrive (10, -10);
//  balanceDrive (10, 10);
//  balanceDrive (-10, -10);
}

// Assign sensor condition to motor
// drive routine to run above a line and
// keep the line within the middle
// sensor range.
// When using display the state time
// minimum is 20 milliseconds.
void Linefollower () {
  // Adjust sensor readings to
  // aline the robot over the
  // line.
  int16_t error, lineDeviation;
  // collect line sensor reading
  // to compare with the next
  // measurement.
  static int16_t lastError;
  // motor speed variable
  // using encober counts.
  static int16_t Lspd, Rspd;
  // Read the time in milliseconds.
  uint16_t currentMillis = millis ();

  
  error = lnpos;
  if (state == 'B' || state == 'W') {
    Rspd = 0;
    Lspd = 0;
  }
  else {
    lineDeviation = (2 * error) / 5;
//    lineDeviation = error;
    Rspd = 10 - lineDeviation;
    Lspd = 10 + lineDeviation;
//    Rspd = 5;
//    Lspd = -5;
  }
  if (currentMillis - lastTime >= 2000) {
    lastTime = currentMillis;
    #if PLOT
    Serial1.print ("!STAT State is ");
    Serial1.println (state);
    Serial1.print (error);
    Serial1.print (",");
    Serial1.print (lineDeviation);
    Serial1.print (",");
    Serial1.print (Lspd);
    Serial1.print (",");
    Serial1.println (Rspd);
    #elif SERIAL
    Serial1.print (error);
    Serial1.print (",");
    Serial1.print (lineDeviation);
    Serial1.print (",");
    Serial1.print (Lspd);
    Serial1.print (",");
    Serial1.println (Rspd);
    #elif LCD
    lcd.clear ();
    lcd.print (error);
    lcd.gotoXY (3, 0);
    lcd.print (lineDeviation);
    lcd.gotoXY (0, 1);
    lcd.print (Lspd);
    lcd.gotoXY (4, 1);
    lcd.print (Rspd);
    #endif
  }
  // Inject the encoder ticks into the
  // balance algorithm.
  balanceDrive (Lspd, Rspd);  
}

Inserted into balance.h file.

#define LINE true
#if LINE
// Line sensor parameters.
// total number of sensors.
#define NUM_SENSORS 5
// Sensor timeout  value.
#define TIMEOUT     2500
// Sensor enable pin assignment.
#define EMITTER_PIN 5
// Weight value for each sensor.
// sensor0*0, sensor1*10..sensor4*40.
#define STEP 10
#endif

Added to the balance.cpp file:

#if LINE
// collect sensor weight value.
int16_t lnpos;
// Global variables
// collect sensor position.
// 'L', 'R', 'S', 'B', 'W'
uint8_t state;
// collect the sensor condition.
// near zero for white, 2500 for
// black line.
uint16_t sensorValues [NUM_SENSORS];
// Input assignment for line sensor.
const uint8_t sensorPins [NUM_SENSORS] = {A4, A3, A2, A0, 12};
// Configure the sensor library.
QTRDimmableRC qtrrc(sensorPins, NUM_SENSORS, TIMEOUT, EMITTER_PIN);
#endif

#if LINE
// Line sensor program weighs each sensor
// for robot position on the line.
// Allow 4 milliseconds to read the line.
void Read_LineSensor () {
  // number of sensors on the line.
  uint8_t S;
  // read the line sensor.
  qtrrc.read (sensorValues);
  // calculate the sensor weight and
  // record number of contributing
  // sensors.
  lnpos = 0;
  S = 0;
  for (int16_t i = 0; i < NUM_SENSORS; i++) {
    if (sensorValues [i] > 1000) {
      lnpos += i * STEP;
      S++;
    }
  }
  // Detected a black surface, all
  // sensors report greater than
  // 1000.
  if (S == 5) {
    state = 'B';
    lnpos = 25;
  }
  // Line sensor has detected a
  // black and white surface.
  else if (S > 0) {
    // sensor weight by number
    // of contributing sensors.
    lnpos /= S;
    // adjust the sensor value; right (+)
    // or left (-); based on the sensor array
    // center.
    lnpos -= STEP * ((NUM_SENSORS - 1) / 2);
    // positive sensor values near right wheel.
    if (lnpos > 0) {
      state = 'r';
      if (lnpos >= 15) state = 'R';
    }
    // negative sensor values near left wheel.
    else if (lnpos < 0) {
      state = 'l';
      if (lnpos <= -15) state = 'L';
    }
    // no sensor value when robot centered above
    // the line.
    else state = 'S';
  }
  // define sensor position.
  // White surface, all sensors report
  // less than 1000.
  else {
    state = 'W';
    lnpos = -25;
  }
}
#endif

void balanceUpdateSensors () {
  imu.read ();;
  integrateGyro ();
  integrateEncoders ();
  #if LINE
  Read_LineSensor ();
  #endif
}

Any helpful comments would be appreciated. This is a great project and product.

#13

Still working on it. My development environment consist of a five foot diameter disc of half inch plywood painted white with 12 millimeter black lines scrawled on it. This was used to develop / demonstrate the ZUMO robots I write / verify similar code designs; works great. To date techniques used to train the Balboa robot to behave has ended with the robot running to the disc edge; in one quick hurry. The Balboa behaves as expected when I use a 10 millimeter strip of black construction paper on a hardwood floor. But my goal is not to chase the robot around the floor with a strip of paper. I have discovered values that work well to respond to the paper strip to drive the robot using sensor readings. Using timed events demonstrate some control as the robot moves along the line as expected than jiggles out of range of the line. Is there a time constraint that I have to consider? Please advise…

#14

Working out the timing of how to read various sensor inputs and how your two control loops (balancing and line following) is one of the tricky parts of building a robot like this. The balancing algorithm in our example code is meant to run at 100Hz (every 10ms). In general, it is helpful for this timing to be fairly consistent and to have code execution be much shorter than the timing period to keep the variation small.

Each of the QTR-RC line sensors in the array take some time to read a reflectance value from the surface and the more light that the surface reflects to the sensor, the quicker the measurement occurs. The timeout value specified when initializing the array with our library by default is 2500us (2.5ms). It appears your code is still using that TIMEOUT value. In an array of 5 sensors, that might mean a worst case read time for the array of about 12.5ms. You might characterize the amount of time it takes the sensors to change state with the QTRRCRawValuesExample sketch included with our qtr-sensors-arduino library. You should take care to account for variations due to the position of the sensor and general lighting variations. You might then be able to decrease the TIMEOUT value.

-Nathan

Update: Our library actually reads the sensors in the array in parallel, so the worst case execution time for that statement should be closer to 2.5ms than 12.5ms. However, the length of time required to read the array is dependent on the values it is reading and that variation could be interfering with the timing of your balancing loop, so tuning the TIMEOUT value could still help.

#15

Nathan, thank you for your response. Yes I agree with everything you wrote. I noticed this condition when I tried to read the line sensors without timing; specifically the red led would light up. To avoid this condition I found increasing reading intervals greater than 30 milliseconds seemed to work well. Then I tried inserting the sensor read routine into the balance timing and this also worked well; i.e. no red led lighting up. My robot will balance just fine regardless of the timing abuse. My sensor array routine does take advantage of the raw values as you described. The conversion the sensor weight; QTRRC.ReadLine () routine; did not seem to fit my application; using QTRDimmableRC routines. However; I was able to create a routine to weight each sensor in a similar matter ranging from -25 to 25 as the sensor detects surface variations from black to white. This works very well whether I nest it in the balancing routine or a fixed timing loop. My difficultly is using sensor values to control the balance drive routine. I have successfully created routines that control the robot movements; rotate 180 degrees, go forward and repeat; like mowing the lawn with timed events. Also I spin the robot like a top while balancing using drive values of -25 to 25 counts; as shown in earlier posts. I am able to control the robot with 14 inches of half inch wide electrical tape in a straight line. I chase it onto the tape with a 10 mm wide strip of black construction paper then in runs to the end of the tape on stops, as expected. It just refuses to follow a line that is not straight. I have discovered the line sensor has a dead zone; space between detectors greater than 10 mm; that can easily be detected and accounted for in my sensor routine. The only thing left to do is to match the amount of drive to line deviation detected by the sensors. It just loves to take a hike; balancing all the way of course. When I use timed events; It seem to detect the line then takes advantage of the dead time to jiggle off the line; as if to say see ya later. This product is awesome.

#16

Still at it. I was having trouble with the spacing of the line sensors. The weight values would indicate a white line when the line was between inner and outer sensors. So I split the detection code because reading the line sensor and assigning value to the result affected the balance algorithm.

The following is a modified version of my earlier post modifying the balance.cpp code.
making number of sensors (S) and value(lnpos) external and global.

// for robot position on the line.
// Allow 4 milliseconds to read the line.
void LineSensor () {
  // read the line sensor.
  qtrrc.read (sensorValues);
  // calculate the sensor weight and
  // record number of contributing
  // sensors.
  lnpos = 0;
  S = 0;
  for (int16_t i = 0; i < NUM_SENSORS; i++) {
    ledBar [i] = false;
    if (sensorValues [i] > 1000) {
      ledBar [i] = true;
      lnpos += i * STEP;
      S++;
    }
  }
  // Detected a black surface, all
  // sensors report greater than
  // 1000.
  if (S == 5) lnpos = 25;
  // Line sensor has detected a
  // black and white surface.
  else if (S > 0) {
    // sensor weight by number
    // of contributing sensors.
    lnpos /= S;
    // adjust the sensor value; right (+)
    // or left (-); based on the sensor array
    // center.
    lnpos -= STEP * ((NUM_SENSORS - 1) / 2);
  }
  // define sensor position.
  // White surface, all sensors report
  // less than 1000.
  else lnpos = -25;
}

void balanceUpdateSensors () {
  imu.read ();;
  integrateGyro ();
  integrateEncoders ();
  LineSensor ();
}

Then after several attempts I have settled on this code inserted in the balance.ino code.
Using values from balance.cpp; I used global variables to evaluate the detection values.
The main point was to retain the last sensor value while the line was between the inner and outer sensors. For convenience I assign a letter for each range of the sensor; which is handy for developing drive values needed to stay on the line.

// for robot position on the line.
void Read_LineSensor () {
  // number of sensors on the line.
  // Detected a black surface, all
  // sensors report greater than
  // 1000.
  if (S == 5) {
    DZ = false;
    state = 'N';
    lastState = state;
    lastPos = lnpos;
  }
  // Line sensor has detected a
  // black and white surface.
  else if (S > 0) {
    if (lnpos > 0) {
      if (lastState == 'l' && DZ == true) DZ = false;
      if (S > 2) {
        state = 'E';
        lnpos += 7;
      }
      else if (lnpos > 15) state = 'R';
      else state = 'r';
      if (state == 'R' && lastState == 'S') DZ = !DZ;
      else if (state == 'r' && DZ == true) DZ = false;
      lastState = state;
      lastPos = lnpos;
    }
    // negative sensor values near left wheel.
    else if (lnpos < 0) {
      if (lastState == 'r' && DZ == true) DZ = false;
      if (S > 2) {
        state = 'W';
        lnpos -= 7;
      }
      else if (lnpos < -15) state = 'L';
      else state = 'l';
      if (state == 'L' && lastState == 'S') DZ = !DZ;
      else if (state == 'l' && DZ == true) DZ = false;
      lastState = state;
      lastPos = lnpos;
    }
    // no sensor value when robot centered above
    // the line.
    else {
      DZ = false;
      state = 'F';
      lastState = state;
      lastPos = lnpos;
    }
  }
  // define sensor position.
  // White surface, all sensors report
  // less than 1000.
  else {
    if (lastState == 'r' && DZ == false) {
      DZ = true;
      lnpos = lastPos;
      lastState = 'S';
    }
    else if (lastState == 'l' && DZ == false) {
      DZ = true;
      lnpos = lastPos;
      lastState = 'S';
    }
    else if (DZ) {
      lnpos = lastPos;
      lastState = 'S';
    }
    else {
      state = 'S';
      lastState = state;
      lastPos = lnpos;
    }
  }
}

Now to get back to the original plan…

#17

Still working on the line following program… Nice weather outside is my excuse for the drawn out development of my project. Yes there have been a lot of trial and error and code segments that worked well in and out of the balance loop. I have decided to weigh the line outside of the balance update process. Primarily because the line sensor program would process the sensor gap between central and out lying detectors before the main loop process would see it. Since the time needed to maintain balance varied with the amount of encoder counts; limiting counts to twenty; using a timed event process seems to be adequate. I am working with the proportional, integral and derivative algorithm within the event timing loop for adjusting / driving the balance drive function with line sensor readings.
Because my election of sensor values within plus / minus twenty-five; The range of “PID” parameters can be maintained within integer numbers. I believe this consideration will mitigate any timing issues.

My balance code is taken from the latest release on GitHub.com without change.
So the code is inserted into the updated program loop for push button c.

* Line sensor program weighs each sensor
* for robot position on the line.
* The central detectors(S2,S3&S4) are
* spaced close enough to detect the
* line(i.e 0.375"[9.5mm]). 
* Since the detection gap of the outside
* (S5,S1) and inner detectors(S2,S3&S4) is
* 0.75inch(19mm); track width based on
* electrical tape; ~0.5inch(13mm); can
* evade detection.
*/
void Read_LineSensor () {
  static int16_t lastln;
  // read the line sensor.
  lineSensors.read (sensorValues);
  // calculate the sensor weight and
  // record number of contributing
  // sensors.
  lnpos = 0;
  S = 0;
  for (int16_t i = 0; i < SensorCount; i++) {
    ledBar [i] = false;
    if (sensorValues [i] > 1000) {
      ledBar [i] = true;
      lnpos += i * 10;
      S++;
    }
  }
  // Detected a black surface, all
  // sensors report greater than
  // 1000.
  if (S > 3) {
    // Oops! Twisted onto the black line.
    if (lastln != 25) {
      lnpos = lastln;
      lastln = 25;
    }
    lnpos = 25;
  }
  // Line sensor has detected a
  // black and white surface.
  else if (S > 0) {
    // sensor weight by number
    // of contributing sensors.
    lnpos /= S;
    // adjust the sensor value; right (+)
    // or left (-); based on the sensor array
    // center.
    lnpos -= 10 * ((SensorCount - 1) / 2);
    lastln = lnpos;
  }
  // define sensor position.
  // White surface, all sensors report
  // less than 1000.
  // Adjust for sensor gap.
  else {
    // Where did the black line go?
    if (lastln != -25) {
      lnpos = lastln;
      lastln = -25;
    }
    lnpos = -25;
  }
  if (lnpos == 25) state = 'B';
  else if (lnpos == -25) state = 'W';
  else if (lnpos > 5 && lnpos < 15) state = 'l';
  else if (lnpos > 10) state = 'L';
  else if (lnpos < -5 && lnpos > -15) state = 'r';
  else if (lnpos < -10) state = 'R';
  else state = 'F';
}

/*
* Assign sensor condition to motor
* drive routine to run above a line and
* keep the line within the middle
* sensor range.
* When using display the state time
* minimum is 20 milliseconds.
* Adjust drive based on last detected position.
*/

void Linefollower () {
  // Adjust sensor readings to
  // aline the robot over the
  // line.
  const int16_t Kp = 8;
  const int16_t Ki = 0;
  const int16_t Kd = 0;
  static bool Spin;
  int16_t proportional, integral, derivative;
  int16_t error, lineDeviation;
  static int16_t lasterror;
  // motor speed variable
  // using encober counts.
  static int16_t Lspd, Rspd;
  // Read the time in milliseconds.
  uint16_t currentMillis = millis ();

  #if TEST
  if (currentMillis - lastTime >= 1500) {
    lastTime = currentMillis;
    Read_LineSensor ();

    if (Spin) {
      Rspd = 5;
      Lspd = -5;
      Spin = false;
    }
    else {
      Rspd = -5;
      Lspd = 5;
      Spin = true;
    }
    Serial1.println (lnpos);
  }
  #else
  if (Chk == 0 && currentMillis - lastTime >= 55) {
    Chk = 1;
    lastTime = currentMillis;
    Read_LineSensor ();
  }
  if (Chk == 1 && currentMillis - lastTime >= 15) {
//    Chk = 2;
    Chk = 0;
    lastTime = currentMillis;
//    if (lnpos == 25) {
    if (lnpos > 15) {
      Rspd = 0;
      Lspd = 0;
      error = lasterror = 0;
      lineDeviation = 0;
    }
//    else if (lnpos == -25) {
    else if (lnpos < -15) {
      Rspd = 0;
      Lspd = 0;
      error = lasterror = 0;
      lineDeviation = 0;
    }
    else {
      error = lnpos;
      proportional = (error * Kp) / 10;
      if (abs(error) < 10 && abs(error) != 0) {
        integral = ((integral + error) * Ki) / 100;
        integral = constrain (integral, -10, 10);
      }
      else integral = 0;
      if (error == 0) {
        derivative = 0;  
      }
      else {
        derivative = ((error - lasterror) * Kd) / 100;
      }
      lineDeviation = proportional + integral + derivative;
      lasterror = error;
      Lspd = 5 + lineDeviation;
      Rspd = 5 - lineDeviation;
    }
  }
  if (Chk == 2 && currentMillis - lastTime >= 30) {
    Chk = 0;
    lastTime = currentMillis;
    // Use serial plotter.
//    Serial1.print (30);
//    Serial1.print (",");
//    Serial1.print (-30);
//    Serial1.print (",");
//    Serial1.print (lnpos);
//    Serial1.print (",");
    Serial1.print (error);
    Serial1.print (",");
    Serial1.print (lineDeviation);
    Serial1.print (",");
    Serial1.print (Rspd);
    Serial1.print (",");
    Serial1.println (Lspd);
  }
  #endif
  balanceDrive (Lspd, Rspd);
}

Moving forward and enjoying it.

1 Like
#18

I have decided that an autonomous approach to follow a line with the balancing robot is difficult due to the unpredictable situation of the robot after becoming in balance. The balancing algorithm is so effective the robot can achieve balance after careening off the wall or leaving the tile floor for the carpet. Predicting this behavior for the purpose of guiding the robot along a line on the floor or detecting the wall is difficult to control autonomously. By design; The balboa robot allows access to interface inputs for I2C, Serial and discreet sensors(IR line sensor). I discovered that timing events are crucial in introducing sensor input to the balancing algorithm. Referencing experience with “RobotBasic” introduced with the publication of “Robot Programmer’s Bonanza”, I submit the following solution to using sensors to control the behavior of the balancing robot.
Because the line sensor is time sensitive; I integrated interface instructions into the released balancing code as follows ::

Inserted line sensor code into balance.h --
#include <stdint.h>
#include <LSM6.h>
#include <Balboa32U4.h>


// Line sensor parameters.
// total number of sensors.
const uint8_t SensorCount = 5;

// This code was developed for a Balboa unit using 75:1 motors
// and 41:25 plastic gears, for an overall gear ratio of 124.
// Adjust the ratio below to scale various constants in the
// balancing algorithm to match your robot.
const int16_t GEAR_RATIO = 125;

extern uint8_t lnLeds;

Inserted this segment into balance.cpp --.
#include <Wire.h>
#include "Balance.h"

Balboa32U4LineSensors lineSensor;

uint8_t lnLeds;

void Read_LineSensor () {
  uint8_t S;
  uint16_t sensorValues [SensorCount];
  
  // read the line sensor.
  lineSensor.read (sensorValues);
  // record number of contributing
  // sensors.
  S = 0;
  lnLeds = 0;
  for (int16_t i = 0; i < SensorCount; i++) {
    if (i > 0) lnLeds = lnLeds << 1;
    if (sensorValues [i] > 1000) {
      lnLeds += 1;
      S++;
    }
  }
}
  balanceUpdateSensors ();
  balanceDoDriveTicks ();

  if (isBalancingStatus)
  {
    Read_LineSensor ();
    balance();

And lastly this is inserted into balance.ino --
    // Stop trying to balance if we have been farther from
    // vertical than STOP_BALANCING_ANGLE for 5 counts.
    if (abs(angle) > STOP_BALANCING_ANGLE)
  balanceUpdate ();
  if (isBalancing ()) {
    // Execute the robot routines.
    if (enableSong) {
      playSong ();
    }
    if (enableDrive) {
      Figure_Eight ();
//      Test_Track ();
//      driveAround ();
    }
    if (enableLine) {
      Serial_Command ();
//      Line_Detect ();
    }
  }
  else {
    buzzer.stopPlaying ();
    balanceDrive (0, 0);
    // Use robot button to execute
    // the stand up routine.
    if (buttonA.getSingleDebouncedPress ()) {
      enableSong = true;
      enableDrive = false;
      enableLine = false;
      standUp ();
    }
    else if (buttonB.getSingleDebouncedPress ()) {
      enableSong = false;
      enableDrive = true;
      enableLine = false;
      standUp ();
    }
    else if (buttonC.getSingleDebouncedPress ()) {
      enableSong = false;
      enableDrive = false;
      enableLine = true;
//      standUp ();
    }
  }
  // Display the balance status of the robot. 
void Serial_Command () {
  static uint8_t Chk;
  static uint8_t index;
  static int8_t received_bytecount;
  static int16_t lastTime;
  static int16_t Lspd, Rspd;
  uint16_t currentMillis = millis ();

  if (Chk == 0 && (currentMillis - lastTime >= 25)) {
    Lspd = Rspd = 0;
    if (Serial1.available () >= 2) {
      received_bytecount = Serial1.readBytes (command, 2);
    }
    if (received_bytecount == 2) {
      index = 0;
      switch (command[0]) {
        // stop
        case 0:
          Lspd = Rspd = 0;
          break;
        // forward
        case 6:
          Lspd = Rspd = 5 * command[1];
          break;
        // backward
        case 7:
          Lspd = Rspd = -5 * command[1];
          break;
        // right wheel forward
        case 8:
          Lspd = 0;
          Rspd = 3 * command[1];
          break;
        // right wheel backward
        case 9:
          Lspd = 0;
          Rspd = -3 * command[1];
          break;
        // left wheel forward
        case 10:
          Lspd = 3 * command[1];
          Rspd = 0;
          break;
        // left wheel backward  
        case 11:
          Lspd = -3 * command[1];
          Rspd = 0;
          break;
        // spin clockwise
        case 12:
          Lspd = 3 * command[1];
          Rspd = -3 * command[1];
          break;
        // spin counter-clockwise
        case 13:
          Lspd = -3 * command[1];
          Rspd = 3 * command[1];
          break;
        // collect sensor data
        case 16:
          index = 5;
          sendbuffer[5] = 0;
          sendbuffer[6] = 0;
          sendbuffer[7] = lnLeds; 
          sendbuffer[8] = 0;
          sendbuffer[9] = 0;
          Lspd = Rspd = 0;
          break;
        default:
          Lspd = Rspd = 0;
      }
      Read_ThreeBytes ();
      sendbuffer[3] = Lspd;
      sendbuffer[4] = Rspd;
      Serial1.write ((unsigned char *)sendbuffer + index, 5);
      command[0] = command[1] = 0;
      received_bytecount = 0;
    }
    lastTime = millis ();
    Chk = 1;
  }
  else if (Chk == 1 && (currentMillis - lastTime >= 250)) {
    lastTime = currentMillis;
    Chk = 0;
    Lspd = Rspd = 0;
  }
  balanceDrive (Lspd, Rspd);
}

void Read_ThreeBytes () {
  int8_t snsr;
  
  if ((lnLeds & 0x10) && !(lnLeds & 0x0F)) snsr = 0x01;
  else if ((lnLeds & 0x01) && !(lnLeds & 0x1E)) snsr = 0x04;
  else if ((lnLeds & 0x0E) && !(lnLeds & 0x11)) snsr = 0x02;
  else snsr = 0;
  // rbumper value 0-15
  // wall sensors
  // VL6180 ToF distance sensor.
  sendbuffer[0] = 0;
  // rfeel value 0-31
  // line sensor array
  sendbuffer[1] = lnLeds;
  // rsense value 0-7
  sendbuffer[2] = snsr;
}

I have found this to work well for remotely controlling the behavior of the Balboa robot.

Best Regards.

1 Like