I typed in a simple code to get my robot moving down the wall with my left IR Sharp sensor. Then, was going to come back to it and start adding things. Here’s the code:
int TOO_CLOSE_THRESHOLD = 120;
int TOO_FAR_THRESHOLD = 80;
//get reading on IR sensor
int prox_value = analog_read_average(7,20);
if (prox_value > TOO_CLOSE_THRESHOLD)
if (prox_value < TOO_FAR_THRESHOLD)
int main ()
//printing "Press B" to the LCD
//Waiting for button to be pressed
//code above is executed while button b
//is pressed and this code looks for the release.
int state = wall_dist_prox();
if (state == TOO_CLOSE)
else if (state == TOO_FAR)
The first time I programmed it, it went down the wall like it was suppose to. Then, I thought I would increase the wheel speed on the faster wheel for it to respond quicker. Compiled, saved, reloaded hex file, sat the robot down, pressed “B” and . . . perpetual right circle . . every time!
I then loaded a program showing sensor values and it was recieving the right numbers.
I can’t figure why it would work the first time, no real changes made, then not work again.
How do you have your sensors mounted? Can you post a picture of your robot?
Does your 3pi actually drive straight when you set the motor values to the same speed? It is unlikely for the two motors to be perfectly matched, so I expect your robot will still veer slightly to the left or right when you set both speeds to 50. I think your first step should be to find a left and right motor speed near 50 that make the 3pi drive straight (e.g. left = 47, right = 52).
Also, to get good results, you might want to use a PID algorithm rather than the simple left-straight-right approach. PID will cause your motor response to vary proportionally to the sensor readings so that your response is weak when things are going well and strong when they need to be (e.g. when you encounter a corner). Have you seen our sample 3pi project for making a wall-following robot?
Here are 2 links with one showing it with sensors parallel and the other with the left sensor angled.
As I stated before, this program roughly ran the robot down the wall and around the corner. But, with no program changes, all the other runs do what is in the videos.
I know I don’t have anything fancy in my code, but as written, it shouldn’t be doing this. And, it worked the first time.
I’m a little confused. In your first post you said it worked, then you made a minor change and it stopped working. Now you’re saying it stopped working with no change to the code. Even if you think the change was minor, it could still break things. Are you sure the code you’re running now is identical to the case where it worked (and the hardware is mounted the same way, too)?
A statement like this is not really very helpful. It makes it sound like the robot isn’t doing what the code is telling it to do when the real problem is that there is likely something you’re missing in your understanding of the code and how it affects the hardware. Your focus now should be understanding as best you can the behavior you’re actually seeing so that you can figure out how your code or hardware needs to be changed.
Did you try my suggestion of figuring out what speeds you need to make your robot drive straight? I really do think this is going to make a significant difference for you, and I strongly suggest you modify your code based on what you discover. Did you look at our wall-following sample project at all? You definitely should have your sensor mounted at a 45° angle like in the second video as looking straight sideways will probably not work (whenever the robot rotates to get close to the wall, the sensor will angle in a way that makes the wall seem farther away even though the bot is getting closer to the way).
Your problem is either in the identification of the situation or the execution of the response. To figure out which, you can try printing to the LCD what the sensor is seeing and what the bot wants to do in response. Then, instead of putting the wheels on the ground, hold it in various positions and orientations relative to the wall. This will let you evaluate whether the robot is correctly identifying the situation. If you’re happy with how the robot performs in this test, your problem is likely the execution, and you can tweak things like the motor speeds you use based on your sensor readings.
You are correct. When I thought about the minor change I made, it was wheel speed to increase the veering toward and away from the wall.
I reread your post and started thinking about one motor pulling the robot slightly off coarse enough to knock the robot away from the wall and locking it in a spin.
I set both motors at 50 and it pulled to the right. So, I gradually increased the right wheel speed till I found the right number. It was 5.
Then, and I had done this before with erratic results, I loaded the Pololu wall follower PD software and where the PD is applied to the wheels, I added 5 to the right wheel. And it worked!
I’m out of time for my project, but this is a huge breakthrough!
When I bought my sensors I didn’t pay too close attention to the range values and how they would impact the sensing ability at close quarters. I bought longer range sensors, but am only operating in a small area.
And, actually one of my sub projects was printing something to the screen which happened to be the 3 sensor readings on the LCD. I took it and hand carried it near the wall and away from the wall.
Then I adjusted the setpoint value to bring the robot in a little closer.
Thank you for pointing out that I need to really analyze what is going on and not default to throwing my hands in the air!
I’m very glad to hear that you were able to get things working!