Two VL53L0X sensors produce different values to same target

I have two VL53L0X sensors mounted on the front of my robot rover. They are mechanically mounted identically. I have one on the left front and one on the right front of the rover, roughly 4.5 cm apart. They are powered by the same supply (a power bank). They are controlled by an A-Star Micro; they are configured identically, except the left sensor is at the default I2C address and the right sensor is at a different address.

I’ve had them working for months, but only cared about distances of 10s of centimeters. Recently I’ve thought about leveraging the reported accuracy and started to do some “calibration testing”. Placing the rover so that both sensors are 20 cm (to the best of my ability to measure) from a white painted surface, and reading with a timing budget of 100,000, with the left sensor I get ranges from 19.5 - 19.7 cm; with the right sensor the ranges are 18.9 - 19.1. That means the left sensor error is < 2.5%) and the right is < 5.5%. For a distance of 120 cm, I get 119.9-120.5 and 118.2-118.6 respectively; that is an error of <1% and <1.5%.

My (admittedly casual) reading of the documentation leads me to believe the devices are calibrated at the factory. Further tables 12 and 13 in the data sheet suggest I should see no more than 3% error. Thus I am somewhat surprised by the large error at small distances with the right sensor. Further I am surprised at the average error difference between the two sensors (0.2 - 0.4 cm for the left and 1.0 -1.6 cm for the right).

Am I missing something? I excerpted the relevant code:

void setUpTOFs(void) {
  // make the GPIO an output pin so that it will drag XSHUT low for left sensor;
  // this leaves the left sensor at the default address (41)

  // set the address of the right sensor

  // now make the GPIO an input pin so that XSHUT can float high for left sensor;
  // this allows communication with the left sensor at address 41
  pinMode(XSHUT_PIN_L, INPUT);
  delay(100); //For power-up procedure t-boot max 1.2ms "Datasheet: 2.9 Power sequence"

  // iniitalize the sensors

int readTOFs() {
  if (DEBUG) Serial.println("Sample TOFs ");

  // set up for accurate readings

  // read sensors
  int dR = sensorR.readRangeSingleMillimeters();
  int dL = sensorL.readRangeSingleMillimeters();

  // if timeout then make returns negative
  if (sensorR.timeoutOccurred()) {
    dR = dR * -1;
  if (sensorL.timeoutOccurred()) {
    dL = dL * -1;


  return -1;


We are not really sure why you seem to be getting a higher % error at shorter distances, but it suggests that there could be something like a fixed offset that is not being accounted for. So, you might try calibrating for that by adding an offset in your code to see if that helps. Also, if you are able, you might try disconnecting and swapping the sensors in your rover to see if the issue follows the sensor. Additionally, if you have not already done so, it would be good to verify that your sensors are pointing perpendicular to the surface being measured.


Thanks for the quick response and advice.

Addressing your thoughts: Mechanically, the sensors are not offset from the test surface, at least no more than a mm. It is possible that they are not mounted precisely perpendicularly, however. That said, I cannot see how a small difference in the mounting angle could cause a large difference in the detected range at 20 cm, tho it would seem feasible at 120 cm. Even tho it is a real pain, I agree that it is an excellent idea to switch the sensors and will do so and make additional tests.

After my post I decided to do more experiments, in effect to find the “electronic offset” that I can use for correcting reported range. I decided to collect 400 readings of both sensors as fast as possible and the calculate the mean (M) and standard deviation (D). At 1200 mm for the left sensor M=1205, D=4; for the right sensor M=1193.7, D=1. At 200 mm for the left sensor M=198, D=1; for the right sensor M=192.4, D=1.06.

I assert (without any statistical credentials) the average error at 1200 mm for the left sensor is about +0.4%, and for the right sensor -0.5%. That seems much better than the 3% the data sheet suggests. Similarly, the average error at 200 mm for the left sensor is about -1%, and for the right sensor -3.8%. The most curious thing about this is that the error is larger at smaller distances. Another curious thing is that the left sensor has a positive error at long range and a negative error at short range.

I am not sure this work sheds any light on the problem, other than to suggest whatever I am seeing is real and not imagined.

I will do sampling at additional distances with the hope of deriving a correction approach. Per your suggestion, I’ll also switch the sensors and report the results.

First I sadly report that the spreadsheet I used to calculate statistics did not behave well “automatically”. After forcing real numbers everywhere, at 1200 mm for the left sensor M=1205.3, D=4.4; for the right sensor M=1193.7, D=4.1 [also a transcription error here]. At 200 mm for the left sensor M=197.6, D=0.9; for the right sensor M=191.4 [also a transcription error here], D=1.06. Thus my assertion about errors becomes the average error at 1200 mm for the left sensor is about +0.4%, and for the right sensor -0.5%; the average error at 200 mm for the left sensor is about -1.2%, and for the right sensor -4.3%.

Per my promise, I switched the sensors so the one that was on the left is on the right, and the one that was on the right is on the left. Once again I collected 400 samples at 200 mm and produced M and D. For the “new left” M=193.1, D=1.0, and for the “new right” M=196.7, D=0.9. So the average left error is -3.5% and for the right it is -1.7%.

So, the two sensors definitely behave differently for a reason that escapes me. The old left/new right is more accurate than the old right/new left. The old right/new left seems to operate outside of the specifications in the data sheet at short ranges. The precision of the two sensors appears roughly the same at all ranges.

Bottom line, I don’t know enough to say that one of the sensors is bad and the other is good, but one certainly seems to be outside of the specification.

One thing I wonder is whether it is possible that the face of the sensor can be “dirty” or something. The rover is only used indoors, but sat uncovered in a room for months. Is there a safe way to try clean the sensor?

In the meantime, I will continue gathering data to try to “calibrate” the sensors so that I can use them to make somewhat reliable range measurements.

Thanks for doing those tests and looking into it more. Since it looks like the issue follows the sensor, it does seem like the performance of that particular unit is starting to push beyond the boundaries of the nominal specifications that ST claims. However, I am optimistic that you can close the performance gap between those sensors with the measurements you are taking and using to find an offset. (You might even try using a equation to determine the offset based on approximate distance that you expect to measure.)

As for cleaning, you might try using compressed air to blow any dirt off the lens, though we have not tried it and are not sure if that is a good solution. By the way, ST suggests adding your own protective cover glass. Note that if you decide to do that, you should typically have to recalibrate, which is something we have not really tried.