After purchasing some VL53L0X sensors, I have been testing them out for accuracy. While the sensors are quite consistent with regards to the values returned for any given distance, there is a problem in that the values are wrong. This is not simply a problem with a constant offset. If I calculate the ratio of mm/inches based off of my data, I get approximately 27.3 mm/in… This is somewhat problematic, as the ratio should be 25.4.
I have tested two chips, with equivalent results.
I have performed the tests using “high accuracy”. (The example code does not show this)
Since things are very consistent we could determine a reference point and adjustment ratio, and then apply a conversion after each reading to give us valid data, but that seems rather cumbersome, and I doubt that this is what the chip manufacturer had in mind; I have to think that I’ve misunderstood or missed something?
// This is the pertinent code
VL53L0X long_ranged_sensor;
// Yes, we are using multiple sensors
// SENSOR_TIMEOUT is 50
void initialize_sensors() {
for (int i = 0; i < NUMBER_OF_SENSORS; i++) {
muxSelect(i);
delay(5);
sensor.init();
sensor.setTimeout(SENSOR_TIMEOUT);
}
}
// NUMBER_OF_READINGS is 5
// We take the average of several readings to smooth the data out a bit
uint16_t take_sensor_reading(int index) {
muxSelect(index);
delay(5);
int sum = 0, count = 0, value;
while (count < NUMBER_OF_READINGS) {
value = 1 * sensor.readRangeSingleMillimeters();
if (sensor.timeoutOccurred()) {
if (count == 0) {
return 0;
} else {
return sum / count;
}
}
sum += value;
count++;
}
return sum / count;
}
Raw data: (S1 = "sensor1"; S2 = "sensor2")
S1 S2 inches
141 131 4
169 163 5
222 214 7
271 271 9
332 325 11
342 342 11.5
360 354 12
442 431 15