2022 10 23 Puck Status

I’ve been working hard this last week on making my time-of-flight sensors usable. These are VL53L0X chips.

There are 8 of them on Puck, a pair at each corner of the robot. Each is supposed to detect objects within 2 meters of itself and return the distance in millimeters to the closest object it sees. Here are some of the problems:

  • If there isn’t an object, such as a wall or table leg, within 2 meters, the sensor returns a very noisy result. Sometimes 0mm, sometime 8,192 mm.
  • The sensor really is only good for, at best, about 1.2 meters, not 2 meters.
  • Depending on the surface of the obstacle, such as color and texture, the sensor may or may not see it.
  • Even if the robot is at a dead stop, meaning the obstacle is at a fixed distance, the distance reported can vary by 10mm or more.
  • Each sensor reports a different distance when the obstacle is at, say, 100mm away. This might be an offset error, but I need more data to characterize it.

I’ve completely reprogrammed how the sensors are read by my custom computer, now getting about 30 readings per second for each sensor. Which is the maximum that the device can produce, given that I want to detect objects that are over 1 meter away. This is working a treat and I can start to harden that code, document it, and maybe even write an article about the tricks I use to do this (as the manufacturer’s default software won’t work this fas, sort oft).

Now my problem is getting past the unreliability of the readings. The first level of cure is simple statistics–I’ll throw out readings that are obviously bogus. Then I’ll do some averaging of the readings. Or maybe compute a median value.

To see if this is working in a useful way, I’ve also spent days working on producing a visualizer of the data, written in Python. This code reads the 240 frames of data per second (8 sensors times 30 frames per second) and creates a window with 8 plots of data (one for each sensor), showing (currently) the histogram from the last 20 readings per sensor. So far, I’ve spent a huge amount of time just tweaking the plots, such as line widths, labels font size, tick marks size, how many subdivisions of the sensor range to use for the histogram, how often to update the plots on the screen, and so on.

This mostly works now, but could still use some tweaking. Then I need to provide some secondary statistics, such as computing the variance in readings per sensor at different distances. I need to know that if the sensor says an obstacle is at, say, 0.73 meters away, how accurate is that likely to be. And I need to know how many false readings each sensor is producing (usually zeros and 8,193 mm).

Later, the whole robot code base needs to do object tracking. So, if, say, a table leg shows up on the LIDAR at 3 meters away, as the robot moves I need to know that the reflected signal is still from that same table leg and not, say, from a person’s leg that momentarily crossed between the robot and the table leg. Then, as the robot gets closer to the table leg, I should see it get picked up by the SONAR or the time-of-flight sensors.

Also, the robot needs to know if an object it sees is fixed to the ground, or is moving relative to the robot. And, if it is moving, the robot needs to predict its trajectory.

Before I get object tracking to work, I may need some tricky programming to throw out the noise seen when there is no object within the 2 meter maximum detection range. This can probably be done by simply looking to see if there are “n” readings in a row that cluster around a reasonable value, compensating for the robot motion. This is all another reason why I moved the code to move the robot onto my custom processor rather than in the main processor. The custom processor now knows what the robot’s trajectory is and it can adjust the time-of-flight sensor readings from one frame to another according to the robot’s movement to to understand if two consecutive readings from a sensor are reporting a distance to the same obstacle.

That is, each sensor that is detecting distances to things around the robot need to understand what it is that it is seeing. 

Leave a comment

Your email address will not be published. Required fields are marked *