S Goals

S is the name of my current robot and the robot is  an evolutionary step towards building a device to act as a personal assistive device as I age. 

There is unlikely to be an affordable service to provide for my care needs as I age, such as doing some household chores and some personal chores. So it seems my best bet is to invest in trying to invent a device that will do as much as I can make it do with the time and money I have available. I don’t expect a robot to be able to do everything I would like, but by enumerating some possible, valuable tasks, it will provide a vision to guide my invention. Tasks that would be useful include:

  • The usual kind of smart table device, able to transport objects from one place to me or from me to some other place.
  • The addition of some sort of gripper mechanism would enable the robot to actually pick up something, placing it in the smart table, or remove something from the smart table to be placed somewhere. Sample scenarios might include:
    • Fetch me a beer.
    • Get my mail.
    • Put my shoes away,
    • Clean the table and put the dishes into the dishwasher.
  • Really ambitious gripper-based goals might be:
    • Take out the trash.
    • Take the clothes from the dryer, fold them, and put them in the dresser.
  • Patrol my house interior and report to me issues I should know about. Sample scenarios might include:
    • Look for objects strewn on the floor.
    • Look for objects that have moved from their usual location.
    • Look for unlocked or open doors and windows.
    • Look for water on the floor.
    • Look for smoke.
    • Look for unexpected people.
    • Look for unexpected temperatures.
  • Stretch goals would include:
    • Clean the toilet.
    • Clean the floor.
    • Dust the objects on shelves.
    • Change a lightbulb.
    • Notice that I am not responsive and call for help.

To perform any of those goals, I envision a slow evolution of capabilities for the robot. I have built several robots in the last few years, each one being built upon what I learned from previous robots and each one having a name starting with the next letter of the alphabet after the previous robot. This robot is called S and the previous robot is called Raven, giving you an idea of its evolutionary heritage. I will rename S at some point with a better name.

At the least, I expect S to be able to reliably move from where it is to an indicated spot in the house. Adding a gripper may involve significant mechanical redesign after required object weights and torques are determined. Designing a robot that only needs to fetch a beer is quite different from designing a robot that can load a dishwasher which is different from designing a robot that can pick me up from a chair. Sensor requirements will evolve as needs evolve. It is expected that mechanical needs will drive each rebuild of the robot more than software needs. Requirements like faster networks, or better network coverage throughout the house generally are solved with minor changes to the robot. Requirements to move a weight at the end of an arm without the robot tipping over are likely to be dominant mechanical concerns..

Of equal concern is the safety and trustworthiness of the robot. The predecessor to S, namely Raven, was largely an experiment in providing some safety features and providing some sense of trust. My robot will be useless to me if I cannot trust it to avoid smashing into my rather expensive glass walls or damage my one piece of expensive furniture or drop some hot liquid on me.

Some safety and trustworthiness issues for the robot include:

  • The robot should be able to reliably boot up and shutdown.
  • The robot should be able to reliably detect obstacles to its movement and avoid bumping into furniture, walls, people and so on.
  • The robot should be able to reliably detect that power is going to be depleted and perform an action to either recharge itself or put it in a safe state so that it can be recharged.
  • The robot should fail safe in most common failure situations, including:
    • Any sensor fails.
    • Any communication path fails, such as a signal wire, the WiFi network or the Ethernet network.
    • The battery voltage nears some low discharge voltage.
    • The battery voltage is above some safety margin.
    • The wheels get entangled with wires or other debris.
    • One or more of the power converters fail.
    • A motor draws too much current.
    • A motor rotates unexpectedly fast or slow.
    • Any part of the robot overheats.
    • The robot’s intended path is blocked with no easy way to get unblocked.
    • The robot ends up in a pose that prevents normal operation, such as falling over, or a wheel falling off.
    • One or more of the control devices no longer communicates as expected, such as a CPU crash or a device getting fried.
    • An independent (not the main CPU) analysis of the proximity sensors detects an imminent collision of the robot with some object, so the robot gets an emergency stop signal until some person can intervene.

At a minimum, S is expected to be significantly improved over it’s predecessor in its reliability (it fails seldom), robustness (it can fail safely) and safety (it can be trusted not to do bad things), and be able to autonomously move to any place in the house that doesn’t require pushing the boundaries in an unreasonable way, such as needing to pass through a impossibly narrow passage or pass over an obstacle that would raise one of the wheels so it no longer could contact the floor.

Presenting S

Here is S as it exists as of today.

The frame is made from 2020 aluminum extrusions and aluminum plates. The open frame design allows me to somewhat easily get at the internal wiring, which still changes from time to time. There are a pair of 8 inch wheels (200 mm) mounted under the centerline of the bottom plate. A pair of caster wheels underneath prevent the robot from tilting too far forward or backward. 

A large LiFePO4 battery is placed in the back of the bottom plate to weigh the robot so it sits on the back caster, tilting the robot slightly. The front caster provides just enough clearance to pass over floor rugs in the house without causing either drive wheel to lift off the ground. The use of two casters is intended to help stabilize the robot when a gripper is attached.

The power plate you see above, with the terminal barrier strip, holds nearly all of the DC to DC power converters and provides good ventilation, though the power converters never really warm up.

Those gray rectangles at the corners of the bottom frame hold time of flight sensors. Those black rectangles in the center of the crossbars in the middle of the frame hold SONAR sensors.

The Bottom Plate

The robot consists of a bottom plate holding power components and sensors, and a top plate holding computation and communication components. Eventually another plate could be attached above the current top plate to hold an arm and a gripper.

To the right you can see the thick, vertical aluminum plate holding three DC to DC power converters and a terminal barrier strip (two of the converters can be seen from a different angle in a previous picture). The thick, vertical plate in the center holds two custom PC boards I made which are Teensy Monitor boards, and the RoboClaw motor controller. To the left you see the top of the power panel which has the on/emergency-off power switch, a power meter, a touch screen and a few other components. The blue rectangle in the back is the battery that powers the robot.

The Top Plate

In this view of the top plate, you can see the pair of WiFi antennas to the left, then a USB hub to the right of that and then a plate holding an HDMI and USB breakout from the main computer. The plate allows me to hook up a monitor and keyboard/mouse to more easily program the main computer, which is an AMD 3700X 8-core computer on a Mini-ITX motherboard with 32GB of RAM and 512GB of NVME disk. Next to the HDMI/USB breakout are PC panel breakouts, such as the reset switch and power/disk activity lights.

Above the WiFi antennas, showing red and black wire connections, are a pair of Anderson Powerpole breakout boxes that distribute 5 volt and 12 volt power from the bottom plate to the top plate. The black cube in the upper right is an LD06 LIDAR. I previously used a pair of LIDARs, mounted at two corners of the robot at different heights, but currently I’m only using a single LIDAR mounted higher up.

The picture below gives a better view of the main computer and also the NVIDIA GeForce GT 1030 graphics card. To the right you see a pair of OAK-D stereo RGB and depth camera modules mounted on a plate that positions them 90 degrees apart, giving the robot a horizontal field of view of nearly 180 degrees.

Below the OAK-D cameras but not visible in the picture is an Intel Realsense T265 visual odometry camera. Currently I’ve given up trying to keep Intel’s software working on the Ubuntu 20.04 operating system. Intel’s software seems to frequently break when I accept software updates. Eventually, I may take the time to fix their software again, but it’s not needed for now.

The picture below shows the other side of the top plate. The rats nest of white cables provides the needed power to the Mini-ITX motherboard. Obscured under the white wires is a small silver box which is the final DC to DC power convertor that converts 12 volts to 3.3 volts for the motherboard.

The Power Control Panel

The panel includes the on/emergency-off power switch, a 30 amp fuse and a meter showing the battery voltage, current draw and power draw as well as the battery charge. The battery temperature sensor is not hooked up to the battery at this time.

The top left holds a touch screen which is talked about elsewhere. Below that is a power cutoff switch for the RoboClaw motor controller, allowing me to run experiments, such as path planning, without any chance of the robot actually moving. The two red, push button switches below the motor power switch can be used to reset either of the two, custom Teensy Monitor boards, so I don’t have to try to reach my hand behind this panel in case either of the Teensy 4.1 microcomputers fails to program correctly. Finally, there is a panel mount for a pair of Anderson Powerpole connectors to provide the 42 volts power to charge the battery. Eventually the 42 volts will be supplied via some sort of docking connector.

The Motors

The motors and wheel encoders are mounted underneath the bottom plate. The motors drive a right angle gearbox, allowing a better mounting position for the motors and offloading the weight of the robot from the motor shaft. Glued to each motor housing is an analog temperature sensor so the Teensy Monitor can check for motor overheating.

Cabling

Where I remembered, I used shrink wrap, tubular markers to mark the various cables as I built them. Where I didn’t remember, I added wraparound labels to the cables after the connectors were added to the ends of the cables. I used Lever Nuts to connect high current wires. An example is the gray connector with the orange lever sticking out. Since there are 4 SONAR cables and 8 time of flight cables (as well as a few other non-power cables), I also crudely colored the cables with felt tip markers to make them more easily stand out when I’m trying to find where a cable is routed through a bundle of cables.

All of the power distribution cabling is documented using the WireViz software. An example of the documentation is in another paper. I try to keep the documentation up to date as I make changes. It lets me see exactly where each cable end attaches, and even lets me include pictures of the connectors or other devices.

Long cables are generally attached to the frame with self adhesive cable zip tie mounts and zip ties, similar to the following.

Custom PC Board for the RoboClaw

I’ve had many failures with the Dupont style connectors for signal cables to the RoboClaw itself. These connectors don’t offer much friction and fall off rather easily with a vibrating robot. On order to provide a more reliable connection, I designed a simple PC board that connects to all of the terminal pins of the RoboClaw board, providing more friction because of the number of simultaneous pins connected to a single, rigid PC board. It also gives me locking connectors to the signals from the two motor encoders and breaks out test points for various signals so I can test for electrical noise and provide shielding as needed. I still need to provide an even more robust connection in the future, but this works well enough for now.

Robot Footprints in Nav2 for ROS 2

Don’t forget to include a footprint in your Navigation 2 (Nav2) configuration file so that the local and global cost maps can properly account for the outline of the robot when it is attempting to avoid collisions. You need a footprint for both the local and global cost maps. My cost map parameters are store in a file called “nav2_params.yaml“, and within it there is a section for “local_costmap” and one for “global_costmap“. Here is snippet from the configuration file showing the footprint line for the local cost map for my robot.

local_costmap:
  local_costmap:
    ros__parameters:
      footprint: "[[-0.450, -0.215], [-0.450, 0.215], [0.235, 0.215], [0.235, -0.215]]"

And there is a similar “footprint” line in the global_costmap section.

The footprint value must be enclosed in quotes, has an outer pair of square brackets, and then is a list of x and y comma-separated coordinate pairs, each pair enclosed in square brackets and each bracketed pair separated by commas. Each x and y coordinate is a point in the base_link coordinate frame of the robot.

You don’t need to add a coordinate pair at the end back to the origin–it will be assumed. So a line segment from the last pair [0.235, -0.215] to first pair [-0.450, -0.215] is assumed in the above example.

If you’re rather, you can declare a circular outline for your robot instead. Using, e.g.,

robot_radius: 0.25 # The outline is a circle with radius 0.25 meters

You don’t nave to use the same foot print list or radius for both the local and global cost maps. You might, for instance, use a footprint list for the local cost map but a radius for the global cost map. The article referenced at the end of this post has a discussion about this.

Here is what the above footprint outline looks like in rviz2 when visualizing my robot (which is also showing LIDAR, SONAR and time-of-flight sensors and the local cost map). The lime-green square box in the center is the footprint.

See Setting Up The Robot’s Footprint for more information.

Adding SONAR-like sensors to Nav2

It is probably typical that the Navigation-2 (Nav2) setup under ROS 2 is just using LIDAR for generating costmaps. It is also possible to include SONAR-like sensors in your cost maps, that is sensors that publish a Range type of message. The process is easy.

Edit your YAML parameter configuration file for Nav2 as follows. In the “local_costmap” section, you will find a line that lists the plugins to be used to generate the local costmap. Add a “range_layer” to the list. Following the plugins line, you should see, e.g., sections describing the obstacle_layer and inflation_layer — that is, sections corresponding to the names in the plugins list. You will need to add a new section describing the range_layer.

Note, the names obstacle_layer, inflation_layer, range_layer, etc in the plugins list are arbitrary names, they don’t actually make the layer become, say, and obstacle layer. It’s what you supply as sub-parameters that make the section do what it does.

local_costmap:
  local_costmap:
    ros__parameters:
      plugins: ["obstacle_layer", "inflation_layer", "range_layer"]
      range_layer:
        plugin: "nav2_costmap_2d::RangeSensorLayer"
        topics:
          [
            "/sonar0Sensor",
            "/sonar1Sensor",
            "/sonar2Sensor",
            "/sonar3Sensor", 
            "/tof0Sensor",
            "/tof1Sensor",
            "/tof2Sensor",
            "/tof3Sensor",
            "/tof4Sensor",
            "/tof5Sensor",
            "/tof6Sensor",
            "/tof7Sensor",
          ]
      inflation_layer:
        .......rest of local_costmap YAML.....

Just like the section describing, say, the inflation_layer, you will add a new range_layer section by repeating the name given in the plugins list, followed by a colon. The first entry should describe the class name that will handle the data type. In the case of Range type values, the line should say:

plugin: "nav2_costmap_2d::RangeSensorLayer"

Then add a topics entry consisting of the word topics followed by a colon and then a single topic name in quotes, or a list of topic names formed by a left square bracket, then each quoted topic name in the list separated by commas and a closing right square bracket.

That’s it. If you bring up the nav stack, you should now see that when you wave you hand in front of your SONAR sensor, a new obstacle shows up in the local costmap.

There are a couple of caveats to consider. The published topic containing the Range message needs to have a valid frame_id in the header. And that frame_id needs to be described in the URDF for the robot, which, of course, needs to be published so that the transform system in ROS knows how to relate the Range value to the coordinate frames of the robot.

And, the time stamp in the header needs to be accurate and timely. This especially could be a problem if you are, say, publishing the data from an Arduino micro controller which may not have a notion of the time used by the rest of the ROS components. If you are publishing using microROS, there are ways to synchronize your Arduino with the main ROS system.

There are a few additional parameters besides plugin and topics that you can provide for the range_layer. They are described here.

2022 10 23 Puck Status

I’ve been working hard this last week on making my time-of-flight sensors usable. These are VL53L0X chips.

There are 8 of them on Puck, a pair at each corner of the robot. Each is supposed to detect objects within 2 meters of itself and return the distance in millimeters to the closest object it sees. Here are some of the problems:

  • If there isn’t an object, such as a wall or table leg, within 2 meters, the sensor returns a very noisy result. Sometimes 0mm, sometime 8,192 mm.
  • The sensor really is only good for, at best, about 1.2 meters, not 2 meters.
  • Depending on the surface of the obstacle, such as color and texture, the sensor may or may not see it.
  • Even if the robot is at a dead stop, meaning the obstacle is at a fixed distance, the distance reported can vary by 10mm or more.
  • Each sensor reports a different distance when the obstacle is at, say, 100mm away. This might be an offset error, but I need more data to characterize it.

I’ve completely reprogrammed how the sensors are read by my custom computer, now getting about 30 readings per second for each sensor. Which is the maximum that the device can produce, given that I want to detect objects that are over 1 meter away. This is working a treat and I can start to harden that code, document it, and maybe even write an article about the tricks I use to do this (as the manufacturer’s default software won’t work this fas, sort oft).

Now my problem is getting past the unreliability of the readings. The first level of cure is simple statistics–I’ll throw out readings that are obviously bogus. Then I’ll do some averaging of the readings. Or maybe compute a median value.

To see if this is working in a useful way, I’ve also spent days working on producing a visualizer of the data, written in Python. This code reads the 240 frames of data per second (8 sensors times 30 frames per second) and creates a window with 8 plots of data (one for each sensor), showing (currently) the histogram from the last 20 readings per sensor. So far, I’ve spent a huge amount of time just tweaking the plots, such as line widths, labels font size, tick marks size, how many subdivisions of the sensor range to use for the histogram, how often to update the plots on the screen, and so on.

This mostly works now, but could still use some tweaking. Then I need to provide some secondary statistics, such as computing the variance in readings per sensor at different distances. I need to know that if the sensor says an obstacle is at, say, 0.73 meters away, how accurate is that likely to be. And I need to know how many false readings each sensor is producing (usually zeros and 8,193 mm).

Later, the whole robot code base needs to do object tracking. So, if, say, a table leg shows up on the LIDAR at 3 meters away, as the robot moves I need to know that the reflected signal is still from that same table leg and not, say, from a person’s leg that momentarily crossed between the robot and the table leg. Then, as the robot gets closer to the table leg, I should see it get picked up by the SONAR or the time-of-flight sensors.

Also, the robot needs to know if an object it sees is fixed to the ground, or is moving relative to the robot. And, if it is moving, the robot needs to predict its trajectory.

Before I get object tracking to work, I may need some tricky programming to throw out the noise seen when there is no object within the 2 meter maximum detection range. This can probably be done by simply looking to see if there are “n” readings in a row that cluster around a reasonable value, compensating for the robot motion. This is all another reason why I moved the code to move the robot onto my custom processor rather than in the main processor. The custom processor now knows what the robot’s trajectory is and it can adjust the time-of-flight sensor readings from one frame to another according to the robot’s movement to to understand if two consecutive readings from a sensor are reporting a distance to the same obstacle.

That is, each sensor that is detecting distances to things around the robot need to understand what it is that it is seeing. 

Thinking of Behavior Trees v2

This is a complete replacement of a previous blog post. At the very end of this article you will find the small bit of XML code which defines the behavior tree described in this article.

I’m trying to develop a skill in using behavior trees as the logic to move Puck. As part of that learning, I have developed a behavior tree that attempts to move Puck so that one of its edges it is within 6 inches (0.1254 meters) from either a left or right wall in my hallway. The basic strategy will be, if Puck is not already near a wall, to turn left and drive up to some “maximum distance” looking for a wall ahead and, then turn right so that it is heading in the original direction. If a wall wasn’t found to the left, Puck will then turn right and search for up to twice that maximum distance for a wall and then turn left to be, again, facing in the original direction. The reason to travel up to twice the original maximum distance this second time is that the robot needs to move “maximum distance” just to get back to where the robot was when it first began searching to the left.

Here is my first attempt at such a behavior tree.

FindWall behavior tree

Assuming you are not well-versed in behavior trees, let me try to explain what the tree represents.

First, behavior trees consist of a tree of nodes. Interpretation (execution) of the behavior tree begins by “ticking” the top node in the tree and, depending on the kind of node, that node may “tick” one or more of the child nodes, recursively. A ticked node returns a status of either RUNNING, SUCCESS or FAILURE. The ticked status of a child node is used to decide what a particular node does next. I hope this will be clearer as I describe the above tree. Also, look at the link at the beginning for a more complete description of how behavior trees work.

Behavior tree nodes can communicate with each other via a “blackboard”, which is a simple key/value store. There are various ways to write the value of a key name into the blackboard, and nodes can get the value of any key name. Not delved into here, there is a way of remapping key/values between nodes so that nodes can be written as more generic pieces of code, taking parameters.

The robot will begin execution at the Root node which will then attempt to tick the IfThenElse node. This node begins by interrogating the status of the first child node and, if its status is SUCCESS, it will execute the second child node, else instead it executes the third child node. IfThenElse returns as its status the status of either the second or third child, whichever one was chosen.

The first node of the IfThenElse node is a FoundLeftOrRightWall node, a custom node I crafted for Puck. FoundLeftOrRightWall works by reading the various sensors of Puck to try to find the nearest distance from the edge of the robot to the left and right walls near to the robot. FoundLeftOrRightWall takes a parameter called “within_distance”, which was given a hard-coded value of 0.1254 meters (6 inches) for this node as shown. If either the left or right wall is less than or equal to 0.1254 meters away from the edge of the robot, FoundLeftOrRightWall will return SUCCESS else it returns FAILURE.

If Puck is already within 6 inches of the left or right wall, the second node, AlwaysSuccess is executed which, as you might guess, always return SUCCESS, which will be propagated to the IfThenElse node which itself propagates SUCCESS to the root node and we’re done. This whole behavior tree is attempting to position the robot within 6 inches of a left or right wall. If Puck is already there, nothing additional needs to be done.

If Puck is not within 6 inches of a wall to start, it needs to get there, so instead of the AlwaysSuccess being ticked, Sequence will be ticked.

A Sequence has any number of child nodes, they being RetryUntilSuccesful and IfThenElse in the above. A Sequence node begins by putting itself into the RUNNING state. It then ticks the first child node. If that node returns FAILURE, the Sequence stops and returns FAILURE itself, which is propagated to the parent node (IfThenElse in this case). If the child returns RUNNING, then Sequence will just keep ticking that child until either SUCCESS or FAILURE is returned. If SUCCESS is returned, Sequence will then go on to the next child node and continue until there are no more child nodes, at which point Sequence itself returns SUCCESS, because all of the child nodes were successful.

Sequence, above, will then begin by ticking the RetryUntilSuccesful node. In the picture above, “FindWallToTheLeft” is just a name I gave to the node so I have an idea what the node is trying to accomplish.

The way a RetryUntilSuccesful node works is it will attempt to execute the child node (an IfThenElse node here) up to “num_attempts” (hard coded as 3 attempts here) times. If the child node succeeds during any of those 3 attempts, RetryUntilSuccesful will stop ticking the child node and return SUCCESS to the parent node (Sequence in this case).

So the RetryUntilSuccesful is going to make up to 3 attempts to find the left wall. It makes an attempt by ticking the IfThenElse node which tests to see if FoundLeftOrRightWall sees a wall within 0.1254 meters. If so, AlwaysSuccess is returned else a subtree will be executed called FindWallInDirection. That subtree will be described below but basically it attempts to find a wall in the given direction by turning in the indicated direction, moving up to some maximum distance looking for a wall, and then turning back to the original direction.

You should get then that the RetryUntilSuccesful is going to make up to 3 attempts to find a wall to the left of the original orientation of Puck. If it finds a wall, it returns SUCCESS to Sequence, which will then go on to its second child, another IfThenElse.

And I can see now that his is wrong. Remember that the way Sequence works is that it will sequentially execute each child node as long as each child node returns SUCCESS. If RetryUntilSuccesful returns FAILURE because it didn’t find a wall to the left, then Sequence stops ticking the child nodes and itself returns FAILURE.

I won’t correct the tree in this post. But it shows how using the graphical representation of a tree, and trying to explain it to someone, can uncover errors in the behavior tree. Instead, the fix is to insert a node between Sequence and RetryUntilSuccesful which ignores the status of RetryUntilSuccesful and always returns a SUCCESS status.

Going on to the second child of Sequence, it is going to use IfThenElse to see if the first child found a wall to the left and, if not, will use a similar RetryUntilSuccesful child node to search to the right. Note that this second RetryUntilSuccesful node, that I named “FindWallToRight” will make up to 6 attempts to find a wall to the right whereas the first RetryUntilSuccesful only made up to 3 attempts. That is because we may need 3 RetryUntilSuccesful attempts just to get back to the original position of Puck when we started looking for a wall, then we make up to 3 more attempts to find a wall to the right.

There is no error here by not including a forced success parent node for the second RetryUntilSuccesful node. If we fail to find a wall to the right, the whole tree needs to return FAILURE.

Now I’ll describe the FindWallInDirection subtree which is reused in two places via the parent behavior tree. This allows function-like behavior without having to code a common bit of tree behavior in more than one place.

FindWallInDirection subtree

FindWallInDirection is going to execute a sequence of two child nodes. There is an error in this subtree, which I’ll explain as we go along.

The first child node of Sequence is an Switch2 node. This node examines a blackboard variable, “rotate_first_in_direction” and will chose one of the child nodes depending on which of the “case_1” or “case_2” values matches the “rotate_in_first_direction” value. If the value is “left”, the first node will be ticked, which is a custom node I created called Turn that will turn 90 degrees, which is a turn to the left. If the value is “right”, the second node will be ticked ,which will Turn -90 degrees, which is a turn to the right. If the value is neither “left” or “right”, the third child, AlwaysSuccess is ticked, so no turn is made at the start.

The purpose of FindWallInDirection is to make an initial turn, attempt to find a wall ahead of the robot within some maximum distance, and then reverse the turn so the robot is pointing back in the original orientation. What is missing in this subtree is the action that makes that restoring turn. Again, this shows the value of creating a graphical representation of the behavior tree and then trying to explain it to someone. I will not attempt to fix the tree in this blog post.

After the initial turn is made, the Sequence node executes the second child, a RetryUntilSuccesful node which is going to make up to 6 attempts to move forward a bit and see if a wall is found ahead of the robot. This does so by executing an IfThenElse node which starts by testing if a wall is found ahead of the robot via the custom node I created, FrontWallFound, which takes a “within_distance” parameter. If the front wall is near, rather than just returning SUCCESS, I chose to tick the SaySomething node to emit a message to the log, which will also return SUCCESS as a side effect. If the wall isn’t found ahead of the robot, the custom node I wrote, MoveForward is ticked to move the robot ahead 0.0762 meters (3 inches).

Here is the XML code file used in this article

<root main_tree_to_execute="FindWall">
  <BehaviorTree ID="FindWallInDirection">
    <Sequence>
      <Switch2 variable="{rotate_first_in_direction}" case_1="left" case_2="right">
        <Turn degrees="90" />
        <Turn degrees="-90" />
        <AlwaysSuccess />
      </Switch2>
      <RetryUntilSuccesful num_attempts="6" name="MoveForward3Inches">
        <IfThenElse>
          <FrontWallFound within_distance="0.1524" />
          <SaySomething message="MoveUntilWallFound success" />
          <MoveForward distance="0.0762" />
        </IfThenElse>
      </RetryUntilSuccesful>
    </Sequence>
  </BehaviorTree>

  <BehaviorTree ID="FindWall">
    <IfThenElse>
      <FoundLeftOrRightWall within_distance="0.1254" />
      <AlwaysSuccess />
      <Sequence>
        <RetryUntilSuccesful num_attempts="3" name="FindWallToLeft">
          <IfThenElse>
            <FoundLeftOrRightWall within_distance="0.1254" />
            <AlwaysSuccess />
            <SubTree ID="FindWallInDirection" rotate_first_in_direction="left" />
          </IfThenElse>
        </RetryUntilSuccesful>
        <IfThenElse>
          <FoundLeftOrRightWall within_distance="0.1254" />
          <AlwaysSuccess />
          <RetryUntilSuccesful num_attempts="6" name="FindWallToRight">
            <IfThenElse>
              <FoundLeftOrRightWall within_distance="0.1254" />
              <AlwaysSuccess />
              <SubTree ID="FindWallInDirection" rotate_first_in_direction="right" />
            </IfThenElse>
          </RetryUntilSuccesful>
        </IfThenElse>
      </Sequence>
    </IfThenElse>
  </BehaviorTree>

</root>

Here is a proposed new XML with fixes for the two errors:

<?xml version="1.0"?>
<root main_tree_to_execute="FindWall">
    <!-- ////////// -->
    <BehaviorTree ID="FindWall">
        <IfThenElse>
            <Action ID="FoundLeftOrRightWall" within_distance="0.1254"/>
            <AlwaysSuccess name="AlreadyNearAWall"/>
            <Sequence name="SearchForWallLeftThenRight">
                <ForceSuccess>
                    <RetryUntilSuccesful name="SearchWallLeft" num_attempts="3">
                        <IfThenElse>
                            <Action ID="FoundLeftOrRightWall" within_distance="0.1254"/>
                            <AlwaysSuccess name="FoundAWallLeft"/>
                            <SubTree ID="FindWallInDirection" name="MoveLeftLookingForWall" direction="left"/>
                        </IfThenElse>
                    </RetryUntilSuccesful>
                </ForceSuccess>
                <IfThenElse>
                    <Action ID="FoundLeftOrRightWall" within_distance="0.1254"/>
                    <AlwaysSuccess/>
                    <RetryUntilSuccesful name="FindWallToRight" num_attempts="6">
                        <IfThenElse>
                            <Action ID="FoundLeftOrRightWall" within_distance="0.1254"/>
                            <AlwaysSuccess/>
                            <SubTree ID="FindWallInDirection" direction="right"/>
                        </IfThenElse>
                    </RetryUntilSuccesful>
                </IfThenElse>
            </Sequence>
        </IfThenElse>
    </BehaviorTree>
    <!-- ////////// -->
    <BehaviorTree ID="FindWallInDirection">
        <Sequence>
            <Switch2 case_1="left" case_2="right" name="MakeInitialTurn" variable="{direction}">
                <Action ID="Turn" degrees="90"/>
                <Action ID="Turn" degrees="-90"/>
                <AlwaysSuccess/>
            </Switch2>
            <ForceSuccess>
                <RetryUntilSuccesful name="MoveForwards3Inches" num_attempts="6">
                    <IfThenElse>
                        <Action ID="FoundFrontWall" within_distance="0.1524"/>
                        <AlwaysSuccess name="MoveForwards3InchesSuccess"/>
                        <ForceFailure>
                            <Action ID="MoveForwards" distance="0.0762"/>
                        </ForceFailure>
                    </IfThenElse>
                </RetryUntilSuccesful>
            </ForceSuccess>
            <Switch2 case_1="left" case_2="right" name="RestoreToInitialDirection" variable="{direction}">
                <Action ID="Turn" degrees="-90"/>
                <Action ID="Turn" degrees="+90"/>
                <AlwaysSuccess/>
            </Switch2>
            <Action ID="FoundFrontWall" name="ForceFinalSubtreeStatus" within_distance="0.1524"/>
        </Sequence>
    </BehaviorTree>
    <!-- ////////// -->
    <TreeNodesModel>
        <SubTree ID="FindWallInDirection">
            <inout_port name="rotate_first_in_direction"/>
        </SubTree>
        <Action ID="FoundFrontWall">
            <input_port name="within_distance"/>
        </Action>
        <Action ID="FoundLeftOrRightWall">
            <input_port name="within_distance"/>
        </Action>
        <Action ID="MoveForwards">
            <input_port name="distance"/>
        </Action>
        <Action ID="SaySomething">
            <input_port name="message"/>
        </Action>
        <Action ID="Turn">
            <input_port name="degrees"/>
        </Action>
    </TreeNodesModel>
    <!-- ////////// -->
</root>

2021 10 12 Motor Mount Problems

Everything about robots is hard. For example.

Puck motor twists away from frame

I have been doing experiments with Puck while it was sitting on the top of a desk. To help prevent it accidentally rolling about, I enabled the wheel locks for the two caster wheels in back. I was ready to try an experiment with an obstacle avoidance block of code I have been developing, so I put Puck on the floor in a hallway and brought up the software stack.

As is often the case, I also brought up a keyboard teleoperation node which allows me to drive the computer manually and remotely. I have a data visualizer on my desktop computer (rviz2) that lets me look at the LIDAR scans, the proximity sensors data, any of the camera videos, a mockup of the physical robot placed in the house, and so on, and the robot can be far away from me.

As I began to move the robot, it didn’t seem to be steering as I expected. It kept pulling to the right and I heard some unusual sounds coming from the hallway. But my mind was on the new feedback in the display which was showing me an idea of potential hazards along the way along with a recommendation of how to avoid the hazards. It was all fun until it wasn’t. Eventually there was a grinding sound coming from the hallway.

I went out to look and the picture shows what I saw. The right wheel and motor were twisted around and the wheel was grinding against the frame of the robot. I could barely imagine how this was even possible. My first thought was that there was a collision and the four bolts holding the motor to the frame had sheared off.

I put the robot back on the desktop and took off the top plate with the control panel and LIDAR, then the middle plate with the two computers and the camera array, and then I could look at the motor mounts. What had actually happened is that three of the four screws holding the motor had come out of the holes. How is that possible, you may ask?

The screws are 6-32 screws and after the initial robot construction I added a spacer between the motor and the chassis as the rear of the motor body is slightly wider that the front of the motor body and the right angle gear box. The screws fit into that gearbox. The spacers give room for the motor to lie completely parallel with the plane of the frame. When I added those spacers (just washers), it made is so the screws no longer reached completely through the gearbox housing. In fact, and I didn’t measure this when I did it, the screws barely screwed into the motor housing at all. And the torque from one locked caster wheel on the wheel-coupler-gearbox assembly was enough to really twist the motor so that three of those screws popped out.

The fix was easy enough—I replaced the screws with longer screws. But I still don’t have, for example, a sensor to detect if the caster wheels are locked or not before driving. That is still an open problem for human error for the robot. I could remove the wheel locks, but without a specialized frame to hold the robot off the floor when I run experiments where I want to motors to turn but the robot to not actually go anywhere, without that frame the wheel locks are my safety feature to prevent the robot from rolling off the desk.

Eventually, everything goes wrong.

Puck status for October 10, 2021

It’s been a long slog trying to get reliable recordings of the important sensors and other bits of data from the robot. 

I need these recordings so I can, say, drive around the house and record the proximity sensors, odometry and LIDAR as, together, they reflect what is around the robot as it moves. Then I can work on my software that tries to determine where the robot is in the house, where it is trying to get to, and computes a plan for getting there as the world changes around it. 

It’s not practical to try to debug complex software by trying to repeatedly drive the robot around the house on the same path, there are too many variables that will never be quite the same from run to run. And the battery discharges over time and there are only so many recharges I can make to the expensive battery. But by playing back the gathered data from one run over and over, I can pause where I need to and look at why my software isn’t quite doing what I expected.

I’ve mentioned in the past that ROS2’s recording system (rosbag2) is unreliable, at least so far, and lacks features I want, at least so far. I’m pretty sure that ROS2 will eventually be “good enough”, but that’s months or years in the future. Meanwhile, as has been for my whole life, I’m limited by my mortality. 

Unfortunately, the state of hobby robotics today, right now, is that you have to build everything yourself. Oh, you can start further up the ladder and spend money to buy a better base, better sensors, faster computers, a better starting position. But the problems are many and the existing solutions are few, specialized and brittle.

The way I tackle any complex problem is to try to reduce the problem to a sub problem that is just out of reach of what I can do today. Then I build what I know and understand, and then work on that little bit more that gets me a solution to that “just out of reach” problem. Then I go on to the next sub problem that is just out of reach but on the path towards being the real problem I need to solve.

Puck is intended to transport a robot gripper around the house, to find and fetch something, and move that something to a different place. Many people working on hobby robots call this the “fetch a can of beer from the refrigerator” problem, which is a fair goal for a hobby robot. Mind, for me it’s sufficient that the robot moves from some place reasonably far away from that “can of beer” which might actually be a colored block to start, maybe on a table to start with rather than in a refrigerator, recognize the can, grab the can, pick it up and move it to some other predefined place.

I don’t have a gripper yet. I’ve got ideas about how to build one, and friends who have offered to help me get something working. But gripping things isn’t the next “just out of reach problem” I need to solve just yet. Baby steps.

Over the past few years, as I’ve tried to learn robotics and build a real robot, I have been building ever more complex robots that get ever nearer that can-grabbing robot. But everything about robots is hard.

For example, my robot, as of today, pumps out 400 sonar samples a second and 800 laser, time-of-flight samples a second. Getting that information from the computer that reads those sensors to the computer that reacts to the data has been a real problem. 

I was at the point where I was getting all that data successfully transferred, then I added “just one more” thing to the mix and only half the data was being transferred. Why? I don’t know yet. And when I tried to add in “just one more sensor” to the mix, the whole thing blew up. Why? I don’t know yet.

But I can capture some data and play it back. And it sort of works. Something is missing in the “transform tree” yet so the whole magic isn’t working yet, but I’ll get there. Slowly.

But it’s a long slog, because everything about robots is hard.

ROS2 foxy and the failures of rosbag2

I used rosbag2 to attempt to record sensor data from Puck to be replayed later so I could debug code without having to run the actual robot around the house. Besides the feature failure of rosbag2, especially the inability to view data one time increment at a time, and the inability to control playback, such as pausing, looping and backing up, my current experiment shows that the recording ability of rosbag2 is insufficient.

I need to do further testing, but initially it looks like a LOT of data was simply not recorded.

There is also the whole problem about how to write code that can read the bag data. I know I’m slow at times, but I found the code base to be excessively obscure. Examples are almost non existent. And the examples I found were inconsistent in how to use the API. Still, I managed to cobble up some code that was able to read the bag file and do some analysis of my various proximity sensor reports. Note, I still have not found a way to read a compressed bag file—the runtime error reports are virtually meaningless.

The Teensy monitor in the robot reads eight time-of-flight sensors and four sonar sensors and gathers all of those readings and a few more and publishes it as a single message via Ethernet. Then there is a ROS node I wrote which polls that data and converts it into a set of Range messages, one for reach reported sensor reading. When I read the bag file, I find that instead of the same count of messages for all twelve sensors, there is a big variation in how many messages were captured per sensor.

The exercise is still useful in that I did get some useful analysis regarding how well the sensors were operating. In particular, the time-of-flight sensors are housed in a case I designed and printed and, as a guess, it appears that I don’t have sufficient clearance for the laser beam and return signal. That is, for several of the sensors, if there is a reflecting surface within about two meters, a strong signal is detected and a good distance is reported. But if there isn’t a close reflecting surface, it seems that a weak signal is detected and a bogus distance is reported. When no reflecting surface is nearby, the sensors report a distance of infinity. It appears that I should replace that infinity value with a zero value, which will play nicer with rviz, which doesn’t appear to like infinity values.

Here is the output of running the code I show later below, with lots of lines elided. It begins with showing some metadata from the bag files. Then it shows the timestamp, frame_id and distance data for each database record, along with the running minimum and maximum distance detected for the sensor. At the end, I show the number of messages received for each sensor, the number of messages reporting a value of infinity for the distance, and the detected minimum and maximum distance for the whole recording. The “*” after the infinity count is just flagging that all the reported distances were infinity for that sensor.


Opened database '/home/ros/rosbag2_2021_10_04-09_04_44/rosbag2_2021_10_04-09_04_44_0.db3' for READ_ONLY.
meta name: /t265/gyro/imu_info
meta type: realsense2_camera_msgs/msg/IMUInfo
meta serialization_format: cdr
meta name: /cmd_vel
meta type: geometry_msgs/msg/Twist
meta serialization_format: cdr
meta name: /tofSensor
meta type: sensor_msgs/msg/Range
meta serialization_format: cdr
...
START READ
[0/0] 1633363456.724302976 tof_0 inf 1000000.000000, 0.000000
[1/3] 1633363456.724393408 tof_3 0.200 0.200000, 0.200000
[2/6] 1633363456.724437120 tof_6 inf 1000000.000000, 0.000000
...
[806/3] 1633363456.339151744 sonar_3 0.944 0.935000, 0.957000
END READ, found: 525

TOF message count: 525
tof_0 count:  73, inf_count:  73*, min: 1e+06, max:   0
tof_1 count:  55, inf_count:   1 , min: 0.033, max: 0.112
tof_2 count:  46, inf_count:   2 , min: 0.937, max: 0.995
tof_3 count:  48, inf_count:  30 , min: 0.026, max: 0.583
tof_4 count:  63, inf_count:  56 , min: 0.548, max: 0.999
tof_5 count:  41, inf_count:  41*, min: 1e+06, max:   0
tof_6 count:  54, inf_count:  50 , min: 0.903, max: 0.972
tof_7 count: 145, inf_count:  23 , min: 0.917, max: 0.999

SONAR message count: 282
sonar_0 count:  46, inf_count:  46*, min: 1e+06, max:   0
sonar_1 count:  37, inf_count:   0 , min: 0.731, max: 0.736
sonar_2 count:  54, inf_count:   0 , min: 0.831, max: 0.993
sonar_3 count: 145, inf_count:   0 , min: 0.935, max: 0.957

Here is the code, warts and all. You can probably remove the stuff that is specific for my robot and use it as a basis for writing your own code to read a bag file. At the top, I also include a few web references to materials that helped my create this code.


// Invoke, e.g.: clear;ros2 run bag_utils playback_proximity_data_node
// --ros-args -p bagpath:="/home/ros/rosbag2_2021_10_04-09_04_44"

// http://ftp-osl.osuosl.org/pub/ros/ros_docs_mirror/en/foxy/Tutorials/Ros2bag/Recording-A-Bag-From-Your-Own-Node.html
// https://github.com/Kyungpyo-Kim/ROS2BagFileParsing/blob/master/dev_ws/src/dev_cpp_pkg/src/dev_cpp_node.cpp
// https://answers.ros.org/question/366301/rosbag2-sequential-reading-linker-error/

#include <rclcpp/rclcpp.hpp>
#include <rosbag2_cpp/converter_interfaces/serialization_format_converter.hpp>
#include <rosbag2_cpp/readers/sequential_reader.hpp>
#include <rosbag2_cpp/typesupport_helpers.hpp>

using rosbag2_cpp::converter_interfaces::SerializationFormatConverter;

using std::placeholders::_1;

#include <boost/filesystem.hpp>
namespace fs = boost::filesystem;

#include <iomanip>
#include <limits>
#include <sensor_msgs/msg/range.hpp>

#include "rclcpp/serialization.hpp"
#include "rclcpp/serialized_message.hpp"

class BagReader : public rclcpp::Node {
 public:
  BagReader() : Node("playback_proximity_data_node") {
    this->declare_parameter("bagpath");
    rclcpp::Parameter bag_path_parameter = this->get_parameter("bagpath");
    std::string bag_path = bag_path_parameter.as_string();
    std::cout << "CURRENT_DIRECTORY: " << fs::current_path() << "\n";
    rosbag2_cpp::readers::SequentialReader reader;
    rosbag2_cpp::StorageOptions storage_options{};

    // storage_options.uri = "/home/ros/rosbag2_2021_10_01-10_28_19";
    // storage_options.uri = "src/rosbag2_test_data";
    storage_options.uri = bag_path;  //###"rosbag2_2021_10_04-09_04_44";
    storage_options.storage_id = "sqlite3";

    rosbag2_cpp::ConverterOptions converter_options{};
    converter_options.input_serialization_format = "cdr";
    converter_options.output_serialization_format = "cdr";
    reader.open(storage_options, converter_options);

    auto topics = reader.get_all_topics_and_types();

    // about metadata
    for (auto t : topics) {
      std::cout << "meta name: " << t.name << std::endl;
      std::cout << "meta type: " << t.type << std::endl;
      std::cout << "meta serialization_format: " << t.serialization_format
                << std::endl;
    }

    // read and deserialize "serialized data"
    int tof_i = 0;
    int tof_counted_messages = 0;
    int tof_counts[8] = {0, 0, 0, 0, 0, 0, 0, 0};
    int tof_inf_counts[8] = {0, 0, 0, 0, 0, 0, 0, 0};
    float tof_mins[8] = {1e6, 1e6, 1e6, 1e6, 1e6, 1e6, 1e6, 1e6};
    float tof_maxs[8] = {0, 0, 0, 0, 0, 0, 0, 0};

    int sonar_i = 0;
    int sonar_counted_messages = 0;
    int sonar_counts[4] = {0, 0, 0, 0};
    int sonar_inf_counts[4] = {0, 0, 0, 0};
    float sonar_mins[4] = {1e6, 1e6, 1e6, 1e6};
    float sonar_maxs[4] = {0, 0, 0, 0};

    std::cout << "START READ" << std::endl;
    while (reader.has_next()) {
      // serialized data
      auto serialized_message = reader.read_next();
      if (serialized_message->topic_name == "/tofSensor") {
        sensor_msgs::msg::Range msg;
        rclcpp::Serialization<sensor_msgs::msg::Range> serialization;
        rclcpp::SerializedMessage extracted_serialized_msg(
            *serialized_message->serialized_data);
        serialization.deserialize_message(&extracted_serialized_msg, &msg);
        int32_t sec = (float)msg.header.stamp.sec;
        uint32_t ns = (float)msg.header.stamp.nanosec;
        int frame_id = tof_frameId(msg.header.frame_id);
        float r = msg.range;
        tof_counts[frame_id]++;
        if (r == -std::numeric_limits<float>::infinity())
          std::cout << "NEG_INFINITY" << std::endl;
        bool is_infinity = (r == std::numeric_limits<float>::infinity()) ||
                           (r == -std::numeric_limits<float>::infinity());
        if (is_infinity) tof_inf_counts[frame_id]++;
        if (r < tof_mins[frame_id] &&
            (r != -std::numeric_limits<float>::infinity()))
          tof_mins[frame_id] = r;
        if ((r > tof_maxs[frame_id]) &&
            (r != std::numeric_limits<float>::infinity()))
          tof_maxs[frame_id] = r;
        char rline[256];
        sprintf(rline, "[%d/%d] %10d.%-9d %5s %1.3f %f, %f", tof_i++, frame_id,
                sec, ns, msg.header.frame_id.c_str(), r, tof_mins[frame_id],
                tof_maxs[frame_id]);
        std::cout << rline << std::endl;
        tof_counted_messages++;
      } else if (serialized_message->topic_name == "/sonarSensor") {
        sensor_msgs::msg::Range msg;
        rclcpp::Serialization<sensor_msgs::msg::Range> serialization;
        rclcpp::SerializedMessage extracted_serialized_msg(
            *serialized_message->serialized_data);
        serialization.deserialize_message(&extracted_serialized_msg, &msg);
        int32_t sec = (float)msg.header.stamp.sec;
        uint32_t ns = (float)msg.header.stamp.nanosec;
        int frame_id = sonar_frameId(msg.header.frame_id);
        float r = msg.range;
        sonar_counts[frame_id]++;
        if (r == -std::numeric_limits<float>::infinity())
          std::cout << "NEG_INFINITY" << std::endl;
        bool is_infinity = (r == std::numeric_limits<float>::infinity()) ||
                           (r == -std::numeric_limits<float>::infinity());
        if (is_infinity) sonar_inf_counts[frame_id]++;
        if (r < sonar_mins[frame_id] &&
            (r != -std::numeric_limits<float>::infinity()))
          sonar_mins[frame_id] = r;
        if ((r > sonar_maxs[frame_id]) &&
            (r != std::numeric_limits<float>::infinity()))
          sonar_maxs[frame_id] = r;
        char rline[256];
        sprintf(rline, "[%d/%d] %10d.%-9d %5s %1.3f %f, %f", tof_i++, frame_id,
                sec, ns, msg.header.frame_id.c_str(), r, sonar_mins[frame_id],
                sonar_maxs[frame_id]);
        std::cout << rline << std::endl;
        sonar_counted_messages++;
      } else {
        // std::cout << "Skipping topic: " << serialized_message->topic_name <<
        // std::endl;
      }
    }

    std::cout << "END READ, found: " << tof_counted_messages << std::endl;

    std::cout << std::endl;
    std::cout << "TOF message count: " << tof_counted_messages << std::endl;
    for (int i = 0; i < 8; i++) {
      std::cout << "tof_" << i;
      std::cout << " count: " << std::setw(3) << tof_counts[i];
      std::cout << ", inf_count: " << std::setw(3) << tof_inf_counts[i];
      std::cout << (tof_counts[i] == tof_inf_counts[i] ? "*" : " ");
      std::cout << ", min: " << std::setw(3) << std::setprecision(3)
                << tof_mins[i];
      std::cout << ", max: " << std::setw(3) << std::setprecision(3)
                << tof_maxs[i];
      std::cout << std::endl;
    }

    std::cout << std::endl;
    std::cout << "SONAR message count: " << sonar_counted_messages << std::endl;
    for (int i = 0; i < 4; i++) {
      std::cout << "sonar_" << i;
      std::cout << " count: " << std::setw(3) << sonar_counts[i];
      std::cout << ", inf_count: " << std::setw(3) << sonar_inf_counts[i];
      std::cout << (sonar_counts[i] == sonar_inf_counts[i] ? "*" : " ");
      std::cout << ", min: " << std::setw(3) << std::setprecision(3)
                << sonar_mins[i];
      std::cout << ", max: " << std::setw(3) << std::setprecision(3)
                << sonar_maxs[i];
      std::cout << std::endl;
    }
  }

 private:
  std::unique_ptr<rosbag2_cpp::readers::SequentialReader> reader_;

  int sonar_frameId(std::string frame_id) {
    if (frame_id == "sonar_0") return 0;
    if (frame_id == "sonar_1") return 1;
    if (frame_id == "sonar_2") return 2;
    if (frame_id == "sonar_3")
      return 3;
    else {
      std::cout << "BAD SONAR FRAME ID: " << frame_id << std::endl;
      return -0;
    }
  }

  int tof_frameId(std::string frame_id) {
    if (frame_id == "tof_0") return 0;
    if (frame_id == "tof_1") return 1;
    if (frame_id == "tof_2") return 2;
    if (frame_id == "tof_3") return 3;
    if (frame_id == "tof_4") return 4;
    if (frame_id == "tof_5") return 5;
    if (frame_id == "tof_6") return 6;
    if (frame_id == "tof_7")
      return 7;
    else {
      std::cout << "BAD TOF FRAME ID: " << frame_id << std::endl;
      return -0;
    }
  }
};

int main(int argc, char* argv[]) {
  rclcpp::init(argc, argv);
  rclcpp::spin(std::make_shared<BagReader>());
  rclcpp::shutdown();
  return 0;
}

Wiring Diagrams

Using the open-source software at https://github.com/formatc1702/WireViz, I created the following power distribution wiring diagram for Puck.

Puck power distribution wiring diagram—open in new window and remove resize from URL to see detail

Here is the Roboclaw wiring.

Puck Roboclaw wiring diagram

Here is the monitor board wiring.

Puck monitor wiring diagram