Designing an Advanced Autonomous Robot: Goose

Reading Time: 10 minutes

Update: This article was featured on Hackaday.com and on SDP/SI’s “Featured People and Organizations” page.

Goose is a mobile autonomous robot I designed and built over 6 months in my spare time for a robotics competition. This was a fully custom and challenging build that tested my competence in electrical engineering, mechanical engineering, control systems, and computer science.

Instead of focusing heavily on the competition, the goal of this article is to briefly go through the system design process. I’ll touch on my various design choices and discuss how I chose to address some of the common issues in designing an autonomous robot. This is not a tutorial (those will be coming later) but more of a case study.

Competition and Results

The basic premise of the competition was to have an autonomous robot navigate a square arena (enclosed by four walls) and collect / sort  multi-colored cubes and balls. It had to avoid fixed obstacles and had a 3-minute time limit. The robot was also required to “orbit” around the center of the arena in a counterclockwise fashion while performing the aforementioned tasks.

Goose was in second place after the first round then suffered a catastrophic reverse polarity condition at my hands (thanks Murphy) prior to the second run. I made two critical mistakes:

  1. I daisy-chained the power rail going to my two motor controllers instead of tying each directly to my power bus. This set me up for the upcoming cascading failure.
  2. After being awake for over 24 hour I somehow reverse polarized the power rail going to my motor controllers, which destroyed them both.

Even without a second round run, Goose did well enough in the first round to finish fourth overall in the Open competition.

Hardware Design

The underside of the machined base with wiring harnesses connected

The first hardware decision I made was selecting the type of drive system the robot would employ. Since the robot was constrained to a 9″ x 9″ x 11″ box, and once I realized that I was just building a glorified vacuum cleaner, my design ideas converged towards a very compact Wall-E type chassis.

Based on experience I chose a tracked differential drive system, i.e., non-holonomic like a tank. As a general rule you want to avoid using omnidirectional wheels an autonomous robot unless you have a localization system that gives you an accurate pose. Using a two-motor differential drive system allowed me to achieve a compact design by placing the geared motors in the rear and directly driving the tracks.

SDP/SI was gracious enough to provide me with the timing belts and pulleys I used in my custom drivebase (shown in the image above) so thanks to them! I also would not have been able to build this robot without the amazing engineering/manufacturing facilities at Georgia Southern University that have state of the art equipment (such as the waterjet and laser cutter I used) available to students.

Prototyping and Base Build

After getting a rough idea of the physical design of the robot I started prototyping using SolidWorks, my 3D printer, and a laser cutter. My prototyping workflow involved first sketching out parts on paper, creating them in SolidWorks, then 3D printing or laser cutting them as needed. In total I ended up creating around 232 SolidWorks parts by the end of the build.

Rapid prototyping robot base
Rapid prototyping with wood and acrylic

The biggest issue I ran into was figuring out how to remove play from the shaft in the bearing. The inner race of the bearing would move when under tension which caused alignment issues with the belt. The shaft was anchored at one end only because there was no room to add a second bearing (even a sleeve bearing) on the outer plate end due to the competition size constraints.

I was able to overcome this issue by designing a sturdy bearing mount that I mounted on the inside of the bracket then added a smaller acrylic mount with a sleeve bearing on the outside. This, along with shortening the shaft, was able to remove any play from the shaft under tension. I also added a very small lip to the pulleys to prevent the belt from slipping. The final base used aluminum which was machined using a MAXIEM 1515 waterjet cutter.

Testing the right drive assembly at full power

The remaining parts were created using wood with the help of a laser cutter.

Electronics

The electronics subsystem of Goose utilized an architecture that I have been tweaking for my autonomous robot designs over many years. The main idea is to minimize the cost of failure by isolating as many systems as possible and using generalized interfaces to connect the systems, i.e., modularization. All the supply rails are also individually fused with on/off switches and indicators.

Control panel

Some core safety features of my architecture include:

  • Proper fusing for overcurrent protection.
  • Crowbar circuits for overvoltage protection. Crowbar circuits require very careful design in order to avoid false triggers due to noise or a narrow operating band.
  • High current ideal diodes for reverse polarity protection. I did not include this in Goose and of course suffered a competition ending reverse-polarity condition so I won’t skip it again!

All these concepts are nothing new in electronics but striking the right balance between complexity and functionality for any one particular application can be tricky, especially when creating an autonomous robot.

Even though this system did not contain any sensitive analog circuitry, I still isolated the noisy motor power rail from the rest of the circuitry, eventually tying everything together at a single point at the battery in the “star ground” configuration. It’s always important to have a good idea of the major current paths in your circuits so you can spot any potential issues like ground loops — this does not require advanced knowledge of mesh analysis!

Mainboard

Custom circuit with various sensor interfaces and drivers.

I have made it a habit to always use soldered PCBs for my projects. While breadboards are useful in the very initial stages of prototyping, they should not be used for anything past initial testing and brainstorming — they are too unreliable and can be difficult to troubleshoot due to issues like intermittent connections. Depending on your level of experience most analog circuits can be designed using a SPICE simulator and some basic math.

I designed the board for GOOSE using a stripboard. Stripboards are easier to use than regular solderable PCBs for more complicated circuits because they have connected strips (thus the name) that you cut to break the circuit at a desired point. This means you don’t have to solder a lot of holes in order to make a power bus or run a signal to multiple pins. Just don’t forget to cut the traces or you’ll have a pretty bad day!

The bottom of the stripboard before I cut the traces

The board contained the 5V and 3.3V power rails rated up to 2.5A, two PWM MOSFET driver circuits, a logic level converter, 9 DoF IMU, and the Teensy microcontroller. There are also a few other assorted supporting components.

Unpopulated stripboard

Cabling and Wiring Harnesses

One of the most important and necessary skills in robotics is being able to make cables and wiring harnesses that are custom to the robot. While the process can be tedious, it results in a more reliable and professional end product. I typically use 2.54mm JST-XH connectors and “Dupont” connectors for my wiring when working with prototype boards. There’s quite a lot behind the names of connectors so I encourage you to read this article which does a great job summarizing most of what you’ll need to know about crimpers and connectors. For high current applications I used regular spade connectors and Wago 221 connectors for creating power busses. You can get generic versions of these on eBay and similar sites.

When creating long runs on this project I cannibalized a CAT5 cable and crimped on my own connectors to the ends. This was especially ideal because the twisted pairs meant I could effectively run differential signals if needed.

The NASA Standard for Crimping, Interconnecting Cables, Harnesses, And Wiring is a great resource for learning from the pros. While most of the crimping tools they use cost in the hundreds of dollars on the low end, the information is still relevant and quite useful!

Processing Subsystem

I used two processing cores for Goose: a Teensy 3.2 (32 bit ARM Cortex-M4) microcontroller to handle all the deterministic logic, and a Debian-based Raspberry Pi 3B+ for the more heavy computational work dealing with mapping and image processing. The Raspberry Pi was running the Robot Operating System (ROS) and the Teensy communicated to the Pi via USB serial.

The two “brains” of Goose connected together.

Running at 96 MHz, the Teensy was fast enough to process data from two quadrature encoders and perform real-time odometry calculations; run an AHRS fusion algorithm from a 9 DoF IMU at about 2.1 kHz; run multiple PID loops; monitor two ultrasonic sensors at a high refresh rate, and send all this data to the Pi at a baud rate of 500,000. I had several ROS nodes running on the Teensy which communicated to the ROS Master on the Pi. As an example, I used the handler below to process messages from the Pi to the Teensy and then command the setpoint of the PID controllers:

void process_velocities(const geometry_msgs::Twist& cmd_msg)
{
	double x = cmd_msg.linear.x;	// forward velocity
	double z = cmd_msg.angular.z;	// rotational velocity

	if (z == 0) { // going straight
		left_motor_setpoint = x * 60 / (PI*wheel_diameter);
		right_motor_setpoint = left_motor_setpoint;
	}
	else if (x == 0) // turning in place
	{
		left_motor_setpoint = z * track_width * 60 / (wheel_diameter* PI * 2);
		right_motor_setpoint = -left_motor_setpoint;
	}
	else 
	{
		left_motor_setpoint = x * 60 / (PI * wheel_diameter) - z * track_width * 60 / (wheel_diameter * PI * 2);
		right_motor_setpoint = x * 60 / (PI * wheel_diameter) + z * track_width * 60 / (wheel_diameter * PI * 2);
	}
} // process_velocities()

Handler used to process velocity messages from the Pi and command the PID setpoints

I initially considered and prototyped using a BeagleBone Black since it contains two deterministic 32-bit PRU processing cores that can be accessed from the Linux system but I found the development overhead not worth it for one person with limited time. I wrote a few articles based on that experience here, here, and here.

Localization and Navigation Subsystem

The competition rules required the robot to move in large circles for the duration of the run which meant I couldn’t use encoder-based dead reckoning; the act of turning introduces errors due to drift and slippage which would accumulate to unacceptable levels over the course of the 3 minute run.

To counter this issue, I took advantage of the fact that the arena was enclosed by four walls in a square to develop a navigation system that used encoder odometry, IMU feedback, and LiDAR point cloud data to effectively pinpoint the location of the robot at all times.

As shown in the figure above, the basic idea was to use a pre-made map of the environment and then use point cloud data from the LiDAR in addition to the pose esitmate from to localize using a particle filter algorithm. I created the map in Adobe Illustrator and then converted it to a pgm file for use in ROS.

Map used to generate an occupancy grid for navigation in ROS

Localization inside a square would be difficult using LiDAR only (since all corners look the same and corner matching would fail) so I incorporated the pose estimate generated by fusing the encoder odometry data and the 9 DoF IMU using an unscented Kalman filter. I used an unscented Kalman filter (as opposed to the extended Kalman filter) for handling the nonlinear data because it is generally more accurate and has less computational complexity since there’s no need to calculate the Jacobian matrix.

Testing the tuning of the two motor PID loops

Side note:

I have written a paper that includes some experimentation on developing a computationally efficient 2D navigation algorithm (for a microcontroller-based autonomous robot) which I might revise and release when I have the time. The algorithm I developed had a time complexity of O(n). Below is a page from the paper showing part of my experimental setup and a mapping / vector generated by the algorithm. I used a TFmini short-range LiDAR module for these experiments.

Final Thoughts

This project was certainly fun and a lot of work! There’s so much more to this robot I haven’t even touched on in this article but for the sake of keeping it “brief” I will have to end it here.

If time permits I may write a separate article on each system of the robot along with any associated source code and hardware info; possibly a series on building an advanced autonomous robot. I would like to go into more details on the software side and the workflow I came up with for working easily with ROS on a Raspberry Pi and a Teensy microcontroller.

Let me know your thoughts on this build or if you’d like to see any tutorials based on what’s in this build or robotics in general!

19 thoughts on “Designing an Advanced Autonomous Robot: Goose”

    1. Since I used ROS to implement the filtering I’ll most likely do a tutorial on setting up ROS on a Pi first so the Kalman filter post will most likely be after that one. I’ll eventually cover everything!

  1. I’ve been having a good time with the ODroid-C2, btw, as a more powerful ROS-capable alternative to the RPi. Even though I have an Arduino, I am driving ROS DiffDrive nodes directly on the ODroid. I’ll have to rethink what putting nodes on the Arduino might look like… didn’t even really consider that. Are you sharing your code? Mine (currently non-functional, I stalled integrating IMU) is at https://github.com/payneio/baybot.

    1. Hmm I might give the ODroid board a try — looking at their website their newest board has some pretty impressive specs!

      I would recommend the Teensy boards over Arduino if you want to run ROS nodes on a microcontroller — they outperform the Arduino boards in practically every metric and most Arduino libraries will work on the Teensy. I almost always opt for using a microcontroller when dealing with encoders, high frequency control loops, or anything that has “realtime” requirements.

      I’m going to do a post soon on running ROS nodes on a microcontroller and a SBC like a Pi — looks like a lot of people are interested in that. It’s really pretty straightforward and rather formulaic but it does require more than surface-level knowledge of C++.

  2. Thanks for the writeup! I’m looking forward to the more detailed articles.

    Your localization block diagram shows wheel odometry and IMU data going into a Kalman filter, and you mentioned using a fusion algorithm for the IMU data. Was that AHRS fusion algorithm separate from the Kalman filter? Assuming it was, did you feed the position data from the Kalman filter back into the fusion algorithm? Or was the fusion algorithm only producing rates, which were used to update the position estimate in the Kalman filter?

    I’m imagining that if the AHRS fusion and odometry algorithms both maintained an estimated pose without any feedback, they would diverge from each other and become less and less useful.

    1. I modified Kris Winer’s implementation of Madgwick’s AHRS to do the 9 DoF sensor fusion. Since I was using ROS I used a filter from the robot_localization package. Most of the work was figuring out the proper covariances.

      The point cloud data from the LIDAR is the important piece since I wasn’t technically navigating based on the odometry data in the traditional sense. In such a small space the LIDAR would know precisely where it was as long as it knew which of the four quadrants it was in, which is what was provided by the odometry. Since I knew that it would take significantly longer than the competition run time for there to be any significant error in the odometry I could get away without completing the loop.

      1. Interesting, thanks for the info. When you’re talking about odometry providing the quadrant, was that the odometry + IMU? Or was IMU used for something else?

Leave a Reply

Your email address will not be published. Required fields are marked *