Aditi Chandrashekar Under the mentorship of Professor Soon-Jo Chung, John Lathrop, and Dr. Ben Rivière
This work was completed as the Aerospace Corporation SURF Fellow at the ARCL Lab at Caltech.
The INDY Autonomous Project is a global competition aimed at advancing autonomous vehicle technology through the design of high-speed self-driving race cars. There is significant financial risk in testing planning and control algorithms on real race cars. There is a need for a fully functional autonomous testbed on an RC car. To this end, a Traxxas X-Maxx RC car was modified to carry an NVIDIA Jetson Orin for onboard computing as well as a ZED 2 AI camera for perception, object detection, and odometry. This hardware stack was tested with path planning and control algorithms, including the A* algorithm. This testbed will enable the development of more sophisticated motion planning algorithms in the future.
Autonomous vehicles have garnered significant attention recently, leading to the release of increasingly capable technologies by various companies. Nevertheless, there are several challenges to the widespread commercialization of fully autonomous vehicles; these include concerns related to reliability, cost, and public perception.
In the past, open forums such as NASA's Big Idea Challenge and DARPA's Grand Challenge have addressed such challenges, driving substantial technological advancements in their respective scientific domains. The INDY Autonomous Challenge follows suit by inspiring university teams to create self-driving race cars capable of exceeding 200 mph. As a part of this competition, we aim to design control and planning algorithms for an autonomous racing car.
An integral step in this design process is the testing of the algorithms out of simulation. Testing algorithms on full-sized, professional race cars is financially infeasible. Foehn et al. developed a hardware and software framework for autonomous, agile quadrotor flight [1]. This serves as a versatile and standardized platform for further research on perception, control, and planning of autonomous quadrotor flight.
This problem requires a similar framework that will allow for the development of safer and higher-performing learning algorithms for cars. Existing frameworks to test these algorithms, such as those for PackBots, are largely oriented towards different specifications. While there are several common goals in building planning algorithms for RC cars and PackBots, the specifications for cars are distinct; cars offer more precise navigation, higher speeds, and increased mobility due to independent maneuverable wheels. Thus, there is a need for an autonomous testbed tailored to RC cars.
Our objective was to produce a successful testbed for control and planning algorithms on an RC car, then facilitate the long-term goal of integrating perception to make the car fully autonomous. This will eventually aid in the development of better algorithms for the INDY Autonomous Challenge.
We have created a framework capable of transmitting algorithms to the onboard computer and monitoring real-time data during execution. The project's enduring objective is the integration of advanced control and planning algorithms, involving full incorporation of the perception stack to enable closed-loop autonomy.
This milestone will be reached when the vehicle can autonomously observe and adapt to its surroundings, execute predefined plans, and employ its controller to respond to deviations. The controller can correct for the planned location of the robot, taking instructions from the motion planner. The perception stack is also fully developed. Next steps include further testing of the perception stack and integration with the existing software framework in the Chung Lab. Closing the loop will create a fully functional testbed representative of a fully autonomous race car.
The vehicle is a modified Traxxas X-Maxx featuring an NVIDIA Jetson Orin for onboard computation and a ZED 2 AI camera for perception. The Traxxas X-Maxx RC car was chosen for its versatility and ability to drive at max speeds of 60 mph, making it relatively representative of a race car. Retaining the original Steering servo and Electronic Speed Controller (ESC), we replaced the transmitter with an RP2040 microcontroller, which responds to ROS topic commands to control steering and drive functions.
The onboard Jetson Orin operates on Ubuntu 20.04 and ROS 1. The microcontroller interfaces with the ESC through 5V PWM pulses and the steering servo through 3V PWM pulses, generating PWM signals centered around 1500. These ranges were calibrated to align with the vehicle's capabilities in both steering and drive. ESC ranges were decreased for controlled testing within the laboratory. The X-Maxx car has four-wheel drive, with the front two wheels responsible for steering. The steering capability of the front wheels spans 60 degrees total, affording the vehicle a minimum turning radius of approximately 48 centimeters.
The ZED 2 AI Camera is a stereo camera system developed by Stereolabs. It combines two cameras with depth sensing capabilities to capture 3D imagery and enable depth perception similar to human vision. This camera uses pretrained vision models to calculate depth maps from the images captured by its cameras. It is also able to sense position, orientation, velocity, and acceleration information in a variety of frames. To implement feedback for the controller, we use pose data and position data relative to the odometry frame. For motion planning, we use other capabilities such as depth maps and obstacle detection.