Autonomous Robotics Engineer
Building Autonomous Racing Intelligence.
A fully self-built autonomous racing car featuring stereo visual SLAM, multi-sensor EKF fusion (wheel encoder, IMU, steering input), real-time path planning, and Ackermann steering control — running entirely on a Raspberry Pi 5.
System Architecture
How the Robot Thinks and Moves
A tightly integrated stack — from raw sensor data to wheel actuation — running entirely on commodity hardware.
Perception
- ELP Global Shutter Stereo Camera
- STM LSM6DSOX IMU
- Wheel Encoder
- Steering Angle Input
- SGBM Stereo Depth
- ORB-SLAM3 / RTAB-Map
- Kalibr Calibration
Planning
- RRT / A* Global Planner
- Gap Following
- Pure Pursuit
- Racing Line Optimization
Control
- Arduino Nano RP2040 Connect
- Ackermann Steering (Servo)
- Brushless Motor + ESC
Mapping and Localization
Perception Stack
Two independent SLAM systems run on the robot, each suited to different operating conditions and levels of sensor fusion.
Camera-IMU Calibration
Before any SLAM algorithm runs, the stereo camera and IMU must be spatially and temporally calibrated. Kalibr estimates the extrinsic transform between sensors and aligns their timestamps — a prerequisite for accurate visual-inertial odometry.
Camera-IMU Calibration (Kalibr)
Spatial and temporal calibration of the ELP stereo camera with the STM LSM6DSOX IMU using Kalibr. Required for accurate visual-inertial odometry.
ORB-SLAM3
A stereo visual-inertial SLAM system that relies purely on the camera and IMU. ORB feature extraction produces a sparse 3D map with real-time loop closure. The camera and IMU are tightly calibrated with Kalibr to enable accurate visual-inertial odometry.
ORB-SLAM3 Corridor Mapping
Real-time stereo visual SLAM in an indoor corridor. Sparse point cloud with loop closure and relocalization.

ORB-SLAM3 Map Output
Zoomed-out view of the sparse 3D map built by ORB-SLAM3, showing the reconstructed point cloud and navigable corridors.
RTAB-Map
Unlike ORB-SLAM3, RTAB-Map fuses a richer set of proprioceptive sensors before building the map. Wheel encoder odometry, IMU measurements, and the commanded steering angle are fed into an Extended Kalman Filter (EKF) node, which produces a fused odometry estimate. This is then combined with the camera's visual odometry inside RTAB-Map, giving the system a far more robust pose prior — especially in low-texture areas where pure visual methods drift.
Sensor Fusion Pipeline
RTAB-Map Dense Reconstruction
Dense RGB-D SLAM generating a rich occupancy grid for navigation. Appearance-based loop closure handles revisited environments.

RTAB-Map Output Visualization
2D occupancy grid output from RTAB-Map — the primary map representation consumed by the path planner.
Stereo Depth Estimation
The stereo camera pair feeds a Semi-Global Block Matching (SGBM) pipeline running at 15 FPS entirely on the Raspberry Pi 5 CPU. The resulting dense depth map is published as a ROS 2 topic, leaving enough CPU headroom for SLAM, planning, and other algorithms to consume it simultaneously — no dedicated GPU needed.
SGBM Stereo Depth
Semi-Global Block Matching running at 15 FPS on the Raspberry Pi 5 CPU, leaving sufficient headroom for other algorithms to consume the depth map concurrently — no GPU required.
Path Planning
Where to Go and How to Get There
A layered planning stack — global route computation paired with reactive local obstacle avoidance — for safe, efficient trajectories.
Probabilistic sampling-based planner that efficiently explores high-dimensional configuration spaces to find feasible paths around obstacles.
Optimal grid-based path search using an admissible heuristic. Used for global route planning on the occupancy map generated by RTAB-Map.
Reactive local planner that identifies the widest obstacle-free gap in depth data and steers toward it — ideal for tight corridors at speed.
Geometric path tracker that computes the steering angle required to intercept a lookahead point on the planned trajectory.
Trajectory Generation
Raw waypoints are smoothed into kinematically feasible trajectories respecting the Ackermann geometry. Velocity profiles are computed to maximize speed within safe lateral acceleration limits.
Racing Line Optimization
For time-trial scenarios, an optimization pass computes the minimum-curvature racing line through waypoints — enabling the robot to carry maximum speed through corners.
Low-Level Execution
Control Architecture
Translating planned trajectories into precise wheel commands at millisecond timescales.
Ackermann Steering Model
The robot uses rear-wheel-drive Ackermann geometry, ensuring each wheel follows its own arc during a turn. This eliminates tire scrub and maintains stability at speed.
PID Controller
Proportional-Integral-Derivative control loops regulate both longitudinal speed and lateral steering angle, continuously minimizing tracking error against the planned trajectory.
Model Predictive Control
MPC solves a finite-horizon optimization at each timestep, predicting future states and computing control inputs that minimize a cost function over the prediction window.
Arduino RP2040 Low-Level Interface
The Arduino Nano RP2040 Connect translates high-level velocity and steering commands from ROS 2 into PWM signals for the brushless motor ESC and steering servo, providing deterministic real-time actuation.
Control Loop
Odometry and state feedback loop back to the controller
The Platform
The Robot
A custom-built autonomous racing platform designed from the ground up.
Hardware Components
Tools and Technologies
Built With Precision
C++ — Dominant Codebase Language
C++ is the primary language across the robot's software stack. Its zero-overhead abstractions, deterministic memory management, and direct hardware access make it the right choice for real-time ROS 2 nodes where every microsecond of latency matters.
FreeRTOS on Arduino — Lowest Jitter, Highest Determinism
The Arduino Nano RP2040 Connect runs FreeRTOS to deterministically sample the IMU, read servo position, and capture motor PWM feedback — then stream that data to the Raspberry Pi at a high, consistent rate. Preemptive task scheduling and priority-based execution eliminate the jitter inherent in bare loop() polling, ensuring sensor readings arrive at the Pi with minimal and predictable latency.
Platform Comparison
How It Compares
A detailed comparison of sensor capability, algorithm coverage, real-time engineering, and industry readiness across the leading small-scale autonomous racing platforms.
| Category | F1TENTH 1/10 scale research | Donkey Car DIY beginner RC | RoboRacer 1/10 competition | This Platform Custom Ackermann |
|---|---|---|---|---|
| Price | ~$4,000 | ~$250 – $400 | ~$1,400 | ~$500 |
| Primary Sensor | 2D LiDAR (RPLiDAR) | Single monocular camera | Camera or LiDAR | Stereo camera + IMU (full VIO) |
| Localization | LiDAR SLAM | None (end-to-end only) | Basic odometry or simulation | Stereo VIO + RTAB-Map 3D SLAM |
| Mapping | 2D occupancy grid | None | None or simulation | Full 3D dense mapping |
| Depth Estimation | No (LiDAR only, no stereo) | None | None | Stereo depth + 3D reprojection |
| IMU Fusion | Basic (VESC IMU) | None | None | Full camera-IMU calibration (Kalibr) + EKF sensor fusion |
| Racing Algorithms | Gap Follow, Pure Pursuit, MPC | Behavioral cloning only | Usually one algorithm | Gap Follow, Pure Pursuit, MPC + behavioral cloning |
| Behavioral Cloning | Not covered | Basic TF/Keras | Rarely | Production imitation learning on real hardware |
| Real-time Engineering | Partial (ROS 1 often) | No real-time concerns | Rarely taught | Real-time ROS 2, timestamp correction, hardware-aware coding |
| Camera Calibration | Basic | None | None | Full stereo + camera-IMU calibration with Kalibr |
| ROS Version | ROS 1 or ROS 2 | No ROS | Varies | ROS 2 (production standard) |
| Scale and Speed | 1/10 scale, fast | 1/10 scale, slow | Slow / simulation | 1/12 scale, very high speed |
| Industry Readiness | High (but inaccessible) | Low | Low | High — same stack used in industry |
Prices are approximate and vary by region and configuration.
Get In Touch
Contact
Interested in collaboration, research, or just talking robotics? Reach out.