Autonomous Delivery Vehicle


This Project was initially inspired by the situation that students have to pickup their packages and take-away food at the main gate of the campus, which was far away from the dorm of most of students. Therefore we wanted to design an autonomous vehicle that could help deliver these things.

Globally, the vehicle was localized by RTK-GPS which provided 10 cm accuracy, and guided in the pre-built waypoint map. A ROS package for localization was used, which toke wheel odometry, IMU and GPS and used extended kalman filter to produce an optimal estimation of the position of the robot. A* algorithm will provide a shortest path from the closest waypoint to the given destination.

Collecting waypoints
As for perception, an RGB-D camera and a Radar were mainly used, all facing towards. We used the output of the RGB-D camera as the input to the nueral network to get a segmentation of collision-free space.
Collision-free space segmentation result
The segmentation mask contributed to steering decision-making. The vehicle tended to steer to the side with more collision-free space.
Radar acted as a back-up for the visual system. If objects in short distance were detected, the vehicle will perform emergency stop.
A touch screen was mounted on the top of the vehicle for user-robot interaction.