Bravobot: Robotic Tours

During admission season, Oliners give tours around campus to visiting prospective students and their families.

However, the number of tours are limited by student guide availability.

Bravobot is intended to make self-guided tours interactive.


I developed Bravobot over the course of one year with a team of 7 fellow students.

Hardware

I designed Bravobot in Solidworks, and manufactured the chassis on a 3 axis wood mill using 1/4 in plywood. Metal hardware was ordered as needed from McMaster. Plastic parts were 3D printed on a Stratasys Origin.

Bravobot uses a sliding electronics tray and a hatch opening design to simplify debugging. 

It runs on a fused 12V battery and uses a Roboclaw 2X7A Motor Controller, Arduino Mega, and ODROID for robot control.

Bravobot is equipped with the following sensors:

Software

Bravobot uses a three-tier software stack:

Leg Detection

Leg detection functions by combining LIDAR and camera input through a superpixel clustering algorithm. After converting to a lighting invariant color space, a large contiguous color contour that is noticeably closer than the background is isolated and assigned a centroid during calibration.

As the visitor walks around the Academic Center, Bravobot will turn to keep their legs centered in the field of view.

Simultaneous Localization and Mapping (SLAM)

Bravobot uses the g_mapping ROS package to generate a map of the Academic Center. By detecting its location within the map, which corresponds to a color-coded key of interest points, Bravobot is able to deliver a customized recording to the visitor.

Public Project Artifacts

YouTube Project Summary

YouTube Human-safe Obstacle Avoidance