Presents concepts, principles, and algorithmic foundations for robots and autonomous vehicles operating in the physical world. Topics include sensing, kinematics and dynamics, state estimation, computer vision, perception, learning, control, motion planning, and embedded system development. Students design and implement advanced algorithms on complex robotic platforms capable of agile autonomous navigation and real-time interaction with the physical word. Students engage in extensive written and oral communication exercises.
Throughout the semester my team and I (the Bad Bananas) worked on a variety of projects that we first implemented on a simulated platform before applying on the real-life robotic platform shown below. We implemented labs focusing on Wall Following control algorithms, Visual Servoring, Localization, and Motion Planning and Trajectory Following using the Hokuyo Lidar sensor and the ZED camera. More details on each lab can be seen at our team website linked above!
For our final project we implemented navigation using Deep Neural Nets for Gate Detection and Imitation Learning. My focus on our team was more on the Imitation Learning aspect of the project. For this we deployed an imitation learning system, PilotNet, which spanned low and high-level perception and control to drive a car around the entire Stata (MIT Building) basement loop based solely on human-driven examples. We also used a different car for this final project that relied solely on 3 basic webcams a more powerful onboard computer, as seen below.
Our final project video can be seen here!