This competition required designing a robot that can avoid obstacles and use GPS coordinates to find orange traffic cones. However, the accuracy of GPS is not enough to pinpoint the exact position of the cone, and so a camera must find the cone in the general vicinity of the coordinates and touch it. I used a 6-wheeled platform so that it can negotiate the terrain.
When programming the GPS coordinates at the beginning, we were allowed to survey the area. I took that time to make intermediate GPS markers so that the obstacle avoidance subsystem was used only when necessary. The obstacles were primarily bushes, which are tricky to detect using sonar. For the machine vision system, I used a Pixy camera that finds pixels in view that match the color of the cone (bright orange) and set up a bounding box and estimated a confidence level using color segmentation. The position of the bounding box in view gave the relative coordinates of the cone. It was mounted on a servo that looked around 180 degrees, and the rover drove towards wherever the camera sees the cone.
I shied away from using a kernel-based system such as ROS, since it was unnecessarily complex for the application, in addition to having a lot of hidden variables that completely confuse the debugging process. The difficulty was illustrated by the other teams on the day-of, when various expletives were thrown around regarding the unforeseen errors emanating from ROS. Instead, I used the Parallax Propeller chip, a 32-bit microcontroller with 8 cores. This was programmed from scratch using the Spin language, and later optimized by using raw assembly to improve execution speed. This low-level no-frills approach allowed me to place 4th amongst teams from 17 countries.
- Project Started: July 2014
- Project Completed: April 2015
- Prototypes: 2
- Estimated Total Build Time: 100hrs
- Current Status: Project Archived