Overview

The base functionality of the rover is object-avoidance, allowing it to avert barriers and to navigate through open space. Once the motors are activated by the user, the vehicle is completely autonomous. The areas of open space encountered determines the direction of movement. Additional functionality includes recognition of obstacles, specifically the color of obstacles. Upon reaching a barrier, the rover associates the color of the barrier with the series of movements required to navigate around it. Colors are detected by a PIXY camera mounted on the front of the vehicle. This basic learning mechanism allows the rover to effectively maneuver around objects based on its past encounters.

How We Built It

The chassis of the rover is from a DFRobot - 4WD Mobile Robot Platform kit. Located on the top plate are two Arduino Uno microcontrollers, several small breadboards, and a custom sensor mount made from press-fit acrylic that houses five ultrasonic sensors. The functionality of the rover is divided between the two Arduinos, one that controls the sensors and one that controls the DC motors. During movement, the top Arduino retrieves readings from the sensors and converts them into distance values. These values are communicated to the bottom Arduino via an I2C communication bus. By parsing the distance values, appropriate signals are sent to the DC motors to dictate movement.

Challenges Encountered

The initial design of the rover included an IMU sensor, which was used to calibrate turns. The vehicle would turn left or right depending on the degree value set. However, the inaccuracies of the sensor indoors often translated to imperfect turns. At times, the sensor could not calibrate a turn correctly and the rover would spin continuously in circles. We resolved this issue by removing the sensor and utilizing encoders. These encoders counted wheel rotations and provided a more accurate system of navigation.

We encountered another dilemma when working with the ultrasonic sensors. Each sensor returned a pulse that was proportional to the distance detected. This width would have to be measured with the input capture pins of the Arduino, one for each of the five sensors. However, the Arduino only had one input capture pin accessible. We resolved this issue by creating our own subroutine that mimicked the input capture of the microcontroller. The subroutine continuously checked for transitions in voltage (high to low, low to high) from each sensor and noted the timer count at these instances. This provided relatively accurate pulse widths, which translated to acceptable distance values. Our custom input capture subroutine calculated distances from the sensor with an error of less than half a centimeter.

What's next?

In future iterations, we would like to improve the rover's system of navigation. The ultrasonic sensors, for the most part, provided adequate distance readings. However, these readings drastically fluctuated if the sensor was positioned at an odd angle in relation to a barrier. Instead of ultrasonic sensors, we may include a LiDAR sensor, which is far more accurate.

Additionally, we would like to add more functionality to the rover. For instance, the user may determine the start and end points of the route, leaving the intermediate journey up to the rover to decide.

Built With

Share this project:

Updates