What it does

The drone hovers overhead of the user in a completely unobtrusive manner. In addition to fulfilling your desire for revenge for the copious amounts of rain it collects, it is incredibly safe to take with you wherever you go. Furthermore, since the motors are waterproof, you can have peace of mind knowing that at least half of the umbrella is waterproof.

The system utilizes a flight controller, housing accelerometers and a compute module, which stabilizes and applies PIDS to fly the drone. This is connected to a companion Raspberry Pi 4b board, which computes the location of a person underneath it using the ROS2 framework that includes April tag detection and a couple of PIDS for controlling the movement. For indoor navigation, we had to use a couple of extra sensors, namely the LiDAR and optical flow, for altitude and positioning. This drone was built and developed over a hacking period of 24 hours.

Challenges we ran into

  • Motor directions mixing up and switching. When building and assembling the drones, the motors have to be correlated with their respective electronic speed controllers, such that the flight controller's PID controller will be able to control the correct motors at the correct times. However, the way we soldered the connections flipped the motors, changed the numbers from the aux numbers, and caused a bunch of hassle.

Normally, to fix a brushless DC motor spinning in the wrong direction, two of the three wires have to be resoldered. However, because we were on a time crunch, this was not an option since everything was already mounted. Thus we did some research and switched from PWM (pulse-width modulation) to Dshot 300 (Digital shot), which supports reversing motors through software. Furthermore, we had to do extensive testing to figure out the correct numbers for the motors.

  • Flying indoors

    - To test the drone, our only option was to fly the drone indoors since outdoor flight in Downtown Toronto is strictly prohibited, and there is no wind indoors. Flying indoors is extremely difficult since the primary source of velocity (GPS) does not have a lock to multiple satellites indoors. Multiple sensors are needed to determine the velocity and position of the drone. The Pixhawk flight controller that we used utilizes an Extended Kalman Filter to estimate the state most accurately by fusing different sensors considering the uncertainties. These difficulties with fusing these sensors are described in detail in the sections below.

    • Fusing LiDAR with altitude.
      • To fly indoors, we had to have an accurate altitude measurement to ensure the drone stayed put on the z-axis for our testing purposes and for safety (to not hit the roof). We had multiple issues with this in terms of preparation for flying. For example, the Pixhawk would not arm (spin up the motors in preparation for takeoff) unless the LiDAR was further off the ground.
    • Fusing optical flow
      • Since the GPS could not get a fix, it could not act as a primary source of velocity for the drone. To counter this, we used an optical flow sensor, which proved very unreliable in multiple aspects. We found during overnight testing that it does not like dimly lit environments, drifting in all directions, causing us to execute a few emergency stops. Furthermore, the documentation on calibrating the sensor was not very clear in the Arducopter documentation, which we had to piece together piece by piece.

Accomplishments that we're proud of

  • A flying, functional large-sized drone fully built in under 24 hours.

    • In 24 hours we were able to assemble a large drone that follows an April tag for a very, very useful application.
  • We did not get injured during the test flights.

    • We are happy that we are still in 3 pieces. We mean - one piece, each. We took safety very seriously and took many precautions to reduce the risks to both us and others around us.
  • Learning the complex (and painful) ROS2 framework

    • We were able to apply and learn the efficient (and painful) ROS2 framework, setting the grounds for future work in drone and robot development for each of us.
  • Overcoming the many limitations and challenges of achieving stable GPS-free indoor flight.

    • We are very satisfied with the fact that our drone can fly indoors autonomously, without a GPS lock, and stay very well in one position with the Arducopter LOITER mode.

What we learned

April tags are a very solid and robust alternative for machine learning, especially when we don't have datasets of the tops of people's heads. As long as the tag is in the frame, it can constantly detect the center of the tag, with almost 100% accuracy. This method is also able to measure the angles and the pose of the object under it. More information about the tags can be found in this paper: https://april.eecs.umich.edu/media/pdfs/olson2011tags.pdf.

ROS 2 framework is a very efficient and organized tool for managing large robotics projects such as drones. In the past, we had a bias towards utilizing only pure Python or C scripts, which would do the job, but look like spaghetti code. This framework has done us well in organization, as we can organize which nodes perform what tasks, and what data is transferred in between those nodes. We have also uploaded a picture of our framework, which we tested on this drone.

What's next for Anti-Umbrella-Dynamic-Loiter

Although this project may seem utterly useless, the central problem of accurately loitering above a moving target is very nontrivial, and we have only scratched the surface. We used a simpler PID controller due to the time constraints of the hackathon, but we are interested in investigating more advanced forms of control such as a Model Predictive Control that would allow for a much more consistent, robust, and efficient target following.

Built With

  • apriltag
  • arducopter
  • mavros
  • pixhawk
  • ros2
Share this project:

Updates