🚨 Cyclops: AI-Powered Backpack Sentry
Inspiration
With the rapid advancement of computer vision and hardware capabilities, we wanted to tackle a problem that directly impacts personal safety. We noticed a common issue among travelers and everyday pedestrians who carry backpacks — lack of awareness of what’s happening behind them.
There are numerous situations this system could help prevent, such as stalking, pickpocketing, or unwanted proximity. Our goal was to build an awareness system that leverages AI to enhance safety and peace of mind in everyday movement.
What It Does
Cyclops is an AI-powered awareness system that mounts a rear-facing camera onto a backpack to monitor the user’s blind spot.
We developed algorithms and criteria to identify “suspicious” behavior, such as:
- A person remaining in frame for an extended period
- Someone moving closer over time
- Repeated reappearances of the same individual
If the system deems a person suspicious, it alerts the user’s phone via push notification, providing real-time situational awareness.
How We Built It
We used computer vision models like YOLOv11 and DeepSORT for image and video stream processing.
Alongside these models, Python scripts were used to extract data and generate statistics, which were then fed into GenAI models and our own algorithms to determine if a person was suspicious.
Finally, we integrated Firebase and Android Studio to create a real-time mobile notification system that connects the CV pipeline to the user’s device.
Challenges We Ran Into
Computer vision is a notoriously complex field. Our biggest challenge was the re-identification (ReID) problem — determining whether a person leaving and re-entering the frame is the same individual or a new one.
This problem is complicated by factors such as:
- Image resolution affecting embedding quality
- Varying lighting conditions
- Changes in pose or orientation
- Distance and occlusion
Although our model isn’t perfect, it serves as a proof of concept that demonstrates the potential of our full system.
Accomplishments We’re Proud Of
We’re proud of building something unique and interdisciplinary. Successfully merging multiple technologies was a huge accomplishment for our team.
Our tech stack includes:
- Computer Vision (YOLOv11, DeepSORT)
- Arduino hardware, ultrasonic sensor, webcam
- Python scripting and GenAI integration
- Firebase database
- Android mobile app development
Bringing all these together into a cohesive prototype was no small feat.
What We Learned
We learned a lot — both from our successes and our limitations.
While we mastered the tools in our tech stack, we also learned about project scope, integration challenges, and adaptability.
Originally, we attempted to create a fully autonomous agentic system using Google’s ADK. Although we pivoted away from that approach due to hardware limitations, we gained valuable insights from the process.
What’s Next for Cyclops
Looking ahead, we plan to:
- Improve computer vision performance using newer models and optimized quantization techniques
- Enhance the mobile app with more safety features and user customization
- Streamline the workflow for better efficiency and maintainability
- Explore on-device inference for low-latency and offline operation
Built With
- ai
- android-studio
- arduino
- deepsort
- firebase
- python
- sensor
- yolo

Log in or sign up for Devpost to join the conversation.