Workplace violence in healthcare is a serious but often overlooked issue, with studies showing that 1 in 5 healthcare workers experience physical violence or verbal abuse. Our AI-powered Hospital Violence Detection System aims to detect and prevent violent incidents in real time using video and audio analysis, ensuring a safer work environment for medical staff and patients.
- Real-time Detection: Identifies signs of verbal aggression and physical violence.
- AI-Powered Analysis: Uses computer vision and deep learning to distinguish between normal and aggressive behavior.
- Automated Alerts: Notifies security personnel instantly when violence is detected.
- Privacy-Focused: No personal identification—analyzes actions, not individuals.
- User-Friendly Dashboard: Displays alerts and logs for monitoring and review.
- Healthcare Workers: Provides a safer work environment by reducing incidents of violence.
- Hospitals & Clinics: Enhances security without additional manpower.
- Patients: Ensures a calm and safe healthcare experience.
- Bias Mitigation: Trained on diverse datasets to avoid discrimination.
- Privacy & Security: All data is encrypted, and no personal identifiers are stored.
- Transparency: AI decisions are explainable and auditable.
- Programming Language: Python
- Frameworks & Libraries: OpenCV, TensorFlo
HackHive (MedSafe AI) is an AI-powered hospital security system that detects aggressive behavior and alerts security personnel in real time. Follow the steps below to install, configure, and run the system.
Before installing, ensure your system has the following dependencies installed:
- Python 3.8+
- pip (Python package manager)
- Docker & Docker Compose (For containerized deployment)
- Git (To clone the repository)
- FFmpeg (For audio processing)
- CUDA Toolkit (If using GPU acceleration)
git clone https://github.com/AbdulMustaf/HackHive.git
cd HackHivepip install -r requirements.txtNote: If using GPU, install
torchandtorchvisionwith CUDA support.
Create a .env file in the root directory and add the following:
DATABASE_URL=postgresql://user:password@localhost:5432/medsafe
SECRET_KEY=your_secret_key
CUDA_ENABLED=Truepython main.py- This will launch the MedSafe AI detection pipeline.
- The system will start monitoring video, audio, and speech for signs of violence.
docker-compose up --build- This will set up the entire application + database inside containers.
- Open your browser and navigate to
http://localhost:8000. - View real-time alerts and detection logs on the dashboard.
For production deployment:
- Use Kubernetes or AWS Lambda for scalable hosting.
- Store logs in PostgreSQL or Cloud Storage for compliance.
We welcome contributions! 🚀 If you’d like to improve MedSafe AI:
- Fork the repository.
- Create a feature branch.
- Submit a Pull Request.
For issues or suggestions, open an Issue or contact the maintainer.
Happy coding! 🔥👨💻