Autonomous food bank built at TreeHacks 2026! xArm 1S sorts donated food, Claude agents coordinate with donors + shelters.
- Hiwonder xArm 1S over USB (connected to laptop)
- Laptop webcam for vision classification
python -m venv venv
.\venv\Scripts\Activate.ps1
pip install -r requirements.txtSet your API key before running vision features:
$env:ANTHROPIC_API_KEY="YOUR_API_KEY"Physically move the arm to each position and record servo values:
python calibrate.pyCopy the output into positions.py.
python api.pyRuns on http://localhost:5000
Open dashboard.html in your browser. It will show:
- Live stats (items, weight, donors)
- Category breakdown (fruit/snack/drink)
- Real-time donation feed with Claude's detailed classifications
Test mode (manual capture):
python test_pipeline.pyPress SPACE to capture and classify.
Auto mode (motion detection):
python main.pyWatches for items placed in front of camera, auto-classifies and sorts.
Vision-only (no arm):
python main.py --no-armTest arm:
python test_arm.pyTest camera + vision:
python test_vision.pyTest camera demo:
python camera_demo.py --camera 0- Donations logged to:
donations.json - Captured images saved to:
images/ - API endpoints:
- GET
/donations- all records - GET
/donations/recent?limit=10- latest N - GET
/stats- summary stats
- GET