Skip to content

PranavViswanath/project-lend

Repository files navigation

Project Lend

Autonomous food bank built at TreeHacks 2026! xArm 1S sorts donated food, Claude agents coordinate with donors + shelters.

Hardware

  • Hiwonder xArm 1S over USB (connected to laptop)
  • Laptop webcam for vision classification

Setup (Laptop)

python -m venv venv
.\venv\Scripts\Activate.ps1
pip install -r requirements.txt

Set your API key before running vision features:

$env:ANTHROPIC_API_KEY="YOUR_API_KEY"

Calibrate Arm Positions

Physically move the arm to each position and record servo values:

python calibrate.py

Copy the output into positions.py.

Run the Full System

1. Start the API (terminal 1)

python api.py

Runs on http://localhost:5000

2. Open the Dashboard

Open dashboard.html in your browser. It will show:

  • Live stats (items, weight, donors)
  • Category breakdown (fruit/snack/drink)
  • Real-time donation feed with Claude's detailed classifications

3. Run the Pipeline (terminal 2)

Test mode (manual capture):

python test_pipeline.py

Press SPACE to capture and classify.

Auto mode (motion detection):

python main.py

Watches for items placed in front of camera, auto-classifies and sorts.

Vision-only (no arm):

python main.py --no-arm

Quick Tests

Test arm:

python test_arm.py

Test camera + vision:

python test_vision.py

Test camera demo:

python camera_demo.py --camera 0

Data

  • Donations logged to: donations.json
  • Captured images saved to: images/
  • API endpoints:
    • GET /donations - all records
    • GET /donations/recent?limit=10 - latest N
    • GET /stats - summary stats

About

Autonomous food bank run by Claude

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors