Spatial fashion capture and taste intelligence MVP for Meta Ray-Ban smart glasses.
Detailed setup: docs/setup.md
apps/mobile-captureFlutter companion app with DAT abstraction, frame preprocessing, upload.apps/ui-aestheticaNext.js landing page + dashboard (App Router).apps/dashboard-webLegacy React + Vite dashboard (kept for reference).services/apiFastAPI API, auth, capture ingestion, DB models, endpoints.services/workerCelery worker for async inference pipeline.services/mlShared ML pipeline code: segmentation, embeddings, attributes, FAISS, radar.infraDocker Compose for local dev stack.docsArchitecture and API notes.dataDemo catalog, FAISS artifacts, uploads.
- Docker + Docker Compose
- Python 3.11+ (optional, for local non-docker runs)
- Node 20+ and pnpm/npm (dashboard local run)
- Flutter 3.24+ (mobile app)
- Copy env:
cp .env.example .env- Start infra + services:
make dev- Run DB migrations and seed demo data:
make migrate
make seed- Build product embeddings + FAISS indexes:
make embed-productsAPI: http://localhost:8000
Dashboard: http://localhost:5173
Catalog API: http://localhost:8001
docker compose -f infra/docker-compose.yml up --buildmake uiThis serves:
- Landing:
http://localhost:5173/ - Dashboard:
http://localhost:5173/dashboard
cd apps/mobile-capture
flutter pub get
flutter runDev notes for DAT integration:
- The app uses
DatServiceabstraction (MethodChannel) so native DAT SDK integration can be plugged in on iOS/Android. - iOS bridge is now implemented in
apps/mobile-capture/ios/Runner/AppDelegate.swiftwith:- DAT provider path (
MWDATCore+MWDATCamera, when SPM dependency is installed) - AVFoundation fallback stream path (phone camera)
- DAT provider path (
- Hardware camera-button capture is wired through DAT
photoDataPublisherto auto-upload flow. - Capture trigger supports in-app button; extendable for physical button callbacks and volume-button shortcut.
- Source CSV:
data/products.csv - Embedding script:
services/ml/scripts/embed_products.py - FAISS output:
data/faiss/*.indexanddata/faiss/*_mapping.json - Open-web match fallback: SerpAPI Google Shopping
See .env.example.
Key ones:
DATABASE_URLREDIS_URLFAISS_DIRPRODUCT_CSV_PATHOPENCLIP_MODEL_NAMEOPENCLIP_PRETRAINEDPOKE_API_KEYSERPAPI_API_KEY(for live online-shop search beyond local catalog)
- OpenAPI UI:
http://localhost:8000/docs - Human-readable spec:
docs/api.md
make testIncludes:
- Unit tests for embedding/radar math.
- Integration test for capture pipeline (mock model providers).
- Stores only cropped/blurred capture image.
- Runs backend safety face blur pass before persistence.
- No full scene frame persisted.
- Basic capture endpoint rate limit.
- Health:
/healthz,/readyz - JSON structured logs with request/capture correlation IDs.
- Celery async jobs with retry support.
Dedicated endpoint for image -> OpenAI -> Serp -> DB write:
POST http://localhost:8001/v1/catalog/from-image- No auth
- Accepts multipart
imageupload or rawimage/jpegbody - Uploads the input image to Supabase Storage bucket
captures(best effort) whenSUPABASE_URLandSUPABASE_SERVICE_ROLE_KEYare set - Runs additional style-recommendation flow immediately:
- OpenAI call #1: style description + 5 scores (0-100) into
style_scores - Aggregate last 5 score rows/descriptions
- OpenAI call #2: recommendation rationale + search query
- One Serp shopping call -> top 5 rows into
style_recommendations
- OpenAI call #1: style description + 5 scores (0-100) into
Run (from repo root):
cp .env.example .env
# set these for storage upload of API input images
# SUPABASE_URL=https://<project-ref>.supabase.co
# SUPABASE_SERVICE_ROLE_KEY=<service-role-key>
docker compose -f infra/docker-compose.yml up -d --build postgres redis api catalog-api
make migrateTest with any local image file (example below uses one that exists in this repo):
curl -sS -X POST "http://localhost:8001/v1/catalog/from-image" -F "image=@apps/ui-aesthetica/public/images/outfit-1.png"Verify latest writes in the configured database:
docker compose -f infra/docker-compose.yml run --rm api sh -lc 'python - <<\"PY\"
from sqlalchemy import create_engine, text
from app.core.config import settings
engine = create_engine(settings.database_url)
with engine.connect() as c:
print("catalog_requests:", c.execute(text("select count(*) from catalog_requests")).scalar())
print("style_scores:", c.execute(text("select count(*) from style_scores")).scalar())
print("style_recommendations:", c.execute(text("select count(*) from style_recommendations")).scalar())
PY'Quick health checks:
curl -s http://localhost:8001/healthz
curl -s http://localhost:8001/readyzThis script runs a real image through:
POST /v1/catalog/from-image- Supabase capture upload URL check
- OpenAI image analysis for clothing description + brand/color/style cues
- OpenAI query builder for shopping retrieval
- Serp Google Shopping using the OpenAI-built query
Run from repo root:
docker compose -f infra/docker-compose.yml run --rm api \
python services/api/app/scripts/test_openai_shopping_pipeline.py \
--image apps/ui-aesthetica/public/images/outfit-1.png \
--api-base http://catalog-api:8000Or use Make:
make test-openai-shopping
# custom image:
make test-openai-shopping OPENAI_TEST_IMAGE=apps/ui-aesthetica/public/images/outfit-9.pngThe script prints:
request_idcapture_blob_url+ HTTP statusquery_used(final OpenAI-built query used for shopping)- top Shopping results returned by the API