University friend-matching app with 3D vector visualization, HNSW search, and self-optimizing index tuning.
- Node.js 18+ installed
- OpenAI API key (Get one here)
# 1. Set up environment
# Create .env file in project root with your OpenAI key:
echo "openaikey=your-api-key-here" > .env
# 2. Install dependencies
npm install
# 3. Start the development server
npm run dev
# App runs on http://localhost:3000# Test API health
curl http://localhost:3000/api/health
# Should return: {"status":"ok","timestamp":...}
# Test chat endpoint
curl -X POST http://localhost:3000/api/chat \
-H "Content-Type: application/json" \
-d '{"userId":"test","message":"Hello!","history":[]}'takoa/
├── .env ← Your OpenAI API key (create this)
├── app/ ← Next.js App Router
│ ├── api/ ← API routes (/api/chat, /api/graph, etc.)
│ └── page.tsx ← Main page component
├── components/ ← React components (SocialGraph, ChatInterface)
├── src/ ← Shared TypeScript services
│ ├── services/ ← Business logic (HNSW, UMAP, LLM)
│ └── data/ ← Seed data and user management
├── lib/ ← Client-side utilities (API client)
├── scripts/ralph/ ← Ralph automation scripts
└── prd.json ← Product requirements document
npm run dev # Start Next.js dev server (port 3000)
npm run build # Build for production
npm start # Run production build
npm run lint # Run ESLintAll endpoints are under /api/:
GET /api/health- Health checkPOST /api/chat- Chat with onboarding bot (with RAG)- Retrieves similar users from vector store for context-aware responses
{ "userId": "string", "message": "string", "history": [{"role": "user|assistant", "content": "string"}] }GET /api/graph- Get social graph dataGET /api/graph/match/:id1/:id2- Get match explanationGET /api/tuner/benchmark- Get index tuner results
Create a .env file in the project root:
openaikey=sk-your-openai-api-key-hereGet your API key from: https://platform.openai.com/api-keys
Note: The key is named openaikey (not OPENAI_API_KEY) to match your setup.
- Chat Onboarding - LLM extracts conversational signals with RAG (Retrieval-Augmented Generation)
- Retrieves similar users from vector store for context-aware responses
- LLM understands community interests and personalities
- 3D Social Graph - react-force-graph-3d visualization
- Match Explainer - Why two people are matched
- Embedding Toggle - Force view ↔ UMAP view
- Index Tuner - Self-optimizing HNSW parameters
- Framework: Next.js 14 (App Router) with API Routes
- Frontend: shadcn/ui, react-force-graph-3d, recharts
- Backend: Next.js API Routes, TypeScript, hnswlib-node, density-clustering, umap-js
- LLM: OpenAI GPT-4 with RAG (Retrieval-Augmented Generation)
- Retrieves similar users from vector store before generating responses
- Uses HNSW vector search for fast similarity matching
The app is ready for deployment on Vercel:
- Push to GitHub
- Import to Vercel
- Add environment variable
openaikeyin Vercel dashboard - Deploy!
All API routes are automatically handled by Next.js API Routes.
Ralph is set up for autonomous development of the graph visualization features.
Quick Start:
# Run Ralph to implement PRD stories automatically
./scripts/ralph/ralph.sh [max_iterations]See RALPH_SETUP.md for detailed instructions.
Current PRD: prd.json contains 6 user stories for:
-
3D graph visualization with ForceGraph3D
-
Hover tooltips (name, age, uni)
-
Click interactions (highlight edges, show top matches)
-
Match explanation panel (similarity scores + top dimensions)
-
Chat integration (graph refresh on updates)