Skip to content

MohitGoyal09/MemexLLM-backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

110 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MemexLLM Backend

MemexLLM Logo

FastAPI Backend for AI-Powered Document Intelligence

Python FastAPI PostgreSQL License: MIT

FastAPI application with Supabase, LangChain, and AI integrations for the MemexLLM AI research assistant.

Frontend RepoFeaturesTech StackQuick StartDeployment


🎯 Overview

The MemexLLM backend provides a powerful API for document processing, AI-powered chat, and content generation. It leverages modern AI frameworks to transform documents into interactive knowledge bases.

Built for production with async processing, streaming responses, and enterprise-grade security.


🏛️ Architecture

MemexLLM Architecture

The backend follows a service-oriented architecture with a focus on asynchronous processing and reliable document ingestion.


✨ Features

  • Document Processing: PDF, TXT, DOCX, audio, YouTube, and web content extraction
  • AI Chat: Contextual Q&A with source citations and streaming responses
  • Content Generation: Create podcasts, quizzes, flashcards, mindmaps, and more
  • Vector Search: Semantic search using Pinecone embeddings
  • Authentication: Supabase Auth with JWT tokens
  • Real-time: Server-Sent Events (SSE) for streaming responses

🛠️ Tech Stack

Category Technology
Framework FastAPI
Language Python 3.11+
Database PostgreSQL + Supabase
ORM SQLAlchemy + Alembic
AI LangChain, OpenAI, Anthropic, Google Gemini
Embeddings OpenAI, Google
Vector DB Pinecone
TTS Google Cloud TTS, ElevenLabs
YouTube yt-dlp
Deployment Docker, Coolify

🚀 Quick Start

Prerequisites

  • Python 3.11+
  • PostgreSQL database (local or Supabase)
  • API keys for AI providers

Installation

cd backend
uv sync

Environment Setup

Copy .env.example to .env and configure:

# Supabase
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your-service-role-key

# Database
DATABASE_URL=postgresql://user:pass@localhost:5432/memexllm

# AI Providers
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=...

# Pinecone
PINECONE_API_KEY=...
PINECONE_INDEX=memexllm

Development

uv run uvicorn src.main:app --reload

API docs available at http://localhost:8000/docs


📁 Project Structure

backend/
├── src/
│   ├── api/              # API routes
│   │   └── v1/
│   │       ├── notebooks.py
│   │       ├── chat.py
│   │       ├── sources.py
│   │       └── ...
│   ├── core/             # Core config
│   │   ├── config.py
│   │   ├── security.py
│   │   └── supabase.py
│   ├── models/           # SQLAlchemy models
│   ├── services/         # Business logic
│   │   ├── ai/
│   │   ├── chat/
│   │   ├── documents/
│   │   └── storage/
│   └── main.py           # App entry point
├── migrations/            # Alembic migrations
├── tests/                # Test files
└── alembic.ini           # Alembic config

🐳 Docker

docker-compose up -d

🚢 Deployment

Coolify

See docs/COOLIFY_DEPLOYMENT.md for detailed deployment instructions.

Docker Compose

docker-compose -f docker-compose.prod.yml up -d

📡 API Endpoints

Endpoint Description
POST /api/v1/notebooks Create notebook
GET /api/v1/notebooks/{id} Get notebook
POST /api/v1/sources/upload Upload document
POST /api/v1/chat/message Send chat message
GET /api/v1/studio/{type} Generate content

See http://localhost:8000/docs for full API documentation.


🤝 Contributing

  1. Follow the existing code style (ruff)
  2. Write tests for new features
  3. Update documentation

📜 License

MIT License

About

No description, website, or topics provided.

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages