Pundit Protocol is a multi-agent debate app that turns a topic into a fast, opinionated panel discussion. A FastAPI + uAgents backend orchestrates the debate, and a Next.js frontend streams each turn in real time.
- Accepts a topic from the UI.
- Fetches fresh context (NewsAPI) for grounding.
- Runs a moderator + three pundit agents:
- The Contrarian
- The Hype Man
- The Materialist
- Streams events over WebSocket (
overview->turn->summary). - Uses LLM generation for pundit turns and moderator synthesis, with safe fallback text if providers fail.
- Frontend: Next.js 16, React 19, TypeScript, Tailwind CSS
- Backend: FastAPI, uAgents, Pydantic
- LLM Providers: Gemini (default) or OpenAI
- Data Source: NewsAPI
backend/
main.py # FastAPI bridge + WS endpoint on :8080
requirements.txt
.env.example
agents/
moderator.py
pundit.py
messages.py
personas.py
services/
llm.py
briefing.py
news_fetcher.py
config.py
events.py
debate_engine.py
frontend/
app/page.tsx # Main debate UI + WebSocket client
package.json
- Python 3.11+
- Node.js 20+
- npm
git clone https://github.com/wizanyx/pundit-protocol.git
cd pundit-protocol
python -m venv .venv
source .venv/bin/activatepip install -r backend/requirements.txtcd frontend
npm install
cd ..cp backend/.env.example backend/.envEdit backend/.env and set real keys:
NEWSAPI_KEYGOOGLE_API_KEY(for Gemini)OPENAI_API_KEY(optional unless using OpenAI provider)LLM_PROVIDER=geminiorLLM_PROVIDER=openai
cd /home/wizanyx/Documents/dev/pundit-protocol
source .venv/bin/activate
python -m uvicorn backend.main:app --host 127.0.0.1 --port 8080cd /home/wizanyx/Documents/dev/pundit-protocol/frontend
npm run dev -- --hostname 127.0.0.1 --port 3000Open:
- Frontend: http://127.0.0.1:3000
- Backend docs: http://127.0.0.1:8080/docs
- Frontend opens
ws://localhost:8080/ws/debate. - Backend creates a
DebateBriefwith topic + article context. - Moderator emits initial
overviewevent. - Pundits generate per-round arguments.
- Backend streams
turnevents to frontend. - Moderator emits final
summaryevent.
overview- Includes debate brief and sources.
turn- Includes
speaker,text, andround.
- Includes
summary- Includes final synthesis and optional sources.
error- Includes a descriptive backend error payload.
- Enter a current-events topic in the UI.
- Show live streamed pundit turns arriving one by one.
- Toggle persona mode/source mode to demonstrate behavior shifts.
- End with moderator synthesis and cited source list.
Likely causes:
- Gemini quota exhausted or key invalid.
- OpenAI key missing/invalid when fallback to OpenAI is attempted.
LLM_PROVIDERset to a provider without a valid key.
Checks:
- Verify
backend/.envkeys are real values (not placeholders). - Confirm backend logs during a run for provider/auth/quota failures.
Run backend from repo root as a package module:
python -m uvicorn backend.main:app --host 127.0.0.1 --port 8080Do not run from backend/ with uvicorn main:app unless local import mode is specifically desired.
Built for BeachHacks as a real-time AI debate experience with a multi-agent architecture and live streaming UI.