Inspiration

In probability theory, the Posterior is a key term in Bayesian inference, representing our most informed beliefs using all the information available to us. We believe that Posterity helps traders and researchers calculate their own posteriors, and become better, more informed forecasters.

While decentralized prediction market platforms like Polymarket hold valuable event-driven data. There is no unified analytics UI like a trading dashboard. In traditional finance, traders correlate assets (e.g., oil vs. USD strength); in prediction markets, we can do the same with “event contracts.” Posterity also helps traders find correlated markets using both price correlations and LLM-powered semantic similarity. Even moreso, the correlations between prices reveal much richer information than just the market's instantaneous estimate of probabilities.

What it does

Posterity is the "Bloomberg Terminal of Polymarket,” letting traders visualize cross-event sentiment and implied probabilities in real time. The application combines real-time market data with cutting-edge statistical analysis and AI-powered insights to help users identify relationships between different prediction markets.

How we built it

Frontend Framework: Next.js 14 with App Router Language: TypeScript for type safety Styling: TailwindCSS with custom components UI Components: Custom pixel animations, terminal effects, glassmorphism design Charts: TradingView Lightweight Charts for professional market visualization Backend & APIs API Routes: Next.js API routes for server-side processing Data Sources: Polymarket Gamma API (live market data) Polymarket CLOB API (historical price data) AI Integration: OpenAI API (text-embedding-3-large, GPT-4) Statistical Computing Correlation Analysis: Custom Pearson correlation implementation Time Series: Advanced alignment and normalization algorithms Probability Calculations: Conditional and joint probability estimation Linear Regression: Custom implementation for market relationship modeling

The Math

Suppose \(A\) and \(B\) are events. Suppose that \(P(A)\), \(P(B)\) and all of their combinations e.g. \(P(A | B)\) are unknown constants that we would like to estimate. We have access to Polymarket's price history \(P(A)_t\), \(P(B)_t\), which we assume are unbiased estimators for \(P(A)\), \(P(B)\).

We have \(P(A) = P(A | B) P(B) + P(A | \neg B) P(\neg B)\). Using \(P(\neg B) = 1 - P(B)\) we obtain \(P(A) = [ P(A | B) - P(A | \neg B) ] P(B) + P(A | \neg B)\)

If \(P(A)_t\), \(P(B)_t\) are good estimators, then \(P(A)_t \approx [ P(A | B) - P(A | \neg B) ] P(B)_t + P(A | \neg B).\)

This means that, to first order, \(P(A)_t\) should be linear in \(P(B)_t\). Using classical statistical theory, the least squared error slope \(m = P(A | B) - P(A | \neg B) = r \cdot \sigma_A / \sigma_B\), where \(r\) is the correlation coefficient, \(\sigma_A\) is the standard deviation of \(P(A)_t\), and \(\sigma_B\) is the standard deviation of \(P(B)_t\), and the y-intercept \(b = P(A | \neg B) = E(P(A)_t - m P(B)_t)\).

Since \(m\) and \(b\) are observable, this allows us to estimate \(P(A | B)\), \(P(A | \neg B)\), and by reversing our variables, \(P(B | A)\) and \(P(B | \neg A)\). Additionally, we can estimate \(P(A \cap B)\) as both \(P(A | B) P(B)_T\) and \(P(B | A) P(A)_T\) (where \(T\) is the maximum of our time interval) which we average to obtain a closer estimate.

Challenges we ran into

The main challenges we encountered include visualizing Polymarket's data in an intuitive fashion as well as calculating joint and conditional probabilities using only point estimates of isolated probabilities.

What's next for Posterity

-- Higher order fitting and maybe even high dimensional gradient descent-driven fitting

Built With

  • next.js
  • openai-api
  • polymarket-clob
  • polymarket-gamma
  • tailwindcss
  • typescript
Share this project:

Updates