Inspiration

I was drowning in screenshots of the day's gym routine, Apple Notes, and voice memos about workouts all sitting useless in different places. Vincent had built himself a custom app that used regex to parse these screenshots and update his database, but it's brittle and limited. We thought we could make something way better using AI to handle the messy, unstructured inputs which could lead to more data to take advantage of with less effort. Then we could have an agent proactively come up with insights and reports to do actions with.

What it does

The Quantified Self movement had the right idea but not ideal execution. People wore GoPros 24/7, logged every meal in spreadsheets, tracked 47 different biomarkers daily. They understood that comprehensive self-data could unlock insights, but the movement largely failed because of a two-sided bottleneck: recording everything was exhausting (who wants to log "3 almonds, 2:47pm" into a spreadsheet?) and even when people collected mountains of data, no one could actually process it meaningfully. You'd end up with gigabytes of life data sitting in folders, completely useless.

Why Now Works: We can finally fix both sides. Modern AI eliminates the recording friction AND can actually analyze the data to find patterns.

What We Built: Quantified Self MCP turns messy health inputs into structured data and AI insights with zero spreadsheet touching. Take a photo of your CrossFit whiteboard, voice memo about your workout, or Apple Watch screenshot - the MCP server extracts structured data and stores it in clean PostgreSQL tables.

The system receives whatever data you give it through ChatGPT or Claude and tries to fit it into existing schemas. If the data doesn't match existing tables, it creates new tables with you. The schema is dynamic based on what you're trying to record. If you decide you want to start tracking how you feel after eating meals, it updates the food table with a new "mood_after_eating" column and tracks that from that point forward.

For deep research, we have a specialized agent inside of an E2B container that does exploratory data analysis over your PostgreSQL tables. It creates matplotlib graphs and identifies trends based on what you're focused on, with access to web search through Exa.ai for additional context. We track how the agent uses different tools through Wandb logging.

Real Example: Vincent felt terrible after a run and didn't know why - the system analyzed his data and discovered he hadn't eaten enough carbs beforehand. This kind of insight becomes routine when you have comprehensive data and AI that can process it.

The Bigger Vision: We're using MCP because it meets people where they already are (ChatGPT/Claude), but this extends to AI glasses or any interface. The core insight is that protocols that reduce recording friction will unlock massive datasets for personal AI assistants to actually help us. The Quantified Self vision was right, they were just 15 years too early.

Challenges we ran into

We had a hard time hosting the MCP and decided to hack something together with ngrok to get it to work instead.

Accomplishments that we're proud of

We made our first MCPs servers.

What's next for Quantified Self MCP

We want an iPhone widget to make it easy to input info. I use MacroFactor for calorie counting and it has a really convenient widget.

Built With

Share this project:

Updates