Inspiration
What it does
How we built it
Challenges we ran into
Accomplishments that we're proud of
What we learned# Palate — Project Story
Inspiration
It started with Ratatouille.
There is a scene where Remy takes a single bite of food and the world explodes into color — flavor becomes visible, synesthetic, alive. As a kid that scene felt like magic. As an adult studying the actual neuroscience of taste, I realized it was just biology we had never been given the tools to see.
The real inspiration was learning that up to 80% of what humans perceive as "taste" has nothing to do with the tongue. It happens retronasally — aromatic molecules traveling upward through the nasopharynx as you chew, registering in the olfactory cortex. A completely distinct sensory pathway, operating below conscious awareness during every meal you have ever eaten. Nobody talks about it. Nobody has ever built a tool to surface it.
Palate is that tool.
What I Learned
The deeper I went into the research, the more I realized how radically under-explored this space is. Gordon Shepherd's Neurogastronomy (2012) established that flavor is primarily a brain construction — not a tongue sensation. Volatile organic compound (VOC) sensing technology, already validated in clinical breath analysis, is now small enough for consumer hardware. And an estimated 100 million people globally are living with post-COVID olfactory disruption with no clinical retraining tool.
The problem was real. The technology existed. Nobody had connected them.
How I Built It
Palate has two components:
The Hardware — a smart fork with four embedded sensors:
- Micro-biosensor detecting fat, sugar, and acid on contact
- Thermal tip sensor reading surface temperature to ±0.1°C
- 200Hz pressure array capturing texture and structural composition
- Bluetooth LE for continuous passive data transmission
The App — a mobile interface that:
- Receives real-time sensor data during a meal
- Generates a Flavor Fingerprint after eating — a visual breakdown of what drove the experience across retronasal, textural, and chemical dimensions
- Surfaces one actionable insight per meal
- Builds a longitudinal Taste Galaxy across months of data
The interface was designed and prototyped in Figma Make. The 3D fork interaction was built in Spline with cursor-reactive physics. The landing page was built in HTML/CSS with an embedded YouTube background pulling from the Ratatouille kitchen scene — because the visual language of the film is, it turns out, just accurate neuroscience.
Challenges
The hardest design problem was information overload. The fork captures continuous data across four sensor dimensions during an entire meal. The temptation is to show all of it. The right answer is to show almost none of it — filter everything, surface one insight, and trust the user to ask for more if they want it.
The second challenge was the eating disorder consideration. A tool that makes people more conscious of every bite carries real risk for vulnerable users. Palate screens for eating disorder history at onboarding and suppresses granular per-bite data for flagged users. The wellness goal is interoceptive awareness — not obsession.
The third was framing. Retronasal olfaction is not in the popular vocabulary. The entire product has to teach the science while making the experience feel effortless. That tension — between depth and simplicity — shaped every design decision.
The Math
The flavor construction model Palate is built on can be expressed as:
$$F = \sum_{i=1}^{n} w_i \cdot S_i$$
Where $F$ is the perceived flavor experience, $S_i$ represents each sensory input stream (retronasal VOC profile, texture, temperature, taste receptor activation, emotional state, memory association), and $w_i$ is the weighted contribution of each stream — with retronasal olfaction carrying $w \approx 0.80$ for most individuals under normal conditions.
Palate measures the $S_i$ values directly. For the first time, you can see the equation being solved.
Log in or sign up for Devpost to join the conversation.