Inspiration

Many people in underserved regions lack access to eye care specialists or diagnostic equipment but most own a smartphone. We started with VisionCheck and then rebranded it to OptiVision to turn that phone into a basic vision screening tool.

What it does

OptiVision walks users through 8 screenings: Visual Acuity, Color Vision, Astigmatism, Contrast Sensitivity, Near Vision, Amsler Grid, Peripheral Vision, and a Symptoms Review. Afterward, it outputs an urgency classification (Routine → Follow-Up → Clinical Evaluation → Seek Immediate Care), a Gemini AI-generated plain-language summary in the user's language, and options to share results or find nearby clinics.

How we built it

React 19 PWA — no app store, no backend, no database. Each test is a self-contained component with SVG rendering and pointer-event interactions. Scoring logic maps raw responses to pass/warn/fail tiers; overall urgency is the highest tier across all 8 tests. Gemini 2.5 Flash generates the AI summary; offline fallback and voice guidance use built-in browser APIs.

Challenges

Calibrating vision tests without hardware (screen sizes vary wildly), and extending an existing codebase (ClearSight) under hackathon time pressure.

What we're proud of

A fully offline 8-test suite that gives someone with zero access to eye care a genuine sense of what might be wrong — and how urgently to act on it.

What's next

Full UI translation across all supported languages, camera-based cataract detection via TensorFlow.js, PDF export, a Community Health Worker batch mode, and expanded African language support.

Share this project:

Updates