Developed during HackPrinceton Fall'25, Percepta is an iOS camera application that transforms your view of the world through different knowledge-based "lenses" - seeing everyday scenes through the eyes of a mathematician, physicist, biologist, or artist.
Percepta uses your device's camera combined with AI-powered object detection to provide unique perspectives on the world around you. Select a lens mode, point your camera, and receive interpretations and insights tailored to that specific viewpoint.
Translate scenes into formulas, ratios, and elegant proofs hidden in plain sight.
Surface forces, trajectories, and thought experiments that govern every motion.
Reveal living systems, evolutionary quirks, and ecological stories in the environment.
Highlight palettes, composition tricks, and creative insights to inspire your next masterpiece.
- Frontend: SwiftUI (iOS)
- Language: Swift
- Architecture: MVVM with modern Swift concurrency (async/await)
- Backend Integration: RESTful API communication with Flask backend
- Camera: AVFoundation framework
Percepta/
βββ PerceptaApp.swift # App entry point
βββ ContentView.swift # Navigation root
βββ Camera/
β βββ CameraModel.swift # Camera logic and capture handling
β βββ CameraPreview.swift # Camera preview UI component
βββ Components/
β βββ CameraView.swift # Main camera interface
β βββ LenSelectorView.swift # Lens mode selector
β βββ PermissionDeniedView.swift
β βββ ShutterButton.swift # Camera capture button
βββ Extensions/
β βββ Color+Hex.swift # Hex color support
βββ Models/
β βββ Lens.swift # Lens mode data models
βββ Screens/
β βββ HomeScreen.swift # Lens selection screen
β βββ CameraScreen.swift # Camera capture screen
βββ Services/
βββ APIService.swift # Backend API communication
- Xcode 14.0 or later
- iOS 15.0 or later
- Swift 5.7 or later
- Active iOS device or simulator with camera support
- Clone the repository:
git clone https://github.com/han-hangoc-le/percepta.git
cd percepta- Open the project in Xcode:
open Percepta.xcodeproj-
Configure backend URL in
APIService.swiftif needed (default:http://127.0.0.1:5000/api) -
Build and run the project on your device or simulator.
The app expects a Flask backend running with the following endpoints:
Health check endpoint to verify backend connectivity.
Response:
{
"status": "ok"
}Object detection endpoint that processes camera frames.
Request:
{
"imageBase64": "base64_encoded_image_data",
"lensMode": "mathematician|physicist|biologist|artist"
}Response:
{
"objects": [
{
"id": "unique-id",
"label": "Object Name",
"confidence": 0.95,
"boundingBox": {
"x": 0.1,
"y": 0.2,
"width": 0.3,
"height": 0.4
}
}
],
"lensMode": "mathematician",
"message": "Interpretation message from the lens perspective"
}- Launch the app - You'll see the WorldLens home screen
- Select a lens mode - Choose from Mathematician, Physicist, Biologist, or Artist
- Open Camera - Tap the "Open Camera" button
- Grant permissions - Allow camera access when prompted
- Capture frames - Tap the shutter button to analyze what you're seeing
- View insights - Receive AI-powered interpretations based on your selected lens
- Offline Mock Mode: Automatically falls back to mock data if backend is unavailable
- Real-time Camera: Native camera integration with AVFoundation
- Dark Mode UI: Sleek, modern interface optimized for low-light viewing
- Async/Await: Modern Swift concurrency for smooth performance
- Smart Error Handling: User-friendly error messages and automatic retry logic
- Flexible Backend: Configurable API endpoints with multiple fallback options
The app includes robust error handling:
- Network connectivity issues
- Backend timeouts
- Invalid API responses
- Camera permission denials
# Unit Tests
xcodebuild test -scheme Percepta -destination 'platform=iOS Simulator,name=iPhone 14'
# UI Tests
xcodebuild test -scheme PerceptaUITests -destination 'platform=iOS Simulator,name=iPhone 14'- Select your development team in Xcode project settings
- Update bundle identifier if needed
- Archive the project: Product β Archive
- Distribute through App Store Connect or ad-hoc distribution
Modify APIService.swift to configure backend endpoints:
private let defaultPort = "5000"
private let defaultPath = "/api"
private let defaultTimeout: TimeInterval = 10Adjust camera settings in CameraModel.swift for different quality or performance requirements.
This project is available under the MIT License.
- Thu Nguyen on the App dev and CV Phase
- Alex Tran on the GenAI Overlay and CV/AR Phase
- Ethan Do on the LLM and GenAI Overlay
- Han Le on the App dev and CV/AR Phase
- Built with SwiftUI and modern iOS development best practices
- Inspired by the desire to see the world through different perspectives
- Thanks to the HackPrinceton organizers and mentors
