Inspiration
In classrooms today, students often struggle to truly visualize complex scientific diagrams, engineering mechanisms, biological processes, or 3D structures. Textbooks and chalkboard diagrams simply cannot convey depth or interactivity. We wanted to bridge this gap by making learning immersive, intuitive, and fun. The idea came from observing how quickly students understand a concept when they can see it in action. That sparked our goal: Bring concepts to life using AR and VR so students can explore knowledge, not just read it.
What it does
Meta Edu App is an interactive AR/VR learning platform built with Unity that allows students to explore complex diagrams and processes through highly detailed 3D models.
Key capabilities:
AR Mode – View 3D models overlaid in the real world using your phone camera
VR Mode – Enter a virtual immersive space and walk around large-scale models
Interactive 3D Visualizations – Rotate, zoom, inspect layers, and interact with objects
Process Animations – View step-by-step breakdowns of scientific, anatomical, or mechanical processes
Intuitive Navigation – Smooth controls, gestures, and UI built for students of all ages
It transforms passive reading into active exploration, helping students understand complex topics faster.
How we built it
Unity – Primary engine for rendering AR/VR scenes and building the Android application
AR Foundation / ARCore – For augmented reality tracking, plane detection, and rendering 3D models
VR Toolkit (Unity XR Interaction Toolkit) – For VR interactions, locomotion, and immersive environment
Blender – Used to design, rig, and optimize 3D models and animations
C# – Implemented logic for interactions, process animations, UI, and controls
Android Build Pipeline – Exported the final app for Android mobile devices
We created models in Blender, imported them into Unity, optimized them for mobile, and added interactive layers such as highlighting, step-by-step animations, and voice-over explanations.
Challenges we ran into
Implementing VR in an Android environment VR required heavy optimization to maintain stable frame rates, especially on mid-range phones.
Model Optimization High-quality 3D models impact performance, so we had to decimate and simplify meshes while retaining details.
AR plane detection inconsistencies Ensuring smooth AR placement across different lighting conditions and devices was tricky.
Interaction complexity Designing intuitive controls for both AR and VR without confusing the user required iterative testing.
Accomplishments that we're proud of
Successfully built a cross-platform AR + VR app in a single Unity project
Achieved smooth performance on Android after multiple optimizations
Delivered realistic, visually appealing 3D models that enhance learning
Created an app that teachers and students can actually use to break down complex topics
Built an educational tool that is both fun and technically sophisticated
What we learned
Deep understanding of Unity XR, AR Foundation, and VR interaction systems
How to optimize 3D assets for performance-sensitive environments
Improved UI/UX knowledge for immersive educational tools
Learned to structure multi-mode (AR+VR) navigation within a single mobile application
Gained experience in real-world usage of ARCore tracking and environment detection
What's next for Meta Edu App
Adding voice-guided tutorials and AI-generated explanations
Introducing quiz modules linked to each 3D model
Multi-user AR sessions for collaborative learning
Cloud-based model library so teachers can upload their own topics
Support for iOS (ARKit)
Haptics & gesture-based interaction for an even richer experience
Expanding the library with more subjects: biology, astronomy, physics, mechanical engineering, and more
Log in or sign up for Devpost to join the conversation.