Inspiration

Our team was inspired by Augmented Reality's ability bringing the virtual world to our surroundings. Whereas not all people are able to gain full information on environmental crises, they are happening just around us and influence each individual's life.

What it does

The AR app will put a window on walls or windows, within which multiple natural disaster sceneries are presented. It gives people a stronger sense of crisis and greater immersive experience, thus raising people's awareness of making everyday changes for our future environment.

How we built it

We created the scene in Unity3D. Using ARFundation toolkit, the app will detect features such as planes, windows or a designated image and spawn virtual objects in our desire. With game logics and animations, the scene is interactive and flexible for added features.

Challenges we ran into

The first challenge is the limitation of ARFundation in terms of its transform systems since the orientation of the spawned objects are restricted. To create the effect of having a portal we have to use shader to modify the underlying rendering pipeline to present the "illusion" of seeing things from a restricted window. The fact that we need to render massive scenes behind the portal increased the difficulty due to performance problems.

Accomplishments that we're proud of

We came up with creative solutions such as using multiple parents and child system to overcome the limitation on rotation and transform. We dynamically changed the material code and selective shader code to achieve desire "buffer" effect while retaining assets' original features. We also achieved accurate animation with Unity APIs.

What we learned

We learned about shaders, ARFundation Image-Tracking

What's next for Window2Future

Improve its narrative features; add audio and more interactive UI

Built With

  • arfundation
  • c#
  • cg
  • image-tracking
  • shader
  • unity
Share this project:

Updates