Inspiration
Kanye West's music has always been larger than life — from the gothic darkness of Yeezus to the futuristic soundscapes of Graduation and Donda. We wanted to take that cinematic energy and put fans inside of it. Traditional concert streams feel flat, and even in-person shows only give one perspective. VR gave us the chance to build something impossible in the real world: a concert where the stage itself transforms around you, where you can stand in a burning hellscape one moment and float through a neon cyber-future the next. YeVirtualConcert was born from the idea that Ye's artistry deserves an environment as bold as the music itself.
What it does
YeVirtualConcert is an immersive VR concert experience featuring a virtual Kanye West performing across two radically different worlds. The Hellfire Stage is a dark, gothic cathedral engulfed in flames, inspired by the raw, distorted aesthetic of Yeezus and Donda — lava cracks across the floor, demonic silhouettes rise behind the stage, and the lighting reacts to the bass in real time. The Cyber Cosmos Stage is a hyper-futuristic sci-fi arena floating in deep space, with neon grids, holographic visualizers, and zero-gravity particle effects synchronized to the music. Users put on a VR headset, pick a scene (or let the show transition between them), and experience a full performance with spatial audio, reactive visuals, and a 360° view that puts them front row — or on stage if they want.
How we built it
We built YeVirtualConcert in Unity using the XR Interaction Toolkit, targeting Meta Quest as our primary platform. The two environments were modeled and textured in Blender, with additional assets and shaders made in Substance Painter and Unity's Shader Graph. For the hell scene, we leaned heavily on VFX Graph for volumetric fire, ember particles, and heat distortion. For the sci-fi scene, we used HDRP-style post-processing, bloom, and custom neon shaders to sell the cyber aesthetic. A rigged 3D Kanye-inspired avatar was animated with mocap data and lip-synced to the audio track. Spatial audio was handled through Unity's built-in spatializer so the sound reacts to where you turn your head.
Challenges we ran into
Performance was our biggest enemy. VR demands 72+ FPS per eye, and our hell scene's fire and particle effects tanked the framerate on standalone headsets — we had to rebuild the VFX system using GPU-instanced particles and baked lightmaps to claw the performance back. Lip-syncing the virtual Kanye to real audio without it looking uncanny took many iterations. We also fought with scene transitions: going from "hell" to "cyber cosmos" without breaking immersion or causing motion sickness required a custom shader-based dissolve and carefully paced camera logic. Licensing limitations also forced us to use instrumental/royalty-safe tracks that emulate the vibe of Ye's catalogue rather than the originals.
Accomplishments that we're proud of
We're proud that both scenes feel genuinely distinct — one oppressive and cinematic, the other euphoric and weightless — yet they share the same performer and musical through-line. Hitting a stable framerate on a standalone Quest while still running volumetric fire and real-time reactive lighting felt like a small miracle. We're also proud of the music-reactive visual system: every flame flicker, neon pulse, and particle burst is actually driven by the audio spectrum, not pre-baked animation. And honestly, the first time we put the headset on and saw "Kanye" walk out onto a stage we built from scratch — that was the moment we knew we had something.
What we learned
We learned that VR is a completely different design discipline from flat-screen games — comfort, scale, and sightlines matter as much as visuals. We got hands-on experience with performance budgeting for mobile VR, audio-reactive shaders, character rigging and lip-sync, and the weird art of directing a concert when the "audience" can literally look anywhere. More broadly, we learned how powerful VR can be as a medium for music and storytelling — not just games.
What's next for YeVirtualConcert
We want to expand the experience in three directions. First, more scenes — a sunken Atlantis stage, a desert stage inspired by Jesus Is King, and a minimalist white-void stage for 808s & Heartbreak. Second, a multiplayer mode that lets friends attend the concert together as avatars, react in real time, and take virtual photos. Third, creator tools that open up our reactive-stage system so other artists (not just a Ye concept) can drop in their own tracks and host their own VR shows. Long-term, we see YeVirtualConcert as a proof-of-concept for a new kind of live music platform: concerts that aren't limited by physics, geography, or ticket prices — just imagination.
Log in or sign up for Devpost to join the conversation.