360° MR Concert

Experience music like never before—explore, interact, and immerse yourself in a cyberpunk concert

This mixed reality concert brings Kenshi Yonezu’s "Lemon" to life in a 360° interactive environment, featuring a live performance by four Chinese traditional instrumentalists playing the Erhu (二胡), Pipa (琵琶), Guzheng (古筝), and Dizi (笛子).

 

Users step into a futuristic world where they can become the DJ, controlling the mix in real-time—soloing, muting, adding reverb, and adjusting volume for each instrument through an intuitive hand-tracking interface. Beyond the music, they can explore the concert’s visual storytelling, uncovering the meaning behind the song, its lyrics, and cultural inspirations.

ROLE

Solo Experience Designer - Concept Design, Interaction Design, Audio Design

TEAM

Yolanda Yu - Product Designer

Clayton Izuka - Audio Engineer

Sin-Yu Deng - Video Editor

Thomas Giberson - Developer

Tool

Unity 3D, Figma, Blender, Adobe Premiere Pro, Ableton Live, C#

XR Device

Oculus Quest

TIMELINE

2 weeks - Jan 2023

Final Design

Turn on your volume 🔊

CHALLENGE

Designing an environment as an instrument, new and "impossible".

Visual:

Implemented a panoramic skybox shader to display 360° video materials in VR.

Designed an interactive layer that allows users to manipulate their concert experience.

Integrated mixed reality assets synchronized with the music to seamlessly blend real and virtual elements.

Audio:

Utilized Resonance Audio to create an immersive 3D spatial sound experience, dynamically positioning individual musician sources based on their placement in the video.

CONTRIBUTION

What I am responsible for?

As the product designer, I initiated and developed the core concept with my team, shaping the interaction design and UI in Unity 3D. I also implemented spatial audio using Resonance Audio, ensuring an immersive listening experience. Throughout testing, I played a key role in debugging UI control scripts to refine the user experience.

Environment Design

How did we generate this concept?

We discovered that the original raw video implemented in the panorama skybox lacked user comfort due to its monochromatic color, light noise, and classroom settings. To improve the experience, we edited the 360 video using "blender" and created a cyberpunk-style environment inspired by New York Times Square. Our immersive world view featured Japanese-style billboards and posters related to the song "Lemon" by Yonezu Kenshi. Moreover we used LED-like rendering, added sci-fi elements, and included instrument names in the video for clear correspondences with the interactive interface.

Figure 1. Edit Raw Video in Pr

Figure 2. New 360 Environment created

Final Looks

Before

The original raw video implemented in the panorama skybox lacked user comfort due to its monochromatic color, light noise, and classroom settings. 

After

We transformed the space by enhancing colors, adding dynamic cyberpunk visuals, and integrating interactive elements—resulting in a vivid, engaging, and story-driven concert experience.

UI Design

How did I design this interactive control panel?

We aimed for a minimalistic and modern UI design that would be easily visible in the dark environment. Inspired by digital music players, I created a local control panel for each musician to merge with the environment. Additionally, I designed a global panel resembling a mixer to control all instruments. Both designs worked well, but due to technical implementation limitations, the team decided to use the global panel solely as a hand menu.The hand menu is accessible by pressing the "X button" on the left hand controller and can be closed with one more press. Users can use laser to interact with the hand menu by holding the right hand controller's index trigger.

Figure 3. Global UI Panel Design

Figure 4. Local Panel Design

Oculus Quest2 gameplay interaction

How did I design this interactive control panel?

We aimed for a minimalistic and modern UI design that would be easily visible in the dark environment. Inspired by digital music players, I created a local control panel for each musician to merge with the environment. Additionally, I designed a global panel resembling a mixer to control all instruments. Both designs worked well, but due to technical implementation limitations, the team decided to use the global panel solely as a hand menu.The hand menu is accessible by pressing the "X button" on the left hand controller and can be closed with one more press. Users can use laser to interact with the hand menu by holding the right hand controller's index trigger.

REFLECTION

How to overcome technical challenges?

For development, we are excited about creating a mixer-like experience with features such as solo, mute, and volume sliders. We are constantly exploring ways to emulate the user experience of Mixer. While different DAWs have varying workflows, we found that aligning with Ableton Live's solo button functionality worked best for our project. This decision allows us to focus on delivering a high-quality experience without the complexity of accommodating multiple scenarios. We remain committed to enhancing our coding skills to further perfect our project.