VR Gesture Instrument
A year-long capstone exploring what VR uniquely enables for musical expression through gesture-driven sound and immersive audiovisual feedback.
Project Snapshot
Project Type
Research-led capstone + interactive VR instrument
Primary Goal
Create a music experience that is not possible in the physical world
Core Stack
Unity, FMOD, Blender, XR Interaction Toolkit
Current Stage
Actively prototyping and validating interaction/audio mappings
Key Contributions
- Defined project direction from a novelty concept into a research-driven question about VR in music
- Led team coordination and cross-discipline decision making across code, audio, and 3D design
- Designed FMOD event architecture for responsive gesture-to-audio mapping
- Implemented and iterated audio reactivity patterns tied to user movement and interaction context
- Contributed to Unity-side interaction tuning for expressive and readable user feedback loops
Research Focus
- Which forms of musical control feel native in 3D space compared with flat-screen or physical instruments?
- How does spatialized audio feedback affect perceived control, immersion, and creative flow?
- What gesture complexity remains expressive without introducing fatigue or confusion for first-time users?
- How can visuals and audio co-respond fast enough to feel instrument-like rather than game-like?
Active Build Tracks
- Gesture interaction prototypes and control vocabulary definition
- FMOD system design for event, parameter, and mix behavior
- Blender-to-Unity environment and asset integration
- User testing passes focused on comfort, onboarding, and expressiveness

This project is actively in production — full documentation, additional screenshots, and a demo video will be added as development progresses.
Overview
This capstone started as a challenge to design a VR musical instrument and evolved into a broader research question: what capabilities does VR offer for music creation that the real world cannot? The project combines gesture-based performance, real-time reactive audio, and immersive visual feedback to investigate new forms of musical interaction.
My Role
I serve as Group Lead and Audio Director. I shape project scope, guide team alignment, and own core audio system strategy and implementation in FMOD while collaborating on Unity integration and interaction tuning.
Outcome
The project is still in active development. Current progress has established a viable prototype direction, a reusable audio interaction architecture, and a clear research framework for evaluating VR-native musical affordances through ongoing iteration and user testing.
Next Project
Multimodal AI Chatbot