Dispatch is a VR project built around a problem found genuinely frustrating in VR: movement often feels unnatural and uncomfortable. The original goal was to explore how more natural locomotion could reduce motion sickness and make VR movement feel closer to how people actually move in real life.
The system combines software-based locomotion with motion sensing and omnidirectional movement. While the final application focused on first-responder training, the core of the project stayed centered on locomotion and how software and hardware interact with human perception.
This ended up being one of my favorite projects I have worked on.
The guiding question behind Dispatch was simple:
How can locomotion in VR feel more natural while reducing motion sickness?
Rather than treating this as only a software problem or only a hardware problem, we approached it as something shaped by movement logic, physical sensing, and how people process motion and space. That framing is what initially pulled me into the project and kept my interest throughout development.
We applied the system to first-responder training, specifically scenarios like navigating a burning building[1]. This context made sense because it involves high stress, movement, and spatial awareness, and it benefits from realistic but accessible training tools.
The idea was to enable cost-effective VR training that could be used at home or onsite, improving accessibility for firefighters, EMTs, and rescue teams[2]. While this project does not claim training outcomes, it explores how better locomotion design could support more realistic and usable VR training environments[3].
My focus was on research direction, VR software, and interaction design. I worked primarily in Unity on locomotion logic, player movement, and how the system would feel to use. I also helped with design decisions related to how the player would move through the environment and how the system should be applied in practice.
I collaborated closely with one other teammate who was focused on similar areas, while two teammates focused more heavily on hardware and sensor integration. I was also involved in shaping the overall feel of the project and helping put together project materials.
- Software-based locomotion and movement logic in Unity
- Omnidirectional movement and spatial navigation
- Awareness of motion sensors and hardware constraints
- Accessibility and usability as design considerations
What made this project especially interesting to me was how naturally it raised neuro-adjacent questions. Working on locomotion and motion sensing made me think more about how the brain processes balance, movement, and spatial orientation in VR.
One idea I kept coming back to was how neuro-adaptive approaches might further reduce motion sickness or make movement feel more natural.
These questions are a big part of why this project stood out to me and why I am interested in continuing to explore sensing, neurotechnology, and interactive systems.


