VR Mech cockpit designed to reduce motion sickness

The Magic Edge simulator circa 1996

The Magic Edge simulator circa 1996

Piloting giant equipment from inside a cockpit, be it flying a jet fighter or a giant mech, is just plain cool. Magic Edge went bankrupt in the 90s but not before capturing the hearts of Silicon Valley nerds; my friends and I still talk of the physical cockpits fondly. However, implementing this in virtual reality is a challenging problem with the assumption that the virtual vehicles need to move around. Locomotion in VR conflicts with our natural senses. Moving inside a vehicle can trigger motion sickness if we don't feel our chairs moving along with what our eyes see. Thankfully our senses are a bit blurry and fallible. By reducing visual cues that imply virtual acceleration forces, coupled with an additional sensor that measures hip orientation, full VR locomotion is achieved with nothing more than a smartphone and office chair.

It should be noted that this blog post covers the design thoughts in the context of a walking robot and may not be the best solution for all situations. Also, if you're interested in implementing orientation data over Websockets in your own projects, I've uploaded an example project to get you started.

 
 

Fixed velocities + dampened impacts

 
"Acceleration (linear or angular, in any direction) conveyed visually but not to the vestibular organs constitutes a sensory conflict that can cause discomfort. An instantaneous burst of acceleration is more comfortable than an extended, gradual acceleration to the same movement velocity." - Oculus Best Practices: Simulator Sickness
 

One of the guidelines to reducing motion sickness can be paraphrased as only setting constant velocities. Unlike Nintendo's Super Mario that accelerates Mario the longer "forward" is held down, setting constant velocities removes the implication of an acceleration force. Since it would take infinite acceleration to change velocities in zero time, our brains kindly ignore the missing force ( see Newton's Second Law, force = mass x acceleration ). However moving in VR at various speeds can still be done using analogue control inputs. The XBOX controllers' triggers and joysticks work great to give discrete velocity values along the triggers' range of travel. As long as the user has a 1-to-1 expectation of their velocity, the motion sickness is greatly minimized. This control scheme works great for forward-back and strafing left-right but this isn't the full story in VR. Missing is what happens in a world with accelerations such as from gravity. Fast decelerations when collisions occur have associated forces our brains have learned to expect. Any acceleration unaccounted for is a cause for motion sickness.

Note: Even with 1-to-1 velocity control, if the user button mashes to create jerky motion for themselves, motion sickness can occur. Since I'm implementing slower moving robots, rate limiting the actions will solve this problem but keep this in mind for your implementation of 1-to-1 velocity controls.

The solution to the missing force in VR is found in games like Lucky's Tale that slowly move the players view without causing motion sickness. The camera ramps up slowly enough that it's plausible for our brains to not feel an acceleration force. The end result is a mostly imperceptible camera move. This along with healthy dose of attention redirection through gameplay, sights, and sounds works to mask the entire discomfort. After a couple of attempts, my best implementation results used Unity's Spring Joint to connect the camera and cockpit to each other but keep them as separate physics objects. The camera is not affected by gravity but has a large drag force to dampen the spring force from the cockpit pulling it along. The camera-spring motion is a good visual indicator of direction and scale of impact force for the VR player. I've have no evidence besides anecdotes and watching a couple people, but rubber-banding the player's camera on impacts is much better than not doing it.

A major part in this Mech simulator was that the user would "feel" the giant mech taking steps rather than just floating around. Syncing the footstep sound with visual motion cues is a really simple way to fool the brain to believe what it's seeing. My implementation of the step is simply shifting the collision box upwards and triggering the audio clip of a single step. After teleporting the collision box upward, gravity brings the cockpit to the ground smoothly. Finally, before the audio file finishes playing, the collision box transitions ("lerp" function in Unity)  back into place ready to for the next step. Thanks to the Spring Joint, the VR player's camera is isolated from the motion but it's still visible to convey the sense of motion. In the third animated GIF above, since the camera is locked to the cockpit, the animation makes it seem like the cockpit is not moving. However, from the camera's perspective, the cockpit is moving and the camera is slowly following. Notice how the passing cube is bouncing but in reality the cockpit is bouncing. This is due to the camera locked to the cockpit.

Using smartphone to track the head and hips seperately

With regards to things not to do to VR players, rolling the camera is probably the number one. I can't think of anything that more closely simulates a hangover. Closely followed by rolling the camera is rotating the camera left-right, also called yaw. This is problematic because we want to turn the mech to go where we want while still being able to look around the cockpit. In other words, our heads and bodies need to be tracked independently. I had plans to attach a rotary encoder to my office chair to always know where it's pointing, but then I realized my smartphone did this already.

Instead of requiring user input to control heading, the iPhone in my pocket reports to Unity its orientation and therefore the direction of my hips. This leaves the head free to look around while preserving any associated angular acceleration forces in view since the chair is actually rotating. The data is transmitted to Unity through a WebSocket connection served by an HTML page that accesses the iPhone's orientation API. No app installs needed, runs in default Safari web browser. A smoothed integration method is used to get fast response. For this initial application the smartphone's gyro drift error is ignored since the user is in VR and doesn't have any visual reference points to indicate drift error. This is the obvious next step for improvement by using a Kalman Filter or similar. 

Stationary visual reference point

The final requirement for a mech VR experience is separating what happens outside the cockpit from what happens inside. The brain needs a static visual cue to separate what it sees from what it feels. For example, flying in an airplane is quite disorienting when a plane banks to make a turn and the passengers on one side see nothing but sky and the other side of the plane sees only ground, yet we feel stuck to our seats. Which way is up? In order to avoid motion sickness we have to trick the brain into thinking it's not moving relative to the mech, even if the mech is moving relative to the world. Since the stationary objects in the brain's vision are not moving, and the brain isn't moving relative to the stationary objects, then that means the brain isn't moving either and therefore there are no forces to expect. This is accomplished with filling the player's view with static geometry representing the cockpit. The static cockpit grounds the brain into the local reference frame even if the mech is jumping or walking backwards. The tradeoff is visibility. If the player is fully enclosed, it's impossible to know if they're moving but then they can't see anything either.

Next steps

I have many ideas of details I'd like to implement such as vibrating and tilting the chair, changing the field-of-view dynamically, or adding surface details to the VR windows to feel enclosed and protected. Even adding a nose:

 
"Researchers claim the addition of the nose allowed people to play one of the games tested, a VR stroll around a Tuscan villa, for an average of 94.2 seconds LONGER than those without. " - www.express.co.uk/
 

Ergonomics of VR comfort may seem to be technical, but in practice have many overlaps in design aesthetics and subjectiveness. To make something feel good in VR it also has to also follow good design rules. The same rules of human manipulation and interaction apply to VR than to reality. I'm really looking forward to fusing disciplines with diverse minds and new perspectives, and exploring more possibilities of what VR/AR affords us.