The better technology gets, the more likely it is to give you a headache or make you throw up. The trend is inescapable: Whether it’s videogames, Apple’s latest mobile operating system, 3D movies and television, or Google Glass, a portion of the population—basically, anyone predisposed to motion sickness—is going to spend their sunset years, when this kind of technology is ubiquitous, in serious discomfort.
And if you think you can escape it simply by avoiding sophisticated but optional entertainments, think again—the latest example is people experiencing motion sickness as a result of Apple’s new iOS 7, which uses a parallax effect to make its interface look 3D. (If you haven’t experienced iOS 7 yourself, this video is a good illustration.)
While people encountering these effects for the first time compare them to motion sickness, what they’re experiencing has a more specific name—simulation sickness. The US Army has known about the problem for decades, since it often uses simulators to train soldiers. Motion sickness arises when our inner ear senses movement but our eyes don’t perceive any, whereas simulation sickness is the inverse: We see motion that should indicate we’re moving when we’re not. The exact incidence of these disorders is hard to pin down: motion sickness occurs in between 25% and 40% of the population, depending on the mode of transit, and simulation sickness occurrs in between 13% and 90% of the population, depending on how immersive and convincing is the virtual environment. (pdf)
We get sick, goes the classic hypothesis about the origins of motion sickness, because in our evolutionary history, a disconnect between our equilibrium and our visual cues indicated we’d probably ingested something poisonous, and the thing to do next is vomit. (Other explanations exist—such as, that some of us are just worse at adapting to confusing sensory cues.)
Pioneers in virtual reality wondered as early as 1992 whether simulation sickness would limit adoption of the technology, not realizing, perhaps, that eventually just about every interface humans might use would take on elements of virtual reality.
As the technology to generate and display 3D environments and effects has become less expensive, it is making it into pretty much everything with an interface. Some are more egregious than others. Oculus Rift, which only sounds like the clinical term for the disorientation it inspires, is a virtual reality headset that one reviewer said “is amazing until it makes you want to hurl.” The company that makes it, Oculus, says one issue is that there’s a subtle lag between users’ head movements and what they see in their headsets. Oculus is working on the problem, but the company says it may never go away.
Three-dimensional displays have their own issues, and for different reasons. Normally, our eyes must each aim slightly inward at an object coming toward us, in a process called convergence. Meanwhile, the lenses of our eyes bend to maintain focus on an object, which is called accommodation. But 3D displays force our eyes to converge but not to accommodate (because the object is still displayed on an unmoving, flat screen) which gives some people headaches.
And yet this technology is cropping up everywhere—in 3D hand-held gaming systems like Nintendo’s 3DS, 3D movies, 3D televisions and, eventually, in smartphones made by Amazon. Pioneers of wearable computing have warned that Google Glass may induce headaches for similar reasons, and I have experienced it myself, after only a few minutes of using Glass.
A 3D smartphone with a 3D interface might induce headaches and motion sickness, or at least heighten the effects of the latter. Wearable computers like Google Glass will only improve in their ability to project high-resolution interfaces directly into our field of view, making nausea-inducing interfaces physically unavoidable. I get headaches at 3D movies and motion sick at the slightest provocation, so it’s no surprise I found Google Glass unpleasant, and I expect that a few minutes with Oculus Rift would have me writhing on the floor. The thought of combining the two into a hypothetical “augmented reality” gives me hives. If your constitution is as delicate as mine, the 21st century is going to be one you’ll want to spend hiding from just about every kind of innovation in human-computer interfaces.