UX & VR - Designing for Virtual Reality Technology

The idea of a Virtual Reality (VR) system has been in development for many years. The first prototypes were developed in the 1960s. In 1995, Nintendo launched a VR system known as ‘Virtual Boy.’ The system was pulled from the market in 1996 – apparently it caused headaches and ghosting red images that lasted long after the Virtual Boy had been taken off.

The aim of VR is to have a system that provides sensory inputs that replace the real environment with an imagined simulation; allowing the user to interact with that simulation. VR can create an environment that incorporates the senses of vision and hearing; and in some circumstances, touch and smell.

In the last 20 years, there have been many developments to improve the user experience. Today, several manufacturers have announced the launch of VR systems including Google Cardboard, HTC Vive, Oculus, Project Morpheus, and Samsung Gear VR.

As the range of platforms develops, content is also developing; and developers are realizing that user interaction with a flat screen is conceptually very different to interaction with VR. There are many issues that developers need to incorporate into the VR interface to provide an experience that meets user needs for comfort while addressing and understanding a user’s field of view.

There are issues in screen resolution. People are accustomed to HD flat screens with high resolution – whether it is a TV or computer monitor. In a VR system, the screen is both smaller, and much closer to the user’s eyes. No matter which system people are using, the resolution is much lower than with 2D systems. This affects how menus, text and buttons are incorporated into the platform. It also affects how programmers locate their content in a 3D environment.

Virtual Reality

Comfort is a major factor in the user experience. People’s comfort levels in the real world are associated with the brain’s ability to focus on fixed points in space. In VR, animation moves in response to head movement. This can result in an uncomfortable, dizzying experience that leads to motion sickness. To address this, developers are introducing options such as ‘blink mode.’ When VR users select blink mode, the simulation updates on a step-by-step basis, rather than continually. This change can minimize the motion sickness experience. Developers are also including small, translucent boxes or patterns within the simulation. These boxes remain fixed, providing the brain with a point of reference that also reduces motion sickness.

In a desktop environment, sound doesn’t need to be location specific. In VR, it is important that the sound source can be associated with the visual source – if you see something to your left, in the distance, the sound associated should also be on the left, and in the distance. As you ‘move’ towards the object, the sound should move in a natural manner. Programming for this enhances the user experience.

Users need to be able to interact with the virtual world. Currently, interaction is based on controllers (such as X-Box One controllers used with Oculus Rift CV1). Oculus Touch and HTC Vive are being developed with controllers that are specific to VR, again enhancing the experience. Controller free interaction is being developed that allow the VR system to monitor hand movement, with the ability to ‘touch’ items and ‘push buttons.’

As VR technology improves, the user experience will become increasingly more natural, and the comfort level and associated perceived reality will continue to improve.

😎