With its almost limitless capabilities for simulation and creativity, more and more companies are turning to virtual reality as a way to test products and environments, train staff and optimize performance – but how do you know which elements of your environment to focus on and which ones to forego?
In a previous blog post we discussed the role that VR plays in contextual research and why the integration with Tobii Pro VR Analytics facilitates insights that are difficult to attain through traditional methods. Of course, with every new methodology there comes new learning on how to get the best out of it; we've made the analytics easy to generate, but contextual research and training needs to create that context in a convincing way which allows it to influence behavior and decision making.
It's perhaps easiest to understand this through an example. One of the applications of eye tracking in VR is for staff training, as it not only provides a way to simulate an environment for tasks such as preparing a hotel room for a guest, it also provides the evidence for feedback, for example when a mini bar hasn't been checked even though the housekeeper looked at the empty beer bottle in the bin. But here's the thing; if the room you're training in doesn't feel credible and doesn't support the natural behaviors of the housekeeper then the training will not achieve its potential impact - if the housekeeper can't open the minibar in the virtual environment then they won't learn to look inside it.
Similarly, if a shopper cannot interact with products in a market research study, or a traveler isn't able to take every possible route through a space in a way-finding study, the observed behavior will be adapted to the context you've presented rather than the natural behavior you wish to test. In this post I'm going to discuss some questions which are good to consider when you design an eye-tracking study for VR.
Tobii Pro VR Analytics provides organizations and researchers with an easy to use to tool for collecting, analyzing, and visualizing attention. Everything you need is included, from calibration of the eye-tracker and live-view of the participant's gaze, to heat-maps and automatic measures like time to first fixation and total fixation duration. All you need to add is the Unity 3D environment, which can even be one that was previously developed for screen-based tests.
One of the most exciting aspects of VR eye tracking research is the almost limitless range of questions that can be asked, but this immediately prompts even more questions about how to set-up and run a study. Like all good research, you're going to want to plan and test your study before you start collecting data, so here are a few things to consider when designing your study:
Virtual Reality environments can range from highly stylized cartoon-like scenes to photo-realistic beautifully rendered scenes that would do Hollywood proud. As great as that second option sounds, remember the computer needs to generate all those images, and the more complex the scene the harder that is whilst maintaining a smooth refresh rate of the screen in the headset.
My advice would be to keep the scene as simple as you can, without sacrificing too much of what tricks the brain into believing it's real, so use 3D objects rather than simple 2D ones which don't have proper depth and, if possible, get the lighting right, because shadows and reflection are important contextual drivers of attention.
One of the best things about VR is that 3D objects trigger the brain to interact with them. These affordances are important as anyone who's ever tried to open a push door with protruding handle will know. These cues bring realism to shopper research, training, and all types of interaction studies, and allow you to explore automatically calculated measures like the time it takes a shopper to pick up a product after they first see it or workplace hazards which have not been noticed on an oil rig, for example.
Of course, your VR can also be too realistic. In Unity, implementing something called "natural physics" will cause objects to behave as they would in the real-world. This means they can fall over, they knock other things over, and they can drop to the floor. As much fun as this sounds, think about what you are trying to study. In the shopper example, you might want them to explore an aisle and select a product, but if your question ends there, don't enable something which makes participants nervous about interaction because they keep knocking things over.
It's all too easy to focus on just the visual environment when conducting studies on visual attention, but of course all of the senses contribute to both our attention and perception of a scene. This means that whilst the brain will happily immerse itself in a VR environment, it will still be processing, and therefore influenced by, other sensory inputs such as background noise in the research environment or an incongruous scent.
In the VR world we talk a lot about "presence" which describes the degree of immersion – the more present someone is in the virtual environment, the more they have disconnected from the real world around them. True presence can only be attained when all sensory inputs have been addressed, but it can be enhanced by something as simple as a soundtrack for your research. We recently conducted some VR research to explore the effects of different types of navigation in an airport terminal, and used background airport noise to increase presence, and flight announcements to act as naturalistic instructions to the participants.
It's not just sensory input that increases presence, the ability to act naturally does this too. This includes being able to interact with 3D objects and move freely through an environment. In VR, this frequently requires the use of controllers, many of which were designed with gaming in mind, and so it is important to consider how naturally other types of participants will behave. After all, if the method of interaction does not feel natural, then presence will be broken and your research confounded.
For this reason, it's important to allow participants an opportunity to familiarize themselves with controllers and adapt to the VR headset. One way to do this is to create a "sandbox" environment with objects that can be picked up and a few features to navigate around. If you're using the room scale mapping function of the HTC Vive, then it's also a good idea to encourage participants to move their body to get closer to objects, for example by placing objects on the ground.
Perhaps one of the most common questions around research in VR is about the cost of creating the environment, because at first glance it seems to be an extra cost that can be avoided by using screen projections or physical environments. This is true, but it's easy to overlook the costs associated with creating a realistic physical environment. For example, when testing package designs on supermarket fixtures, there is a significant cost associated with stocking the fixtures for tests. Similarly, for training and performance development, creating a physical simulation of an operating theatre or a control bridge can be prohibitively expensive.
VR tests offer some further clear cost benefits. Testing multiple concepts or layouts is easier to do in VR since the environment can be changed or reset at the touch of a button, enabling faster testing of participants. Studies can be replicated in almost any location, reducing testing facility costs and facilitating in-home/multiple location tests with precise replication. Analysis of collected data using Tobii Pro VR Analytics is instantaneous and therefore cheaper and faster, enabling more agile testing.
Of course, there are also the invisible costs of NOT using immersive VR, which brings us back to where I started this blog. VR can be used to create fully immersive environments where a participant feels present and behaves naturally. This means that controlled research in a contextually relevant environment is genuinely possible, and the costs of not considering this methodology are potentially the success of your brand, the safety of your employees, or the success of your company.
Learn more about the benefits and features of VR and eye tracking here.
Dr Tim Holmes is a visual neuroscientist and the Director of Research for Tobii Pro Insight