Analyzing recordings made with the Mobile Device Stand and Tobii Studio

Tobii Pro Studio Mobile Device Stand

Learn how to analyze eye tracking data and videos of mobile devices such as tablets and mobile phones collected with Tobii Studio and the Mobile Device Stand.

Gaze replay

The most effective way to analyze eye tracking sessions on mobile devices is to watch a slow-motion gaze replay and perfom a qualitative analysis of the collected data. Because gaze replays visualize the gaze data superimposed onto a video recording of the device, the evaluator will be able to see what parts of an interface were looked at, even if the participants scroll, zoom or rotate the device. Furthermore, gaze replays are the only way to analyze dynamic design elements. But keep in mind that gaze replays make it difficult to recognize gaze patterns, which are often more meaningful than single fixations or saccades. When studying gaze patterns, a gaze plot is a more suitable kind of visualization.

Heat map

Heat maps use a scheme of different colors to depict either the amount or the duration of fixations different parts of a stimulus received. Warm colors like red and yellow indicate areas, which were looked at longer or which attracted more fixations. Heat maps aggregate the viewing behavior of one or multiple participants over a certain time period and display it on a static image. Heat maps can be used as a starting point to get a general idea about which design elements attracted the participant's attention and which items were not looked at all. They may also be helpful in illustrating findings when writing the final usability report. But for a detailed qualitative analysis of individual gaze patterns, other tools such as gaze plots or gaze replays are more suitable. Because heat maps are still images, they cannot be used to analyze stimuli that contain dynamic elements. If the displayed content changes, the mapping of the gaze data onto the image is no longer accurate. Furthermore, heat maps do not display the sequence of fixations made by a participant. They do not show what participants looked at first and what they looked at last. Therefore, heat maps cannot be used to analyze processes such as visual search. When using relative duration heat maps the eye tracking recordings from all participants will have the same weight on the visualization independent of how long they spent viewing the stimuli. Therefore, it is recommended that you use these relative duration heat maps in usability studies since participants typically work at different speeds and spend different amounts of time on each stimulus. 

Gaze plots

Gaze plots or scan paths are static images, which visualize a participant's gaze pattern through a series of dots indicating fixations and fine lines indicating saccades. The size of the dots represents the duration of a fixation. Short fixations are indicated by small dots, while larger dots indicate a longer fixation. Typically, color coding is used in order to distinguish between the gaze patterns of individual participants. Gaze plots cannot be used to analyze interfaces that contain dynamic content. The still image does not show if a non-static design element moved during the eye tracking session and attracted the participant's fixation. Therefore, it is important to check the gaze video recording for any dynamic content before generating any gaze plots. Unlike heat maps, gaze plots reveal the order of fixations indicated by numbers in the dots. They enable researchers to analyze how a participant perceived an interface; which elements were looked at first and which elements were looked at last. Gaze patterns become visible, which can be used to reconstruct the partici-pants' thought processes and examine which search strategies the participants employed while looking for control structures or information on an interface. The interpretation of eye tracking data in itself is difficult without additional qualitative data, e.g., verbal explanation by the test participant. 

When using eye tracking to study websites on a desktop system, the analysis software, Tobii Studio, is able to superimpose the collected gaze data onto a screen capture of the whole website. Thus, participants are able to scroll up and down on the page just like they do when browsing the web at home. Unfortunately this is not possible when working with a scene camera recordings like in the case of the mobile device stand.

In this case recordings need to be divided into “scenes” in Tobii Studio before visualizations can be created. A scene contains two elements: a static image that is used as the background for the visualizations, and the gaze data from the section of the recording where the image was extracted.”  A visualization can only be created as long as the background image does not change. Therefore, the evaluator has to review the recordings and check that the participant did not scroll, change the screen orientation or zoom into a mobile interface before creating a scene. Unfortunately, during interaction with a mobile device, participants tend to do all of these things frequently. As the small display significantly limits the amount of information that can be displayed at once, most mobile websites and applications require users to scroll and navigate between multiple pages. Reviewing the recordings and manually creating scenes is time consuming. It might happen that aggregated gaze plots and heat maps cannot be created at all because some participants used the device in landscape and others in portrait mode.Thus using the standard visualizations such as heatmaps and gaze plots are not always suitable for eye tracking studies conducted on mobile devices. Conducting a qualitative analysis based on gaze play is a more suitable method when analyzing and interpreting data collected with the mobile device stand.

To be able to generate eye tracking metrics and do quantitative analysis of the collected data, Areas of Interests (AOIs) have to be created in Tobii Studio. Areas of Interests can either be static and drawn on top of a still image such as a scene, or dynamic and drawn on top of a video, such as a scene camera video. Thus AOIs may also be used when analyzing dynamic elements.  

Learn more about mobile device testing by reading the whitepaper and User manual

Related Articles