The scene camera of the glasses records a video of everything that is in front of the participant, and this video is used to interpret what the participant is looking at. If for some reason the quality of the the video image is compromised, it can become much harder to see what your participant is looking at.
One problem for wearable eye trackers is that the environment changes drastically regarding illumination. In its default setting, the scene camera will automatically adjust the exposure of the image so it is not too bright nor too dark, and this works in most cases. However, this assumes that everything is equally important in the view of the scene camera, but this may not be the case for your particular study. For example, if the participant looks at her mobile phone in an otherwise dimly lit area, the phone will be much brighter than the rest of the scene. The scene camera strives to provide a balanced exposure throughout the entire image, however the phone screen is so much higher in light intensity that it will be overexposed and simply show as almost, if not completely, white. As a result, the researcher cannot see what notification or application is active on the phone. If our research requires us to see what apps are accessed on the phone, we must adjust the exposure based on the intensity of this area, rather than the image in general.
This problem of unequal intensity levels in the scene camera image can be solved by activating the Gaze Spot Meter function in the Liveview tab of the Tobii Pro Glasses Controller. This function will set the exposure levels based on where the participant is looking. This allows the researcher to clearly see what the participant is looking at. The trade-off of using this function is that the exposure level will not be optimal for the areas where the participant is not looking at, unless it matches the brightness of the object or location the participant is looking at.