Gaze Interaction Research

The point of gaze is recorded and used in real time as an input in the user-computer interaction. Researchers in this field develop more efficient and novel human computer interfaces to support users with and without disabilities.

Eye tracking as a direct control medium

A person's point of gaze can be used in a variety of ways to control user interfaces, alone or in combination with other input modalities, such as a mouse, keyboard, sensors, or other devices.

A major field within gaze interaction research is to find more efficient and novel ways to facilitate the human computer interaction for users with disabilities, who can use only their eyes for input.

Other gaze interaction research focuses on the more general use of real-time eye tracking data in HCI to improve user–computer interaction and explore novel user interfaces.

Examples of gaze interaction application areas:

  • Using eye tracking to control user interfaces
  • Using eye tracking to control a mouse cursor
  • Controlling computer games by eye gaze
  • Gaze interaction in virtual worlds
  • Gaze visualizations in 3D environments
  • Combining gaze control and speech to speed up typing

Adaptive user interfaces

Eye tracking can help assess what a user is doing, user task performance (e.g. reading), attention patterns, interest in various elements, fatigue, and problem solving strategies. This type of data is used as input to automatically adapt the interface in real time (e.g. scrolling, zooming centered to where you are looking, and highlighting important or missed events or information).

A person playing a computer game on the big screen.

Examples of application areas:

  • assessing student meta-cognitive behavior during interaction with intelligent learning environments
  • selecting window placement in a mobile phone conferencing system to give the illusion of eye contact between phone conference participants
  • controlling cameras used for remote repair of large machinery.


University of Birmingham

The University of Birmingham used eye tracking to examine how different operators perform road traffic control tasks and make decisions based on noisy data streams from various sensing technologies and historical data. Read more

Products and services

Tobii Pro offers eye tracking solutions for usability testing in the lab or in the real world. For many projects, a controlled environment, such as a testing facility, offers a consistent setting and the ability to monitor the experience. Other studies, where the context of use is critical, benefit from being conducted in authentic environments.

We offer hardware and software, along with a portfolio of training and support for customers who wish to develop in-house testing capabilities and own their eye tracking systems.

For researchers who would rather purchase this methodology as a service, we have a team to conduct eye tracking-centered usability studies, from design and recruitment to execution and analysis. Read more

  • Hagiya, T., & Kato, T. (2014). Probabilistic touchscreen keyboard incorporating gaze point information. 329–333.
  • Beesley, T., Pearson, D., & Le Pelley, M. (2015). Implicit learning of gaze-contingent events. Psychonomic Bulletin & Review, 22(3), 800–807.
  • Shao, Y.-F., Wang, C., & Fuh, C.-S. (2015). EYELASSO: REAL-WORLD OBJECT SELECTION USING GAZE-BASED. CVGIP 2015. The 28th IPPR conference on computer vision. Graphics and Image Processing.