How can I synchronize the video and the gaze data from Tobii Pro Glasses 2?

Tobii Pro Tobii Pro Glasses 2 Tobii Pro Glasses 2 API
You can synchronize the live video and the live gaze data with the Tobii Pro Glasses 2 API, or you can post-synchronize the recorded video and the recorded gaze data.

Presentation timestamps (PTS) are a common concept to both approaches. The presentation timestamp packages are used as reference for Tobii Pro Glasses 2 to synchronize the gaze data with the video. They can be accessed:

  • from the JSON packages received through UDP sockets, when synchronizing live
  • from the livedata.json file on the SD card, when post-synchronizing

The presentation timestamp packages are written like this: {"ts":488682903,"s":0,"vts":559844} or {"ts":489322743,"s":0,"pts":24837344,"pv":1468396373}.
Where ts is the local timestamp from Pro Glasses 2, vts is the presentation timestamp used to post-synchronize, pts is the presentation timestamp used to live synchronize, and pv is a debugging timestamp used by the Tobii Pro Glasses Controller.

To synchronize the live gaze data with the live video:
Before live synchronizing, you need to start the recording and keep the video and the live data streams going. Check our FAQ called "How do you calibrate and record with the Tobii Pro Glasses 2 API?" to do that.

You are now receiving the presentation timestamp packages in the form {"ts":489322743,"s":0,"pts":24837344,"pv":1468396373}, and you are decoding the video stream. The video decoder is giving you the video presentation timestamps associated to each frame (what we call it pts_video). You can synchronize the gaze data timestamps (ts) with the video timestamps (pts_video) in such a way that the frame with pts_video = pts is aligned with the gaze data of timestamp ts.

To post-synchronize the gaze data with the video:
Before post synchronizing, you need to make a recording with your own application or with the Pro Glasses Controller. Once the recording has been done, you can get the following from the SD card:

  • the video (located at \projects\<projectID>\recordings\<recordingID>\segments\<segmentNumber>\fullstream.mp4)
  • the gaze data (located at \projects\<projectID>\recordings\<recordingID>\segments\<segmentNumber>\livedata.json.gz) in a compressed GZip format
The file livedata.json contains the same data as the live data from doing the recording. You now have access to the presentation timestamp packages in the form of {"ts":488682903,"s":0,"vts":559844}, and you can decode the fullstream.mp4 video. The video decoder will give you the video presentation timestamps associated with each frame (we call it pts_video). You can synchronize the gaze data timestamps (ts) with the video timestamps (pts_video) in such a way that the frame with pts_video = vts is aligned with the gaze data of timestamp ts.