Introduction

In this article, we will describe all the data available from the FOVE Eye Tracker. There are two general ways to collect the data described here:

The two are equivalent in terms of the data reported, but since the Unity Gaze Recorder is integrated with the Unity Game Engine, it has some extras included. Since it outputs a CSV, the Unity Gaze Recorder can't capture eye data (but the Unity plugin or SDK can). The underlying FOVE SDK is engine agnostic.

The Gaze Recorder writes to a CSV automatically, without the need for any programming. For this reason, it's recommended for researchers and anyone else who wants to easily record data.

The FOVE runtime service manages multiple client applications simultaneously and handles multiplexing of data. This means that you have one program rendering content, and one or more different programs recording data. You can also have a single program rendering VR content and recording data. It is recommended to multithread such a program since the VR content runs at the screen’s frame rate (70Hz for FOVE 0) and the eye tracking data updates at the cameras’ frame rate (120Hz for FOVE 0).

All eye tracker data is uploaded as soon as the camera image comes in and the eye tracker has completed processing. When using the SDK, you can sync your thread's main loop to the cameras by using the Headset::waitAndFetchNextEyeTrackingData API call.

See output of data from an actual run on this public Google Sheet.

Coordinate Systems

In the Gaze Recorder, you can choose between three coordinate systems for gaze ray calculation: World, Local and HMD. The SDK always uses HMD coordinates.

The choice of the coordinate system affects the following fields of the Gaze Recorder: output:Combined Gaze RayEye Ray Left / Right

The "Eye Ray" for an eye consists of the coordinate of the eyeball center and the direction of the gaze from there. The "Combined Gaze Ray" is a certain average of the left/right "Eye Rays."

The systems all use Cartesian coordinates in meters (except where noted below), but differ in the choice of a reference point/origin.

HMD coordinate system: The gaze ray vectors are given relative to the headset. They depend only on the position and orientation of the user's eyes relative to the headset, and do not include the head rotation or body motion of the user. The HMD coordinate system is useful for studies where the stimulus is fixed with respect to the user.

In this system, +X is to the right, +Y is up, and +Z is forward (all in reference to the user's face). Thus, a direction of (0, 0, 1) indicates directly forward with no vertical or horizontal component.

The origin of this system is the midpoint between the rotation centers of the user's eyes. Thus, (0.0315, 0, 0) is the position of the right eye if the user's IOD is 63mm (IOD is explained below).

Local coordinate system: The gaze vectors are given relative to a fixed real-world position. They take into account the headset’s rotation and translation (translation is available only if position tracking is enabled). You can set the origin of the local coordinate system to the current headset position and orientation using the “Tare Position” and “Tare Orientation” buttons in the FOVE Debug Tool, or programmatically via Headset::tareOrientationSensor and Headset::tarePositionSensor. The local coordinate system is useful for studies that aim to relate eye movements to head or body motion.

World Coordinate System: The gaze vectors are given relative to the origin of the Unity virtual space. Besides the headset’s rotation and translation, the gaze vectors incorporate the transformation of the Unity object parent to the Fove Interface (such as Fove Rig). The World Coordinate System is useful for studies that correlate user gaze to the virtual objects.

In short, given that the point p⃗ in the HMD coordinate system is p⃗_hmd, the headset’s orientation/translation matrix is H, and the local-to-world matrix is U, then p⃗_local = H * p⃗_hmd, and p⃗_world = U * H * p⃗_hmd.