Skip to main content
Fig. 1 | Journal of NeuroEngineering and Rehabilitation

Fig. 1

From: A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment

Fig. 1

The experimental setup and theoretical paradigm used to compute ocular kinematics for the current study. a The KINARM Endpoint robot with a seated participant. The visual stimuli are reflected from the monitor to the display. The remote gaze-tracker is housed at the back of the workspace (yellow rectangle). Participants rest their head against a support in front of the monitor and grab the robotic manipulanda just below the display to interact with the visual stimuli. b Along the line of sight, visual angle (β) spans diameters ‘a’ and ‘c’ at two different distances. In psychophysics studies, visual stimuli are often presented in a frontal plane and converted to degrees to eliminate distance as a confounding variable. c A cartoon of an arbitrary gaze point-of-regard (POR) first transformed from the robot based (XY) to an eye-based (X’Y’Z’) Cartesian coordinate system. Note that the Z-axis (pointing downward, robot frame) is shown to illustrate how Eq. 2 transforms the gaze POR data. This is followed by a transformation to a spherical coordinate system, also fixed to the eye. Then ocular kinematics are obtained from the spherical gaze POR data (see inset) using Equations 6–8. The yellow arrows indicate the sequence in which this transformation is performed

Back to article page