>
Fa   |   Ar   |   En
   A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment  
   
نویسنده singh t. ,perry c.m. ,herter t.m.
منبع journal of neuroengineering and rehabilitation - 2016 - دوره : 13 - شماره : 1
چکیده    Background: robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. a key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e.,peripersonal space). integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. however,remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). when visual stimuli are presented at variable depths (e.g. transverse plane),eye movements have a vergence component that may influence reliable detection of gaze events (fixations,smooth pursuits and saccades). to our knowledge,there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. we then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations,saccades and smooth pursuits. finally,we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. results: within the transverse plane,our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. conclusions: the proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth. © 2015 singh et al.
کلیدواژه Eye tracking; Eye-hand coordination; Fixations; Robotics; Saccades; Smooth pursuits; Upper limb
آدرس department of exercise science,arnold school of public health,university of south carolina,921 assembly street,columbia,sc-29208, United States, department of exercise science,arnold school of public health,university of south carolina,921 assembly street,columbia,sc-29208, United States, department of exercise science,arnold school of public health,university of south carolina,921 assembly street,columbia,sc-29208, United States
 
     
   
Authors
  
 
 

Copyright 2023
Islamic World Science Citation Center
All Rights Reserved