Eye Tracking

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 56745 Experts worldwide ranked by ideXlab platform

Daniel Weiskopf - One of the best experts on this subject based on the ideXlab platform.

  • visualization of Eye Tracking data a taxonomy and survey
    Computer Graphics Forum, 2017
    Co-Authors: Tanja Blascheck, Kuno Kurzhals, Daniel Weiskopf, Michael Raschke, Michael Burch, Thomas Ertl
    Abstract:

    This survey provides an introduction into Eye Tracking visualization with an overview of existing techniques. Eye Tracking is important for evaluating user behaviour. Analysing Eye Tracking data is typically done quantitatively, applying statistical methods. However, in recent years, researchers have been increasingly using qualitative and exploratory analysis methods based on visualization techniques. For this state-of-the-art report, we investigated about 110 research papers presenting visualization techniques for Eye Tracking data. We classified these visualization techniques and identified two main categories: point-based methods and methods based on areas of interest. Additionally, we conducted an expert review asking leading Eye Tracking experts how they apply visualization techniques in their analysis of Eye Tracking data. Based on the experts' feedback, we identified challenges that have to be tackled in the future so that visualizations will become even more widely applied in Eye Tracking research.

  • visual analytics for mobile Eye Tracking
    IEEE Transactions on Visualization and Computer Graphics, 2017
    Co-Authors: Kuno Kurzhals, Marcel Hlawatsch, Christof Seeger, Daniel Weiskopf
    Abstract:

    The analysis of Eye Tracking data often requires the annotation of areas of interest (AOIs) to derive semantic interpretations of human viewing behavior during experiments. This annotation is typically the most time-consuming step of the analysis process. Especially for data from wearable Eye Tracking glasses, every independently recorded video has to be annotated individually and corresponding AOIs between videos have to be identified. We provide a novel visual analytics approach to ease this annotation process by image-based, automatic clustering of Eye Tracking data integrated in an interactive labeling and analysis system. The annotation and analysis are tightly coupled by multiple linked views that allow for a direct interpretation of the labeled data in the context of the recorded video stimuli. The components of our analytics environment were developed with a user-centered design approach in close cooperation with an Eye Tracking expert. We demonstrate our approach with Eye Tracking data from a real experiment and compare it to an analysis of the data by manual annotation of dynamic AOIs. Furthermore, we conducted an expert user study with 6 external Eye Tracking researchers to collect feedback and identify analysis strategies they used while working with our application.

  • visual data cleansing of low level Eye Tracking data
    Workshop on Eye Tracking and Visualization, 2015
    Co-Authors: Christoph Schulz, Michael Burch, Fabian Beck, Daniel Weiskopf
    Abstract:

    Analysis and visualization of Eye movement data from Eye-Tracking studies typically take into account gazes, fixations, and saccades of both Eyes filtered and fused into a combined Eye. Although this is a valid strategy, we argue that it is also worth investigating low-level Eye-Tracking data prior to high-level analysis, because today’s Eye-Tracking systems measure and infer data from both Eyes separately. In this work, we present an approach that supports visual analysis and cleansing of low-level time-varying data for Eye-Tracking experiments. The visualization helps researchers get insights into the quality of the data in terms of its uncertainty, or reliability. We discuss uncertainty originating from Eye Tracking, and how to reveal it for visualization, using a comparative approach for disagreement between plots, and a density-based approach for accuracy in volume rendering. Finally, we illustrate the usefulness of our approach by applying it to Eye movement data recorded with two state-of-the-art Eye trackers.

Austin Roorda - One of the best experts on this subject based on the ideXlab platform.

  • high resolution Eye Tracking using scanning laser ophthalmoscopy
    Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, 2019
    Co-Authors: Norick R Bowers, Agostino Gibaldi, Emma Alexander, Martin S Banks, Austin Roorda
    Abstract:

    Current Eye-Tracking techniques rely primarily on video-based Tracking of components of the anterior surfaces of the Eye. However, these trackers have several limitations. Their limited resolution precludes study of small fixational Eye motion. Furthermore, many of these trackers rely on calibration procedures that do not offer a way to validate their Eye motion traces. By comparison, retinal-image-based trackers can track the motion of the retinal image directly, at frequencies greater than 1kHz and with subarcminute accuracy. The retinal image provides a way to validate the Eye position at any point in time, offering an unambiguous record of Eye motion as a reference for the Eye trace. The benefits of using scanning retinal imaging systems as Eye trackers, however, comes at the price of different problems that are not present in video-based systems, and need to be solved to obtain robust Eye traces. The current abstract provides an overview of retinal-image-based Eye Tracking methods, provides preliminary Eye-Tracking results from a Tracking scanning-laser ophthalmoscope (TSLO), and proposes a new binocular line-scanning Eye-Tracking system.

  • active Eye Tracking for an adaptive optics scanning laser ophthalmoscope
    Biomedical Optics Express, 2015
    Co-Authors: Christy K Sheehy, Pavan Tiruveedhula, Ramkumar Sabesan, Austin Roorda
    Abstract:

    We demonstrate a system that combines a Tracking scanning laser ophthalmoscope (TSLO) and an adaptive optics scanning laser ophthalmoscope (AOSLO) system resulting in both optical (hardware) and digital (software) Eye-Tracking capabilities. The hybrid system employs the TSLO for active Eye-Tracking at a rate up to 960 Hz for real-time stabilization of the AOSLO system. AOSLO videos with active Eye-Tracking signals showed, at most, an amplitude of motion of 0.20 arcminutes for horizontal motion and 0.14 arcminutes for vertical motion. Subsequent real-time digital stabilization limited residual motion to an average of only 0.06 arcminutes (a 95% reduction). By correcting for high amplitude, low frequency drifts of the Eye, the active TSLO Eye-Tracking system enabled the AOSLO system to capture high-resolution retinal images over a larger range of motion than previously possible with just the AOSLO imaging system alone.

Ara Darzi - One of the best experts on this subject based on the ideXlab platform.

  • Eye Tracking for skills assessment and training a systematic review
    Journal of Surgical Research, 2014
    Co-Authors: Tony Tien, Philip H Pucher, Mikael H Sodergren, Kumuthan Sriskandarajah, Guangzhong Yang, Ara Darzi
    Abstract:

    Abstract Background The development of quantitative objective tools is critical to the assessment of surgeon skill. Eye Tracking is a novel tool, which has been proposed may provide suitable metrics for this task. The aim of this study was to review current evidence for the use of Eye Tracking in training and assessment. Methods A systematic literature review was conducted in line with PRISMA guidelines. A search of EMBASE, OVID MEDLINE, Maternity and Infant Care, PsycINFO, and Transport databases was conducted, till March 2013. Studies describing the use of Eye Tracking in the execution, training or assessment of a task, or for skill acquisition were included in the review. Results Initial search results returned 12,051 results. Twenty-four studies were included in the final qualitative synthesis. Sixteen studies were based on Eye Tracking in assessment and eight studies were on Eye tacking in training. These demonstrated feasibility and validity in the use of Eye Tracking metrics and gaze Tracking to differentiate between subjects of varying skill levels. Several training methods using gaze training and pattern recognition were also described. Conclusions Current literature demonstrates the ability of Eye Tracking to provide reliable quantitative data as an objective assessment tool, with potential applications to surgical training to improve performance. Eye Tracking remains a promising area of research with the possibility of future implementation into surgical skill assessment.

Martin Meisner - One of the best experts on this subject based on the ideXlab platform.

  • combining virtual reality and mobile Eye Tracking to provide a naturalistic experimental environment for shopper research
    Journal of Business Research, 2017
    Co-Authors: Martin Meisner, Jella Pfeiffer, Thies Pfeiffer, Harmen Oppewal
    Abstract:

    Abstract Technological advances in Eye Tracking methodology have made it possible to unobtrusively measure consumer visual attention during the shopping process. Mobile Eye Tracking in field settings however has several limitations, including a highly cumbersome data coding process. In addition, field settings allow only limited control of important interfering variables. The present paper argues that virtual reality can provide an alternative setting that combines the benefits of mobile Eye Tracking with the flexibility and control provided by lab experiments. The paper first reviews key advantages of different Eye Tracking technologies as available for desktop, natural and virtual environments. It then explains how combining virtual reality settings with Eye Tracking provides a unique opportunity for shopper research in particular regarding the use of augmented reality to provide shopper assistance.

J N Gowdy - One of the best experts on this subject based on the ideXlab platform.

  • real time Eye Tracking for human computer interfaces
    International Conference on Multimedia and Expo, 2003
    Co-Authors: S Amarnag, R S Kumaran, J N Gowdy
    Abstract:

    In recent years considerable interest has developed in real time Eye tracing for various applications. An approach that has received a lot of attention is the use of infrared technology for purposes of Eye Tracking. In this paper, we propose a technique that does not rely on the use of infrared devices for Eye Tracking. Instead, our Eye tracker makes use of a binary classifier with a dynamic training strategy and an unsupervised clustering stage in order to efficiently track the pupil (Eyeball) in real time. The dynamic training strategy makes the algorithm subject (speaker) and lighting condition invariant. Our algorithm does not make any assumption regarding the position of the speaker's face in the field of view of the camera, nor does it restrict the 'natural' motion of the speaker in the field of view of the camera. Experimental results from a real time implementation show that this algorithm is robust and able to detect the pupils under various illumination conditions.