T08: Eye Tracking: Applications, Recording, Analytics, Interaction

Monday, 24 July 2023, 08:30 - 12:30 CEST (Copenhagen)
Back to Tutorials' Program

 

Andrew T. Duchowski (short bio)

Clemson University, United States

 

Objective:

The tutorial is based on four objectives: a survey of the field followed by hands-on examples of gaze recording, analytics and gaze interaction. The goals are for attendees to learn about classic eye-tracking research as well as current directions and open problems, and then to learn about basic and advanced eye movement analysis and interaction techniques.

For 2023, the tutorial will include new PsychoPy demonstration code with which attendees will be able to record their own data provided they have access to an eye tracker supported by PsychoPy, i.e., Tobii, Gazepoint, SR Research, or simulate gaze data with the mouse (also an available PsychoPy option).

 

Content and Benefits:

The tutorial starts with an overview of eye-tracking applications, distinguishing eye movement analysis from synthesis in virtual reality, games, and other venues including mobile eye tracking.  The focus is on five forms of applications: diagnostic (off-line measurement), active (selection, look to shoot), passive (foveated rendering, a.k.a. gaze-contingent displays), assistive (translation), and expressive (gaze synthesis).  The tutorial will then cover details of a Python-based gaze recording and analytics pipeline developed and used by Prof. Duchowski and others.  The gaze analytics pipeline consists of PsychoPy and Python scripts for extraction of raw eye movement data, analysis and event detection via velocity-based filtering, collation of events for statistical evaluation, analysis and visualization of results using R.  The tutorial covers basic eye movement analytics, e.g., fixation count and dwell time within AOIs, as well as advanced analysis using ambient/focal attention modeling and gaze transition entropy. Newer analytical tools and techniques such as microsaccade detection and pupillary activity will be covered, time permitting.  The tutorial concludes with an overview and demo of how to build an interactive Python application.

Attendee benefits are four-fold: (a) attendees will learn of a large and growing number of eye-tracking applications, e.g., potentially giving inspiration to fruitful research directions, (b) attendees will learn basic (and open-source) methods of gaze data recording with PsychoPy, (c) attendees will learn basic and advanced methods of eye movement analysis, and (d) attendees will gain an understanding of what is involved in constructing an interactive application from the ground up.

 

Target Audience:

The intended audience should largely be composed of novice attendees who may have heard something about eye tracking and would like to learn more about what is involved.  Seasoned veterans are also welcome, they may benefit from the section on experimental design.

Finally, time permitting, anyone interested in developing a basic eye-tracking application will benefit from an overview of a Python application designed to obtain gaze information in real-time from a commodity eye tracker.

 

Additional platform or tool to be used by the tutor:

PsychoPy, Python and R scripts will be demonstrated to show how typical gaze data can be recorded and processed via an analytics pipeline developed by the instructor. Example code will be sent out to registrants so they can install the proper tools and then try the example code before the tutorial.

 

List of materials or devices required by the participants:

Generally, installation of Python 3 and R would be helpful. Installation and familiarity with PsychoPy 3 would also be helpful. Open-source software installation instructions (e.g., of additional software such as HDFView and Scribus) will be provided by the instructor ahead of the tutorial. All software used is generally available for all operating systems and demonstration code is provided for Windows and Unix-like platforms (including Mac)

 

Relevant links and software will be provided here: http://andrewd.ces.clemson.edu/hcii23

Bio Sketch of Presenter:

Dr. Duchowski is a professor of Computer Science at Clemson University. He received his baccalaureate (1990) from Simon Fraser University, Burnaby, Canada, and doctorate (1997) from Texas A&M University, College Station, TX, both in Computer Science. His research and teaching interests include visual attention and perception, computer vision, and computer graphics. He is a noted research leader in the field of eye tracking, having produced a corpus of related papers and a monograph on eye tracking methodology, and has delivered courses and seminars on the subject at international conferences. He maintains Clemson's eye tracking laboratory and teaches a regular course on eye tracking methodology attracting students from a variety of disciplines across campus.