Dr. Thomas Kübler

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 78996
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
thomas.kuebler@uni-tuebingen.de
Office
Sand 14, C205a
Office hours
on appointment

My research concerns all aspects of eye-tracking, from the design of a physical recording device, the necessary image processing steps via classical computer vision as well as DNNs, to the high-level interpretation of recorded gaze sequences through machine learning. I work on assessing data quality, visualization and gaze analysis tools in various applications in the medical, educational, automotive and art historian fields.

With my Spin-Off Look! we develop eye-tracking solutions for automotive applications as well as in-vehicle head-mounted devices for driving schools and instructor teaching.

Our algorithms for the registration and analysis of gaze data are well suited to infer cognitive processes such as attention and vigilance solely from the movement of the eyes - and are therefore an excellent complementary factor to traditional measures such as perclos, head movements or gaze-on-road. This adds to robustness and sensitivity of driver monitoring systems by enabling the vehicle to sense the current attentional state of the driver.

Driving instructors are enabled by our head-mounted eye-tracking devices to see the streets through the eyes of their students. That way they can provide efficient feedback and speed up the learning process by making students aware of the importance of correct visual exploration of their surroundings.

Research Interests

  • Gaze-based driver assistance and monitoring systems
  • Computational models of human gaze behavior
  • Algorithms for the comparison of exploratory gaze sequences
  • Eye-tracking data quality in real-world applications
  • Algorithms and tools for the analysis and visualization of eye-tracking data
  • Head-mounted eye-tracking hardware

Publications

Low Power Scanned Laser Eye Tracking for Retinal Projection AR Glasses

2020

by Johannes Meyer, Thomas Schlebusch, Thomas C Kübler, and Enkelejda Kasneci

BIB

The Perception Engineer’s Toolkit for Eye-Tracking data analysis

2020

by Thomas C Kübler

PDF BIB

Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing

2020

by Nora Castner, Thomas C Kübler, Juliane Richter, Therese Eder, Fabian Huettig, Constanze Keutel, and Enkelejda Kasneci

PDF BIB

Improving Real-Time CNN-Based Pupil Detection Through Domain-Specific Data Augmentation

jun, 2019

by S. Eivazi, T. Santini, A. Keshavarzi, T. C. Kübler, and A. Mazzei

PDF BIB

Ferns for area of interest free scanpath classification

jun, 2019

by W. Fuhl, N. Castner, T. C. Kübler, A. Lotz, W. Rosenstiel, and E. Kasneci

PDF BIB

Eye-Hand Behavior in Human-Robot Shared Manipulation

mar, 2018

by R. M., T. Santini, T. C. Kübler, E. Kasneci, S. Srinivasa, and H. Admoni

BIB

An Inconspicuous and Modular Head-Mounted Eye Tracker

jun, 2018

by S. Eivazi, T. Kübler, T. Santini, and E. Kasneci

BIB

Automatic generation of saliency-based areas of interest

sep, 2018

by W. Fuhl, T. Kübler, T. Santini, and E. Kasneci

PDF BIB

Region of interest generation algorithms for eye tracking data

jun, 2018

by W. Fuhl, T. C. Kübler, H. Brinkmann, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Scanpath comparison in medical image reading skills of dental students

jun, 2018

by N. Castner, E. Kasneci, T. C. Kübler, K. Scheiter, and J. Richter

PDF BIB

Eye movement simulation and detector creation to reduce laborious parameter adjustments

2018

by W. Fuhl, T. Santini, T. Kuebler, N. Castner, W. Rosenstiel, and E. Kasneci

PDF BIB

EyeLad: Remote Eye Tracking Image Labeling Tool

feb, 2017

by W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, and E. Kasneci

PDF BIB

Algorithms for the comparison of visual scan patterns

feb, 2017

by T. C. Kübler

BIB

Aggregating physiological and eye tracking signals to predict perception in the absence of ground truth

mar, 2017

by E. Kasneci, T. C. Kübler, K. Broelemann, and G. Kasneci

BIB

Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection

may, 2017

by W. Fuhl, T. C. Kübler, D. Hospach, O. Bringmann, W. Rosenstiel, and E. Kasneci

PDF BIB

Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom

sep, 2017

by T. Santini, T. C. Kübler, L. Draghetti, P. Gerjets, W. Wagner, U. Trautwein, and E. Kasneci

BIB

Monitoring Response Quality During Campimetry Via Eye-Tracking

mar, 2017

by G. Dambros, J. Ungewiss, T. C. Kübler, E. Kasneci, and M. Spüler

BIB

EyeRec: An Open-source Data Acquisition Software for Head-mounted Eye-tracking

feb, 2016

by T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci

PDF BIB

Rendering refraction and reflection of eyeglasses for synthetic eye tracker images

mar, 2016

by T. C. Kübler, T. Rittig, J. Ungewiss, C. Krauss, and E. Kasneci

BIB

ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments

mar, 2016

by W. Fuhl, T. Santini, T. C. Kübler, and E. Kasneci

PDF BIB

Bayesian Identification of Fixations, Saccades, and Smooth Pursuits

mar, 2016

by T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci

PDF BIB

SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies

jul, 2016

by T. C. Kübler, C. Rothe, U. Schiefer, W. Rosenstiel, and E. Kasneci

PDF BIB

Eyes Wide Open? Eyelid Location and Eye Aperture Estimation for Pervasive Eye Tracking in Real-World Scenarios

sep, 2016

by W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, W. Rosenstiel, and E. Kasneci

PDF BIB

Novel methods for analysis and visualization of saccade trajectories

oct, 2016

by T. C. Kübler, W. Fuhl, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Towards Automated Scan Pattern Analysis for Dynamic Scenes

2015

by J. Ungewiss, T. C. Kübler, D.R. Bukenberger, E. Kasneci, and U. Schiefer

BIB

Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Traffic Hazard Perception

2015

by E. Kasneci, G. Kasneci, T. C. Kübler, and W. Rosenstiel

BIB

Analysis of eye movements with Eyetrace

2015

by T. C. Kübler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Eyetrace2014: Eyetracking Data Analysis Tool

jan, 2015

by K. Sippel, T. C. Kübler, W. Fuhl, G. Schievelbein, R. Rosenberg, and W. Rosenstiel

PDF BIB

Exploiting the potential of eye movements analysis in the driving context

mar, 2015

by E. Kasneci, T. C. Kübler, C. Braunagel, W. Fuhl, W. Stolzmann, and W. Rosenstiel

PDF BIB

ExCuSe: Robust Pupil Detection in Real-World Scenarios

sep, 2015

by W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci

PDF BIB

Automated Comparison of Scanpaths in Dynamic Scenes

sep, 2015

by T. C. Kübler and E. Kasneci

BIB

Automated Visual Scanpath Analysis Reveals the Expertise Level of Micro-neurosurgeons

oct, 2015

by T. C. Kübler, S. Eivazi, and E. Kasneci

PDF BIB

Driving with Glaucoma: Task Performance and Gaze Movements

nov, 2015

by T. C. Kübler, E. Kasneci, K. Aehling, M. Heister, W. Rosenstiel, U. Schiefer, and E. Papageorgiou

BIB

Driving with Homonymous Visual Field Defects: Driving Performance and Compensatory Gaze Movements

dec, 2015

by T. C. Kübler, E. Kasneci, W. Rosenstiel, K. Aehling, M. Heister, K. Nagel, U. Schiefer, and E. Papageorgiou

BIB

Arbitrarily shaped areas of interest based on gaze density gradient

aug, 2015

by W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci

PDF BIB

The Applicability of Probabilistic Methods to the Online Recognition of Fixations and Saccades in Dynamic Scenes

mar, 2014

by E. Kasneci, G. Kasneci, T. C. Kübler, and W. Rosenstiel

BIB

SubsMatch: Scanpath Similarity in Dynamic Scenes based on Subsequence Frequencies

mar, 2014

by T. C. Kübler, E. Kasneci, and W. Rosenstiel

BIB

Rule-based classification of visual field defects

mar, 2014

by E. Kasneci, G. Kasneci, U. Schiefer, and W. Rosenstiel

BIB

Gaze guidance for the visually impaired

mar, 2014

by T. C. Kübler, E. Kasneci, and W. Rosenstiel

BIB

Stress-indicators and exploratory gaze for the analysis of hazard perception in patients with visual field loss

may, 2014

by T. C. Kübler, E. Kasneci, W. Rosenstiel, U. Schiefer, K. Nagel, and E. Papageorgiou

BIB

Towards automated comparison of eye-tracking recordings in dynamic scenes

dec, 2014

by T. C. Kübler, D. R., J. Ungewiss, A. Wörner, C. Rothe, U. Schiefer, W. Rosenstiel, and E. Kasneci

BIB

Online Classification of Eye Tracking Data for Automated Analysis of Traffic Hazard Perception

sep, 2013

by E. Tafaj, T. C. Kübler, G. Kasneci, W. Rosenstiel, and M. Bogdan

BIB

Auswirkungen des visuellen Explorationsverhaltens von Patienten mit binokularen Gesichtsfelddefekten auf alltagsrelevante Tätigkeiten - Ergebnisse der TUTOR-Studie

sep, 2013

by U. Schiefer, T. C. Kübler, M. Heister, K. Aehling, K. Sippel, E. Papageorgiou, W. Rosenstiel, and E. Tafaj

BIB

Vishnoo - An Open-Source Software for Vision Research

jun, 2011

by E. Tafaj, T. C. Kübler, J. Peter, U. Schiefer, and M. Bogdan

BIB

Research

Eyetrace

Eyetrace is a tool for analysis of eye-tracking data. It has the approach to bunch a variety of different evaluation methods for a large share of eye trackers supporting scientific work and medical diagnosis. To allow EyeTrace to be compatible to different eye trackers, an additional tool called Eyetrace Butler is used. The Eyetrace Butler performs a data preprocessing and conversion for analysis with Eyetrace. It provides plugins for different eye trackers and converts their data into a format that can be imported and used by Eyetrace.

Learn More

Scanpath Comparison

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging.

Learn More

Vishnoo - A Visual Search Examination Tool

Vishnoo (Visual Search Examination Tool) is an integrated framework that combines configurable search tasks with gaze tracking capabilities, thus enabling the analysis of both, the visual field and the visual attention.

Learn More

Teaching

Course Term
Advanced Topics in Perception Engineering
Computational Models of Visual Attention
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Eye Tracking in Mobile Computing and Virtual Reality

Open Thesis Topics

10.07.2019

Saccade Bundles

This project is about applying (and probably adjusting) a linear time clustering algorithm for brain fiber activity to work with saccadic trajectories. The result will be a clustering of saccades (= saccade bundle).

Read more …

10.07.2019

Gaze Counter

Während in der Online-Werbebranche die Erfolgskontrolle für eine bestimmte Werbeanzeige z.B. durch die Anzahl der Klicks oder eine Klick/Kauf-Rate relativ einfach durchgeführt werden kann, ist dies für Offline-Medien nicht immer ganz so einfach: Wie oft wird beispielsweise ein bestimmtes Plakat pro Tag betrachtet? Erregt ein Design mehr Aufmerksamkeit, als ein anderes? Ist ein höherer Preis für einen bestimmten Plakatstandort gerechtfertigt? Das Blickverhalten ist in der Werbebranche ein gängiger Indikator. Eye-Tracking Studien sind aber teuer und aufwändig auszuwerten.

Read more …

10.07.2019

Crowd attention tracking

We want to be able to track eye and head movements of a crowd of people, such as a whole classroom. Thereby we can infer measures about current attentional focus, e.g. whether students pay attention to the lecture material and when attention decreases. Therefore we will record high resolution videos of multiple persons in a naturalistic setting, including a variety of head poses and eye positions.

Read more …

10.07.2019

Automatisierter Scanpath Vergleich

Menschen können ihre Aufmerksamkeit gezielt lenken. Eine Methode diese Aufmerksamkeitszuweisung aufzuzeichnen sind Eye-Tracking Aufnahmen: Objekte, die momentan interessant sind, werden mit den Augen fixiert. Oftmals ist es nicht nur interessant, wohin einzelne Individuen ihre Aufmerksamkeit lenken, sondern auch ein Vergleich zwischen Individuen oder verschiedenen Zeitpunkten. Hierzu werden diese auf einfache Elemente reduziert: Fixationen und Sakkaden (schnelle Augenbewegungen). Die zeitliche Abfolge und räumliche Position von Fixationen und Sakkaden nennt man Scanpath. Aktuelle Vergleichsmetoden basieren auf zwei verschiedenen Ansätzen:

Read more …

Finished Thesis Topics

10.07.2019

Vein extraction and eye rotation determination

The first step is setting up an recording environment with fixed subject position. This environment is used for data acquisition with predefined head rotations of the subjects. Based on this data an algorithm has to be developed measuring the eyeball rotation of the subject. The resulting angle is then compared and validated based on the head rotation.

Read more …

10.07.2019

EyeTrace CUDA extesion

EyeTrace is a software for gaze data visualization and analysis. Due to the increasing amount of data these visualizations need more computation time. In this thesis existing visualizations should be implemented using CUDA for GPU computations. Additionally this includes a data storage model making it possible to shift the data between the GPU and the host computer. Due to the fact that nowadays not all computers have a CUDA capable card the modul should also allow CPU computations. This should be determined automatically by the module.

Read more …

10.07.2019

Ein Dashboard für Eyetrace

Die Eyetrace Software bietet eine Vielzahl an Visualisierungsmöglchkeiten für Eye-Tracking Daten. In diesem Projekt soll eine grafisch ansprechende Übersicht über die momentan ins Programm geladenen Daten erzeugt werden.

Read more …

10.07.2019

3D Eyeball generation based on vein motion

The first step is robust feature extraction. This can be done using SURF, SIFT, BRISK or MSER features if sufficient. Those features have to be mapped on features found in consecutive images. Based on the displacement a 3D model has to be computed. This model is used afterwards for gaze position estimation.

Read more …

News

19.03.2020

Eight papers accepted at 2020 Eye Tracking and Reseach Applications Conference

The Human - Computer Interaction group (formerly Perception Engineering) are represented with 8 papers at the 2020 Eye Tracking and Reseach Applications Conference (ETRA).

Read more ...