Prof. Dr. Enkelejda Kasneci

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 74015
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
enkelejda.kasneci@uni-tuebingen.de
Office
Sand 14, C221
Office hours
on appointment

Publications on Google Scholar

Enkelejda Kasneci is a Professor of Computer Science at the University of Tübingen, Germany, where she leads the Human-Computer Interaction Lab. As a BOSCH scholar, she received her M.Sc. degree in Computer Science from the University of Stuttgart in 2007. In 2013, she received her PhD in Computer Science from the University of Tübingen. For her PhD research, she was awarded the research prize of the Federation Südwestmetall in 2014. From 2013 to 2015, she was a postdoctoral researcher and a Margarete-von-Wrangell Fellow at the University of Tübingen. Her research evolves around the application of machine learning for intelligent and perceptual human-computer interaction. She served as academic editor for PlosOne and as a TPC member and reviewer for several major conferences and journals.

Research Interests

  • Eye-tracking methods and applications, especially eye tracking in the wild
  • Applied machine learning
  • Eye movements and driving
  • Autonomous driving and Driver Observation Technology
  • Eye movements and VR/AR

Invited Talks

  • 2020: ECCV 2020 OpenEyes: Eye Gaze in AR, VR and in the wild
  • 2020: Closing Keynote at Augmented Human, AH 2020
  • 2019: Machine Learning in Education, FernUni Hagen
  • 2019: Keynote at the ACM Symposium on Eye Tracking Research and Applications, ETRA 2019
  • 2019: Opening Keynote European Conference on Eye Movements, ECEM 2019
  • 2018: Keynote on Cognitive Interfaces, Ada-Lovelace-Festival, Berlin
  • 2018: Global Female Leaders Summit, Berlin
  • 2017: Hub.Berlin
  • 2017: Ada-Lovelace-Festival, Berlin
  • 2017: Eye-Tracking während des Fahrens, OCG Jahresopening 2017, Vienna
  • 2016: It’s in your eyes – How eye tracking will shape our future, Ada-Lovelace-Festival, Berlin
  • 2016: Maschinelles Lernen und Eye-Tracking-Technologie zur Erforschung der Mechanismen der visuellen Wahrnehmung, INFORMATIK2016, Klagenfurt
  • 2015: Eye tracking in natural settings - Challenges and opportunities, Institut für Neuro- und Bioinformatik, Universität zu Lübeck
  • 2015: Arbitrarily shaped areas of interest based on gaze density gradient, European Conference on Eye Movements, ECEM 2015, Vienna
  • 2015: Exploiting the potential of eye movements analysis in the driving context, Perceptual User Interfaces Group, Max Planck Institute for Informatics, Saarbrücken
  • 2015: Eye movements and driving, Volkswagen Research Center, Wolfsburg
  • 2014: Online Eye-Tracking Data Analysis, SMI Vision, Research Center Berlin
  • 2013: Towards the Automated Recognition of Assistance Need for Drivers with Impaired Visual Field, Mercedes-Benz Technology Center Sindelfingen

Scholarships, Awards and Administrative Functions

Publications

Distilling Location Proposals of Unknown Objects Through Gaze Information for Human-Robot Interaction

2020

by Daniel Weber, Thiago Santini, Andreas Zell, and Enkelejda Kasneci

PDF BIB

RemoteEye: An open-source high-speed remote eye tracker

mar, 2020

by Benedikt Hosp, Shahram Eivazi, Maximilian Maurer, Woflgang Fuhl, David Geisler, and Enkelejda Kasneci

PDF BIB

Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing

2020

by Nora Castner, Thomas C Kübler, Juliane Richter, Therese Eder, Fabian Huettig, Constanze Keutel, and Enkelejda Kasneci

PDF BIB

Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework

2020

by Efe Bozkir, Ali B. Uenal, Mete Akguen, Enkelejda Kasneci, and Nico Pfeifer

BIB

A Novel -Eye-Tracking Sensor for AR Glasses Based on Laser Self-Mixing Showing Exceptional Robustness Against Illumination

2020

by Johannes Meyer, Thomas Schlebusch, Hans Spruit, Jochen Hellmig, and Enkelejda Kasneci

BIB

Exploiting the GBVS for Saliency aware Gaze Heatmaps

2020

by David Geisler, Daniel Weber, Nora Castner, and Enkelejda Kasneci

PDF BIB

A MinHash approach for fast scanpath classification

2020

by David Geisler, Nora Castner, Gjergji Kasneci, and Enkelejda Kasneci

PDF BIB

Training Decision Trees as Replacement for Convolution Layers

feb, 2020

by W. Fuhl, G. Kasneci, W. Rosenstiel, and E. Kasneci

PDF BIB

Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation

jan, 2020

by W. Fuhl, H. Gao, and E. Kasneci

PDF BIB

Neural networks for optical vector and eye ball parameter estimation

jan, 2020

by W. Fuhl, H. Gao, and E. Kasneci

PDF BIB

A Novel Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning

jul, 2020

by Johannes Meyer, Thomas Schlebusch, Wolfgang Fuhl, and Enkelejda Kasneci

BIB

Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction

2020

by Wolfgang Fuhl, Yao Rong, and Kasneci Enkelejda

PDF BIB

Explainable Online Validation of Machine Learning Models for Practical Applications

aug, 2020

by Wolfgang Fuhl and Enkelejda Kasneci

PDF BIB

Multi Layer Neural Networks as Replacement for Pooling Operations

aug, 2020

by Wolfgang Fuhl, Yao Rong, Thomas Motz, Michael Scheidt, Andreas Hartel, Andreas Koch, and Enkelejda Kasneci

PDF BIB

Reinforcement learning for the privacy preservation and manipulation of eye tracking data

aug, 2020

by Wolfgang Fuhl, Efe Bozkir, and Enkelejda Kasneci

PDF BIB

Weight and Gradient Centralization in Deep Neural Networks

aug, 2020

by Wolfgang Fuhl and Enkelejda Kasneci

PDF BIB

Rotated Ring, Radial and Depth Wise Separable Radial Convolutions

aug, 2020

by Wolfgang Fuhl and Enkelejda Kasneci

PDF BIB

Pupil diameter differentiates expertise in dental radiography visual search

may, 2020

by Nora Castner, Tobias Appel, Thérése Eder, Juliane Richter, Katharina Scheiter, Constanze Keutel, Fabian Hüttig, Andrew Duchowski, and Enkelejda Kasneci

PDF BIB

Driver Intention Anticipation Based on In-Cabin and Driving Scene Monitoring

2020

by Rong Yao, Akata Zeynep, and Kasneci Enkelejda

BIB

The display makes a difference: A mobile eye tracking study on the perception of art before and after a museum’s rearrangement

jun, 2020

by L. Reitstätter, H. Brinkmann, T. Santini, E. Specker, Z. Dare, F. Bakondi, A. Miscená, E. Kasneci, H. Leder, and R. Rosenberg

PDF BIB

Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning

jul, 2020

by J. Meyer, T. Schlebusch, W. Fuhl, and E. Kasneci

PDF BIB

Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)

2019

by Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci

PDF BIB

Person Independent, Privacy Preserving, and Real Time Assessment of Cognitive Load using Eye Tracking in a Virtual Reality Setup

mar, 2019

by E. Bozkir, D. Geisler, and E. Kasneci

PDF BIB

Improving Real-Time CNN-Based Pupil Detection Through Domain-Specific Data Augmentation

jun, 2019

by S. Eivazi, T. Santini, A. Keshavarzi, T. C. Kübler, and A. Mazzei

PDF BIB

Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Tracking

jun, 2019

by T. Santini, D. Niehorster, and E. Kasneci

BIB

Ferns for area of interest free scanpath classification

jun, 2019

by W. Fuhl, N. Castner, T. C. Kübler, A. Lotz, W. Rosenstiel, and E. Kasneci

PDF BIB

500,000 images closer to eyelid and pupil segmentation

nov, 2019

by W. Fuhl, W. Rosenstiel, and E. Kasneci

PDF BIB

RemoteEye: An Open Source remote Eye Tracker

dec, 2019

by B. Hosp, S. Evazi, M. Maurer, W. Fuhl, and E. Kasneci

BIB

The applicability of Cycle GANs for pupil and eyelid segmentation, data generation and image refinement

nov, 2019

by W. Fuhl, D. Geisler, W. Rosenstiel, and E. Kasneci

PDF BIB

Learning to validate the quality of detected landmarks

nov, 2019

by W. Fuhl and E. Kasneci

PDF BIB

Assessment of Driver Attention During a Safety Critical Situation in VR to Generate VR-based Training

sep, 2019

by E. Bozkir, D. Geisler, and E. Kasneci

PDF BIB

Eye-Hand Behavior in Human-Robot Shared Manipulation

mar, 2018

by R. M., T. Santini, T. C. Kübler, E. Kasneci, S. Srinivasa, and H. Admoni

BIB

Real-time 3D Glint Detection in Remote Eye Tracking Based on Bayesian Inference

may, 2018

by David Geisler, Dieter Fox, and Enkelejda Kasneci

PDF BIB

PuRe: Robust Pupil Detection for Real-Time Pervasive Eye Tracking

may, 2018

by T. Santini, W. Fuhl, and E. Kasneci

PDF BIB

The Art of Pervasive Eye Tracking: Unconstrained Eye Tracking in the Austrian Gallery Belvedere

jun, 2018

by T. Santini, H. Brinkmann, L. Reistätter, H. Leder, R. Rosenberg, W. Rosenstiel, and E. Kasneci

BIB

Teachers’ Perception in the Classroom

jun, 2018

by Ö. Sümer, P. Goldberg, K. Stürmer, T. Seidel, P. Gerjets, U. Trautwein, and E. Kasneci

BIB

PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking

jun, 2018

by T. Santini, W. Fuhl, and E. Kasneci

PDF BIB

Overlooking: The nature of gaze behavior and anomaly detection in expert dentists

oct, 2018

by Nora Castner, Solveig Klepper, Lena Kopnarski, Fabian Hüttig, Constanze Keutel, Katharina Scheiter, Juliane Richter, T. Eder, and Enkelejda Kasneci

PDF BIB

CBF:Circular binary features for robust and real-time pupil center detection

jun, 2018

by W. Fuhl, D. Geisler, T. Santini, T. Appel, W. Rosenstiel, and E. Kasneci

PDF BIB

Teachers’ Perception in the Classroom

jun, 2018

by Ö. Sümer, P. Goldberg, K. Stürmer, T. Seidel, P. Gerjets, U. Trautwein, and E. Kasneci

BIB

Development and Evaluation of a Gaze Feedback System Integrated into EyeTrace

jun, 2018

by K. Otto, N. Castner, D. Geisler, and E. Kasneci

PDF BIB

An Inconspicuous and Modular Head-Mounted Eye Tracker

jun, 2018

by S. Eivazi, T. Kübler, T. Santini, and E. Kasneci

BIB

Automatic generation of saliency-based areas of interest

sep, 2018

by W. Fuhl, T. Kübler, T. Santini, and E. Kasneci

PDF BIB

Region of interest generation algorithms for eye tracking data

jun, 2018

by W. Fuhl, T. C. Kübler, H. Brinkmann, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Scanpath comparison in medical image reading skills of dental students

jun, 2018

by N. Castner, E. Kasneci, T. C. Kübler, K. Scheiter, and J. Richter

PDF BIB

MAM: Transfer learning for fully automatic video annotation and specialized detector creation

2018

by W. Fuhl, N. Castner, L. Zhuang, M. Holzer, W. Rosenstiel, and E. Kasneci

PDF BIB

Eye movement velocity and gaze data generator for evaluation, robustness testing and assess of eye tracking software and visualization tools

2018

by W. Fuhl and E. Kasneci

PDF BIB

BORE: Boosted-oriented edge optimization for robust, real time remote pupil center detection

2018

by W. Fuhl, S. Eivazi, B. Hosp, A. Eivazi, W. Rosenstiel, and E. Kasneci

PDF BIB

Rule based learning for eye movement type detection

2018

by W. Fuhl, N. Castner, and E. Kasneci

PDF BIB

Histogram of oriented velocities for eye movement detection

2018

by W. Fuhl, N. Castner, and E. Kasneci

PDF BIB

Eye movement simulation and detector creation to reduce laborious parameter adjustments

2018

by W. Fuhl, T. Santini, T. Kuebler, N. Castner, W. Rosenstiel, and E. Kasneci

PDF BIB

Online recognition of driver-activity based on visual scanpath classification

2017

by Christian Braunagel, David Geisler, Wolfgang Rosenstiel, and Enkelejda Kasneci

BIB

Eye tracking as a tool to evaluate functional ability in everyday tasks in glaucoma

2017

by E. Kasneci, A. A., and J. M.

BIB

Saliency Sandbox: Bottom-Up Saliency Framework

feb, 2017

by D. Geisler, W. Fuhl, T. Santini, and E. Kasneci

PDF BIB

EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking

feb, 2017

by T. Santini, W. Fuhl, D. Geisler, and E. Kasneci

PDF BIB

EyeLad: Remote Eye Tracking Image Labeling Tool

feb, 2017

by W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, and E. Kasneci

PDF BIB

Fast and Robust Eyelid Outline and Aperture Detection in Real-World Scenarios

mar, 2017

by W. Fuhl, T. Santini, and E. Kasneci

PDF BIB

Aggregating physiological and eye tracking signals to predict perception in the absence of ground truth

mar, 2017

by E. Kasneci, T. C. Kübler, K. Broelemann, and G. Kasneci

BIB

Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection

may, 2017

by W. Fuhl, T. C. Kübler, D. Hospach, O. Bringmann, W. Rosenstiel, and E. Kasneci

PDF BIB

CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction

may, 2017

by T. Santini, W. Fuhl, and E. Kasneci

PDF BIB

Towards pervasive eye tracking

aug, 2017

by E. Kasneci

BIB

Using Eye Tracking to Evaluate and Develop Innovative Teaching Strategies for Fostering Image Reading Skills of Novices in Medical Training

sep, 2017

by N. Castner, S. Eivazi, K. Scheiter, and E. Kasneci

PDF BIB

Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom

sep, 2017

by T. Santini, T. C. Kübler, L. Draghetti, P. Gerjets, W. Wagner, U. Trautwein, and E. Kasneci

BIB

Towards Intelligent Surgical Microscopes: Surgeons Gaze and Instrument Tracking

mar, 2017

by Shahram Eivazi, Wolfgang Fuhl, and Enkelejda Kasneci

PDF BIB

Towards automatic skill evaluation in microsurgery

mar, 2017

by Shahram Eivazi, Michael Slupina, Wolfgang Fuhl, Hoorieh Afkari, Ahmad Hafez, and Enkelejda Kasneci

PDF BIB

Monitoring Response Quality During Campimetry Via Eye-Tracking

mar, 2017

by G. Dambros, J. Ungewiss, T. C. Kübler, E. Kasneci, and M. Spüler

BIB

PupilNet v2.0: Convolutional Neural Networks for Robust Pupil Detection

2017

by W. Fuhl, T. Santini, G. Kasneci, and E. Kasneci

PDF BIB

Fast camera focus estimation for gaze-based focus control

2017

by W. Fuhl, T. Santini, and E. Kasneci

PDF BIB

Optimal eye movement strategies: a comparison of neurosurgeons gaze patterns when using a surgical microscope

2017

by S. Eivazi, A. Hafez, W. Fuhl, H. Afkari, E. Kasneci, M. Lehecka, and R. Bednarik

PDF BIB

EyeRec: An Open-source Data Acquisition Software for Head-mounted Eye-tracking

feb, 2016

by T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci

PDF BIB

3D Gaze Estimation Using Eye Vergence

feb, 2016

by E.G. Mlot, H. Bahmani, S. Wahl, and E. Kasneci

BIB

Rendering refraction and reflection of eyeglasses for synthetic eye tracker images

mar, 2016

by T. C. Kübler, T. Rittig, J. Ungewiss, C. Krauss, and E. Kasneci

BIB

On the necessity of adaptive eye movement classification in conditionally automated driving scenarios

mar, 2016

by C. Braunagel, D. Geisler, W. Stolzmann, W. Rosenstiel, and E. Kasneci

PDF BIB

ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments

mar, 2016

by W. Fuhl, T. Santini, T. C. Kübler, and E. Kasneci

PDF BIB

Bayesian Identification of Fixations, Saccades, and Smooth Pursuits

mar, 2016

by T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci

PDF BIB

Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art

jun, 2016

by Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci

PDF BIB

SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies

jul, 2016

by T. C. Kübler, C. Rothe, U. Schiefer, W. Rosenstiel, and E. Kasneci

PDF BIB

Eyes Wide Open? Eyelid Location and Eye Aperture Estimation for Pervasive Eye Tracking in Real-World Scenarios

sep, 2016

by W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, W. Rosenstiel, and E. Kasneci

PDF BIB

Brightness- and Motion-Based Blink Detection for Head-Mounted Eye Trackers

sep, 2016

by T. Appel, T. Santini, and E. Kasneci

PDF BIB

Novel methods for analysis and visualization of saccade trajectories

oct, 2016

by T. C. Kübler, W. Fuhl, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Non-Intrusive Practitioner Pupil Detection for Unmodified Microscope Oculars

dec, 2016

by W. Fuhl, T. Santini, C. Reichert, D. Claus, A. Herkommer, H. Bahmani, K. Rifai, S. Wahl, and E. Kasneci

PDF BIB

Evaluation of State-of-the-Art Pupil Detection Algorithms on Remote Eye Images

sep, 2016

by W. Fuhl, D. Geisler, T. Santini, and E. Kasneci

PDF BIB

Feature-based attentional influences on the accommodation response

2016

by H. Bahmani, W. Fuhl, E. Gutierrez, G. Kasneci, E. Kasneci, and S. Wahl

BIB

PupilNet: Convolutional Neural Networks for Robust Pupil Detection

2016

by W. Fuhl, T. Santini, G. Kasneci, and E. Kasneci

PDF BIB

Towards Automated Scan Pattern Analysis for Dynamic Scenes

2015

by J. Ungewiss, T. C. Kübler, D.R. Bukenberger, E. Kasneci, and U. Schiefer

BIB

Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Traffic Hazard Perception

2015

by E. Kasneci, G. Kasneci, T. C. Kübler, and W. Rosenstiel

BIB

Analysis of eye movements with Eyetrace

2015

by T. C. Kübler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, and E. Kasneci

PDF BIB

Exploiting the potential of eye movements analysis in the driving context

mar, 2015

by E. Kasneci, T. C. Kübler, C. Braunagel, W. Fuhl, W. Stolzmann, and W. Rosenstiel

PDF BIB

ExCuSe: Robust Pupil Detection in Real-World Scenarios

sep, 2015

by W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci

PDF BIB

Driver-Activity Recognition in the Context of Conditionally Autonomous Driving

sep, 2015

by C. Braunagel, E. Kasneci, W. Stolzmann, and W. Rosenstiel

BIB

Automated Comparison of Scanpaths in Dynamic Scenes

sep, 2015

by T. C. Kübler and E. Kasneci

BIB

Automated Visual Scanpath Analysis Reveals the Expertise Level of Micro-neurosurgeons

oct, 2015

by T. C. Kübler, S. Eivazi, and E. Kasneci

PDF BIB

Driving with Glaucoma: Task Performance and Gaze Movements

nov, 2015

by T. C. Kübler, E. Kasneci, K. Aehling, M. Heister, W. Rosenstiel, U. Schiefer, and E. Papageorgiou

BIB

Driving with Homonymous Visual Field Defects: Driving Performance and Compensatory Gaze Movements

dec, 2015

by T. C. Kübler, E. Kasneci, W. Rosenstiel, K. Aehling, M. Heister, K. Nagel, U. Schiefer, and E. Papageorgiou

BIB

Arbitrarily shaped areas of interest based on gaze density gradient

aug, 2015

by W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci

PDF BIB

Driving with Binocular Visual Field Loss? A Study on a Supervised On-road Parcours with Simultaneous Eye and Head Tracking

feb, 2014

by E. Kasneci, K. Sippel, K. Aehling, M. Heister, W. Rosenstiel, U. Schiefer, and E. Papageorgiou

BIB

The Applicability of Probabilistic Methods to the Online Recognition of Fixations and Saccades in Dynamic Scenes

mar, 2014

by E. Kasneci, G. Kasneci, T. C. Kübler, and W. Rosenstiel

BIB

SubsMatch: Scanpath Similarity in Dynamic Scenes based on Subsequence Frequencies

mar, 2014

by T. C. Kübler, E. Kasneci, and W. Rosenstiel

BIB

Rule-based classification of visual field defects

mar, 2014

by E. Kasneci, G. Kasneci, U. Schiefer, and W. Rosenstiel

BIB

Gaze guidance for the visually impaired

mar, 2014

by T. C. Kübler, E. Kasneci, and W. Rosenstiel

BIB

Stress-indicators and exploratory gaze for the analysis of hazard perception in patients with visual field loss

may, 2014

by T. C. Kübler, E. Kasneci, W. Rosenstiel, U. Schiefer, K. Nagel, and E. Papageorgiou

BIB

Binocular Glaucomatous Visual Field Loss and Its Impact on Visual Exploration - A Supermarket Study

aug, 2014

by K. Sippel, E. Kasneci, K. Aehling, M. Heister, W. Rosenstiel, U. Schiefer, and E. Papageorgiou

BIB

Homonymous Visual Field Loss and its Impact on Visual Exploration - A Supermarket Study

sep, 2014

by E. Kasneci, K. Sippel, K. Aehling, M. Heister, W. Rosenstiel, U. Schiefer, and E. Papageorgiou

BIB

Towards automated comparison of eye-tracking recordings in dynamic scenes

dec, 2014

by T. C. Kübler, D. R., J. Ungewiss, A. Wörner, C. Rothe, U. Schiefer, W. Rosenstiel, and E. Kasneci

BIB

A New Method for Assessing the Exploratory Field of View (EFOV)

feb, 2013

by E. Tafaj, S. Hempel, M. Heister, K. Aehling, J. Dietzsch, F. Schaeffel, W. Rosenstiel, and U. Schiefer

BIB

Online Classification of Eye Tracking Data for Automated Analysis of Traffic Hazard Perception

sep, 2013

by E. Tafaj, T. C. Kübler, G. Kasneci, W. Rosenstiel, and M. Bogdan

BIB

Auswirkungen des visuellen Explorationsverhaltens von Patienten mit binokularen Gesichtsfelddefekten auf alltagsrelevante Tätigkeiten - Ergebnisse der TUTOR-Studie

sep, 2013

by U. Schiefer, T. C. Kübler, M. Heister, K. Aehling, K. Sippel, E. Papageorgiou, W. Rosenstiel, and E. Tafaj

BIB

Towards the Automated Recognition of Assistance Need for Drivers with Impaired Visual Field

oct, 2013

by E. Kasneci

BIB

Erste Ergebnisse der TUTOR-Pilotstudie: Binokulare Gesichtsfeldausfälle und deren Auswirkungen auf die visuelle Exploration

2012

by U. Schiefer, K. Sippel, M. Heister, K. Aehling, C. Heine, K. Januschowski, E. Papageorgiou, W. Rosenstiel, and E. Tafaj

BIB

Bayesian Online Clustering of Eye-Tracking Data

2012

by E. Tafaj, G. Kasneci, W. Rosenstiel, and M. Bogdan

PDF BIB

Reliable Classification of visual field defects in automated perimetry using clustering

feb, 2011

by E. Tafaj, J. Dietzsch, U. Schiefer, W. Rosenstiel, and M. Bogdan

BIB

Methode zur Messung der physiologischen Blendung im Fahrsimulator

mar, 2011

by V. Melcher, M. Aust, O. Stefani, E. Lösch, H. Wilhelm, and E. Tafaj

BIB

Fast extraction of neuron morphologies from large-scale electron-microscopic image stacks

mar, 2011

by S. Lang, P. Drouvelis, E. Tafaj, P. Bastian, and B. Sakmann

BIB

Zukünftige Fahrzeuge adaptieren sich auf den Fahrer: Identifikation charakteristischer Verhaltensmerkmale von Fahrzeugführern unter demographischen Gesichtspunkten

apr, 2011

by E. Tafaj, P. Rumbolz, M. Bogdan, J. Wiedemann, and W. Rosenstiel

BIB

Vishnoo - An Open-Source Software for Vision Research

jun, 2011

by E. Tafaj, T. C. Kübler, J. Peter, U. Schiefer, and M. Bogdan

BIB

Mobile and fast detection of visual field defects for elderly drivers as a necessary input into driver assistance systems for mobility maintenance

2010

by E. Tafaj, C. Uebber, J. Dietzsch, U. Schiefer, M. Bogdan, and W. Rosenstiel

BIB

Introduction of a Portable Campimeter Based on a Laptop/Tablet PC

2010

by E. Tafaj, C. Uebber, J. Dietzsch, U. Schiefer, M. Bogdan, and W. Rosenstiel

BIB

Fahrzeugentwicklung für eine Gesellschaft im demografischen Wandel. H. Häfner, K. Beyreuther, W. Schlicht (Hrsg.). Altern gestalten, Medizin, Technik, Umwelt

2010

by J. Wiedemann, M. Horn, W. Rosenstiel, and E. Tafaj

BIB

Research

Blink Detection

Blinks are an indicator for fatigue or drowsiness and can assist in the diagnose of mental disorders, such as schizophrenia. Additionally, a blink that obstructs the pupil impairs the performance of other eye-tracking algorithms, such as pupil detection, and often results in noise to the gaze estimation. The algorithm presented here, is tailored towards head-mounted eye trackers and is robust to calibration-based variations like translation or rotation of the eye. The proposed approach reached 96,35% accuracy for a realistic and challenging data set and in real-time even on low-end devices, rendering the proposed method suited for pervasive eye tracking.

Learn More

Eye labeling tool

Ground truth data is an important prerequisite for the development and evaluation of many algorithms in the area of computer vision, especially when these are based on convolutional neural networks or other machine learning approaches that unfold their power mostly by supervised learning. This learning relies on ground truth data, which is laborious, tedious, and error prone for humans to generate. In this paper, we contribute a labeling tool (EyeLad) specifically designed for remote eye-tracking data to enable researchers to leverage machine learning based approaches in this field, which is of great interest for the automotive, medical, and human-computer interaction applications. The tool is multi platform and supports a variety of state-of-theart detection and tracking algorithms, including eye detection, pupil detection, and eyelid coarse positioning.

Learn More

Eye Movements Identification

In this paper, we use fully convolutional neural networks for the semantic segmentation of eye tracking data. We also use these networks for reconstruction, and in conjunction with a variational auto-encoder to generate eye movement data. The first improvement of our approach is that no input window is necessary, due to the use of fully convolutional networks and therefore any input size can be processed directly. The second improvement is that the used and generated data is raw eye tracking data (position X, Y and time) without preprocessing. This is achieved by pre-initializing the filters in the first layer and by building the input tensor along the z axis. We evaluated our approach on three publicly available datasets and compare the results to the state of the art.

Learn More

EyeRecToo

Head-mounted eye tracking offers remarkable opportunities for research and applications regarding pervasive health monitoring, mental state inference, and human computer interaction in dynamic scenarios. Although a plethora of software for the acquisition of eye-tracking data exists, they often exhibit critical issues when pervasive eye tracking is considered, e.g., closed source, costly eye tracker hardware dependencies, and requiring a human supervisor for calibration. In this paper, we introduce EyeRecToo, an open-source software for real-time pervasive head-mounted eye-tracking. Out of the box, EyeRecToo offers multiple real-time state-of-the-art pupil detection and gaze estimation methods, which can be easily replaced by user implemented algorithms if desired. A novel calibration method that allows users to calibrate the system without the assistance of a human supervisor is also integrated. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRecToo serves as a framework to quickly enable pervasive eye-tracking research and applications.

Learn More

Eyetrace

Eyetrace is a tool for analysis of eye-tracking data. It has the approach to bunch a variety of different evaluation methods for a large share of eye trackers supporting scientific work and medical diagnosis. To allow EyeTrace to be compatible to different eye trackers, an additional tool called Eyetrace Butler is used. The Eyetrace Butler performs a data preprocessing and conversion for analysis with Eyetrace. It provides plugins for different eye trackers and converts their data into a format that can be imported and used by Eyetrace.

Learn More

Intelligent Surgical Microscope

Head-mounted eye tracking offers remarkable opportunities for research and applications regarding pervasive health monitoring, mental state inference, and human computer interaction in dynamic scenarios. Although a plethora of software for the acquisition of eye-tracking data exists, they often exhibit critical issues when pervasive eye tracking is considered, e.g., closed source, costly eye tracker hardware dependencies, and requiring a human supervisor for calibration. In this paper, we introduce EyeRecToo, an open-source software for real-time pervasive head-mounted eye-tracking. Out of the box, EyeRecToo offers multiple real-time state-of-the-art pupil detection and gaze estimation methods, which can be easily replaced by user implemented algorithms if desired. A novel calibration method that allows users to calibrate the system without the assistance of a human supervisor is also integrated. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRecToo serves as a framework to quickly enable pervasive eye-tracking research and applications.

Learn More

Robust Pupil Detection and Gaze Estimation

The reliable estimation of the pupil position in eye images is perhaps the most important prerequisite in gaze-based HMI applications. While there are many approaches that enable accurate pupil tracking under laboratory conditions, tracking the pupil in real-world images is highly challenging due to changes in illumination, reflections on glasses or on the eyeball, off-axis camera position, contact lenses, and many more.

Learn More

Scanpath Comparison

Our eye movements are driven by a continuous trade-off between the need for detailed examination of objects of interest and the necessity to keep an overview of our surrounding. In consequence, behavioral patterns that are characteristic for our actions and their planning are typically manifested in the way we move our eyes to interact with our environment. Identifying such patterns from individual eye movement measurements is however highly challenging.

Learn More

Vishnoo - A Visual Search Examination Tool

Vishnoo (Visual Search Examination Tool) is an integrated framework that combines configurable search tasks with gaze tracking capabilities, thus enabling the analysis of both, the visual field and the visual attention.

Learn More

Teaching

Course Term
Forschungsprojekt AB E. Kasneci
Multimodale Mensch-Maschine Interaktion
Proseminar Grundlagen Mensch-Maschine Interaktion
Seminar Advanced Topics in Perception Engineering
Advanced Topics in Perception Engineering
Computational Models of Visual Attention
Eye Movements and Visual Perception
Eye Movements and Visual Perception
Eye Movements and Visual Perception
Grundlagen der Multimediatechnik
Praktikum zu Grundlagen der Multimediatechnik
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Seminar: Advanced Topics in Perception Engineering
Eye Tracking in Mobile Computing and Virtual Reality

Open Thesis Topics

10.07.2019

Automatisierter Scanpath Vergleich

Menschen können ihre Aufmerksamkeit gezielt lenken. Eine Methode diese Aufmerksamkeitszuweisung aufzuzeichnen sind Eye-Tracking Aufnahmen: Objekte, die momentan interessant sind, werden mit den Augen fixiert. Oftmals ist es nicht nur interessant, wohin einzelne Individuen ihre Aufmerksamkeit lenken, sondern auch ein Vergleich zwischen Individuen oder verschiedenen Zeitpunkten. Hierzu werden diese auf einfache Elemente reduziert: Fixationen und Sakkaden (schnelle Augenbewegungen). Die zeitliche Abfolge und räumliche Position von Fixationen und Sakkaden nennt man Scanpath. Aktuelle Vergleichsmetoden basieren auf zwei verschiedenen Ansätzen:

Read more …

News

08.07.2020

Liebherr-Elektronik GmbH funds research project at the University of Tübingen

Liebherr-Elektronik GmbH and the University of Tübingen are researching efficient machine learning methods for use in real-time critical automation tasks. The cooperation, which is funded by Liebherr-Elektronik, is headed by Prof. Dr. Enkelejda Kasneci, holder of the chair for human-machine interaction.

Read more ...

24.06.2020

Paper accepted at the International Conference on Pattern Recognition 2020 (ICPR)

The paper “Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction” was accepted at the International Conference on Pattern Recognition 2020.

Read more ...

06.06.2020

Best paper award at 2020 Eye Tracking and Reseach Applications Conference

The paper “A Min hash approach to scanpath classification” received the best paper award at 2020 Eye Tracking and Reseach Applications Conference (ETRA).

Read more ...