Human Computer Interaction

Dr. Wolfgang Fuhl

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 70492
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
wolfgang.fuhl@uni-tuebingen.de
Office
Sand 14, C206
Office hours
on appointment

Research Interest:

  • Computer Vision (Classical image processing, Rule based algorithms, Shape estimation, 3D pose estimation, Detection, Classification, Segmentation, 3D reconstruction, Image generation, Modern Image, Features (HOG, MSER, SIFT), Real time systems)
  • Machine Learning (Deep Neuronal Networks (Residual, Inception, Combinations, Recurrent), Tiny Convolutional Neuronal Networks, Real time Neuronal Networks (XOR, Binary, Tree), Unsupervised learning (Auto encoders, PCA), Support vector machines, Optimization, Decision Trees, KNN, GMM, transfer learning, Probabilistic (Naive Bays, HMM, CRF, Graph Models), Bagging, Boosting, Rule learning, PCA Networks, Scattering Networks, Clustering, Curve/Function fitting, evolutionary algorithms)
  • Eye Tracking (Real time feature extraction algorithms, Feature based gaze estimation, Appearance based gaze estimation, AOI generation, Scan path classification, Eye movement detection, Usability of eye tracking software, Data visualization)
  • Visualization (3D rendering, Data reduction, Interactive Visualizations, Splines)
  • Hardware (FPGAs, raspberry PI, Mobile Phones, NPU, GPU, CPU)

Assigned-Thesis-Topics:

  • Arduino Cloud for Mobile Phones, Benedikt Hosp (M.Sc.)
  • Implementation And Evaluation Of Methods For Object Recognition, Sebastian Lutz (M.Sc.)
  • EyeTrace - Saliency AOI Generation, Ying Meng (M.Sc.)
  • Vergleich von Aufmerksamkeitsmodellen auf der Basis dynamischer Fahrszenen, Erik Lemke (B.Sc.)
  • Extraktion der Blutgefäße des Auges aus Nahaufnahmen, Sotirios Pavlidis (B.Sc.)
  • Bewertung Maschinelle Lernalgorithmen für die Güte von Programmierkenntnissen, Christian Hackenbeck (B.Sc.)
    PDF
  • Bewertung und Umsetzung bewährter Echtzeitbildverarbeitungsmerkmale für den Einsatz in Webbrowsern, Amr Abdellatif (B.Sc.)
    PDF
  • Bewertung und Umsetzung echtzeitfähiger maschineller Lern Algorithmen für den Einsatz in Webbrowsern, Hao liu (B.Sc.)
    PDF
  • Erstellung und Bewertung eines Webbrowserbasierten Frameworks zur Datenakquise für Studien, Roufayda Salaheddine (B.Sc. in progress)
  • Erstellung und Bewertung einer Webplattformbasierten Datenhaltungs- und Bewertungssoftware für Studien, Oliwia Oles (B.Sc. in progress)
  • Improve Browser Watermarking with Eye Tracking, Nikolai Iraj Sanamrad (B.Sc.)
    PDF
  • Scan path Classification in dynamic scenes, Fadi Al-kayid (B.Sc. in progress)
  • Gaze based tessellation of objects in 3D scenes, Eric Goofers (M.Sc. in progress)

Publications

2020

RemoteEye: An open-source high-speed remote eye tracker

Benedikt Hosp, Shahram Eivazi, Maximilian Maurer, Woflgang Fuhl, David Geisler, and Enkelejda Kasneci. Behavior Research Methods, pages 1–15. Springer, 2020.

PDF BIB

Training Decision Trees as Replacement for Convolution Layers

W. Fuhl, G. Kasneci, W. Rosenstiel, and E. Kasneci. Conference on Artificial Intelligence, AAAI, 2020.

PDF BIB

Tiny convolution, decision tree, and binary neuronal networks for robust and real time pupil outline estimation

W. Fuhl, H. Gao, and E. Kasneci. ACM Symposium on Eye Tracking Research & Applications, ETRA 2020. ACM, 2020.

PDF BIB

Neural networks for optical vector and eye ball parameter estimation

W. Fuhl, H. Gao, and E. Kasneci. ACM Symposium on Eye Tracking Research & Applications, ETRA 2020. ACM, 2020.

PDF BIB

A Novel Camera-Free Eye Tracking Sensor for Augmented Reality based on Laser Scanning

Johannes Meyer, Thomas Schlebusch, Wolfgang Fuhl, and Enkelejda Kasneci. Sensors Journal, pages 1-1. IEEE, 2020.

PDF BIB

Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction

Wolfgang Fuhl, Yao Rong, and Kasneci Enkelejda. Proceedings of the International Conference on Pattern Recognition, pages 0–0, 2020.

PDF BIB

Explainable Online Validation of Machine Learning Models for Practical Applications

Wolfgang Fuhl, Yao Rong, Thomas Motz, Michael Scheidt, Andreas Hartel, Andreas Koch, and Enkelejda Kasneci. Proceedings of the International Conference on Pattern Recognition, pages 0–0, 2020.

PDF BIB

Multi Layer Neural Networks as Replacement for Pooling Operations

Wolfgang Fuhl and Enkelejda Kasneci. arXiv preprint arXiv:2006.06969. CoRR, 2020.

PDF BIB

Reinforcement learning for the privacy preservation and manipulation of eye tracking data

Wolfgang Fuhl, Efe Bozkir, and Enkelejda Kasneci. arXiv preprint arXiv:2002.06806. CoRR, 2020.

BIB

Differential Privacy for Eye Tracking with Temporal Correlations

Efe Bozkir, Onur Günlü, Wolfgang Fuhl, Rafael F. Schaefer, and Enkelejda Kasneci. arXiv preprint arXiv:2002.08972. CoRR, 2020.

BIB

Weight and Gradient Centralization in Deep Neural Networks

Wolfgang Fuhl and Enkelejda Kasneci. arXiv. CoRR, 2020.

PDF BIB

Rotated Ring, Radial and Depth Wise Separable Radial Convolutions

Wolfgang Fuhl and Enkelejda Kasneci. arXiv. CoRR, 2020.

PDF BIB Supplementary Material

2019

Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)

Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci. Eye Tracking Research and Applications, 2019.

PDF BIB

Ferns for area of interest free scanpath classification

W. Fuhl, N. Castner, T. C. Kübler, A. Lotz, W. Rosenstiel, and E. Kasneci. Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA) , 2019.

PDF BIB

Image-based extraction of eye features for robust eye tracking

W. Fuhl. PhD thesis. University of Tübingen, 2019.

PDF BIB

500,000 images closer to eyelid and pupil segmentation

W. Fuhl, W. Rosenstiel, and E. Kasneci. Computer Analysis of Images and Patterns, CAIP, 2019.

PDF BIB

The applicability of Cycle GANs for pupil and eyelid segmentation, data generation and image refinement

W. Fuhl, D. Geisler, W. Rosenstiel, and E. Kasneci. International Conference on Computer Vision Workshops, ICCVW, 2019.

PDF BIB

Learning to validate the quality of detected landmarks

W. Fuhl and E. Kasneci. International Conference on Machine Vision, ICMV, 2019.

PDF BIB

2018

PuRe: Robust Pupil Detection for Real-Time Pervasive Eye Tracking

T. Santini, W. Fuhl, and E. Kasneci. Elsevier Computer Vision and Image Understanding To Appear, 2018.

PDF BIB

PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking

T. Santini, W. Fuhl, and E. Kasneci. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA), 2018.

PDF BIB

CBF:Circular binary features for robust and real-time pupil center detection

W. Fuhl, D. Geisler, T. Santini, T. Appel, W. Rosenstiel, and E. Kasneci. ACM Symposium on Eye Tracking Research & Applications, 2018.

PDF BIB

Automatic generation of saliency-based areas of interest

W. Fuhl, T. Kübler, T. Santini, and E. Kasneci. Symposium on Vision, Modeling and Visualization (VMV), 2018.

PDF BIB

Region of interest generation algorithms for eye tracking data

W. Fuhl, T. C. Kübler, H. Brinkmann, R. Rosenberg, W. Rosenstiel, and E. Kasneci. Third Workshop on Eye Tracking and Visualization (ETVIS), in conjunction with ACM ETRA, 2018.

PDF BIB

MAM: Transfer learning for fully automatic video annotation and specialized detector creation

W. Fuhl, N. Castner, L. Zhuang, M. Holzer, W. Rosenstiel, and E. Kasneci. International Conference on Computer Vision Workshops, ICCVW, 2018.

PDF BIB

Eye movement velocity and gaze data generator for evaluation, robustness testing and assess of eye tracking software and visualization tools

W. Fuhl and E. Kasneci. Poster at Egocentric Perception, Interaction and Computing, EPIC, 2018.

PDF BIB

BORE: Boosted-oriented edge optimization for robust, real time remote pupil center detection

W. Fuhl, S. Eivazi, B. Hosp, A. Eivazi, W. Rosenstiel, and E. Kasneci. Eye Tracking Research and Applications, ETRA, 2018.

PDF BIB

Rule based learning for eye movement type detection

W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.

PDF BIB

Histogram of oriented velocities for eye movement detection

W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.

PDF BIB

Eye movement simulation and detector creation to reduce laborious parameter adjustments

W. Fuhl, T. Santini, T. Kuebler, N. Castner, W. Rosenstiel, and E. Kasneci. arXiv preprint arXiv:1804.00970, 2018.

PDF BIB

2017

Saliency Sandbox: Bottom-Up Saliency Framework

D. Geisler, W. Fuhl, T. Santini, and E. Kasneci. 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017), 2017.

PDF BIB

EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking

T. Santini, W. Fuhl, D. Geisler, and E. Kasneci. 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017), 2017.

PDF BIB

EyeLad: Remote Eye Tracking Image Labeling Tool

W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, and E. Kasneci. 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017), 2017.

PDF BIB

Fast and Robust Eyelid Outline and Aperture Detection in Real-World Scenarios

W. Fuhl, T. Santini, and E. Kasneci. IEEE Winter Conference on Applications of Computer Vision (WACV 2017), 2017.

PDF BIB

Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection

W. Fuhl, T. C. Kübler, D. Hospach, O. Bringmann, W. Rosenstiel, and E. Kasneci. Journal of Eye Movement Research 10(3), 2017.

PDF BIB

CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction

T. Santini, W. Fuhl, and E. Kasneci. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 2017.

PDF BIB

Towards Intelligent Surgical Microscopes: Surgeons Gaze and Instrument Tracking

Shahram Eivazi, Wolfgang Fuhl, and Enkelejda Kasneci. Proceedings of the 22st International Conference on Intelligent User Interfaces, IUI 2017. ACM, 2017.

PDF BIB

Towards automatic skill evaluation in microsurgery

Shahram Eivazi, Michael Slupina, Wolfgang Fuhl, Hoorieh Afkari, Ahmad Hafez, and Enkelejda Kasneci. Proceedings of the 22st International Conference on Intelligent User Interfaces, IUI 2017. ACM, 2017.

PDF BIB

PupilNet v2.0: Convolutional Neural Networks for Robust Pupil Detection

W. Fuhl, T. Santini, G. Kasneci, and E. Kasneci. CoRR, 2017.

PDF BIB

Fast camera focus estimation for gaze-based focus control

W. Fuhl, T. Santini, and E. Kasneci. CoRR, 2017.

PDF BIB

Optimal eye movement strategies: a comparison of neurosurgeons gaze patterns when using a surgical microscope

S. Eivazi, A. Hafez, W. Fuhl, H. Afkari, E. Kasneci, M. Lehecka, and R. Bednarik. Acta Neurochirurgica, 2017.

PDF BIB

2016

EyeRec: An Open-source Data Acquisition Software for Head-mounted Eye-tracking

T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci. Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP) 3: VISAPP: 386–391, 2016.

PDF BIB

ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments

W. Fuhl, T. Santini, T. C. Kübler, and E. Kasneci. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA), pages 123–130, 2016.

PDF BIB

Bayesian Identification of Fixations, Saccades, and Smooth Pursuits

T. Santini, W. Fuhl, T. C. Kübler, and E. Kasneci. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA), pages 163–170, 2016.

PDF BIB

Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art

Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, and Enkelejda Kasneci. Machine Vision and Applications, pages 1-14, 2016.

PDF BIB

Eyes Wide Open? Eyelid Location and Eye Aperture Estimation for Pervasive Eye Tracking in Real-World Scenarios

W. Fuhl, T. Santini, D. Geisler, T. C. Kübler, W. Rosenstiel, and E. Kasneci. ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct publication – PETMEI 2016, 2016.

PDF BIB

Novel methods for analysis and visualization of saccade trajectories

T. C. Kübler, W. Fuhl, R. Rosenberg, W. Rosenstiel, and E. Kasneci. 3. ECCV Workshop VISART 2016, 2016.

PDF BIB

Non-Intrusive Practitioner Pupil Detection for Unmodified Microscope Oculars

W. Fuhl, T. Santini, C. Reichert, D. Claus, A. Herkommer, H. Bahmani, K. Rifai, S. Wahl, and E. Kasneci. Elsevier Computers in Biology and Medicine 79: 36-44, 2016.

PDF BIB

Evaluation of State-of-the-Art Pupil Detection Algorithms on Remote Eye Images

W. Fuhl, D. Geisler, T. Santini, and E. Kasneci. ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct publication – PETMEI 2016, 2016.

PDF BIB

Feature-based attentional influences on the accommodation response

H. Bahmani, W. Fuhl, E. Gutierrez, G. Kasneci, E. Kasneci, and S. Wahl. Vision Sciences Society Annual Meeting Abstract, 2016.

BIB

PupilNet: Convolutional Neural Networks for Robust Pupil Detection

W. Fuhl, T. Santini, G. Kasneci, and E. Kasneci. CoRR, 2016.

PDF BIB

2015

Analysis of eye movements with Eyetrace

T. C. Kübler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, and E. Kasneci. 574: 458-471. Biomedical Engineering Systems and Technologies. Communications in Computer and Information Science (CCIS). Springer International Publishing, 2015.

PDF BIB

Eyetrace2014: Eyetracking Data Analysis Tool

K. Sippel, T. C. Kübler, W. Fuhl, G. Schievelbein, R. Rosenberg, and W. Rosenstiel. 8th International Conference on Health Informatics, Healthinf 2015, 2015.

PDF BIB

Exploiting the potential of eye movements analysis in the driving context

E. Kasneci, T. C. Kübler, C. Braunagel, W. Fuhl, W. Stolzmann, and W. Rosenstiel. 15. Internationales Stuttgarter Symposium Automobil- und Motorentechnik. Springer Fachmedien Wiesbaden, 2015.

PDF BIB

ExCuSe: Robust Pupil Detection in Real-World Scenarios

W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci. 16th International Conference on Computer Analysis of Images and Patterns (CAIP 2015), 2015.

PDF BIB

Arbitrarily shaped areas of interest based on gaze density gradient

W. Fuhl, T. C. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci. European Conference on Eye Movements, ECEM 2015, 2015.

PDF BIB

Teaching

Course Term
Programmierung in C/C++
Seminar Advanced Topics in Human-Computer Interaction
Teamprojekt Machine Learning in the Field of Eye Tracking
Teamprojekt Remote Eye Tracking
User Experience
Advanced Topics in Perception Engineering
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Programmieren mobiler eingebetteter Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Technische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme
Seminar: Advanced Topics in Perception Engineering

Research

500,000 images closer to eyelid and pupil segmentation

We propose a fully convolutional neural networkfor pupil and eyelid segmentation as well as eyelid landmark and pupil ellipsis regression. The network is jointly trained using the Log loss forsegmentation and L1 loss for landmark and ellipsis regression. The ap-plication of the proposed network is the offline processing and creationof datasets. Which can be used to train resource-saving and real-timemachine learning algorithms such as random forests. In addition, we willprovide the worlds largest eye images dataset with more than 500,000images.

Learn More

The applicability of Cycle GANs for pupil and eyelid segmentation, datageneration and image refinement

We evaluated Generative Adversarial Networks(GAN) for eyelid and pupil area segmentation, data gener-ation, and image refinement. While the segmentation GANperforms the desired task, the others serve as supportiveNetworks. The trained data generation GAN does not re-quire simulated data to increase the dataset, it simply usesexisting data and creates subsets. The purpose of the re-finement GAN, in contrast, is to simplify manual annota-tion by removing noise and occlusion in an image withoutchanging the eye structure and pupil position. In addition100,000 pupil and eyelid segmentations are made publiclyavailable for images from the labeled pupils in the wild dataset.

Learn More

Neural networks for optical vector and eye ball parameter estimation

In this work we evaluate neural networks, support vector machinesand decision trees for the regression of the center of the eyeballand the optical vector based on the pupil ellipse. In the evaluationwe analyze single ellipses as well as window-based approaches asinput. Comparisons are made regarding accuracy and runtime. Theevaluation gives an overview of the general expected accuracy withdifferent models and amounts of input ellipses. A simulator wasimplemented for the generation of the training and evaluation data.For a visual evaluation and to push the state of the art in opticalvector estimation, the best model was applied to real data. Thisreal data came from public data sets in which the ellipse is alreadyannotated by an algorithm. The optical vectors on real data and thegenerator are made publicly available.

Learn More

Eye labeling tool

Ground truth data is an important prerequisite for the development and evaluation of many algorithms in the area of computer vision, especially when these are based on convolutional neural networks or other machine learning approaches that unfold their power mostly by supervised learning. This learning relies on ground truth data, which is laborious, tedious, and error prone for humans to generate. In this paper, we contribute a labeling tool (EyeLad) specifically designed for remote eye-tracking data to enable researchers to leverage machine learning based approaches in this field, which is of great interest for the automotive, medical, and human-computer interaction applications. The tool is multi platform and supports a variety of state-of-theart detection and tracking algorithms, including eye detection, pupil detection, and eyelid coarse positioning.

Learn More

Eye Movements Identification

Approaches for segmentation and synthesis of eye-tracking data using different neural networks and machine learning approaches.

Learn More

Eyetrace

Eyetrace is a tool for analysis of eye-tracking data. It has the approach to bunch a variety of different evaluation methods for a large share of eye trackers supporting scientific work and medical diagnosis. To allow EyeTrace to be compatible to different eye trackers, an additional tool called Eyetrace Butler is used. The Eyetrace Butler performs a data preprocessing and conversion for analysis with Eyetrace. It provides plugins for different eye trackers and converts their data into a format that can be imported and used by Eyetrace.

Learn More

Intelligent Surgical Microscope

Head-mounted eye tracking offers remarkable opportunities for research and applications regarding pervasive health monitoring, mental state inference, and human computer interaction in dynamic scenarios. Although a plethora of software for the acquisition of eye-tracking data exists, they often exhibit critical issues when pervasive eye tracking is considered, e.g., closed source, costly eye tracker hardware dependencies, and requiring a human supervisor for calibration. In this paper, we introduce EyeRecToo, an open-source software for real-time pervasive head-mounted eye-tracking. Out of the box, EyeRecToo offers multiple real-time state-of-the-art pupil detection and gaze estimation methods, which can be easily replaced by user implemented algorithms if desired. A novel calibration method that allows users to calibrate the system without the assistance of a human supervisor is also integrated. Moreover, this software supports multiple head-mounted eye-tracking hardware, records eye and scene videos, and stores pupil and gaze information, which are also available as a real-time stream. Thus, EyeRecToo serves as a framework to quickly enable pervasive eye-tracking research and applications.

Learn More

Robust Pupil Detection and Gaze Estimation

The reliable estimation of the pupil position in eye images is perhaps the most important prerequisite in gaze-based HMI applications. While there are many approaches that enable accurate pupil tracking under laboratory conditions, tracking the pupil in real-world images is highly challenging due to changes in illumination, reflections on glasses or on the eyeball, off-axis camera position, contact lenses, and many more.

Learn More

Finished Thesis Topics

10.07.2019

Vein extraction and eye rotation determination

The first step is setting up an recording environment with fixed subject position. This environment is used for data acquisition with predefined head rotations of the subjects. Based on this data an algorithm has to be developed measuring the eyeball rotation of the subject. The resulting angle is then compared and validated based on the head rotation.

Read more …

10.07.2019

EyeTrace CUDA extesion

EyeTrace is a software for gaze data visualization and analysis. Due to the increasing amount of data these visualizations need more computation time. In this thesis existing visualizations should be implemented using CUDA for GPU computations. Additionally this includes a data storage model making it possible to shift the data between the GPU and the host computer. Due to the fact that nowadays not all computers have a CUDA capable card the modul should also allow CPU computations. This should be determined automatically by the module.

Read more …

10.07.2019

3D Eyeball generation based on vein motion

The first step is robust feature extraction. This can be done using SURF, SIFT, BRISK or MSER features if sufficient. Those features have to be mapped on features found in consecutive images. Based on the displacement a 3D model has to be computed. This model is used afterwards for gaze position estimation.

Read more …