Dr. Nora Castner

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 70492
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
nora.castner@zeiss.com
Office
Maria von Linden str. 6,
Office hours
on appointment

Profile on Linkedin

Research Interests

  • Eye Tracking
  • Scanpath Analysis
  • Expert/Novice task related eye movement behavior
  • Machine learning and deep learning approaches to expert gaze behavior modeling
  • Gaze Feedback in learning interventions
  • Pupil Detection in rodents for neuroscience research

Network

  • Former PhD Associate at LEAD Research Network

Publications

2023

Watch out for those bananas! Gaze Based Mario Kart Performance Classification

Wolfgang Fuhl, Björn Severitt, Nora Castner, Babette Bühler, Johannes Meyer, Daniel Weber, Regine Lendway, Ruikun Hou, and Enkelejda Kasneci. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, pages 1–2, 2023.

PDF BIB

Exploring the Effects of Scanpath Feature Engineering for Supervised Image Classification Models

Sean Anthony Byrne, Virmarie Maquiling, Adam Peter Frederick Reynolds, Luca Polonio, Nora Castner, and Enkelejda Kasneci. Proceedings of the ACM on Human-Computer Interaction 7(ETRA): 1–18. ACM New York, NY, USA, 2023.

PDF BIB

Leveraging Eye Tracking in Digital Classrooms: A Step Towards Multimodal Model for Learning Assistance

Sean Anthony Byrne, Nora Castner, Ard Kastrati, Martyna Plomecka, William Schaefer, Enkelejda Kasneci, and Zoya Bylinskii. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, pages 1–6, 2023.

PDF BIB

Old or Modern? A Computational Model for Classifying Poem Comprehension using Microsaccades

Patrizia Lenhart, Enkeleda Thaqi, Nora Castner, and Enkelejda Kasneci. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, pages 1–2, 2023.

PDF BIB

Gaze Patterns of Dentists while Evaluating Bitewing Radiographs

Lubaina T Arsiwala-Scheppach, Nora Castner, Csaba Rohrer, Sarah Mertens, Enkelejda Kasneci, Jose Eduardo Cejudo Grano de Oro, Joachim Krois, and Falk Schwendicke. Journal of Dentistry, pages 104585. Elsevier, 2023.

PDF BIB

2022

LSTMs can distinguish dental expert saccade behavior with high” plaque-urracy”

Nora Castner, Jonas Frankemölle, Constanze Keutel, Fabian Huettig, and Enkelejda Kasneci. 2022 Symposium on Eye Tracking Research and Applications, pages 1–7, 2022.

PDF BIB

A gaze-based study design to explore how competency evolves during a photo manipulation task

Nora Castner, Bela Umlauf, Ard Kastrati, Martyna Beata Płomecka, William Schaefer, Enkelejda Kasneci, and Zoya Bylinskii. 2022 Symposium on Eye Tracking Research and Applications, pages 1–3, 2022.

PDF BIB

2020

Gaze and visual scanpath features for data-driven expertise recognition in medical image inspection

N.J. Castner. PhD thesis. University of Tübingen, 2020.

PDF BIB

Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing

Nora Castner, Thomas C Kübler, Juliane Richter, Therese Eder, Fabian Huettig, Constanze Keutel, and Enkelejda Kasneci. Eye Tracking Research and Applications. ACM, 2020.

PDF BIB

Exploiting the GBVS for Saliency aware Gaze Heatmaps

David Geisler, Daniel Weber, Nora Castner, and Enkelejda Kasneci. ACM Symposium on Eye Tracking Research and Applications. ACM, 2020.

PDF BIB

A MinHash approach for fast scanpath classification

David Geisler, Nora Castner, Gjergji Kasneci, and Enkelejda Kasneci. Eye Tracking Research and Applications. ACM, 2020.

PDF BIB

Pupil diameter differentiates expertise in dental radiography visual search

Nora Castner, Tobias Appel, Thérése Eder, Juliane Richter, Katharina Scheiter, Constanze Keutel, Fabian Hüttig, Andrew Duchowski, and Enkelejda Kasneci. PLOS ONE 15(5): 1-19. Public Library of Science, 2020.

PDF BIB

Towards expert gaze modeling and recognition of a user’s attention in realtime

Nora Castner, Lea Geßler, David Geisler, Fabian Hüttig, and Enkelejda Kasneci. Procedia Computer Science 176. Elsevier, 2020.

PDF BIB

2019

Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)

Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci. Eye Tracking Research and Applications, 2019.

PDF BIB Supplementary Material

Ferns for area of interest free scanpath classification

W. Fuhl, N. Castner, T. C. Kübler, A. Lotz, W. Rosenstiel, and E. Kasneci. Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA) , 2019.

PDF BIB Supplementary Material

2018

Overlooking: The nature of gaze behavior and anomaly detection in expert dentists

Nora Castner, Solveig Klepper, Lena Kopnarski, Fabian Hüttig, Constanze Keutel, Katharina Scheiter, Juliane Richter, T. Eder, and Enkelejda Kasneci. Workshop on Modeling Cognitive Processes from Multimodal Data (MCPMD’18 ), 2018.

PDF BIB

Development and Evaluation of a Gaze Feedback System Integrated into EyeTrace

K. Otto, N. Castner, D. Geisler, and E. Kasneci. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA) , 2018.

PDF BIB

Scanpath comparison in medical image reading skills of dental students

N. Castner, E. Kasneci, T. C. Kübler, K. Scheiter, and J. Richter. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA), 2018.

PDF BIB

MAM: Transfer learning for fully automatic video annotation and specialized detector creation

W. Fuhl, N. Castner, L. Zhuang, M. Holzer, W. Rosenstiel, and E. Kasneci. International Conference on Computer Vision Workshops, ICCVW, 2018.

PDF BIB Supplementary Material

Rule based learning for eye movement type detection

W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.

PDF BIB Supplementary Material

Histogram of oriented velocities for eye movement detection

W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.

PDF BIB Supplementary Material

Eye movement simulation and detector creation to reduce laborious parameter adjustments

W. Fuhl, T. Santini, T. Kuebler, N. Castner, W. Rosenstiel, and E. Kasneci. arXiv preprint arXiv:1804.00970, 2018.

PDF BIB Supplementary Material

2017

Selective suppression of local circuits during movement preparation in the mouse motor cortex

Masashi Hasegawa, Kei Majima, Takahide Itokazu, Takakuni Maki, Urban-Raphael Albrecht, Nora Castner, Mariko Izumo, Kazuhiro Sohya, Tatsuo K Sato, Yukiyasu Kamitani, and others. Cell reports 18(11): 2676–2686. Elsevier, 2017.

PDF BIB

Using Eye Tracking to Evaluate and Develop Innovative Teaching Strategies for Fostering Image Reading Skills of Novices in Medical Training

N. Castner, S. Eivazi, K. Scheiter, and E. Kasneci. Eye Tracking Enhanced Learning (ETEL2017), 2017.

PDF BIB

Teaching

Course Term
Eye-based Human-Computer Interaction
Eye-based Human-Computer Interaction
User Experience
User Experience
Eye Movements and Visual Perception

Open Thesis Topics

Finished Thesis Topics

01.09.2020

A Deep Learning Approach for Expertise Classification using Saccade Behavior

Eye movements reflect the cognitive advantage of experts over novices in a domain specific task. Current literature focuses on fixations, but leaves out saccades. This research investigates the gaze behavior of dentistry students and expert dentists viewing orthopantomograms (OPTs). All proposed Long Short-Term Memory (LSTM) models were able to distinguish expert and novice gaze behavior by saccade features above guess chance, with the best performing feature having an accuracy of 77.1%. The results provide further evidence for the holistic model of image perception, which proposes that experts initially analyze an image globally, and then proceed with a focal analysis. Further, our results show that saccade features are important to understand expert gaze behavior, and therefore should get integrated into current theories on expertise.

Read more …

01.01.2020

Differences in attention in near vs. far hand conditions during propaganda viewing

The propaganda images used during the active time of the Nationalsozialistische Deutsche Arbeiterpartei (NSDAP), were one of the most influential images used to promote a certain ideology. In modern times, these images seem outdated. However, do these propaganda images still influence our perception of Adolf Hitler and his ideology? Using eye tracking, we analyzed the attention distribution of people touching these pictures to find influences of the “near-hand phenomenon”, which explains a more sympathetic affect towards people we touch. It was found that the near-hand viewing condition leads to lower viewing duration and higher number of fixations, indicating a tendency to devote more attention to the images when touching them than compared to the control group.

Read more …