Dr. Nora Castner

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany
- Telephone
- +49 - (0) 70 71 - 29 - 70492
- Telefax
- +49 - (0) 70 71 - 29 - 50 62
- nora.castner@uni-tuebingen.de
- Office
- Sand 14, C206
- Office hours
- on appointment
Research Interests
- Eye Tracking
- Scanpath Analysis
- Expert/Novice task related eye movement behavior
- Machine learning and deep learning approaches to expert gaze behavior modeling
- Gaze Feedback in learning interventions
- Pupil Detection in rodents for neuroscience research
Network
- Former PhD Associate at LEAD Research Network
Publications
2020
Gaze and visual scanpath features for data-driven expertise recognition in medical image inspection
N.J. Castner. PhD thesis. University of Tübingen, 2020.
Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing
Nora Castner, Thomas C Kübler, Juliane Richter, Therese Eder, Fabian Huettig, Constanze Keutel, and Enkelejda Kasneci. Eye Tracking Research and Applications. ACM, 2020.
Exploiting the GBVS for Saliency aware Gaze Heatmaps
David Geisler, Daniel Weber, Nora Castner, and Enkelejda Kasneci. Eye Tracking Research and Applications. ACM, 2020.
A MinHash approach for fast scanpath classification
David Geisler, Nora Castner, Gjergji Kasneci, and Enkelejda Kasneci. Eye Tracking Research and Applications. ACM, 2020.
Pupil diameter differentiates expertise in dental radiography visual search
Nora Castner, Tobias Appel, Thérése Eder, Juliane Richter, Katharina Scheiter, Constanze Keutel, Fabian Hüttig, Andrew Duchowski, and Enkelejda Kasneci. PLOS ONE 15(5): 1-19. Public Library of Science, 2020.
Towards expert gaze modeling and recognition of a user’s attention in realtime
Nora Castner, Lea Geßler, David Geisler, Fabian Hüttig, and Enkelejda Kasneci. Procedia Computer Science 176. Elsevier, 2020.
2019
Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)
Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci. Eye Tracking Research and Applications, 2019.
Ferns for area of interest free scanpath classification
W. Fuhl, N. Castner, T. C. Kübler, A. Lotz, W. Rosenstiel, and E. Kasneci. Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA) , 2019.
2018
Overlooking: The nature of gaze behavior and anomaly detection in expert dentists
Nora Castner, Solveig Klepper, Lena Kopnarski, Fabian Hüttig, Constanze Keutel, Katharina Scheiter, Juliane Richter, T. Eder, and Enkelejda Kasneci. Workshop on Modeling Cognitive Processes from Multimodal Data (MCPMD’18 ), 2018.
Development and Evaluation of a Gaze Feedback System Integrated into EyeTrace
K. Otto, N. Castner, D. Geisler, and E. Kasneci. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA) , 2018.
Scanpath comparison in medical image reading skills of dental students
N. Castner, E. Kasneci, T. C. Kübler, K. Scheiter, and J. Richter. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA), 2018.
MAM: Transfer learning for fully automatic video annotation and specialized detector creation
W. Fuhl, N. Castner, L. Zhuang, M. Holzer, W. Rosenstiel, and E. Kasneci. International Conference on Computer Vision Workshops, ICCVW, 2018.
Rule based learning for eye movement type detection
W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.
Histogram of oriented velocities for eye movement detection
W. Fuhl, N. Castner, and E. Kasneci. International Conference on Multimodal Interaction Workshops, ICMIW, 2018.
Eye movement simulation and detector creation to reduce laborious parameter adjustments
W. Fuhl, T. Santini, T. Kuebler, N. Castner, W. Rosenstiel, and E. Kasneci. arXiv preprint arXiv:1804.00970, 2018.
2017
Selective suppression of local circuits during movement preparation in the mouse motor cortex
Masashi Hasegawa, Kei Majima, Takahide Itokazu, Takakuni Maki, Urban-Raphael Albrecht, Nora Castner, Mariko Izumo, Kazuhiro Sohya, Tatsuo K Sato, Yukiyasu Kamitani, and others. Cell reports 18(11): 2676–2686. Elsevier, 2017.
Using Eye Tracking to Evaluate and Develop Innovative Teaching Strategies for Fostering Image Reading Skills of Novices in Medical Training
N. Castner, S. Eivazi, K. Scheiter, and E. Kasneci. Eye Tracking Enhanced Learning (ETEL2017), 2017.
Teaching
Course | Term |
---|---|
Eye-based Human-Computer Interaction | |
Eye-based Human-Computer Interaction | |
Eye Movements and Visual Perception |
Open Thesis Topics
Finished Thesis Topics
01.09.2020
A Deep Learning Approach for Expertise Classification using Saccade Behavior
- Eye movements reflect the cognitive advantage of experts over novices in a domain specific task. Current literature focuses on fixations but leaves out saccades. This research investigates the gaze behavior of dentistry students and expert dentists viewing orthopantomograms (OPTs). All proposed Long Short-Term Memory (LSTM) models were able to distinguish expert and novice gaze behavior by saccade features above guess chance, with the best performing feature having an accuracy of 77.1%. The results provide further evidence for the holistic model of image perception, which proposes that experts initially analyze an image globally, and then proceed with a focal analysis. Further, our results show that saccade features are important to understand expert gaze behavior, and therefore should get integrated into current theories on expertise.
01.01.2020
Differences in attention in near vs. far hand conditions during propaganda viewing
- The propaganda images used during the active time of the Nationalsozialistische Deutsche Arbeiterpartei (NSDAP), were one of the most influential images used to promote a certain ideology. In modern times these images seem outdated. However, do these propaganda images still influence our perception of Adolf Hitler and his ideology? Using eye tracking, we analyzed the attention distribution of people touching these pictures to find influences of the “near-hand phenomenon”, which explains a more sympathetic affect towards people we touch. It was found that the near-hand viewing condition leads to lower viewing duration and higher number of fixations, indicating a tendency to devote more attention to the images when touching them than compared to the control group.