Human Computer Interaction

Efe Bozkir

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 70494
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
efe.bozkir@uni-tuebingen.de
Office
Sand 14, C220
Office hours
on appointment

If you are a student looking for a project in the field of VR/AR/ML and below topics are interesting for you, contact me.

Research Interests

  • Virtual/Augmented/Mixed reality
  • Machine learning
  • Information privacy
  • Eye movements and gaze-based interaction

Education and Academic Degrees

  • Doctoral studies in Computer Science, Eberhard Karls University of Tübingen, Germany, July 2018 - Present
  • M.Sc. in Computer Science, Technical University of Munich, Germany, September 2016
  • B.Sc. in Computer Engineering, Istanbul Technical University, Turkey, February 2014

Professional Experience

  • IT Consultant, Netlight Consulting, Munich & Hamburg, Germany, May 2017 - July 2018
  • Software Developer, Texas Instruments, Freising, Germany, May 2015 - September 2016

Publications

2021

Digital Transformations of Classrooms in Virtual Reality

Hong Gao, Efe Bozkir, Lisa Hasenbein, Jens-Uwe Hahn, Richard Göllner, and Enkelejda Kasneci. CHI Conference on Human Factors in Computing Systems (CHI ’21). ACM, 2021.

PDF BIB

FakeNewsPerception: An eye movement dataset on the perceived believability of news stories

Ömer Sümer, Efe Bozkir, Thomas Kübler, Sven Grüner, Sonja Utz, and Enkelejda Kasneci. Data in Brief 35, 2021.

BIB

Exploiting Object-of-Interest Information to Understand Attention in VR Classrooms

Efe Bozkir, Philipp Stark, Hong Gao, Lisa Hasenbein, Jens-Uwe Hahn, Enkelejda Kasneci, and Richard Göllner. 2021 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2021.

PDF BIB

2020

Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework

Efe Bozkir, Ali B. Ünal, Mete Akgün, Enkelejda Kasneci, and Nico Pfeifer. Eye Tracking Research and Applications. ACM, 2020.

PDF BIB

Reinforcement learning for the privacy preservation and manipulation of eye tracking data

Wolfgang Fuhl, Efe Bozkir, and Enkelejda Kasneci. arXiv preprint arXiv:2002.06806. CoRR, 2020.

BIB

Eye Tracking Data Collection Protocol for VR for Remotely Located Subjects using Blockchain and Smart Contracts

Efe Bozkir, Shahram Eivazi, Mete Akgün, and Enkelejda Kasneci. IEEE International Conference on Artifical Intelligence and Virtual Reality (AIVR), 2020.

BIB

Differential Privacy for Eye Tracking with Temporal Correlations

Efe Bozkir, Onur Günlü, Wolfgang Fuhl, Rafael F. Schaefer, and Enkelejda Kasneci. arXiv preprint arXiv:2002.08972. CoRR, 2020.

BIB

2019

Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)

Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci. Eye Tracking Research and Applications, 2019.

PDF BIB Supplementary Material

Person Independent, Privacy Preserving, and Real Time Assessment of Cognitive Load using Eye Tracking in a Virtual Reality Setup

Efe Bozkir, David Geisler, and Enkelejda Kasneci. The IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Workshops, 2019.

PDF BIB

Assessment of Driver Attention During a Safety Critical Situation in VR to Generate VR-based Training

Efe Bozkir, David Geisler, and Enkelejda Kasneci. ACM Symposium on Applied Perception 2019, 2019.

PDF BIB

Open Thesis Topics

Assigned Thesis Topics

Finished Thesis Topics

12.03.2021

Towards avatar interaction and teleportation in virtual environments

This work focuses on the design and implementation of a virtual environment. This environment is intended to investigate avatar interaction as well as teleportation in virtual reality. Two teleportation methods are thereby implemented: gaze teleportation and automatic teleportation. In the gaze teleportation, the environment can further serve to explore gaze-based interaction. A gallery serves as a setting in which a virtual guide provides information about paintings. The interaction with the guide consists of giving speech or controller commands regarding the information about paintings. Furthermore, the environment can be customized in terms of its paintings and the possible positions to be teleported to. Overall, the environment serves as an extensible platform for research projects to study different topics in virtual reality. Using the environment as a template can save time in the required software development.

Read more …

23.10.2020

Towards understanding attention in virtual reality - Analysing visual attention in a VR-Classroom experiment

Attention can be seen as a key aspect of learning. Most of children’s everyday learning takes place in a classroom. But investigating children’s attention and learning in a real-world classroom can be difficult. Therefore, we used an Immersive Virtual Reality classroom to investigate children’s attention in a 14 minute virtual lesson. We collected information about the objects children had looked at. With the gazed object information, we analysed the total time spent on specific objects of interest (peer learners, teacher, screen) or investigated children’s visual attention behaviour with scanpath analysis (ScanMatch, SubsMatch). The study was conducted as a between design with three different classroom manipulations regarding participants sitting position, the avatar style of the peer learners and their hand raising behaviour. We found significant differences regarding children’s visual attention for the position they are seated in the classroom and regarding the visual appearance of the peer learners. Additionally, we found indications that children also process social information in the virtual classroom due to effects of the hand raising condition on children’s visual attention. These findings can be seen as a first step towards understanding children’s visual attention in an Immersive Virtual Reality classroom.

Read more …

21.06.2019

Effectiveness of augmented reality for human performance in assembly

Augmented Reality (AR) has evolved recently and has emerged as one of the most promising technologies for assisting human operators with assembly, which is highly in demand in today’s manufacturing environments. However, there is still a lack of empirical studies investigating the effectiveness of AR for human performance in assembly tasks. An empirical study with 20 participants was conducted to counter this lack by comparing the use of printed instructions to the use of AR as instructional medium. Three models of a fischertechnik kit were assembled using the same instructional medium for each assembly. Times of assembly, error rates, cognitive load and usability aspects were measured to compare both media. Even though time of task completion could not be significantly improved under the use of AR, especially reading the instructions and finding required storage boxes benefited from using AR. Moreover, error rates for multiple types of error were significantly decreased using AR instructions. Major limitations in this application for AR-aided human assembly arose from a lack of alignment between virtual and physical objects, along with the limited field of view. Transferring similar applications to the industrial environment can be considered in the near future.

Read more …