Dr. Efe Bozkir

Photo of

University of Tübingen
Dpt. of Computer Science
Human-Computer Interaction
Sand 14
72076 Tübingen
Germany

Telephone
+49 - (0) 70 71 - 29 - 70494
Telefax
+49 - (0) 70 71 - 29 - 50 62
E-Mail
efe.bozkir@uni-tuebingen.de
Office
Sand 14, C220
Office hours
on appointment

Publications on Google Scholar

Profile on Linkedin

This page is slightly outdated. Please refer to my personal page for more info.

Research Interests

  • Virtual/Augmented/Mixed reality
  • Machine learning
  • Information privacy
  • Eye movements and gaze-based interaction

Education and Academic Degrees

  • Dr. rer. nat. in Computer Science, Eberhard Karls University of Tübingen, Germany, July 2018 - March 2022
  • M.Sc. in Computer Science, Technical University of Munich, Germany, April 2014 - September 2016
  • B.Sc. in Computer Engineering, Istanbul Technical University, Turkey, September 2008 - February 2014

Previous Professional Experience

  • IT Consultant, Netlight Consulting, Munich & Hamburg, Germany, May 2017 - June 2018
  • Software Developer, Texas Instruments, Freising, Germany, May 2015 - September 2016

Publications

2022

Exploring Gender Differences in Computational Thinking Learning in a VR Classroom: Developing Machine Learning Models Using Eye-Tracking Data and Explaining the Models

Hong Gao, Lisa Hasenbein, Efe Bozkir, Richard Göllner, and Enkelejda Kasneci. International Journal of Artificial Intelligence in Education, pages 1–26. Springer, 2022.

PDF BIB

Evaluating the Effects of Virtual Human Animation on Students in an Immersive VR Classroom Using Eye Movements

Hong Gao, Lisa Hasenbein, Efe Bozkir, Richard Göllner, and Enkelejda Kasneci. Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology (VRST ’22). ACM, 2022.

PDF BIB

Towards Everyday Virtual Reality through Eye Tracking

Efe Bozkir. PhD thesis. University of Tübingen, 2022.

PDF BIB

2021

Reinforcement learning for the privacy preservation and manipulation of eye tracking data

Wolfgang Fuhl, Efe Bozkir, and Enkelejda Kasneci. Proceedings of International Conference on Artificial Neural Networks, 2021.

BIB

Differential Privacy for Eye Tracking with Temporal Correlations

Efe Bozkir, Onur Günlü, Wolfgang Fuhl, Rafael F. Schaefer, and Enkelejda Kasneci. PLoS ONE, 2021.

PDF BIB

Digital Transformations of Classrooms in Virtual Reality

Hong Gao, Efe Bozkir, Lisa Hasenbein, Jens-Uwe Hahn, Richard Göllner, and Enkelejda Kasneci. CHI Conference on Human Factors in Computing Systems (CHI ’21). ACM, 2021.

PDF BIB

FakeNewsPerception: An eye movement dataset on the perceived believability of news stories

Ömer Sümer, Efe Bozkir, Thomas Kübler, Sven Grüner, Sonja Utz, and Enkelejda Kasneci. Data in Brief 35, 2021.

PDF BIB

Exploiting Object-of-Interest Information to Understand Attention in VR Classrooms

Efe Bozkir, Philipp Stark, Hong Gao, Lisa Hasenbein, Jens-Uwe Hahn, Enkelejda Kasneci, and Richard Göllner. 2021 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2021.

PDF BIB

2020

Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework

Efe Bozkir, Ali Burak Ünal, Mete Akgün, Enkelejda Kasneci, and Nico Pfeifer. Eye Tracking Research and Applications. ACM, 2020.

PDF BIB

Eye Tracking Data Collection Protocol for VR for Remotely Located Subjects using Blockchain and Smart Contracts

Efe Bozkir, Shahram Eivazi, Mete Akgün, and Enkelejda Kasneci. IEEE International Conference on Artifical Intelligence and Virtual Reality (AIVR), 2020.

PDF BIB

2019

Encodji: Encoding Gaze Data Into Emoji Space for an Amusing Scanpath Classification Approach ;)

Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C., and Enkelejda Kasneci. Eye Tracking Research and Applications, 2019.

PDF BIB Supplementary Material

Person Independent, Privacy Preserving, and Real Time Assessment of Cognitive Load using Eye Tracking in a Virtual Reality Setup

Efe Bozkir, David Geisler, and Enkelejda Kasneci. The IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Workshops, 2019.

PDF BIB

Assessment of Driver Attention During a Safety Critical Situation in VR to Generate VR-based Training

Efe Bozkir, David Geisler, and Enkelejda Kasneci. ACM Symposium on Applied Perception 2019, 2019.

PDF BIB

Open Thesis Topics

Finished Thesis Topics

04.02.2022

Universalizing the VR Experience - A Web Approach

VR technology has grown immensely in the last few years, yet the great majority of people have not experienced it. One reason is that the software is too specialized and the hardware is prohibitively costly. This Bachelor Thesis investigates the feasibility of web-based VR applications in combination with accessible smartphone VR devices. This contributes to the question of how universally accessible can VR applications be to individuals. As the subject of the web-based VR application a simple chess game was chosen, which includes a single-player (AI) and a multiplayer mode. To control individual pieces you can either choose between voice commands or a gaze pointer. As lens gears, two low-tech head-mounted devices (HMD), the Zeiss VR One Plus and the Google Cardboard V2 are tested. These HMDs were combined with two different smartphones, the Google Pixel 6 (2021) and the Samsung Galaxy A5 (2016). This broad array of devices accommodates old and modern smartphones, as well as cheap and more expensive HMDs, thus representing the equipment of the broader public. First, the thesis describes extensively the game development process, as this app was written from scratch. Second, the game performance is taken into account, which is measured by the loading times, the video quality, and the influences the hardware and different 3D resources can have on these factors. Finally, it is evaluated how much impact the combination of the hardware has on these more novel web VR approaches. All together newer smartphones, like the Google Pixel 6 perform the best in these quality points, and in combination with not too complex meshes in the 3D scene, the VR browser chess is offering a satisfying user experience, especially with the multiplayer attribute. However, some weaker hardware devices, such as the Samsung Galaxy A5 lack computation power to accommodate the profound rendering. Thus, web-VR is a promising technology branch, which in theory everybody could experience with little money, nonetheless, some smartphones are exempted from an immersive experience. Based on the recognitions gained in this thesis, investigations on the user experience of web-based VR applications in combination with smartphone HMD devices can be conducted with actual person studies. Additionally, the program can be extended to make use of the camera and perform eye-tracking studies or realize hand-tracking. These features would allow the user to actually interact physically with the VR application and thus realize a novel, truly XR, low-tech web application.

Read more …

28.10.2021

Data augmentations in mixed reality machine learning applications

Machine learning models have accomplished much in the modern day. Nevertheless, they are reliant on big datasets to have practical relevance. Since it is not always possible to obtain masses of data, augmentations of already present data has become an appealing alternative. In this thesis, a data augmentation system is proposed that uses a mixed reality environment to create augmented image data for the task of classification. In order to achieve this, ArUco markers are tracked in a picture, which are then used to insert any virtual object onto the marker using a homography. Finally, the augmentations are evaluated by training a neural network with the augmented and real data as input datasets. The proposed system achieves augmentations, which can partly substitute real model data in a machine learning application. This indicates the possibility to create data augmentations using a mixed reality approach that can expand or substitute existing datasets with augmented pictures in an image classification task trained on a machine learning model.

Read more …

12.03.2021

Towards avatar interaction and teleportation in virtual environments

This work focuses on the design and implementation of a virtual environment. This environment is intended to investigate avatar interaction as well as teleportation in virtual reality. Two teleportation methods are thereby implemented: gaze teleportation and automatic teleportation. In the gaze teleportation, the environment can further serve to explore gaze-based interaction. A gallery serves as a setting in which a virtual guide provides information about paintings. The interaction with the guide consists of giving speech or controller commands regarding the information about paintings. Furthermore, the environment can be customized in terms of its paintings and the possible positions to be teleported to. Overall, the environment serves as an extensible platform for research projects to study different topics in virtual reality. Using the environment as a template can save time in the required software development.

Read more …

23.10.2020

Towards understanding attention in virtual reality - Analysing visual attention in a VR-Classroom experiment

Attention can be seen as a key aspect of learning. Most of the children’s everyday learning takes place in a classroom. But investigating children’s attention and learning in a real-world classroom can be difficult. Therefore, we used an Immersive Virtual Reality classroom to investigate children’s attention in a 14-minute virtual lesson. We collected information about the objects children had looked at. With the gazed object information, we analysed the total time spent on specific objects of interest (peer learners, teacher, screen) or investigated children’s visual attention behaviour with scanpath analysis (ScanMatch, SubsMatch). The study was conducted as a between design with three different classroom manipulations regarding participants sitting position, the avatar style of the peer learners and their hand raising behaviour. We found significant differences regarding children’s visual attention for the position they are seated in the classroom and regarding the visual appearance of the peer learners. Additionally, we found indications that children also process social information in the virtual classroom due to effects of the hand raising condition on children’s visual attention. These findings can be seen as a first step towards understanding children’s visual attention in an Immersive Virtual Reality classroom.

Read more …

21.06.2019

Effectiveness of augmented reality for human performance in assembly

Augmented Reality (AR) has evolved recently and has emerged as one of the most promising technologies for assisting human operators with assembly, which is highly in demand in today’s manufacturing environments. However, there is still a lack of empirical studies investigating the effectiveness of AR for human performance in assembly tasks. An empirical study with 20 participants was conducted to counter this lack by comparing the use of printed instructions to the use of AR as instructional medium. Three models of a fischertechnik kit were assembled using the same instructional medium for each assembly. Times of assembly, error rates, cognitive load and usability aspects were measured to compare both media. Even though time of task completion could not be significantly improved under the use of AR, especially reading the instructions and finding required storage boxes benefited from using AR. Moreover, error rates for multiple types of error were significantly decreased using AR instructions. Major limitations in this application for AR-aided human assembly arose from a lack of alignment between virtual and physical objects, along with the limited field of view. Transferring similar applications to the industrial environment can be considered in the near future.

Read more …