Human Computer Interaction

Projects

Our research is concerned with intelligent computing systems that capture and infer user's cognitive state (e.g. cognitive load, engagement, etc.), task-related expertise, actions and intentions based on multimodal data. On this basis, information for the adaptation of digital media, user interfaces, auxiliary technologies and assistance systems is made available for many activities of daily life and made usable for an individualized human-machine interaction.

The research work ranges from methods for head-worn technologies including eye-tracking (such as those used in miniaturized devices and VR/AR) to AI methods for the efficient processing of multimodal data (eye movement data, physiological sensor data and image material) to holistic ones, real-time capable, user-centered adaptive solutions. An important focus of our research is also the development of privacy-friendly AI methods for multimodal human-machine interaction.

Many of our Datasets and Algorithms can be found here.