Get an overview over our last research projects

Internet of Agriculture

IoA - Internet of Agriculture: Wireless Solutions for Digital Farming is a project within the framework of the strategic initiative "Agrobusiness & Food". So we apply our technical knowledge to farms and assist farmers (more...)


In the project AVIKOM, the ADAMAAS glasses are expanded with an acoustic system (Bielefeld University of Applied Sciences) for industrial use and integrated in the existing work environments of four medium-sized partner companies and Bethel in order to offer employees coordinated and adaptive audiovisual action support in the context of digitization in working environments. (more...)

© CITEC/Thomas Schack, Uni Bielefeld


As part of the VISTA project, an innovative assistance system is being developed to support drivers in the truck loading dock process. In field studies and also using a VR simulator developed in-house, multi-modal data from the drivers are recorded, combined using cross-modal learning and are finally used to model the typical behavior of an experienced truck driver using recurrent neural networks (LSTM, Echo State Networks). (more...)

Gärtners Grüner Daumen

Garender's Green thumb: In this project a system is build to assist gardeners. Out of observations under which condition garderners perfom what kind of action, the technical system will learn rules how experts of a specific gardener company behave, the philosophy behind their actions etc. This knowledge can be used to support unexperienced staff in that company during their daily routines (more...)


In the RUBYDemenz project, the effectiveness and optimization of a personalized human-robot interaction (RUBY) is being investigated which, as a contribution to the promotion of “good care”, provides flexible, situation-adapted, supplementary support, stabilization and relief in home care for people with dementia MmD) for caring relatives (pfA). As an overall intervention, RUBY includes a robotic system and psycho-social support for the user by specially trained attendants. (more ...)


ADAMAAS combines techniques from human memory research, eye tracking and physiological measurements (such as breathing rate or heart rate), object and action recognition (Computer Vision), Augmented Reality (AR), and advanced cognitive assessment and intervention techniques. The system is able to automatically detect the current state of an assembly task and provide individualized feedback at an appropriate time point. (more ...)

© CITEC/Thomas Schack, Uni Bielefeld


The main research objective of the OurPuppet project is to show that a novel form of man-machine interaction can provide support to informal caregivers such as their worries and uncertainties about the wellbeing of their loved ones can be relieved.
Furthermore, the communication between caregivers and care recipients should be fostered in order to maintain a certain level of communication continuity.
The technical system is introduced, motivated and accompanied by specially qualified puppet guides. (more...)

M3S (Modern Human-Machine Interface)

In the M3S project  a hybrid and non-invasive brain-machine interface is developed for controlling computers and external devices. A new fixation-related potentials (FRP) based Brain-Machine Interface was developed. The system was trained using Fisher’s Linear Discriminant Analysis and could classify intra-subject, single-trial event-related potentials (ERPS) with a high accuracy. (more ...)

Keep in touch

The internet is an increasingly important part of our daily lives, which changes the concept of communication. In the 21st century people are using social media: to stay connected to friends and family; to share pictures and videos and many other reasons. Unfortunately, available social media do not provide the build-in features for people with cognitive disabilities. By understanding the importance of communication, we came up with the solution of creating a tool that would be fully accessible and easy to handle for people with disabilities.  (more ...)

Mobile - Mobil im Leben

The goal of ‘Mobile - Mobil im Leben’ is to develop a way for people with cognitive or physical disabilities to easily and effectively participate and navigate in public transportation. The result is a smartphone-based navigation system using an emboddied virtual agent, that can be custom tailored to each user’s individual needs and adapt to changing traffic conditions in real time. (more...)

HIDE - HmI for Driving Enhancement

The goal of the HIDE project is to develop new forms of Human Machine Interfaces for modern cars to present all relevant driving information (i.e. from the internal car sensors as well as from the vehicle to infrastructure communication (V2I)) in such a way that the driver is not distracted from original driving task. (more...)

EIDETIC - drivEr trainIng anD assEssmenT; a dIgital approaCh

In this project a novel approach to use multiple data sources, such as in-vehicle sensors/data, eye tracking, cameras for monitoring the surrounding, as well as AI techniques to provide understadable and supportive feedback for nocive drivers. (more...)