Project duration

01.02.2020 - 31.01.2023

Funded by

Project website

Short description

In the RUBYDemenz project, the effectiveness and optimization of a personalized human-robot interaction (RUBY) is being investigated which, as a contribution to the promotion of “good care”, provides flexible, situation-adapted, supplementary support, stabilization and relief in home care for people with dementia MmD) for caring relatives (pfA). As an overall intervention, RUBY includes a robotic system and psycho-social support for the user by specially trained attendants.

Our part

Perception of liveliness while increasing the effectiveness of the RUBYDemenz robotic system

The Rhein-Waal University of Applied Sciences focuses its main activities in the project on the conception and implementation of the control component of the robot, which decides which action (including linguistic dialogue) should be carried out in a given context. The overall robot system should appear “alive” and at the same time be effective in terms of “good care”. For effective communication, the system should adapt itself to the user's communication style.



To help people suffering on dementia and the whole family

Our view on the project: The basis of the RubyDemenz system is the technical ELISA system, which was developed in the “OurPuppet” project. ELISA has some static programs. These static programs define sequences of basic actions (predefined dialogues, switching devices, movements of the head, etc.) that the ELISA system can carry out. Depending on the answers given by the user during a dialogue, different sequences can be defined for existing situations. A communication partner who “always says or does the same things” will, over time, be perceived as boring, not “alive” and uninteresting. Several variants that take user-specific characteristics into account are therefore necessary. However, the characteristics of the user are usually difficult to configure. One reason for this is that the user is usually not aware of them himself. It should also be mentioned that these peculiarities intensify as the disease progresses. The system must therefore learn from the experiences it has made in different interaction situations and take the learned knowledge into account in future interaction situations and adjust to its communication partner like a living, intelligent being. Verbal interaction is supported in the ELISA system by a low level of non-verbal communication skills. The underlying system can look at the user, open his eyes, move his mouth and the corners of his mouth to express friendliness and attention through a smile. The project partners want to increase the expressiveness of the robot. The new ability to express must then be incorporated into the interaction. For this purpose, it is planned to develop an extended communication model, which is to be implemented as a multidimensional state model and supports non-verbal communication.

Impressions & Documents

This is the robot we are using and we have developed in the OurPuppet project
For the music we have to give credits to (music by)