ANR muDialBot Project

MUlti-party perceptually-active situated DIALog for human-roBOT interaction

In muDialBot, our ambition is to proactively incorporate human-like behavioral traits in human-robot spoken communication. We aim to reach a new stage in harnessing the rich information provided by audio and visual data streams from humans. In particular, extracting verbal and non-verbal events should enhance the decision-making abilities of robots to manage turns of speech more naturally and also switch from group interactions to face-to-face dialogues according to the situation.

There has been growing interest recently in companion robots capable of assisting individuals in their daily lives and effectively communicating with them. These robots are perceived as social entities, and their relevance to health and psychological well-being has been highlighted in studies. Patients, their families, and healthcare professionals will better appreciate the potential of these robots as certain limitations are quickly overcome, such as their ability to move, see, and listen to communicate naturally with humans, beyond what touchscreen displays and voice commands already enable.

The scientific and technological outcomes of the project will be implemented on a commercial social robot and tested and validated with multiple use cases in the context of a day hospital unit. Large-scale data collection will complement in-situ tests to fuel future research.

List of Partners:

Project Coordinator: LIA

Scientific Manager for LIA: Fabrice LEFEVRE

Start Date: 01/01/2021 End Date: 31/12/2024

More