||Designing robots with socio-emotional skills is a challenging task. Robots need to be able to provide not only physical, but also social support to their human users, and to engage in interactions with them in various applications, including healthcare, education, entertainment, and guidance in museums and shopping malls. The availability of commercial robots and developments in academia provide us a positive outlook, however, the capabilities of current social robots are still limited.
The main challenge is understanding humans' intentions and behaviours, and how to model these for designing naturalistic, human-inspired behaviours via robots. To address this challenge successfully requires understanding the components of social interaction, including nonverbal behaviours such as gaze, silence, fast speech, interpersonal distance, posture, hand and head gestures, and facial and bodily expressions. To create socio-emotionally intelligent robots, these nonverbal cues need to be interpreted in terms of higher level phenomena such as first impressions, personality and emotions, and in turn defining optimal behaviours to express these phenomena through robotic platforms in an appropriate and timely manner. Achieving this requires the fields of psychology, computer vision, signal processing, machine learning, affective computing and human-robot interaction to constantly interact with one another. This talk will focus on recognition of affect and perceived personality, and will present an overview of the recent research works my team has conducted in these areas in the context of human-computer-robot interactions.