Workshop
This workshop is part of the 23rd ACM International Conference on Multimodal Interaction (ICMI) to be held in Montreal, Canada. This is a sequel from the workshops held during ICMI 2018 (“Modelling Cognitive Processes from Multimodal Signals”) and ICMI 2020 (“Workshop on modelling socio-emotional and cognitive processes from multimodal data in the wild”).
Multimodal signal processing in HRI and HCI is increasingly entering into a more applied stage, reaching the point of having systems providing engaging interaction experiences in everyday life contexts. These behaviors may have been adequately understood and trained in one context, but they may perform rather poorly when deployed in the wild.
Multimodal signal processing is essential for the design of more intelligent, adaptive and even empathic applications in the wild. However, important issues remain largely unresolved: Starting from low-level processing and integration of noisy data streams, over theoretical pitfalls, up to increasingly pressing ethical questions about what artificial systems and machine learning can and should do.
In this workshop, we will provide a forum for discussion of the state-of-the-art in modeling user states from multimodal signals in the wild. The aim is to focus on human-robot adaptive systems with live feedback from body dynamics and physiological sensing. We look forward to works that combine measures of socio-emotional engagement, mental effort, stress, and dynamics of bodily signals with measures of cognitive load to develop more robust and predictive models.
Topics of interest
- Studies bringing multimodal research between the laboratory and the wild.
- Cognition-adaptive human-computer interfaces.
- Body dynamics analysis for load identification.
- Modelling of emotions and social actions.
- Eye tracking and attention detection.
- Multimodal engagement, attention, stress, memory and workload estimation.
- Modeling and response estimation with biological signals (e.g., EEG, EDA, EMG, HR).
- Experiment design for cognitive analysis.
- Interdisciplinary collaborations to understand the underpinnings of multimodal data.
Keynote Speaker
Valeria Villani
A Framework for Affect-Based Natural Human-Robot Interaction
Valeria Villani is Assistant Professor at the Department of Sciences and Methods for Engineering of the University of Modena and Reggio Emilia since 2017. Her research interests focus on the design of human-centred user interfaces for efficient cooperation between the human and industrial machines or robots and biomedical signal processing for robot control and affective human-robot interaction. She received her B.Sc. and M.Sc. in Biomedical Engineering from the University Campus Bio-Medico of Rome in 2006 and 2009, respectively, and her Ph.D. in Biomedical Engineering from the University Campus Bio-Medico of Rome in 2013, focusing on biomedical signal processing, with emphasis on ECG signals. She was the recipient of the Best Paper Award at ISABEL 2011 and the Mortara Fellowship at CinC 2014. She has served as associate Editor for IEEE ICRA since 2018 and as Guest Editor for the Special Issue on Human-Robot Collaboration in Industrial Applications of Mechatronics (Elsevier) in 2018. She was General Chair for the 12th International Workshop on Human-Friendly Robotics (HFR 2019) and co-organized the Workshop “WORKMATE 2018: the WORKplace is better with intelligent, collaborative, robot MATEs” at IEEE ICRA 2018.
Important Dates
-
6 of August of 2021 (extended)
Submission deadline
-
31 of August of 2021
Notifications of acceptance
-
18 of September of 2021
Camera-ready versions
-
22 of October of 2021
Workshop date
Submission
Submissions for this workshop are using the following ACM SIG templates:
- Full paper: 8 page limit, excluding references (+ optional auxiliary materials).
- Short paper: 4 page limit, with 1 extra page for references and appendices only.
- Poster abstract: 3 Page limit, with 1 extra page for references only.
Links to the ACM SIG templates are available on the ACM website (please use the “sample-sigconf.tex” template, in single column). An Overleaf template for all three submission formats is directly available here. Word authors can find the ACM interim layout template here. Papers should be submitted via the Microsoft Conference Management Toolkit.
All accepted papers will be archived in ICMI ACM proccedings available online.
Workshop schedule
-
8:00am - 8:10am
Opening remarks by workshop organizers
-
8:10am - 9:00am
Keynote talk by Valeria Villani (Recording)
-
9:00am - 9:20am
Tempsett Neal, Khadija Zanna and Shaun Canavan
Clustering of Physiological Signals by Emotional State, Race and Sex (Recording)
-
9:20am - 9:30am
Sayeed Kizuk and Pascal Fortin
Mindscape: Transforming Multimodal Physiological Signals into an Application Specific Reference Frame (Recording)
- 9:30am - 9:50am
Hendrik Voss, Heiko Wersing and Stefan Kopp
Addressing data scarcity in multimodal user state recognition by combining semi-supervised and supervised learning (Recording)
- 9:50am - 10:00am
Ehsan Sobhani, Kian Jalaleddini, Nerea Urrestilla Anguiozar and David St-Onge
Neuromuscular Performance and Injury Risk Assessment Using Fusion of Multimodal Biophysical and Cognitive Data (Recording)
- 10:00am - 10:20am
Coffee Break
- 10:20am - 10:40am
Kana Miyamoto, Hiroki Tanaka and Satoshi Nakamura
Meta-Learning for Emotion Prediction from EEG while Listening to Music (Recording)
- 10:40am - 10:50am
Marcel Kaufmann, Katherine Sheridan and Giovanni Beltrame
Towards Human-in-the Loop Autonomous Multi-Robot Operations (Recording)
- 10:50am - 11:10am
Andreas Foltyn and Jessica Deuschel
Towards Reliable Multimodal Stress Detection under Distribution Shift (Recording)
- 11:10am
Closing remarks
- 9:30am - 9:50am
Organizers
Dennis Küster
University of Bremen (Germany)
Is a senior researcher and science manager at the interdisciplinary high-profile area “Minds, Media, Machines” at the University of Bremen, Germany. He was awarded his PhD degree on the role of social context for facial expressions at Jacobs University Bremen, Germany, in 2008. His research interests focus on multimodal sensing of emotional states, social signaling, attention, and engagement in interaction with cognitive systems and social robots. He studies emotions, empathy, and attention from a multi-level appraisal perspective based on biological, subjective, and behavioral measures.
Felix Putze
University of Bremen (Germany)
Is a research group leader at the Cognitive Systems Lab at the University of Bremen, Germany. His research interests lie in cognitive adaptive interaction systems, on which he wrote his PhD thesis at the Karlsruhe Institute of Technology in 2014. One of the central information sources of such systems are multimodal interfaces which estimate user states such as workload, attention, or confusion. In his research project DINCO, he works on the detection of individual interaction obstacles and competencies. He was the lead organizer of the above-mentioned 2018 ICMI workshop.
David St-Onge
École de Technologie Supérieure (Canada)
Is an associate professor in the Department of Mechanical Engineering at ÉTS and director of the Lab of Intuitive and Natural Interaction for Teleoporated (INIT) Robots. He holds a PhD in mechanical engineering, space robotics, from University Laval (Canada) where he also completed his master’s degree on aircraft control. His research bridges many disciples, strengthened by his network: REPARTI (human-centered robotics) and Hexagram (art-science-technology). He is currently conducting research on robotic swarms control through expressive motions (human and robots) and cognitive ergonomy in robot teams operation.
Tanja Schultz
University of Bremen (Germany)
Is a professor within the Faculty of Mathematics / Computer Sciences at the University of Bremen since 2015. She is head of the “Cognitive Systems” area. In her research she concentrates on cognitive, technical systems for human-machine interaction based on language and biosignals. She connects machine-based learning processes with innovations in biosignal processing, such as “silent language communication” and “brain-to-text” systems. Tanja Schultz is a speaker of the “Mind, Media, Machines” high-profile area at the University of Bremen.
Pascal E. Fortin
McGill University (Canada)
Is a PhD candidate at the Shared Reality Lab (SRL) under the supervision of Prof. Jeremy Cooperstock. His research focuses on the design and evaluation of interfaces enabling closed-loop interactions with mobile computing systems such as smartphones. He is particularly interested in how the perception of system-triggered interactions impacts users’ psychophysiological states and how these changes can be harnessed to adapt the system’s information presentation strategy. Aside from his core research, Pascal is actively involved in a number of haptics projects lying at the intersection of social and physiological computing.
Nerea Urrestilla
École de Technologie Supérieure (Canada)
Is a PhD candidate at the Lab of INIT Robots under the supervision of Prof. David St-Onge. She holds a bachelor and masters degree in biomedical engineering from University of Navarra (Spain), and also completed an internship in Aalborg University (Denmark) on transferring human motion knowledge into robotic systems. Her main focus is on the extraction of cognitive load from the user with the use of body dynamics, adapting the robotic behaviour to the user workload.
Contact
This workshop is part of the 23rd ACM International Conference on Multimodal Interaction (ICMI) to be held in Montreal, Canada.