Current project
Context
My research is part of the GRACE (GRoups' Analysis for automated Cohesion Estimation) fundamental research project. This project is a 3-years ANR (French National Agency) funded project in the frame of its Technological Research JCJC program (GRACE, project ANR-18-CE33-0003). It started in April 2019 and it will be completed by June 2022.
Scope of the project
The project aims at developing a computational model of cohesion among humans able to integrate the Task and Social dimensions of cohesion and to account for their relationship and development over time. The model will be fed with multimodal nonverbal descriptors of cohesion computed at individual and group levels.
The project encompasses the following scientific, technological, and community-building objectives:
Scientific objectives: to gain a deeper understanding of cohesion and, in particular, of the structural and temporal relationship between its major components, that is the task and social components. Grounding on existing research in Social Sciences, GRACE will produce a relevant breakthrough by providing novel quantitative evidence to enhance our scientific knowledge on cohesion and solid foundations for developing algorithms for its automated estimation.
Technological objectives: to investigate suitable technological solutions for collecting multimodal data from small groups. GRACE will explore several sensing platform settings by looking at the recent improvements in wearable motion-capture technology and computer-vision based algorithms for multiparty detection and tracking; to develop software modules for automatically estimating cohesion and its components, accounting also for the temporal dimension. This will offer new opportunities to develop, in the future, apps to enhance interactions among humans as well as among humans and machines (e.g. virtual agents and robots).
Community-building objectives: to improve scientific exchanges among researchers with the final aim to contribute to building an interdisciplinary scientific community working on emergent states and sharing the same research questions and methodological workflows. This could significantly enhance the number and the quality of scientific collaborations and to provide bases for new scientific projects at national as well as European level.
The impact of GRACE is expected in terms of:
- Requirements for a new generation of software applications capable of providing feedback on group processes (e.g. in meeting, surgery)
- Endowing artificial agents (e.g. virtual agents, robots) with skills to monitor and trigger cooperative behaviors both in everyday activities and in specialized tasks. This will open new market opportunities and increase the competitiveness of companies in the area of Social Signal Processing
Achievements
Collection of the GAME-ON multimodal dataset
In October 2019, we collected a multimodal dataset at Casa Paganini - InfoMus, Genoa, Italy. This dataset is composed of more than 30 hours of video, audio, and motion capture data (i.e., rotations and translations of 17 points of the participants' body). More details about the dataset are available in the open access paper describing the data collection (See paper).
If you are interested to know more or to work with this new multimodal dataset dedicated to the study of cohesion and other constructs such as leadership, please feel free to reach out to me!
ICMI 2021 Best Paper Award
On the occasion of the 23rd ACM International Conference on Multimodal Interaction (ICMI 2021) in Montreal (Canada), our paper entitled "Exploiting the Interplay between Social and Task Dimensions of Cohesion to Predict its Dynamics Leveraging Social Sciences" received the Best Paper Award among the 93 accepted papers!
Details about the paper can be found on the ICMI'21 proceedings (abstract and paper) while a french abstract is available on the Télécom Paris' Newsroom.
The author version of the paper can also be downloaded HERE.
Latest updates
Make sure you follow GRACE on its website and social media (Twitter and Instagram).