[iva] [CfP] Deadline Extension - GENEA Workshop @ICMI 2022
Carla De Oliveira Viegas
cviegas at andrew.cmu.edu
Mon Jul 18 10:54:29 CEST 2022
Dear colleagues,
We have extended the submission deadline for the 3rd Workshop on Generation
and Evaluation of Non-verbal Behaviour for Embodied Agents (GENEA). *Abstracts
are now due on July 26 and papers on July 29.* The workshop will be
co-located with the International Conference on Multimodal Interaction
(ICMI 2022), to be held in Bangalore and online. Accepted submissions will
be included in the adjunct ACM ICMI proceedings.
More information can be found in the call for papers text below. Apologies
if you receive this message more than once.
Thank you,
Carla Viegas
on behalf of the GENEA Workshop 2022 organisers
/********************************************************************
Call for papers: GENEA (Generation and Evaluation of Non-verbal Behaviour
for Embodied Agents) Workshop 2022 at ICMI 2022 (hybrid)
Date: 7 or 11 November, 2022 (Exact date to be defined)
Website: https://genea-workshop.github.io/2022/workshop
********************************************************************/
Important dates
*********************
July 26, 2022 – Abstract deadline
July 20 July 29, 2022 – Submission deadline
Aug. 10, 2022 – Notification of acceptance
Aug. 17, 2022 – Deadline for camera-ready papers
Nov. 7 or 11, 2022 – Workshop (at ICMI)
Overview
*********************
GENEA 2022 is the third GENEA workshop and an official workshop of ACM
ICMI’22, which will take place either in Bangalore, India, and online. Accepted
submissions will be included in the adjunct ACM ICMI proceedings.
Generating non-verbal behaviours, such as gesticulation, facial expressions
and gaze, is of great importance for natural interaction with embodied
agents such as virtual agents and social robots. At present, behaviour
generation is typically powered by rule-based systems, data-driven
approaches, and their hybrids. For evaluation, both objective and
subjective methods exist, but their application and validity are frequently
a point of contention.
This workshop asks, “What will be the behaviour-generation methods of the
future? And how can we evaluate these methods using meaningful objective
and subjective metrics?” The aim of the workshop is to bring together
researchers working on the generation and evaluation of non-verbal
behaviours for embodied agents to discuss the future of this field. To
kickstart these discussions, we invite all interested researchers to submit
a paper for presentation at the workshop.
Paper topics include (but are not limited to) the following
********************
-
Automated synthesis of facial expressions, gestures, and gaze movements
-
Audio- and music-driven nonverbal behaviour synthesis
-
Closed-loop nonverbal behaviour generation (from perception to action)
-
Nonverbal behaviour synthesis in two-party and group interactions
-
Emotion-driven and stylistic nonverbal behaviour synthesis
-
New datasets related to nonverbal behaviour
-
Believable nonverbal behaviour synthesis using motion-capture and 4D
scan data
-
Multi-modal nonverbal behaviour synthesis
-
Interactive/autonomous nonverbal behavior generation
-
Subjective and objective evaluation methods for nonverbal behaviour
synthesis
-
Guidelines for nonverbal behaviours in human-agent interaction
Submission guidelines
*********************
We accept long (8 pages) and short (4 pages) paper submissions for
double-blind review, all in the double-column ACM conference format. Pages
containing only references do not count toward the page limit for any of
the paper types. Submissions should be made in PDF format through
OpenReview.
To encourage authors to make their work reproducible and reward the effort
that this requires, we have introduced the GENEA Workshop Reproducibility
Award <https://genea-workshop.github.io/2021/#reproducibility-award>.
Keynote speakers
********************************
Judith Holler <https://www.ru.nl/english/people/holler-j/>, Donders
Institute for Brain, Cognition and Behaviour, Radboud University,
Netherlands
Judith is an Associate Professor and PI at the Donders Institute for Brain,
Cognition, & Behaviour (Radboud University) and leader of the research
group Communication in Social Interaction at the Max Planck Institute for
Psycholinguistics. Judith has been a Marie Curie Fellow and currently holds
an ERC Consolidator grant. The focus of her work is on the interplay of
speech and visual bodily signals from the hands, head, face, and eye gaze,
in communicating meaning in interaction. In her scientific approach, she
combines analyses of natural language corpora with experimental testing,
and methods from a wide range of fields, including gesture studies,
linguistics, psycholinguistics, and neuroscience. In her most recent
projects, she combines these methods also with cutting-edge tools and
techniques, such as virtual reality, mobile eye-tracking, and dual-EEG to
further our insights into multimodal communication and coordination in
social interaction.
Carlos Ishi <https://dil.atr.jp/~carlos/>, RIKEN, Guardian Robot Project;
ATR, Hiroshi Ishiguro Lab., Japan
Carlos T. Ishi received his Ph.D. degree in engineering from The University
of Tokyo, Japan. He joined ATR Intelligent Robotics and Communication Labs
in 2005 and became the group leader of the Dept. of Sound Environment
Intelligence in 2013. He joined the Guardian Robot Project, RIKEN, in 2020.
His research topics include analysis and processing of dialogue speech and
non-verbal behaviors applied for human-robot interaction.
Organisers
********************************
Pieter Wolfert Ghent University, Belgium
Taras Kucherenko SEED, Electronic Arts, Sweden
Zerrin Yumak <http://www.zerrinyumak.com/> Utrecht University, The
Netherlands
Gustav Eje Henter KTH Royal Institute of Technology, Sweden
Youngwoo Yoon ETRI, South Korea
Carla Viegas CMU, NOVA University Lisbon, Portugal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.uni-bielefeld.de/mailman2/unibi/public/iva-list/attachments/20220718/09a9b669/attachment-0001.html>
More information about the iva-list
mailing list