[iva] *New Issue* ACM Transactions on Human-Robot Interaction 12(4) December 2023
young at cs.umanitoba.ca
young at cs.umanitoba.ca
Wed Jan 3 16:23:27 CET 2024
============================================
ACM Transactions on Human-Robot Interaction
============================================
We are pleased to announce the publication of Volume 12, Issue 4, December 2023.
https://dl.acm.org/toc/thri/2023/12/4
NEWS: ACM THRI received it's first Impact Factor, the 2022 IF is 5.1 (Clarivate Analytics Journal Citation Reports).
============================================
Perspectives Articles:
============================================
Affective Corners as a Problematic for Design Interactions
Katherine Harrison, Ericka Johnson
Abstract: Domestic robots are already commonplace in many homes, while humanoid companion robots like Pepper are increasingly becoming part of different kinds of care work. Drawing on fieldwork at a robotics lab, as well as our personal encounters with domestic ...
DOI: https://doi.org/10.1145/3596452.
============================================
Regular Papers:
============================================
It Takes Two: Using Co-creation to Facilitate Child-Robot Co-regulation
Mike E. U. Ligthart, Mark A. Neerincx, Koen V. Hindriks
Abstract: While interacting with a social robot, children have a need to express themselves and have their expressions acknowledged by the robot—a need that is often unaddressed by the robot, due to its limitations in understanding the expressions of children. To ...
DOI: https://doi.org/10.1145/3593812.
Virtual, Augmented, and Mixed Reality for Human-robot Interaction: A Survey and Virtual Design Element Taxonomy
Michael Walker, Thao Phung, Tathagata Chakraborti, Tom Williams, Daniel Szafir
Abstract: Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) has been gaining considerable attention in HRI research in recent years. However, the HRI community lacks a set of shared terminology and framework for characterizing aspects of ...
DOI: https://doi.org/10.1145/3597623.
The Power of Robot-mediated Play: Forming Friendships and Expressing Identity
Verónica Ahumada-Newhart, Margaret Schneider, Laurel D. Riek
Abstract: Tele-operated collaborative robots are used by many children for academic learning. However, as child-directed play is important for social-emotional learning, it is also important to understand how robots can facilitate play. In this article, we present ...
DOI: https://doi.org/10.1145/3611656.
Introduction to the Special Issue on Sound in Human-Robot Interaction
Frederic Robinson, Hannah Pelikan, Katsumi Watanabe, Luisa Damiano, Oliver Bown, Mari Velonaki
DOI: https://doi.org/10.1145/3632185.
Nonverbal Sound in Human-Robot Interaction: A Systematic Review
Brian J. Zhang, Naomi T. Fitter
Abstract: Nonverbal sound offers great potential to enhance robots’ interactions with humans, and a growing body of research has begun to explore nonverbal sound for tasks such as sound source localization, explicit communication, and improving sociability. However,...
DOI: https://doi.org/10.1145/3583743.
Is Someone There or Is That the TV? Detecting Social Presence Using Sound
Nicholas C. Georgiou, Rebecca Ramnauth, Emmanuel Adeniran, Michael Lee, Lila Selin, Brian Scassellati
Abstract: Social robots in the home will need to solve audio identification problems to better interact with their users. This article focuses on the classification between (a) natural conversation that includes at least one co-located user and (b) media that is ...
DOI: https://doi.org/10.1145/3611658.
“Who Said That?” Applying the Situation Awareness Global Assessment Technique to Social Telepresence
Adam K. Coyne, Keshav Sapkota, Conor McGinn
Abstract: As with all remotely controlled robots, successful teleoperation of social and telepresence robots relies greatly on operator situation awareness; however, existing situation awareness measurements, most being originally created for military purposes, are ...
DOI: https://doi.org/10.1145/3592801.
Sounding Robots: Design and Evaluation of Auditory Displays for Unintentional Human-robot Interaction
Bastian Orthmann, Iolanda Leite, Roberto Bresin, Ilaria Torre
Abstract: Non-verbal communication is important in HRI, particularly when humans and robots do not need to actively engage in a task together, but rather they co-exist in a shared space. Robots might still need to communicate states such as urgency or availability, ...
DOI: https://doi.org/10.1145/3611655.
The Effects of Natural Sounds and Proxemic Distances on the Perception of a Noisy Domestic Flying Robot
Ziming Wang, Ziyi Hu, Björn Rohles, Sara Ljungblad, Vincent Koenig, Morten Fjeld
Abstract: When flying robots are used in close-range interaction with humans, the noise they generate, also called consequential sound, is a critical parameter for user acceptance. We conjecture that there is a benefit in adding natural sounds to noisy domestic ...
DOI: https://doi.org/10.1145/3579859.
New Design Potentials of Non-mimetic Sonification in Human–Robot Interaction
Elias Naphausen, Andreas Muxel, Jan Willmann
Abstract: With the increasing use and complexity of robotic devices, the requirements for the design of human–robot interfaces are rapidly changing and call for new means of interaction and information transfer. On that scope, the discussed project—being developed ...
DOI: https://doi.org/10.1145/3611646.
Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification
Adrian B. Latupeirissa, Claudio Panariello, Roberto Bresin
Abstract: This article presents three studies where we probe aesthetics strategies of sound produced by movement sonification of a Pepper robot by mapping its movements to sound models. We developed two sets of sound models. The first set was made by two sound ...
DOI: https://doi.org/10.1145/3585277.
The Sound of Swarm. Auditory Description of Swarm Robotic Movements
Maria Mannone, Valeria Seidita, Antonio Chella
Abstract: Movements of robots in a swarm can be mapped to sounds, highlighting the group behavior through the coordinated and simultaneous variations of musical parameters across time. The vice versa is also possible: Sound parameters can be mapped to robotic ...
DOI: https://doi.org/10.1145/3596203.
Robots’ “Woohoo” and “Argh” Can Enhance Users’ Emotional and Social Perceptions: An Exploratory Study on Non-lexical Vocalizations and Non-linguistic Sounds
Xiaozhen Liu, Jiayuan Dong, Myounghoon Jeon
Abstract: As robots have become more pervasive in our everyday life, social aspects of robots have attracted researchers’ attention. Because emotions play a crucial role in social interactions, research has been conducted on conveying emotions via speech. Our study ...
DOI: https://doi.org/10.1145/3626185.
Which Voice for which Robot? Designing Robot Voices that Indicate Robot Size
Kerstin Fischer, Oliver Niebuhr
Abstract: Many social robots will have the capacity to interact via speech in the future, and thus they will have to have a voice. However, so far it is unclear how we can create voices that fit their robotic speakers. In this article, we explore how robot voices ...
DOI: https://doi.org/10.1145/3632124.
============================================
ACM THRI welcomes contributions from across HRI and Robotics. For details on the journal, information for authors, and upcoming Special Issues, please visit the ACM THRI website: http://thri.acm.org
Odest Chadwicke Jenkins
Selma Sabanovic
ACM THRI Editors-in-Chief
James Young, University of Manitoba
ACM THRI Managing Editor
More information about the iva-list
mailing list