The workshop is organized as part of MuC conference

Mensch und Computer 2023
3 – 6 September in Zürichsee

Mensch und Computer 2023 (mensch-und-computer.de)

Workshop Schedule

9:00 – 10:00  – Opening and introductions
10:00-10:30  – Paper session 1
10:30-11:00 – Coffee
11:00-12:00 – Paper session 2 & 3
12:00-12:30 – Interactivity I
12:30-14:00 – Lunch
14:00-15:30 –  Interactivity 2
15:30-16:00 – Coffee and Demos
16:00-16:30 – Discussion and closing
16:30 – End of the workshop

The field of robotics has evolved to include social interactions with humans. Research has investigated features such as gaze, head movements, nodding, and body orientation to improve the naturalness of robot behavior in social situations. The adoption of technology such as robots, and their acceptance in social contexts, depend on technical capabilities, design, contextual requirements, and other social factors. We know from previous research that signaling intent clearly has been found to improve trust, acceptance, and situational awareness in human-robot interaction and collaboration. This workshop aims to examine how intent communication impacts the user experience and social interaction with robots in public spaces or social settings. We hope to engage both researchers and practitioners interested in this subfield of human-robot interaction.

This is the second iteration of the in-presence RoboX workshop at Mensch Und Computer conference, addressing interaction with robots, this time focusing especially on intent communication and social interaction in public spaces. This workshop aims to bring together researchers, designers, and practitioners working in human-robot interaction and neighboring disciplines to explore intent communication, user experience, and interaction with social robots. 

Participants will discuss new possibilities, and engage in collaborative dialogue. We aim for the workshop to stimulate discussion about the current state of research. The workshop should also serve as a platform for expanding the social robotics research network, as well as building a roadmap for future research.

The workshop invites submissions of case studies, applications, methodological notes, as well as position papers, related but not limited to topics such as:

  • verbal and non-verbal communication with robots
  • multimodal communication with robots
  • interaction with social robots (e.g., in public spaces)
  • interaction with robots in cross-cultural context
  • accessible interaction with robots and inclusive HRI design
  • conducting field studies with robots(e.g., in public spaces)
  • transfer of learnings from other domains (e.g., automated driving, mobility)
  • simulation of human-robot interaction
  • service concepts integrating robots as part of the service experience
  • user experience with robots in different contexts (e.g., healthcare)
  • robots in education, rescue or other professional contexts
  • robots and HRI in public spaces or mobility contexts
  • telepresence or robots as avatars
  • emotions and affect with robots
  • social robots as mediators of human-human interactions

Important dates

Submission: 16.06.2023

Notification: 03.07.2023

Final submission: 14.07.2023

Workshop 03.09.2023

Submission instructions

Submit your non-anonymized workshop papers via Easy Chair through this link.
The workshop papers should follow the ACM template, be the length of 2-4 pages, and not be anonymized. The submissions will be reviewed by the workshop organizers and other experts on the field.
Templates:
Overleaf & LaTeX, and use documentclass[sigconf,screen]{acmart}
Word template

List of Accepted Papers:

Will be updated

Workshop Organizers:

Jonna Häkkilä, University of Lapland, Finland
Khaled Kassem, Tu Wien, Austria
Emma Kirjavainen, University of Lapland, Finland
Johannes Kraus, Ulm University, Germany
Florian Michahelles, Tu Wien, Austria
Heiko Müller, University of Oldenburg, Germany
Bastian Pfleging, TU Bergakademie Freiberg, Germany
Norman Seyffer, TU Bergakademie Freiberg, Germany
Kai Erik Trost, HDM Stuttgart, Germany