Volltext-Downloads (blau) und Frontdoor-Views (grau)

FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

  • This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Availability

Statistics

frontdoor_oas
Metadaten
Document Type:Conference Object
Language:English
Author:Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, Bernhard Riecke
Parent Title (English):CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, April 25–30, 2020, Honolulu, HI, USA
First Page:1
Last Page:14
ISBN:978-1-4503-6708-0
DOI:https://doi.org/10.1145/3313831.3376481
Publisher:ACM
Date of first publication:2020/04/25
Note:
© 2020 Association for Computing Machinery. Abstracting with credit is permitted.
Departments, institutes and facilities:Fachbereich Informatik
Institute of Visual Computing (IVC)
Dewey Decimal Classification (DDC):0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 006 Spezielle Computerverfahren
Entry in this database:2020/04/29