Refine
H-BRS Bibliography
- yes (2)
Departments, institutes and facilities
Document Type
- Conference Object (2) (remove)
Language
- English (2)
Has Fulltext
- no (2)
Keywords
- user study (2) (remove)
Aim of this study is to investigate the effects of user experience (UX) on shopping mall customers’ intention to use a social robot. Therefore, we used a Wizard of Oz approach that enabled data collection in situ. Quantitative data was obtained from a questionnaire completed by shopping mall customers who interacted with a social robot. Data was used in a regression analysis, where user experience factors served as predictors for robot use in retail. The regression model explains up to 23.2% of the variance in customers’ intention to use a social robot. In addition, we collected qualitative data on human-robot-interactions and used the data to complement the interpretation of statistical results. Our findings suggest that only hedonic qualities significantly contribute to the prediction of customers’ intention, that shopping mall customers are reluctant to grant pragmatic qualities to social robots, and that UX evaluation in HRI requires additional predictors.
Large, high-resolution displays are highly suitable for creation of digital environments for co-located collaborative task solving. Yet, placing multiple users in a shared environment may increase the risk of interferences, thus causing mental discomfort and decreasing efficiency of the team. To mitigate interferences coordination strategies and techniques were introduced. However, in a mixed-focus collaboration scenarios users switch now and again between loosely and tightly collaboration, therefore different coordination techniques might be required depending on the current collaboration state of team members. For that, systems have to be able to recognize collaboration states as well as transitions between them to ensure a proper adjustment of the coordination strategy. Previous studies on group behavior during collaboration in front of large displays investigated solely collaborative coupling states, not transitions between them though. To address this gap, we conducted a study with 12 participant dyads in front of a tiled display and let them solve two tasks in two different conditions (focus and overview). We looked into group dynamics and categorized transitions by means of changes in proximity, verbal communication, visual attention, visual interface, and gestures. The findings can be valuable for user interface design and development of group behavior models.