Refine
Department, Institute
Document Type
- Conference Object (7)
- Article (2)
Keywords
- 3D user interface (2)
- 3D user interfaces (2)
- 3D User Interface (1)
- Augmented Reality (1)
- Challenges (1)
- Games (1)
- Hand Guidance (1)
- Immersion (1)
- Methodologies (1)
- Multisensory cues (1)
In this article, we report on a user study investigating the effects of multisensory cues on triggering the emotional response in immersive games. Yet, isolating the effect of a specific sensory cue on the emotional state is a difficult feat. The performed experiment is a first of a series that aims at producing usable guidelines that can be applied to reproducing similar emotional responses, as well as the methods to measure the effects. As such, we are interested in methodologies to both design effective stimuli, and assess the quality and effect thereof. We start with identifying main challenges and the followed methodology. Thereafter, we closely analyze the study results to address some of the challenges, and identify where the potential is for improving the induced stimuli (cause) and effect, as well as the analytical methods used to pinpoint the extent of the effect.
In this article, we report on challenges and potential methodologies to support the design and validation of multisensory techniques. Such techniques can be used for enhancing engagement in immersive systems. Yet, designing effective techniques requires careful analysis of the effect of different cues on user engagement. The level of engagement spans the general level of presence in an environment, as well as the specific emotional response to a set trigger. Yet, measuring and analyzing the actual effect of cues is hard as it spans numerous interconnected issues. In this article, we identify the different challenges and potential validation methodologies that affect the analysis of multisensory cues on user engagement. In doing so, we provide an overview of issues and potential validation directions as an entry point for further research. The various challenges are supported by lessons learned from a pilot study, which focused on reflecting the initial validation methodology by analyzing the effect of different stimuli on user engagement.
When navigating larger virtual environments and computer games, natural walking is often unfeasible. Here, we investigate how alternatives such as joystick- or leaning-based locomotion interfaces ("human joystick") can be enhanced by adding walking-related cues following a sensory substitution approach. Using a custom-designed foot haptics system and evaluating it in a multi-part study, we show that adding walking related auditory cues (footstep sounds), visual cues (simulating bobbing head-motions from walking), and vibrotactile cues (via vibrotactile transducers and bass-shakers under participants' feet) could all enhance participants' sensation of self-motion (vection) and involement/presence. These benefits occurred similarly for seated joystick and standing leaning locomotion. Footstep sounds and vibrotactile cues also enhanced participants' self-reported ability to judge self-motion velocities and distances traveled. Compared to seated joystick control, standing leaning enhanced self-motion sensations. Combining standing leaning with a minimal walking-in-place procedure showed no benefits and reduced usability, though. Together, results highlight the potential of incorporating walking-related auditory, visual, and vibrotactile cues for improving user experience and self-motion perception in applications such as virtual reality, gaming, and tele-presence.
Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays
(2020)
In presence of conflicting or ambiguous visual cues in complex scenes, performing 3D selection and manipulation tasks can be challenging. To improve motor planning and coordination, we explore audio-tactile cues to inform the user about the presence of objects in hand proximity, e.g., to avoid unwanted object penetrations. We do so through a novel glove-based tactile interface, enhanced by audio cues. Through two user studies, we illustrate that proximity guidance cues improve spatial awareness, hand motions, and collision avoidance behaviors, and show how proximity cues in combination with collision and friction cues can significantly improve performance.
We present a novel forearm-and-glove tactile interface that can enhance 3D interaction by guiding hand motor planning and coordination. In particular, we aim to improve hand motion and pose actions related to selection and manipulation tasks. Through our user studies, we illustrate how tactile patterns can guide the user, by triggering hand pose and motion changes, for example to grasp (select) and manipulate (move) an object. We discuss the potential and limitations of the interface, and outline future work.