Refine
H-BRS Bibliography
- yes (12)
Departments, institutes and facilities
Document Type
- Conference Object (8)
- Article (3)
- Doctoral Thesis (1)
Year of publication
Language
- English (12)
Has Fulltext
- no (12)
Keywords
- Augmented Reality (3)
- guidance (3)
- 3D user interface (2)
- 3D user interfaces (2)
- multisensory cues (2)
- 3D User Interface (1)
- Challenges (1)
- Games (1)
- Hand Guidance (1)
- Human factors (1)
Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays
(2020)
Selection Performance and Reliability of Eye and Head Gaze Tracking Under Varying Light Conditions
(2024)
We present a novel forearm-and-glove tactile interface that can enhance 3D interaction by guiding hand motor planning and coordination. In particular, we aim to improve hand motion and pose actions related to selection and manipulation tasks. Through our user studies, we illustrate how tactile patterns can guide the user, by triggering hand pose and motion changes, for example to grasp (select) and manipulate (move) an object. We discuss the potential and limitations of the interface, and outline future work.
In presence of conflicting or ambiguous visual cues in complex scenes, performing 3D selection and manipulation tasks can be challenging. To improve motor planning and coordination, we explore audio-tactile cues to inform the user about the presence of objects in hand proximity, e.g., to avoid unwanted object penetrations. We do so through a novel glove-based tactile interface, enhanced by audio cues. Through two user studies, we illustrate that proximity guidance cues improve spatial awareness, hand motions, and collision avoidance behaviors, and show how proximity cues in combination with collision and friction cues can significantly improve performance.
This research investigates the efficacy of multisensory cues for locating targets in Augmented Reality (AR). Sensory constraints can impair perception and attention in AR, leading to reduced performance due to factors such as conflicting visual cues or a restricted field of view. To address these limitations, the research proposes head-based multisensory guidance methods that leverage audio-tactile cues to direct users' attention towards target locations. The research findings demonstrate that this approach can effectively reduce the influence of sensory constraints, resulting in improved search performance in AR. Additionally, the thesis discusses the limitations of the proposed methods and provides recommendations for future research.