Refine
Departments, institutes and facilities
Document Type
- Conference Object (40)
- Article (25)
- Book (monograph, edited volume) (2)
- Contribution to a Periodical (1)
Year of publication
Keywords
- 3D user interface (7)
- virtual reality (6)
- haptics (4)
- Augmented Reality (3)
- 3D user interfaces (2)
- Augmented reality (2)
- Navigation (2)
- Perception (2)
- Virtual Reality (2)
- Virtual reality (2)
- guidance (2)
- human factors (2)
- interface design (2)
- leaning (2)
- multisensory cues (2)
- navigation (2)
- peripheral vision (2)
- spatial updating (2)
- vibration (2)
- 3D User Interface (1)
- 3D interfaces (1)
- 3D navigation (1)
- Active locomotion (1)
- Adaptive Control (1)
- Auditory Cueing (1)
- Camera selection (1)
- Camera view analysis (1)
- Challenges (1)
- Co-located work (1)
- Cognitive informatics (1)
- Cybersickness (1)
- Demonstration-based training (1)
- Entropy (1)
- Feedback (1)
- Fixed spatial data (1)
- Focus plus context (1)
- Games (1)
- Geospatial modeling (1)
- Group behavior analysis (1)
- Hand Guidance (1)
- Handheld Augmented Reality (1)
- Head Mounted Display (1)
- Head-mounted Display (1)
- Human computer interaction (1)
- Human factors (1)
- Immersion (1)
- Immersive analytics (1)
- Information interaction (1)
- Instruction design (1)
- Large display interaction (1)
- Large, high-resolution displays (1)
- Lighting simulation (1)
- Locomotion (1)
- Methodologies (1)
- Mobile spatial interaction (1)
- Motion Sickness (1)
- Multi-Modal Interaction (1)
- Multi-camera (1)
- Multilayer interaction (1)
- Multisensory cues (1)
- Navigation interface (1)
- Online 3D reconstruction (1)
- Out-of-view Objects (1)
- Performance (1)
- Presence (1)
- Proximity (1)
- Recommender systems (1)
- Tactile Feedback (1)
- Tactile feedback (1)
- Tiled displays (1)
- Touchscreen interaction (1)
- Travel Techniques (1)
- User Study (1)
- User engagement (1)
- User interface (1)
- User interfaces (1)
- VR (1)
- View selection (1)
- Visual Cueing (1)
- Visual Discrimination (1)
- Visualization (1)
- adaptive trigger (1)
- audio-tactile feedback (1)
- augmented, and virtual realities (1)
- back-of-device interaction (1)
- bass-shaker (1)
- body-centric cues (1)
- collaboration (1)
- collision (1)
- controller design (1)
- cybersickness (1)
- depth perception (1)
- embodied interfaces (1)
- flying (1)
- full-body interface (1)
- gaming (1)
- hand guidance (1)
- haptic feedback (1)
- haptic interfaces (1)
- human computer interaction (1)
- human-centric lighting (1)
- immersion (1)
- immersive systems (1)
- information display methods (1)
- interaction techniques (1)
- leaning, self-motion perception (1)
- leaning-based interfaces (1)
- locomotion (1)
- locomotion interface (1)
- mobile applications (1)
- mobile projection (1)
- motion cueing (1)
- motion sickness (1)
- multi-layer display (1)
- multisensory interface (1)
- natural user interface (1)
- navigational search (1)
- pen interaction (1)
- peripheral visual field (1)
- projection based systems (1)
- proxemics (1)
- psychophysics (1)
- robotics (1)
- see-through head-mounted displays (1)
- semi-continuous locomotion (1)
- sensemaking (1)
- short-term memory (1)
- situation awareness (1)
- spatial augmented reality (1)
- spatial orientation (1)
- spectral rendering (1)
- stereoscopic vision (1)
- surface textures (1)
- teleportation (1)
- telepresence (1)
- travel techniques (1)
- user engagement (1)
- user study (1)
- vection (1)
- view management (1)
- virtual environments (1)
- virtual locomotion (1)
- visuohaptic feedback (1)
- weight perception (1)
- whole-body interface (1)
- zooming interface (1)
- zooming interfaces (1)
3D User Interfaces
(2005)
Recent studies have shown that through a careful combination of multiple sensory channels, so called multisensory binding effects can be achieved that can be beneficial for collision detection and texture recognition feedback. During the design of a new pen-input device called Tactylus, specific focus was put on exploring multisensory effects of audiotactile cues to create a new, but effective way to interact in virtual environments with the purpose to overcome several of the problems noticed in current devices.
In this paper, we report on four generations of display-sensor platforms for handheld augmented reality. The paper is organized as a compendium of requirements that guided the design and construction of each generation of the handheld platforms. The first generation, reported in [17]), was a result of various studies on ergonomics and human factors. Thereafter, each following iteration in the design-production process was guided by experiences and evaluations that resulted in new guidelines for future versions. We describe the evolution of hardware for handheld augmented reality, the requirements and guidelines that motivated its construction.
Environment monitoring using multiple observation cameras is increasingly popular. Different techniques exist to visualize the incoming video streams, but only few evaluations are available to find the best suitable one for a given task and context. This article compares three techniques for browsing video feeds from cameras that are located around the user in an unstructured manner. The techniques allow mobile users to gain extra information about the surroundings, the objects and the actors in the environment by observing a site from different perspectives. The techniques relate local and remote cameras topologically, via a tunnel, or via bird's eye viewpoint. Their common goal is to enhance spatial awareness of the viewer, without relying on a model or previous knowledge of the environment. We introduce several factors of spatial awareness inherent to multi-camera systems, and present a comparative evaluation of the proposed techniques with respect to spatial understanding and workload.