Refine
H-BRS Bibliography
- yes (208)
Departments, institutes and facilities
- Institute of Visual Computing (IVC) (208) (remove)
Document Type
- Conference Object (208) (remove)
Year of publication
Has Fulltext
- no (208) (remove)
Keywords
- FPGA (10)
- Virtual Reality (6)
- 3D user interface (4)
- Education (4)
- Hyperspectral image (3)
- Image Processing (3)
- virtual reality (3)
- Algorithms (2)
- Augmented reality (2)
- CUDA (2)
- Computer Graphics (2)
- Content Module (2)
- Distributed rendering (2)
- Eye Tracking (2)
- Intelligent virtual agents (2)
- Large, high-resolution displays (2)
- Original Story (2)
- Raman microscopy (2)
- Remote laboratory (2)
- Serious Games (2)
- digital design (2)
- education (2)
- edutainment (2)
- haptics (2)
- human factors (2)
- hypermedia (2)
- image fusion (2)
- interface design (2)
- low power (2)
- machine vision (2)
- microcontroller (2)
- navigation (2)
- pansharpening (2)
- remote lab (2)
- serious games (2)
- virtual environments (2)
- 2D Level Design (1)
- 3D User Interface (1)
- 3D Visualisierung (1)
- 3D gaming (1)
- 3D interfaces (1)
- 3D shape (1)
- 3D user interfaces (1)
- Active locomotion (1)
- Augmented Reality (1)
- BLOB Detection (1)
- Bicycle Simulator (1)
- Blob Detection (1)
- Bound Volume Hierarchy (1)
- Bounding Box (1)
- Cell Processor (1)
- Cell/B.E. (1)
- Center-of-Mass (1)
- Clusters (1)
- Co-located Collaboration (1)
- Co-located work (1)
- Computer graphics (1)
- Computer-supported Cooperative Work (1)
- Container Structure (1)
- Containerization (1)
- Correlative Microscopy (1)
- Current measurement (1)
- Datalog (1)
- Digital Storytelling (1)
- Docker (1)
- EEG (1)
- Echtzeit (1)
- Educational institutions (1)
- Evaluation board (1)
- Fixed spatial data (1)
- Force field (1)
- Fuzzy logic (1)
- Game Engine (1)
- Games and Simulations for Learning (1)
- Garbage collection (1)
- Gaze Behavior (1)
- Gaze Depth Estimation (1)
- Gender Issues in Computer Science Education (1)
- Global Illumination (1)
- Grailog (1)
- Group Behavior (1)
- Group behavior (1)
- Groupware (1)
- HCI (1)
- Hand Guidance (1)
- Hand Tracking (1)
- Head-mounted Display (1)
- Heat shrink tubing (1)
- High-performance computing (1)
- Higher education (1)
- Human Factors (1)
- Image-based rendering (1)
- Immersive Visualization Environment (1)
- Information interaction (1)
- Interaction (1)
- Internet (1)
- Interoperability (1)
- Inventory (1)
- Java virtual machine (1)
- JavaScript (1)
- Laboratories (1)
- Lighting simulation (1)
- Low-power design (1)
- Low-power digital design (1)
- Low-power education (1)
- Main Memory (1)
- Management (1)
- Measurement (1)
- Modular software packages (1)
- Molecular modeling (1)
- Motion Capture (1)
- Multilayer interaction (1)
- Multimodal Microspectroscopy (1)
- Multiuser (1)
- Musical Performance (1)
- NVIDIA Tesla (1)
- Narration Module (1)
- Navigation (1)
- Navigation interface (1)
- Numerical optimization (1)
- OER (1)
- Parallel Processing (1)
- Parallelization (1)
- Perception (1)
- Performance (1)
- Physical exercising game platform (1)
- Pointing (1)
- Pointing devices (1)
- Pose Estimation (1)
- Power dissipation (1)
- Power measurement (1)
- Programming (1)
- Radix Sort (1)
- Ray Casting (1)
- Ray Tracing (1)
- Ray tracing (1)
- Reasoning (1)
- Registration Refinement (1)
- Remote lab (1)
- Render Cache (1)
- Reversible Logic Synthesis (1)
- RuleML (1)
- S3D Video (1)
- S3D video (1)
- SVG (1)
- Scalable Vector Graphic (1)
- Second Life (1)
- Social Virtual Reality (1)
- Split Axis (1)
- Standards (1)
- Star Trek (1)
- Stereoscopic Rendering (1)
- Stereoscopic rendering (1)
- Story Element (1)
- Survey (1)
- Swim Stroke Analysis (1)
- Switches (1)
- Synthetic perception (1)
- SystemVerilog (1)
- Tactile Feedback (1)
- Tactile feedback (1)
- Three-dimensional displays (1)
- Tiled displays (1)
- Tiled-display walls (1)
- Touchscreen interaction (1)
- Traffic Simulations (1)
- UI design (1)
- Unity (1)
- Usability (1)
- User Roles (1)
- User Study (1)
- User-Centered Approach (1)
- VR (1)
- VR system design (1)
- Verilog (1)
- Virtual Agents (1)
- Virtual Environments (1)
- Virtual attention (1)
- Virtual environments (1)
- Visualization (1)
- Volumenrendering (1)
- XML (1)
- XSLT (1)
- adaptive agents (1)
- adaptive filters (1)
- affective computing (1)
- analysis (1)
- audio-tactile feedback (1)
- augmented, and virtual realities (1)
- authoring tools (1)
- automation (1)
- bass-shaker (1)
- bicycle (1)
- body-centric cues (1)
- brain computer interfaces (1)
- brightfield microscopy (1)
- bus load (1)
- can bus (1)
- chemical sensors (1)
- co-located collaboration (1)
- collaboration (1)
- colorimetry (1)
- compensation (1)
- component analyses (1)
- computational logic (1)
- computer-supported collaborative work (1)
- cooperative path planning (1)
- data logging (1)
- depth perception (1)
- digital storytelling (1)
- directed hypergraphs (1)
- disabled people (1)
- e-learning (1)
- educational methods (1)
- electrical devices (1)
- electrical engineering education (1)
- electronics (1)
- embodied interfaces (1)
- embroidery machine (1)
- emotion computing (1)
- energy awareness (1)
- engineering education (1)
- evaluation board development (1)
- eye-tracking (1)
- field programmable gate arrays (1)
- foveated rendering (1)
- fpga (1)
- full-body interface (1)
- fuzzy logic (1)
- game engine (1)
- gaming (1)
- graphs (1)
- guidance (1)
- hand guidance (1)
- hands-on lab (1)
- hands-on labs (1)
- haptic feedback (1)
- heat shrink tubes (1)
- human-centric lighting (1)
- image processing (1)
- immersion (1)
- immersive systems (1)
- information display methods (1)
- interaction (1)
- interaction techniques (1)
- leaning (1)
- leaning, self-motion perception (1)
- life-long learning (1)
- linguistic variable (1)
- linguistic variables (1)
- low-power design (1)
- measurement (1)
- medical training (1)
- mesoscopic agents (1)
- microcontroller education (1)
- microcontrollers (1)
- mobile projection (1)
- momentary frequency (1)
- monitoring (1)
- motion cueing (1)
- motion platform (1)
- multi-layer display (1)
- multi-user VR (1)
- multiresolution analysis (1)
- multisensory cues (1)
- multisensory interface (1)
- natural user interface (1)
- neural networks (1)
- pen interaction (1)
- performance optimizations (1)
- peripheral vision (1)
- peripheral visual field (1)
- photometry (1)
- physical model immersive (1)
- programming (1)
- project management (1)
- proxemics (1)
- remote-lab (1)
- robotics (1)
- rules (1)
- scene element representation (1)
- security (1)
- see-through display (1)
- see-through head-mounted displays (1)
- semantic image seg-mentation (1)
- short-term memory (1)
- simulator (1)
- software engineering (1)
- spectral rendering (1)
- stereoscopic vision (1)
- story authoring (1)
- submillimeter precision (1)
- surface textures (1)
- surface topography (1)
- technological literacy (1)
- telepresence (1)
- territoriality (1)
- tiled displays (1)
- time series processing (1)
- un-manned aerial vehicle (1)
- unmanned ground vehicle (1)
- user acceptance (1)
- user engagement (1)
- user study (1)
- vection (1)
- vibration (1)
- video lectures (1)
- virtual environment framework (1)
- virtual locomotion (1)
- visual quality control (1)
- visualisation (1)
- visuohaptic feedback (1)
- whole-body interface (1)
- workspace awareness (1)
- zooming interface (1)
Selection Performance and Reliability of Eye and Head Gaze Tracking Under Varying Light Conditions
(2024)
Gone But Not Forgotten: Evaluating Performance and Scalability of Real-Time Mesoscopic Agents
(2020)
Telepresence robots allow people to participate in remote spaces, yet they can be difficult to manoeuvre with people and obstacles around. We designed a haptic-feedback system called “FeetBack," which users place their feet in when driving a telepresence robot. When the robot approaches people or obstacles, haptic proximity and collision feedback are provided on the respective sides of the feet, helping inform users about events that are hard to notice through the robot’s camera views. We conducted two studies: one to explore the usage of FeetBack in virtual environments, another focused on real environments.We found that FeetBack can increase spatial presence in simple virtual environments. Users valued the feedback to adjust their behaviour in both types of environments, though it was sometimes too frequent or unneeded for certain situations after a period of time. These results point to the value of foot-based haptic feedback for telepresence robot systems, while also the need to design context-sensitive haptic feedback.
Evaluation of a Multi-Layer 2.5D display in comparison to conventional 3D stereoscopic glasses
(2020)
In this paper we propose and evaluate a custom-build projection-based multilayer 2.5D display, consisting of three layers of images, and compare performance to a stereoscopic 3D display. Stereoscopic vision can increase the involvement and enhance game experience, however may induce possible side effects, e.g. motion sickness and simulator sickness. To overcome the disadvantage of multiple discrete depths, in our system perspective rendering and head-tracking is used. A study was performed to evaluate this display with 20 participants playing custom-designed games. The results indicated that the multi-layer display caused fewer side effects than the stereoscopic display and provided good usability. The participants also stated a better or equal spatial perception, while the cognitive load stayed the same.
This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.