Refine
H-BRS Bibliography
- yes (41)
Departments, institutes and facilities
- Institute of Visual Computing (IVC) (41) (remove)
Document Type
- Article (30)
- Report (7)
- Conference Object (2)
- Part of Periodical (2)
Year of publication
Has Fulltext
- yes (41) (remove)
Keywords
- Perceptual Upright (3)
- Virtuelle Realität (3)
- FPGA (2)
- Forschungsbericht (2)
- Gravitation (2)
- Perception (2)
- Raumwahrnehmung (2)
- Ray Tracing (2)
- Virtual Reality (2)
- computer vision (2)
- haptics (2)
- virtual reality (2)
- 3D navigation (1)
- 3D user interface (1)
- AR (1)
- Adaptive Behavior (1)
- Agents (1)
- Aufrecht (1)
- Augmented Reality (1)
- Camera selection (1)
- Camera view analysis (1)
- Centrifugation (1)
- Centrifuge (1)
- Chemical imaging (1)
- Codes (1)
- Cognition (1)
- Created Gravity (1)
- CyberGlove (1)
- Data structures (1)
- Demonstration-based training (1)
- Ecosystem simulation (1)
- Edutainment (1)
- Electromagnetic Fields (1)
- Emotion (1)
- Entropy (1)
- Executive functions (1)
- FIVIS (1)
- Fahrradfahrsimulator (1)
- Fahrsimulator (1)
- Feedback (1)
- Five Factor Model (1)
- Gefahrenprävention (1)
- Geometry (1)
- Group behavior analysis (1)
- HDBR (1)
- Handzeichenerkennung (1)
- Hardware (1)
- Hochschule Bonn-Rhein-Sieg (1)
- Hochschulehre (1)
- Human orientation perception (1)
- Immersive analytics (1)
- Informations-, Kommunikations- und Medientechnologie (1)
- Instantiation (1)
- Instruction design (1)
- Interaction devices (1)
- Interaktion (1)
- Interventionstudie (1)
- Künstliche Gravitation (1)
- Large display interaction (1)
- Lehr-Lernpsychologie (1)
- Lernen (1)
- Lernumgebung (1)
- Materialwissenschaften (1)
- Molecular rotation (1)
- Multi-camera (1)
- Multimodal hyperspectral data (1)
- Neuroscience (1)
- Older adults (1)
- Organic compounds and Functional groups (1)
- Outer Space Research (1)
- Pattern recognition (1)
- Personality (1)
- Physical activity (1)
- Poisson Disc Distribution (1)
- Pro-MINT-us (1)
- Proximity (1)
- Psychology (1)
- Qualitätspakt Lehre (1)
- Quantum mechanical methods (1)
- Radfahren (1)
- Ray tracing (1)
- Recommender systems (1)
- Rendering (1)
- School experiments (1)
- Software Architecture (1)
- Software Framework (1)
- Somatogravic Illusion (1)
- Supervised classification (1)
- Terrain rendering (1)
- Three-dimensional displays (1)
- Topology (1)
- Unity (1)
- VR (1)
- VR-based systems (1)
- Verkehrserziehung (1)
- Verkehrssimulation (1)
- Vibrational microspectroscopy (1)
- View selection (1)
- Visual Computing (1)
- Wahrnehmung (1)
- Wang-tiles (1)
- Weltraumforschung (1)
- Young adults (1)
- Zentrifuge (1)
- accelerometer (1)
- adaptive trigger (1)
- camera (1)
- collision (1)
- controller design (1)
- data analysis (1)
- data glove (1)
- database (1)
- education (1)
- energy awareness (1)
- eye movement (1)
- eye tracking (1)
- fiducial marker (1)
- foveated rendering (1)
- gaze (1)
- grasp motions (1)
- grasping (1)
- gravito-inertial force (1)
- head down bed rest (1)
- immersive Visualisierung (1)
- infrared pattern (1)
- intelligente virtuelle Agenten (1)
- interaction (1)
- interactive computer graphics (1)
- large-high-resolution displays (1)
- leaning-based interfaces (1)
- locomotion interface (1)
- low power (1)
- mixed reality (1)
- motion capture (1)
- navigational search (1)
- optical tracking (1)
- perceived quality (1)
- perception of upright (1)
- posture analysis (1)
- prehensile motions (1)
- psychophysics (1)
- rapid prototyping tool (1)
- region of interest (1)
- robotic arm (1)
- robotic evaluation (1)
- sensemaking (1)
- software engineering (1)
- space flight analog (1)
- spatial orientation (1)
- spatial updating (1)
- spinal posture (1)
- subjective visual vertical (1)
- tools for education (1)
- user input (1)
- user interaction (1)
- user study (1)
- vestibular system (1)
- vibration (1)
- virtuelle Umgebungen (1)
- wearable sensor (1)
- weight perception (1)
Self-motion perception is a multi-sensory process that involves visual, vestibular, and other cues. When perception of self-motion is induced using only visual motion, vestibular cues indicate that the body remains stationary, which may bias an observer’s perception. When lowering the precision of the vestibular cue by for example, lying down or by adapting to microgravity, these biases may decrease, accompanied by a decrease in precision. To test this hypothesis, we used a move-to-target task in virtual reality. Astronauts and Earth-based controls were shown a target at a range of simulated distances. After the target disappeared, forward self-motion was induced by optic flow. Participants indicated when they thought they had arrived at the target’s previously seen location. Astronauts completed the task on Earth (supine and sitting upright) prior to space travel, early and late in space, and early and late after landing. Controls completed the experiment on Earth using a similar regime with a supine posture used to simulate being in space. While variability was similar across all conditions, the supine posture led to significantly higher gains (target distance/perceived travel distance) than the sitting posture for the astronauts pre-flight and early post-flight but not late post-flight. No difference was detected between the astronauts’ performance on Earth and onboard the ISS, indicating that judgments of traveled distance were largely unaffected by long-term exposure to microgravity. Overall, this constitutes mixed evidence as to whether non-visual cues to travel distance are integrated with relevant visual cues when self-motion is simulated using optic flow alone.