Refine
H-BRS Bibliography
- yes (22)
Departments, institutes and facilities
Document Type
- Conference Object (10)
- Article (6)
- Report (4)
- Part of a Book (2)
Year of publication
Keywords
- Aufrecht (1)
- Centrifugation (1)
- EEG (1)
- ERP (1)
- Educational institutions (1)
- Exercise (1)
- FIVIS (1)
- Fahrradfahrsimulator (1)
- Fahrsimulator (1)
- Forschungsbericht (1)
Ziel des hier beschriebenen Forschungsprojekts war die Entwicklung eines prototypischen Fahrradfahrsimulators für den Einsatz in der Verkehrserziehung und im Verkehrssicherheitstraining. Der entwickelte Prototyp soll möglichst universell für verschiedene Altersklassen und Applikationen einsetzbar sowie mobil sein.
The objective of the FIVIS project is to develop a bicycle simulator which is able to simulate real life bicycle ride situations as a virtual scenario within an immersive environment. A sample test bicycle is mounted on a motion platform to enable a close to reality simulation of turns and balance situations. The visual field of the bike rider is enveloped within a multi-screen visualisation environment which provides visual data relative to the motion and activity of the test bicycle. That means the bike rider has to pedal and steer the bicycle as a usual bicycle, while the motion is recorded and processed to control the simulation. Furthermore, the platform is fed with real forces and accelerations that have been logged by a mobile data acquisition system during real bicycle test drives. Thus, using a feedback system makes the movements of the platform match to the virtual environment and the reaction of the driver (e.g. steering angle, step rate).
3D tracking using multiple Nintendo Wii Remotes: a simple consumer hardware tracking approach
(2009)
An easy to build and cost-effective 3D tracking solution is presented, using Nintendo Wii Remotes acting as cameras. As the hardware differs from usual tracking cameras, the calibration and tracking process has to be adapted accordingly. The tracking approach described could be used for tracking the user's motions in video games based upon physical activity (sports, fighting or dancing games), allowing the player to interact with the game in a more intuitive way than by just pressing buttons.
The perceived distance of self motion induced in a stationary observer by optic flow is overestimated (Redlick et al., Vis Res. 2001 41: 213). Here we assessed how different components of translational optic flow contribute to perceived distance traveled. Subjects sat on a stationary bicycle in front of a virtual reality display that extended beyond 90deg on each side. They monocularly viewed a target presented in a virtual hallway wallpapered with stripes that changed colour to prevent tracking individual stripes. Subjects then looked centrally or 30, 60 or 90° eccentrically while their view was restricted to an ellipse with faded edges (25 x 42deg) centered on their fixation. Subjects judged when they had reached the target’s remembered position. Perceptual gain (perceived/actual distance traveled) was highest when subjects were looking in a direction that depended on the simulated speed of motion. Results were modeled as the sum of separate mechanisms sensitive to radial and laminar optic flow. In our display distances were perceived as compressed. However, there was no correlation between perceptual compression and perceived speed of motion. These results suggest that visually induced self motion in virtual displays can be subject to large but predictable error.
This contribution presents an easy to implement 3D tracking approach that works with a single standard webcam. We describe the algorithm and show that it is well suited for being used as an intuitive interaction method in 3D video games. The algorithm can detect and distinguish multiple objects in real-time and obtain their orientation and position relative to the camera. The trackable objects are equipped with planar patterns of five visual markers. By tracking (stereo) glasses worn by the user and adjusting the in-game camera's viewing frustum accordingly, the well-known immersive "screen as a window" effect can be achieved, even without the use of any special tracking equipment.
This contribution describes an optical laser-based user interaction system designed for virtual reality (VR) environments. The project's objective is to realize a 6-DoF user input device for interaction with VR applications running in CAVE-type visualization environments with flat projections walls. In case of a back-projection VR system, in contrast to optical tracking systems, no camera has to be placed within the visualization environment. Instead, cameras observe patterns of laser beam projections from behind the screens. These patterns are emitted by a hand-held input device. The system is robust with respect to partial occlusion of the laser pattern. An inertial measurement unit is integrated into the device in order to improve robustness and precision.
The relative contributions of radial and laminar optic flow to the perception of linear self-motion
(2012)
When illusory self-motion is induced in a stationary observer by optic flow, the perceived distance traveled is generally overestimated relative to the distance of a remembered target (Redlick, Harris, & Jenkin, 2001): subjects feel they have gone further than the simulated distance and indicate that they have arrived at a target's previously seen location too early. In this article we assess how the radial and laminar components of translational optic flow contribute to the perceived distance traveled. Subjects monocularly viewed a target presented in a virtual hallway wallpapered with stripes that periodically changed color to prevent tracking. The target was then extinguished and the visible area of the hallway shrunk to an oval region 40° (h) × 24° (v). Subjects either continued to look centrally or shifted their gaze eccentrically, thus varying the relative amounts of radial and laminar flow visible. They were then presented with visual motion compatible with moving down the hallway toward the target and pressed a button when they perceived that they had reached the target's remembered position. Data were modeled by the output of a leaky spatial integrator (Lappe, Jenkin, & Harris, 2007). The sensory gain varied systematically with viewing eccentricity while the leak constant was independent of viewing eccentricity. Results were modeled as the linear sum of separate mechanisms sensitive to radial and laminar optic flow. Results are compatible with independent channels for processing the radial and laminar flow components of optic flow that add linearly to produce large but predictable errors in perceived distance traveled.