Prof. Dr. André Hinkenjann
Refine
H-BRS Bibliography
- yes (96)
Departments, institutes and facilities
- Fachbereich Informatik (96) (remove)
Document Type
- Conference Object (70)
- Article (19)
- Report (3)
- Conference Proceedings (2)
- Part of a Book (1)
- Research Data (1)
Year of publication
Keywords
- Virtual Reality (4)
- Ray Tracing (3)
- foveated rendering (3)
- 3D user interface (2)
- 3D user interfaces (2)
- Augmented Reality (2)
- CUDA (2)
- Computer Graphics (2)
- Distributed rendering (2)
- Garbage collection (2)
- Java virtual machine (2)
- Large, high-resolution displays (2)
- Unity (2)
- VR (2)
- eye-tracking (2)
- interaction (2)
- user study (2)
- 2D Level Design (1)
- 3D User Interface (1)
- 3D Visualisierung (1)
- 3D interfaces (1)
- AMBER (1)
- AR (1)
- Algorithms (1)
- Bound Volume Hierarchy (1)
- Cell Processor (1)
- Challenges (1)
- Clusters (1)
- Co-located Collaboration (1)
- Co-located work (1)
- Computer graphics (1)
- Computer-supported Cooperative Work (1)
- Computing methodologies (1)
- Containerization (1)
- CyberGlove (1)
- Docker (1)
- Echtzeit (1)
- Ecosystem simulation (1)
- Edutainment (1)
- Electromagnetic Fields (1)
- Engineering (1)
- Escape analysis (1)
- Eye Tracking (1)
- Fixed spatial data (1)
- Focus plus context (1)
- Force field (1)
- Games (1)
- Gaze Depth Estimation (1)
- Gaze-contingent depth-of-field (1)
- Global Illumination (1)
- Group Behavior (1)
- Group behavior (1)
- Group behavior analysis (1)
- Groupware (1)
- Hand Guidance (1)
- Hand Tracking (1)
- High-resolution displays (1)
- Higher education (1)
- Human centered computing (1)
- Hydrocarbon (1)
- Image Processing (1)
- Image-based rendering (1)
- Immersion (1)
- Immersive analytics (1)
- Information interaction (1)
- Instantiation (1)
- Interaction devices (1)
- Large display interaction (1)
- Lighting simulation (1)
- Main Memory (1)
- Memory management (1)
- Methodologies (1)
- Mixed (1)
- Motion Capture (1)
- Multisensory cues (1)
- Musical Performance (1)
- Navigation interface (1)
- Performance (1)
- Poisson Disc Distribution (1)
- Presence (1)
- Radix Sort (1)
- Ray Casting (1)
- Ray tracing (1)
- Reflectance modeling (1)
- Render Cache (1)
- Rendering (1)
- School experiments (1)
- Software Architecture (1)
- Software Framework (1)
- Split Axis (1)
- Survey (1)
- Tactile Feedback (1)
- Tactile feedback (1)
- Terrain rendering (1)
- Tiled displays (1)
- Tiled-display walls (1)
- Touchscreen interaction (1)
- User Roles (1)
- User Study (1)
- User engagement (1)
- User interfaces (1)
- VR system design (1)
- VR-based systems (1)
- Virtual reality (1)
- Visualization (1)
- Visualization design and evaluation methods (1)
- Visualization systems and tools (1)
- Volumenrendering (1)
- Wang-tiles (1)
- accelerometer (1)
- augmented, and virtual realities (1)
- authoring tools (1)
- back-of-device interaction (1)
- bass-shaker (1)
- co-located collaboration (1)
- collaboration (1)
- component analyses (1)
- computer vision (1)
- data analysis (1)
- data glove (1)
- database (1)
- eye movement (1)
- eye tracking (1)
- gaming (1)
- gaze (1)
- grasp motions (1)
- grasping (1)
- hand guidance (1)
- haptic interfaces (1)
- haptics (1)
- human computer interaction (1)
- human-centric lighting (1)
- immersion (1)
- interaction techniques (1)
- interactive computer graphics (1)
- large-high-resolution displays (1)
- leaning (1)
- mixed reality (1)
- mobile applications (1)
- mobile projection (1)
- motion capture (1)
- multi-layer display (1)
- multisensory cues (1)
- perceived quality (1)
- performance optimizations (1)
- peripheral vision (1)
- posture analysis (1)
- prehensile motions (1)
- projection based systems (1)
- proxemics (1)
- rapid prototyping tool (1)
- ray tracing (1)
- region of interest (1)
- scene element representation (1)
- sensemaking (1)
- short-term memory (1)
- software engineering (1)
- spatial augmented reality (1)
- spectral rendering (1)
- spinal posture (1)
- stereoscopic vision (1)
- surface textures (1)
- tiled displays (1)
- tools for education (1)
- user engagement (1)
- vibration (1)
- virtual environment framework (1)
- virtual environments (1)
- virtual reality (1)
- visualisation (1)
- visuohaptic feedback (1)
- wearable sensor (1)
- whole-body interface (1)
- zooming interface (1)
- zooming interfaces (1)
The Render Cache [1,2] allows the interactive display of very large scenes, rendered with complex global illumination models, by decoupling camera movement from the costly scene sampling process. In this paper, the distributed execution of the individual components of the Render Cache on a PC cluster is shown to be a viable alternative to the shared memory implementation.As the processing power of an entire node can be dedicated to a single component, more advanced algorithms may be examined. Modular functional units also lead to increased flexibility, useful in research as well as industrial applications.We introduce a new strategy for view-driven scene sampling, as well as support for multiple camera viewpoints generated from the same cache. Stereo display and a CAVE multi-camera setup have been implemented.The use of the highly portable and inter-operable CORBA networking API simplifies the integration of most existing pixel-based renderers. So far, three renderers (C++ and Java) have been adapted to function within our framework.
This paper describes the work done at our Lab to improve visual and other quality of Virtual Environments. To be able to achieve better quality we built a new Virtual Environments framework called basho. basho is a renderer independent VE framework. Although renderers are not limited to graphics renderers we first concentrated on improving visual quality. Independence is gained from designing basho to have a small kernel and several plug-ins.
Interactive rendering of complex models has many applications in the Virtual Reality Continuum. The oil&gas industry uses interactive visualizations of huge seismic data sets to evaluate and plan drilling operations. The automotive industry evaluates designs based on very detailed models. Unfortunately, many of these very complex geometric models cannot be displayed with interactive frame rates on graphics workstations. This is due to the limited scalability of their graphics performance. Recently there is a trend to use networked standard PCs to solve this problem. Care must be taken however, because of nonexistent shared memory with clustered PCs. All data and commands have to be sent across the network. It turns out that the removal of the network bottleneck is a challenging problem to solve in this context.In this article we present some approaches for network aware parallel rendering on commodity hardware. These strategies are technological as well as algorithmic solutions.
A New Approach of Using Two Wireless Tracking Systems in Mobile Augmented Reality Applications
(2003)
Clusters of commodity PCs are widely considered as the way to go to improve rendering performance and quality in many real-time rendering applications. We describe the design and implementation of our parallel rendering system for real-time rendering applications. Major design objectives for our system are: usage of commodity hardware for all system components, ease of integration into existing Virtual Environments software, and flexibility in applying different rendering techniques, e.g. using ray tracing to render distinct objects with a particularly high quality.