Refine
H-BRS Bibliography
- yes (32) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (32) (remove)
Document Type
- Conference Object (32) (remove)
Year of publication
- 2018 (32) (remove)
Keywords
- Virtual Reality (2)
- 3D User Interface (1)
- 3D user interface (1)
- Alternatives (1)
- Co-located work (1)
- Code similarity analysis (1)
- Dataflow Programming (1)
- Delph Study (1)
- Difference Visualization (1)
- Embedded software (1)
- Eye Tracking (1)
- Fixed spatial data (1)
- Free-Space Loss (FSL) (1)
- Future (1)
- Future Studies (1)
- Future of Robotics (1)
- Gaze Depth Estimation (1)
- Generation R (1)
- Generative Design (1)
- Group behavior (1)
- Hand Guidance (1)
- IEEE 802.11 (1)
- IP protection (1)
- Information interaction (1)
- Innovation (1)
- LAA (1)
- LTE-U (1)
- LoRa (1)
- LoRa receiver accuracy (1)
- Longley-Rice Irregular Terrain Model (ITM) (1)
- Megatrends (1)
- Multiple Displays (1)
- NEAT (1)
- Navigation (1)
- Neuroevolution (1)
- Path loss model (1)
- Pose Estimation (1)
- Random number generator (1)
- Robotic Governance (1)
- Robotic Natives (1)
- Robotic Revolutions (1)
- Side-channel watermarking (1)
- Similarity matrix (1)
- Software reverse engineering (1)
- Spectrum occupancy (1)
- Surrogate Modeling (1)
- Swim Stroke Analysis (1)
- Tactile Feedback (1)
- Tactile feedback (1)
- Tiled displays (1)
- U-NII band (1)
- User Study (1)
- client-side component model (1)
- co-located collaboration (1)
- cognitive radio (1)
- dependable robots (1)
- dimensionality reduction (1)
- embedded collaborative learning (1)
- hand guidance (1)
- holography (1)
- human factors (1)
- human-robot collaboration (1)
- ideation (1)
- immersive systems (1)
- interface design (1)
- machine learning (1)
- medical training (1)
- mobile web (1)
- modular web (1)
- multi-user VR (1)
- prototype theory (1)
- pseudo-random number generator (1)
- quality-diversity (1)
- remote diagnosis (1)
- robot component monitoring (1)
- robotic black box (1)
- serious games (1)
- service robots (1)
- spectrum sensing (1)
- speech recognition (1)
- speech understanding (1)
- tiled displays (1)
- true random number generator (1)
- ultrasonic sensor (1)
- user study (1)
- web components (1)
- web technology (1)
- xorshift-generator (1)
This work discusses how to use OSM for robotic applications and aims at starting a discussion between the OSM and the robotics community. OSM contains much topological and semantic information that can be directly used in robotics and offers various advantages: 1) Standardized format with existing tooling. 2) The graph structure allows to compose the OSM models with domain-specific semantics by adding custom nodes, relations, and key-value pairs. 3) Information about many places is already available and can be used by robots since it is driven by a community effort.
This paper introduces a random number generator (RNG) based on the avalanche noise of two diodes. A true random number generator (TRNG) generates true random numbers with the use of the electronic noise produced by two avalanche diodes. The amplified outputs of the diodes are sampled and digitized. The difference between the two concurrently sampled and digitized outputs is calculated and used to select a seed and to drive a pseudo-random number generator (PRNG). The PRNG is an xorshift generator that generates 1024 bits in each cycle. Every sequence of 1024 bits is moderately modified and output. The TRNG delivers the next seed and the next cycle begins. The statistical behavior of the generator is analyzed and presented.
In presence of conflicting or ambiguous visual cues in complex scenes, performing 3D selection and manipulation tasks can be challenging. To improve motor planning and coordination, we explore audio-tactile cues to inform the user about the presence of objects in hand proximity, e.g., to avoid unwanted object penetrations. We do so through a novel glove-based tactile interface, enhanced by audio cues. Through two user studies, we illustrate that proximity guidance cues improve spatial awareness, hand motions, and collision avoidance behaviors, and show how proximity cues in combination with collision and friction cues can significantly improve performance.
We present a novel forearm-and-glove tactile interface that can enhance 3D interaction by guiding hand motor planning and coordination. In particular, we aim to improve hand motion and pose actions related to selection and manipulation tasks. Through our user studies, we illustrate how tactile patterns can guide the user, by triggering hand pose and motion changes, for example to grasp (select) and manipulate (move) an object. We discuss the potential and limitations of the interface, and outline future work.
Entering the work envelope of an industrial robot can lead to severe injury from collisions with moving parts of the system. Conventional safety mechanisms therefore mostly restrict access to the robot using physical barriers such as walls and fences or non-contact protective devices including light curtains and laser scanners. As none of these mechanisms applies to human-robot-collaboration (HRC), a concept in which human and machine complement one another by working hand in hand, there is a rising need for safe and reliable detection of human body parts amidst background clutter. For this application camera-based systems are typically well suited. Still, safety concerns remain, owing to possible detection failures caused by environmental occlusion, extraneous light or other adverse imaging conditions. While ultrasonic proximity sensing can provide physical diversity to the system, it does not yet allow to reliably distinguish relevant objects from background objects.This work investigates a new approach to detecting relevant objects and human body parts based on acoustic holography. The approach is experimentally validated using a low-cost application-specific ultrasonic sensor system created from micro-electromechanical systems (MEMS). The presented results show that this system far outperforms conventional proximity sensors in terms of lateral imaging resolution and thus allows for more intelligent muting processes without compromising the safety of people working close to the robot. Based upon this work, a next step could be the development of a multimodal sensor systems to safeguard workers who collaborate with robots using the described ultrasonic sensor system.
Almost unnoticed by the e-learning community, the underlying technology of the WWW is undergoing massive technological changes on all levels these days. In this paper we draw the attention to the emerging game changer and discuss the consequences for online learning. In our e-learning project "Work & Study", funded by the German Federal Ministry of Education and Research, we have experimented with several new technological approaches such as Mobile First, Responsive Design, Mobile Apps, Web Components, Client-side Components, Progressive Web Apps, Course Apps, e-books, and web sockets for real time collaboration and report about the results and consequences for online learning practice. The modular web is emerging where e-learning units are composed from and delivered by universally embeddable web components.
In this paper we propose an architecture to integrate classical planning and real autonomous mobile robots. We start by providing with a high level description of all necessary components to set the goals, generate plans and execute them on real robots and monitor the outcome of their actions. At the core of our method and to deal with execution issues we code the agent actions with automatas. We prove the flexibility of the system by testing on two different domains: industrial (Basic Transportation Test) and domestic (General Purpose Service Robot) in the context of the international RoboCup competition. Additionally we benchmark the scalability of the planning system in two domains on a set of planning problems with increasing complexity. The proposed framework is open source1 and can be easily extended.