Refine
H-BRS Bibliography
- yes (58) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (58) (remove)
Document Type
- Conference Object (30)
- Article (14)
- Report (6)
- Part of a Book (3)
- Book (monograph, edited volume) (2)
- Doctoral Thesis (1)
- Master's Thesis (1)
- Other (1)
Year of publication
- 2012 (58) (remove)
Keywords
- 3D-Scanner (2)
- ARRs (2)
- Bag of Features (2)
- FDI (2)
- Hybrid systems (2)
- classifier combination (2)
- clustering (2)
- feature extraction (2)
- machine learning (2)
- object categorization (2)
The documentation requirements of data published in long term archives have significantly grown over the last decade. At WDCC the data publishing process is assisted by “Atarrabi”, a web-based workflow system for reviewing and editing metadata information by the data authors and the publication agent. The system ensures high metadata quality for long-term use of the data with persistent identifiers (DOI/URN). By these well-defined references (DOI) credit can properly be given to the data producers in any publication.
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key was to make the results applicable to data publication processes in other sciences as well.
YAWL User Group
(2012)
This project investigated the viability of using the Microsoft Kinect in order to obtain reliable Red-Green-Blue-Depth (RGBD) information. This explored the usability of the Kinect in a variety of environments as well as its ability to detect different classes of materials and objects. This was facilitated through the implementation of Random Sample and Consensus (RANSAC) based algorithms and highly parallelized workflows in order to provide time sensitive results. We found that the Kinect provides detailed and reliable information in a time sensitive manner. Furthermore, the project results recommend usability and operational parameters for the use of the Kinect as a scientific research tool.
Traffic simulations for virtual environments are concerned with the behavior of individual traffic participants. The complexity of behavior in these simulations is often rather simple to abide by the constraints of processing resources. In sophisticated traffic simulations, the behavior of individual traffic participants is also modeled, but the focus lies on the overall behavior of the entire system, e.g. to identify possible bottle necks of traffic flow [8].
At previous SIAS conferences, we presented a novel opto-electronic safety sensor system for skin detection at circular saws jointly developed with the Institute for Occupational Safety and Health of the German Social Accident Insurance (IFA). This work now presents the development results of our consecutive research on a prototype of a sensor system for more general production machine applications including robot workplaces. The system uses offthe shelf LEDs and photodiodes in combination with dedicated optics and a microcontroller system to implement a so-called spectral light curtain.
Traffic simulations are typically concerned with modeling human behavior as closely as possible to create realistic results. In conventional traffic simulations used for road planning or traffic jam prediction only the overall behavior of an entire system is of interest. In virtual environments, like digital games, simulated traffic participants are merely a backdrop to the player’s experience and only need to be “sufficiently realistic”. Additionally, restricted computational resources, typical for virtual environment applications, usually limit the complexity of simulated behavior in this field. More importantly, two integral aspects of real-world traffic are not considered in current traffic simulations from both fields: misbehavior and risk taking of traffic participants. However, for certain applications like the FIVIS bicycle simulator, these aspects are essential.
Traditionally traffic simulations are used to predict traffic jams, plan new roads or highways, and estimate road safety. They are also used in computer games and virtual environments. There are two general concepts of modeling traffic: macroscopic and microscopic modeling. Macroscopic traffic models take vehicle collectives into account and do not consider individual vehicles. Parameters like average velocity and density are used to model the flow of traffic. In contrast, microscopic traffic models consider each vehicle individually. Therefore, vehicle specific parameters are of importance, e.g. current velocity, desired velocity, velocity difference to the lead vehicle, individual time gap.
The work presented in this paper focuses on the comparison of well-known and new techniques for designing robust fault diagnosis schemes in the robot domain. The main challenge for fault diagnosis is to allow the robot to effectively cope not only with internal hardware and software faults but with external disturbances and errors from dynamic and complex environments as well.
In the realm of service robots recovery from faults is indispensable to foster user acceptance. Here fault is to be understood not in the sense of robot internal, rather as interaction faults while situated in and interacting with an environment (aka ex-ternal faults). We reason along the most frequent failures in typical scenarios which we observed during real-world demonstrations and competitions using our Care-O-bot III 1 robot. They take place in an apartment-like environments which is known as closed world. We suggest four different -for now adhoc -fault categories caused by disturbances, imperfect per-ception, inadequate planning or chaining of action sequences. The fault are categorized and then mapped to a handful of partly known, partly extended fault handling techniques. Among them we applied qualitative reasoning, use of simu-lation as oracle, learning for planning (aka en-hancement of plan operators) or -in future -case-based reasoning. Having laid out this frame we mainly ask open questions related to the applicability of the pre-sented approach. Amongst them: how to find new categories, how to extend them, how to as-sure disjointness, how to identify old and label new faults on the fly.
In der vorliegenden Arbeit wurde ein Verfahren zur Analyse von Molekülen auf Grundlage ihrer molekularen Oberfläche und lokalen Werte für physiko-chemische und topografische Eigenschaften entwickelt. Der als Kernkomponente der Analyse entwickelte Fuzzy-Controller kombiniert molekulare Eigenschaften und selektiert die für Wechselwirkungen relevanten Merkmale auf der Oberfläche. Die Ergebnisse des Fuzzy-Controllers werden für die Berechnung von 3D-Deskriptoren und für die Visualisierung der ermittelten Domänen auf der Oberfläche herangezogen. Es werden zwei Arten von Deskriptoren berechnet. Deskriptoren, welche Flächeninhalte und Zugehörigkeiten zu den spezifizierten Bindungsmerkmalen der Domänen darstellen, und Deskriptoren, welche die räumliche Anordnung der Domänen zueinander beschreiben. Die vom Fuzzy-Controller überarbeitete Oberfläche wird im VRML-Format zur Visualisierung und weiteren Bearbeitung zur Verfügung gestellt. Die berechneten Deskriptoren werden zur Ähnlichkeitsanalyse von Liganden und zur Suche von komplementären Bereichen an der Bindungsstelle einesRezeptors eingesetzt. MTX in protonierter Form und DHF, die an das Enzym DHF-Reduktase binden, und die Inhibitoren Sildenafil, Tadalafil und Vardenafil des Enzyms PDE-5A wurden unter Ähnlichkeitsaspekten analysiert. Bei der Bestimmung von komplementären Bindungsmerkmalen wird ausgehend von den Bindungsmerkmalen eines Liganden nach komplementären Bereichen in der Bindungstasche des Rezeptors gesucht. Als Anwendungsbeispiele werden die Bindungsstelle des Enzyms DHF-Reduktase aus den Komplexen mit MTX und DHF und des Enzyms PDE-5A aus den Komplexen mit Sildenafil, Vardenafil und Tadalafil betrachtet. Insgesamt haben die Anwendungsbeispiele gezeigt, dass der vorgestellte Fuzzy-Controller Bindungsmerkmale auf der molekularen Oberfläche identifiziert unddie darauf basierenden, rotations- und translationsinvarianten Deskriptoren zur Ähnlichkeitsanalyse und zur Suche von komplementären Bereichen angewendet werden können.