Refine
Departments, institutes and facilities
- Fachbereich Informatik (30)
- Institute of Visual Computing (IVC) (15)
- Institut für Cyber Security & Privacy (ICSP) (6)
- Fachbereich Ingenieurwissenschaften und Kommunikation (4)
- Institut für Verbraucherinformatik (IVI) (3)
- Institut für Sicherheitsforschung (ISF) (2)
- Fachbereich Angewandte Naturwissenschaften (1)
- Fachbereich Wirtschaftswissenschaften (1)
- Institut für Detektionstechnologien (IDT) (1)
- Institut für funktionale Gen-Analytik (IFGA) (1)
Document Type
- Conference Object (64) (remove)
Year of publication
- 2012 (64) (remove)
Has Fulltext
- no (64)
Keywords
- Bag of Features (2)
- classifier combination (2)
- clustering (2)
- feature extraction (2)
- machine learning (2)
- object categorization (2)
- All-Swap Algorithm (1)
- CUDA (1)
- Cloud Security (1)
- Cloud Standards (1)
Für die Entwicklung steuerungstechnischer Sicherheitsfunktionen muss ab 2012 die Normen EN ISO 13849-1 oder EN 62061 befolgt werden, die sowohl Anforderungen an die Hardware als auch Anforderungen an die Software beschreibt. Die Anforderungen an die Software spielten bis vor einigen Jahren kaum eine Rolle, da Sicherheitsfunktionen vorzugsweise in Hardware realisiert wurden. Heutzutage ist es jedoch sehr häufig üblich, Sicherheitsfunktionen mit einer dafür geeigneten programmierbaren SPS zu realisieren. Die neuen Normen bzgl. der sicheren Steuerung von Maschinen verlangen neben der Quantifizierung der Hardware-Ausfallraten von Sicherheitsfunktionen noch ein Management der Sicherheitsfunktionen. Dazu gehört auch ein Management der Softwareentwicklung für Sicherheitsfunktionen, um systematische Fehler zu minimieren. Dieses Management der Softwareentwicklung wird im Wesentlichen durch das V-Modell repräsentiert. Für die Maschinebauindustrie darf dieser Managementprozess nicht zu aufwendig sein, ansonsten wird dieser in der Praxis nur schwer angenommen. Eine Möglichkeit der Abarbeitung des V-Modells wird vorgestellt. Wahrscheinlich ist diese aufgezeigte Möglichkeit für die Industrie noch zu aufwendig.
This paper describes the development of a Pedelec controller whose performance level (PL) conforms to European standard on safety of machinery [9] and whose soft- ware is verified to conform to EPAC standard [6] by means of a software verification technique called model checking. In compliance with the standard [9] the hardware needs to implement the required properties corresponding to categories “C” and “D”. The latter is used if the breaks are not able to bring the velomobile with a broken motor controller to a full stop. Therefore the controller needs to implement a test unit, which verifies the functionality of the components and, in case of an emergency, shuts the whole hardware down to prevent injuries of the cyclist. The MTTFd can be measured through a failure graph, which is the result of a FMEA analysis, and can be used to proof that the Pedelec controller meets the regulations of the system specification. The analysis of the system in compliance with [9] usually treats the software as a black box thus ignoring its inner workings and validating its correctness by means of testing. In this paper we present a temporal logic specification according to [6], based on which the software for the Pedelec controller is implemented, and verify instead of only testing its functionality. By means of model checking [1] we proof that the software fulfills all requirements which are regulated by its specification.
Traditionally traffic simulations are used to predict traffic jams, plan new roads or highways, and estimate road safety. They are also used in computer games and virtual environments. There are two general concepts of modeling traffic: macroscopic and microscopic modeling. Macroscopic traffic models take vehicle collectives into account and do not consider individual vehicles. Parameters like average velocity and density are used to model the flow of traffic. In contrast, microscopic traffic models consider each vehicle individually. Therefore, vehicle specific parameters are of importance, e.g. current velocity, desired velocity, velocity difference to the lead vehicle, individual time gap.
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
This paper compares the memory allocation of two Java virtual machines, namely Oracle Java HotSpot VM 32-bit (OJVM) and Jamaica JamaicaVM (JJVM). The basic difference of the architectures in both machines is that the JamaicaVM uses fixed-size blocks for allocating objects on the heap. The basic difference of the architectures is that the JJVM uses fixed size block allocation on the heap. This means that objects have to be split into several connected blocks if they are bigger than the specified block-size. On the other hand, for small objects a full block must be allocated. The paper contains both theoretical and experimental analysis on the memory-overhead. The theoretical analysis is based on specifications of the two virtual machines. The experimental analysis is done with a modified JVMTI Agent together with the SPECjvm2008 Benchmark.
This paper describes adaptive time frequency analysis of EEG signals, both in theory as well as in practice. A momentary frequency estimation algorithm is discussed and applied to EEG time series of test persons performing a concentration experiment. The motivation for deriving and implementing a time frequency estimator is the assumption that an emotional change implies a transient in the measured EEG time series, which again are superimposed by biological white noise as well as artifacts. It will be shown how accurately and robustly the estimator detects the transient even under such complicated conditions.
We present a study that investigates user performance benefits of playing video games using 3D motion controllers in 3D stereoscopic vision in comparison to monoscopic viewing. Using the PlayStation 3 game console coupled with the PlayStation Move Controller, we explored five different games that combine 3D stereo and 3D spatial interaction. For each game, quantitative and qualitative measures were taken to determine if users performed better and learned faster in the experimental group (3D stereo display) than in the control group (2D display). A game expertise pre-questionnaire was used to classify participants into beginners and expert game player categories to analyze a possible impact on performance differences. The results show two cases where the 3D stereo display did help participants perform significantly better than with a 2D display. For the first time, we can report a positive effect on gaming performance based on stereoscopic vision, although reserved to isolated tasks and depending on game expertise. We discuss the reasons behind these findings and provide recommendations for game designers who want to make use of 3D stereoscopic vision and 3D motion control to enhance game experiences.
The work presented in this paper focuses on the comparison of well-known and new techniques for designing robust fault diagnosis schemes in the robot domain. The main challenge for fault diagnosis is to allow the robot to effectively cope not only with internal hardware and software faults but with external disturbances and errors from dynamic and complex environments as well.
In the realm of service robots recovery from faults is indispensable to foster user acceptance. Here fault is to be understood not in the sense of robot internal, rather as interaction faults while situated in and interacting with an environment (aka ex-ternal faults). We reason along the most frequent failures in typical scenarios which we observed during real-world demonstrations and competitions using our Care-O-bot III 1 robot. They take place in an apartment-like environments which is known as closed world. We suggest four different -for now adhoc -fault categories caused by disturbances, imperfect per-ception, inadequate planning or chaining of action sequences. The fault are categorized and then mapped to a handful of partly known, partly extended fault handling techniques. Among them we applied qualitative reasoning, use of simu-lation as oracle, learning for planning (aka en-hancement of plan operators) or -in future -case-based reasoning. Having laid out this frame we mainly ask open questions related to the applicability of the pre-sented approach. Amongst them: how to find new categories, how to extend them, how to as-sure disjointness, how to identify old and label new faults on the fly.
XML Encryption and XML Signature are fundamental security standards forming the core for many applications which require to process XML-based data. Due to the increased usage of XML in distributed systems and platforms such as in SOA and Cloud settings, the demand for robust and effective security mechanisms increased as well. Recent research work discovered, however, substantial vulnerabilities in these standards as well as in the vast majority of the available implementations. Amongst them, the so-called XML Signature Wrapping attack belongs to the most relevant ones. With the many possible instances of this attack type, it is feasible to annul security systems relying on XML Signature and to gain access to protected resources as has been successfully demonstrated lately for various Cloud infrastructures and services. This paper contributes a comprehensive approach to robust and effective XML Signatures for SOAP-based Web Services. An architecture is proposed, which integrates the r equired enhancements to ensure a fail-safe and robust signature generation and verification. Following this architecture, a hardened XML Signature library has been implemented. The obtained evaluation results show that the developed concept and library provide the targeted robustness against all kinds of known XML Signature Wrapping attacks. Furthermore the empirical results underline, that these security merits are obtained at low efficiency and performance costs as well as remain compliant with the underlying standards.