004 Datenverarbeitung; Informatik
Refine
Departments, institutes and facilities
- Fachbereich Informatik (55)
- Institute of Visual Computing (IVC) (17)
- Institut für Cyber Security & Privacy (ICSP) (9)
- Fachbereich Wirtschaftswissenschaften (6)
- Institut für Sicherheitsforschung (ISF) (4)
- Institut für funktionale Gen-Analytik (IFGA) (2)
- Fachbereich Ingenieurwissenschaften und Kommunikation (1)
- Institut für Verbraucherinformatik (IVI) (1)
Document Type
- Conference Object (41)
- Article (22)
- Report (6)
- Master's Thesis (5)
- Book (monograph, edited volume) (4)
- Part of a Book (3)
- Doctoral Thesis (2)
- Bachelor Thesis (1)
- Conference Proceedings (1)
- Contribution to a Periodical (1)
Year of publication
- 2012 (88) (remove)
Keywords
- 3D-Scanner (2)
- ARRs (2)
- Augmented Reality (2)
- Bag of Features (2)
- FDI (2)
- Hybrid systems (2)
- classifier combination (2)
- clustering (2)
- feature extraction (2)
- machine learning (2)
Multi-hop Netze sind seit vielen Jahren Forschungsthema. Seit einigen Jahren gibt es auch erste Realisierungen solcher Netze. Sie ermöglichen es, ohne feste Infrastruktur sich selbst organisierende Netze zu realisieren. Dies macht sie für vielfältige zivile wie taktische Szenarien interessant. In der vorliegenden Arbeit liegt der Fokus auf taktischen Szenarien, wie Szenarien der öffentlichen Sicherheit, militärischen oder Katastrophenszenarien. In solchen Szenarien kann für die Kommunikation auf der letzten Meile nicht von existierender Kommunikationsinfrastruktur ausgegangen werden. Taktische multi-hop Netze stellen eine Möglichkeit dar, die Kommunikation auf der letzen Meile trotzdem zu realisieren.
Malware is responsible for massive economic damage. Being the preferred tool for digital crime, botnets are becoming increasingly sophisticated, using more and more resilient, distributed infrastructures based on peer-to-peer (P2P) protocols. On the other side, current investigation techniques for malware and botnets on a technical level are time-consuming and highly complex. Fraunhofer FKIE is addressing this problem, researching new ways of intelligent process automation and information management for malware analysis in order to minimize the time needed to investigate these threats.
Today’s computer systems face a vast array of severe threats that are posed by automated attacks performed by malicious software as well as manual attacks by individual humans. These attacks not only differ in their technical implementation but may also be location-dependent. Consequentially, it is necessary to join the information from heterogeneous and distributed attack sensors in order to acquire comprehensive information on current ongoing cyber attacks.
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key was to make the results applicable to data publication processes in other sciences as well.
YAWL User Group
(2012)
The objective of this thesis is to implement a computer game based motivation system for maximal strength testing on the Biodex System 3 Isokinetic Dynamometer. The prototype game has been designed to improve the peak torque produced in an isometric knee extensor strength test. An extensive analysis is performed on a torque data set from a previous study. The torque responses for five second long maximal voluntary contractions of the knee extensor are analyzed to understand torque response characteristics of different subjects. The parameters identifed in the data analysis are used in the implementation of the 'Shark and School of Fish' game. The behavior of the game for different torque responses is analyzed on a different torque data set from the previous study. The evaluation shows that the game rewards and motivates continuously over a repetition to reach the peak torque value. The evaluation also shows that the game rewards the user more if he overcomes a baseline torque value within the first second and then gradually increase the torque to reach peak torque.
In this paper, various enhanced sales forecast methodologies and models for the automobile market are presented. The methods used deliver highly accurate predictions while maintaining the ability to explain the underlying model at the same time. The representation of the economic training data is discussed, as well as its effects on the newly registered automobiles to be predicted. The methodology mainly consists of time series analysis and classical Data Mining algorithms, whereas the data is composed of absolute and/or relative market-specific exogenous parameters on a yearly, quarterly, or monthly base. It can be concluded that the monthly forecasts were especially improved by this enhanced methodology using absolute, normalized exogenous parameters. Decision Trees are considered as the most suitable method in this case, being both accurate and explicable. The German and the US-American automobile market are presented for the evaluation of the forecast models.
Computational chemistry began with the birth of computers in the mid 1900s, and its growth has been directly coupled to the technological advances made in computer science and high-performance computing. A popular goal within the field, be it Newtonian or quantum based methods, is the accurate modelling of physical forces and energetics through mathematics and algorithm design. Through reliable modelling of the underlying forces, molecular simulations frequently provide atomistic insights into macroscopic experimental observations.
This book constitutes the thoroughly refereed post-conference proceedings of the Third International ICST Conference on e-Infrastructure and e-Services for Developing Countries, AFRICOMM 2011, held in Zanzibar, Tansania, in November 2011. The 24 revised full papers presented together with 2 poster papers were carefully reviewed and selected from numerous submissions. The papers cover a wide range of topics in the field of information and communication infrastructures. They are organized in two tracks: communication infrastructures for developing countries and electronic services, ICT policy, and regulatory issues for developing countries.
This project investigated the viability of using the Microsoft Kinect in order to obtain reliable Red-Green-Blue-Depth (RGBD) information. This explored the usability of the Kinect in a variety of environments as well as its ability to detect different classes of materials and objects. This was facilitated through the implementation of Random Sample and Consensus (RANSAC) based algorithms and highly parallelized workflows in order to provide time sensitive results. We found that the Kinect provides detailed and reliable information in a time sensitive manner. Furthermore, the project results recommend usability and operational parameters for the use of the Kinect as a scientific research tool.
Traffic simulations for virtual environments are concerned with the behavior of individual traffic participants. The complexity of behavior in these simulations is often rather simple to abide by the constraints of processing resources. In sophisticated traffic simulations, the behavior of individual traffic participants is also modeled, but the focus lies on the overall behavior of the entire system, e.g. to identify possible bottle necks of traffic flow [8].