Refine
H-BRS Bibliography
- yes (58)
Departments, institutes and facilities
- Fachbereich Informatik (58) (remove)
Document Type
- Conference Object (30)
- Article (14)
- Report (6)
- Part of a Book (3)
- Book (monograph, edited volume) (2)
- Doctoral Thesis (1)
- Master's Thesis (1)
- Other (1)
Year of publication
- 2012 (58) (remove)
Keywords
- 3D-Scanner (2)
- ARRs (2)
- Bag of Features (2)
- FDI (2)
- Hybrid systems (2)
- classifier combination (2)
- clustering (2)
- feature extraction (2)
- machine learning (2)
- object categorization (2)
- virtual reality (2)
- All-Swap Algorithm (1)
- Artificial Intelligence and Natural Language Processing (1)
- Assistenzsystem (1)
- Augmented Reality (1)
- Automation (1)
- Background music (1)
- CUDA (1)
- Cognition (1)
- DHF-Reduktase (1)
- DOI (1)
- Data Publication (1)
- DataCite (1)
- Databases and Data Mining (1)
- Design automation (1)
- Digital Object Identifier (1)
- Domestic service robots (1)
- EEG (1)
- ERP system (1)
- Electromagnetic Fields (1)
- Environmental Data (1)
- Field programmable gate arrays (1)
- Fuzzy-Logik (1)
- Givens Rotations (1)
- Graphical user interfaces (1)
- Graphics Cards (1)
- Hardware (1)
- Human robot interaction (1)
- Hybrid models of engineering systems (1)
- Integrated circuit interconnections (1)
- Knowledge Management (1)
- Lattice Basis Reduction (1)
- Media in education (1)
- Memory (1)
- Meteorological Data (1)
- Mobiler Roboter (1)
- Molekulare Deskriptoren (1)
- Natural scene text (1)
- PDE-5A (1)
- Parallelization (1)
- RE (1)
- Raman spectroscopy (1)
- Robotik (1)
- SOA (1)
- Semantic scene understanding (1)
- Software Architecture (1)
- Software Framework (1)
- Virtual Reality (1)
- Virtual reality (1)
- Visualization (1)
- Workflow Management (1)
- adaptive binarization (1)
- adaptive fault thresholds (1)
- adaptive filters (1)
- affective computing (1)
- analysis (1)
- analytical redundancy relation residuals (1)
- averaged bond graph models (1)
- binary classification (1)
- bond graphs (1)
- brain computer interfaces (1)
- bus load (1)
- can bus (1)
- computer games (1)
- data logging (1)
- data visualisation (1)
- direct feedback (1)
- distance perception (1)
- distributed processing (1)
- e-Research (1)
- emotion computing (1)
- enterprise software (1)
- external faults (1)
- fault scenarios (1)
- fault detection (1)
- fpga (1)
- free and open source software (1)
- incremental bond graphs (1)
- isolation (1)
- knowledge engineering (1)
- light curtains (1)
- microcomputers (1)
- microcontroller (1)
- mobile manipulators (1)
- momentary frequency (1)
- monitoring (1)
- object identification (1)
- operation mode independent causalities (1)
- optic flow (1)
- optical safeguard sensor (1)
- optoelectronic (1)
- plasma-enhanced CVD (PECVD) (deposition) (1)
- power electronic systems (1)
- regression testing (1)
- rendering (computer graphics) (1)
- residual sinks (1)
- screens (display) (1)
- self-motion perception (1)
- skin detection (1)
- software testing (1)
- software-based feedback agents (1)
- support vector machine (1)
- switched three-phase power inverter (1)
- synthetic dataset (1)
- system mode independent bond graph representation (1)
- teaching (1)
- test case reduction (1)
- time series processing (1)
- vection (1)
- web services (1)
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
YAWL User Group
(2012)
The documentation requirements of data published in long term archives have significantly grown over the last decade. At WDCC the data publishing process is assisted by “Atarrabi”, a web-based workflow system for reviewing and editing metadata information by the data authors and the publication agent. The system ensures high metadata quality for long-term use of the data with persistent identifiers (DOI/URN). By these well-defined references (DOI) credit can properly be given to the data producers in any publication.
This paper describes adaptive time frequency analysis of EEG signals, both in theory as well as in practice. A momentary frequency estimation algorithm is discussed and applied to EEG time series of test persons performing a concentration experiment. The motivation for deriving and implementing a time frequency estimator is the assumption that an emotional change implies a transient in the measured EEG time series, which again are superimposed by biological white noise as well as artifacts. It will be shown how accurately and robustly the estimator detects the transient even under such complicated conditions.
In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key was to make the results applicable to data publication processes in other sciences as well.