Refine
H-BRS Bibliography
- yes (239) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (64)
- Fachbereich Ingenieurwissenschaften und Kommunikation (47)
- Fachbereich Sozialpolitik und Soziale Sicherung (41)
- Fachbereich Wirtschaftswissenschaften (40)
- Fachbereich Angewandte Naturwissenschaften (34)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (28)
- Institut für funktionale Gen-Analytik (IFGA) (16)
- Institut für Verbraucherinformatik (IVI) (13)
- Institut für Cyber Security & Privacy (ICSP) (12)
- Institute of Visual Computing (IVC) (9)
Document Type
- Article (68)
- Conference Object (62)
- Part of a Book (29)
- Book (monograph, edited volume) (25)
- Contribution to a Periodical (10)
- Preprint (10)
- Video (8)
- Report (8)
- Research Data (6)
- Doctoral Thesis (5)
- Book review (3)
- Conference Proceedings (1)
- Lecture (1)
- Patent (1)
- Study Thesis (1)
- Working Paper (1)
Year of publication
- 2021 (239) (remove)
Has Fulltext
- no (239) (remove)
Keywords
- Lehrbuch (7)
- DGQ (6)
- Melcher (6)
- Grundwerkzeug des Qualitätsmanagements (3)
- Machine Learning (3)
- Augmented Reality (2)
- Automatic Differentiation (2)
- Cognitive robot control (2)
- Digitalisierung (2)
- Dimensionality reduction (2)
BWL für Dummies
(2021)
The majority of biomedical knowledge is stored in structured databases or as unstructured text in scientific publications. This vast amount of information has led to numerous machine learning-based biological applications using either text through natural language processing (NLP) or structured data through knowledge graph embedding models (KGEMs). However, representations based on a single modality are inherently limited. To generate better representations of biological knowledge, we propose STonKGs, a Sophisticated Transformer trained on biomedical text and Knowledge Graphs. This multimodal Transformer uses combined input sequences of structured information from KGs and unstructured text data from biomedical literature to learn joint representations. First, we pre-trained STonKGs on a knowledge base assembled by the Integrated Network and Dynamical Reasoning Assembler (INDRA) consisting of millions of text-triple pairs extracted from biomedical literature by multiple NLP systems. Then, we benchmarked STonKGs against two baseline models trained on either one of the modalities (i.e., text or KG) across eight different classification tasks, each corresponding to a different biological application. Our results demonstrate that STonKGs outperforms both baselines, especially on the more challenging tasks with respect to the number of classes, improving upon the F1-score of the best baseline by up to 0.083. Additionally, our pre-trained model as well as the model architecture can be adapted to various other transfer learning applications. Finally, the source code and pre-trained STonKGs models are available at https://github.com/stonkgs/stonkgs and https://huggingface.co/stonkgs/stonkgs-150k.
Solving transport network problems can be complicated by non-linear effects. In the particular case of gas transport networks, the most complex non-linear elements are compressors and their drives. They are described by a system of equations, composed of a piecewise linear ‘free’ model for the control logic and a non-linear ‘advanced’ model for calibrated characteristics of the compressor. For all element equations, certain stability criteria must be fulfilled, providing the absence of folds in associated system mapping. In this paper, we consider a transformation (warping) of a system from the space of calibration parameters to the space of transport variables, satisfying these criteria. The algorithm drastically improves stability of the network solver. Numerous tests on realistic networks show that nearly 100% convergence rate of the solver is achieved with this approach.
The lattice Boltzmann method (LBM) is an efficient simulation technique for computational fluid mechanics and beyond. It is based on a simple stream-and-collide algorithm on Cartesian grids, which is easily compatible with modern machine learning architectures. While it is becoming increasingly clear that deep learning can provide a decisive stimulus for classical simulation techniques, recent studies have not addressed possible connections between machine learning and LBM. Here, we introduce Lettuce, a PyTorch-based LBM code with a threefold aim. Lettuce enables GPU accelerated calculations with minimal source code, facilitates rapid prototyping of LBM models, and enables integrating LBM simulations with PyTorch's deep learning and automatic differentiation facility. As a proof of concept for combining machine learning with the LBM, a neural collision model is developed, trained on a doubly periodic shear layer and then transferred to a different flow, a decaying turbulence. We also exemplify the added benefit of PyTorch's automatic differentiation framework in flow control and optimization. To this end, the spectrum of a forced isotropic turbulence is maintained without further constraining the velocity field.
Kinder – unsere Zukunft!
(2021)
Using Visual and Auditory Cues to Locate Out-of-View Objects in Head-Mounted Augmented Reality
(2021)
Intercultural Management
(2021)
This book shows in a comprehensive presentation how Bond Graph methodology can support model-based control, model-based fault diagnosis, fault accommodation, and failure prognosis by reviewing the state-of-the-art, presenting a hybrid integrated approach to Bond Graph model-based fault diagnosis and failure prognosis, and by providing a review of software that can be used for these tasks.
Konzept zum Umgang mit Prüfungsstress und Lernblockaden bei Studierenden in der Studieneingangsphase
(2021)
Steuerlehre für Dummies
(2021)