Refine
H-BRS Bibliography
- yes (66)
Departments, institutes and facilities
- Graduierteninstitut (66) (remove)
Document Type
- Doctoral Thesis (66) (remove)
Year of publication
Has Fulltext
- no (66)
Keywords
- Lignin (3)
- Antioxidans (2)
- Evolutionary optimization (2)
- Gitter-Boltzmann-Methode (2)
- Human-Computer Interaction (2)
- Nachhaltigkeit (2)
- Quality diversity (2)
- Robotics (2)
- Spektroskopie (2)
- Strömungssimulation (2)
- Sustainability (2)
- Virtual Reality (2)
- ABTS (1)
- API Gebrauchstauglichkeit (1)
- API usability (1)
- Account (Datenverarbeitung) (1)
- Active and Healthy Aging Technologies (1)
- Additiv (1)
- Aerodynamics (1)
- Aerodynamik (1)
- Agarose (1)
- Ammoniak (1)
- Analytik (1)
- Anoplophora glabripennis (1)
- Antioxidant capacity (1)
- Antioxidanz (1)
- Arbeitspause (1)
- Augmented Reality (1)
- Augmented reality en (1)
- Ausbreitung (1)
- Authentifikation (1)
- Automated design (1)
- Autonomes Fahren (1)
- Autonomous Driving (1)
- Autonomous Systems (1)
- Bacteria (1)
- Bakterien (1)
- Benetzbarkeit (1)
- Benzoyl-coenzym A (1)
- Bioactivity (1)
- Bioaktive Verbindung (1)
- Bioaktivität (1)
- Biomass (1)
- Biomasse (1)
- Bodengesundheit (1)
- Charakterisierung (1)
- Chemische Analyse (1)
- Chemometrics (1)
- Chemometrie (1)
- Christmas trees (1)
- Clustering (1)
- Co-creative processes (1)
- Computational creativity (1)
- Computergrafik (1)
- Computersicherheit (1)
- Connectivity in rural areas (1)
- Consumer Informatics (1)
- Consumption (1)
- DNA profiling (1)
- DPPH (1)
- Data Protection (1)
- Demands of Older Adults (1)
- Design Case Study (1)
- Detektion von Explosivstoffen (1)
- Digital design (1)
- Dimensionality reduction (1)
- Divergent optimization (1)
- Drahtloses lokales Netz (1)
- Düngemittel (1)
- Empfehlungssystem (1)
- Empfehlungssysteme (1)
- Employee Privacy (1)
- Energy meteorology (1)
- Enhanced weathering (1)
- Ernte (1)
- Erweiterte Realität <Informatik> de (1)
- Extended reality (1)
- FPGA (1)
- FRAP (1)
- Fluiddynamik (1)
- Folin-Ciocalteu (1)
- Food packaging (1)
- Foveated rendering (1)
- Gebrauchstauglicher Datenschutz (1)
- Generative Models (1)
- Genotyp (1)
- Gesteinsmehl (1)
- Gesundheit (1)
- Geteilte autonome Fahrzeuge (1)
- Global illumination (1)
- Glutamin N-phenylacetyltransferase (1)
- Glycin N-acyltransferase (1)
- Glycine N-acyltransferase (1)
- Glycine conjugation (1)
- Glyzinkonjugation (1)
- Grounded Theory (1)
- Gülle (1)
- HSQC NMR (1)
- Harnstoffzyklusdefekt (1)
- Health Technology Design (1)
- Human factors (1)
- Human-Centered Design (1)
- Information Privacy (1)
- Informationsflüsse (1)
- Infrared (1)
- Infrarot (1)
- Innovation (1)
- Integration of New Technologies for the Elderly (1)
- Integration of Technologies for Active and Healthy Aging (1)
- Interaction design en (1)
- Interaktionsdesign de (1)
- Interferenz (1)
- Irreguläre Gitter (1)
- Isovalerianazidämie (1)
- Isovaleric acidemia (1)
- Knochenzement (1)
- Kompressible Strömung (1)
- Konsum (1)
- Kontaktwinkel (1)
- Kunststoffverpackung (1)
- Large high-resolution displays (1)
- Lattice-Boltzmann (1)
- Lattice-Boltzmann-Methode (1)
- Lebensmittelverpackungen (1)
- Legal Design (1)
- Login (1)
- Low-power design (1)
- MOX Gassensoren (1)
- Miscanthus (1)
- Miscanthus x giganteus (1)
- Mixed methods (1)
- Model-Based Software Development (1)
- Molekulargewicht (1)
- Nachwachsender Rohstoff (1)
- Nadelhölzer (1)
- Network simulation verification (1)
- Neurophysiologie (1)
- Next generation sequencing (NGS) (1)
- Numerische Strömungssimulation (1)
- Nutzerorientierte Methoden (1)
- OH-Zahl-Bestimmungen (1)
- Organosolv lignin (1)
- Organosolv-Lignin (1)
- Organosolv-Verfahren (1)
- Participatory design studies (1)
- Passwort (1)
- Paulownia tomemtosa (1)
- Phase II Reaktion (1)
- Phasenübergang (1)
- Phenol-Formaldehyd-Harze (1)
- Phenole-formaldehyde resin (1)
- Phenylacetyl-coenzym A (1)
- Polyurethan (1)
- Polyurethan-Coatings (1)
- Polyurethanbeschichtungen (1)
- Polyurethane (1)
- Privacy Risk Assessment (1)
- Radiance caching (1)
- Raman (1)
- Ray tracing (1)
- Remote laboratory (1)
- Renewable resource (1)
- Ressource (1)
- Right to Informational Self-Determination (1)
- Rock dust (1)
- Rosskastanie (1)
- Schneeglöckchen (1)
- Semi-Lagrange-Verfahren (1)
- Shan-Chen (1)
- Shared autonomous vehicles (1)
- Sicherheits-APIs (1)
- Signal (1)
- Silphium perfoliatum (1)
- Skin cells (1)
- Skin flakes (1)
- Smartphone (1)
- Social practice theory en (1)
- Social practices (1)
- Softwareentwicklung (1)
- Soil health (1)
- Solar power (1)
- Soziale Praktiken (1)
- Spectroscopy (1)
- Spectroscropy (1)
- Sperm cells (1)
- Spermatozoa (1)
- Stabilisator (1)
- Stabilization (1)
- Statistische Physik (1)
- Strukturaufklärung (1)
- Strömungsmechanik (1)
- Surrogate Modeling (1)
- Surrogate-assistance (1)
- Synergie (1)
- TD-GC/MS (1)
- Technikfolgenabschätzung (1)
- Technology Assessment (1)
- Theorie der sozialen Praxis (1)
- Thyme (1)
- Thymian (1)
- Touchscreens (1)
- UV (1)
- Usable Privacy (1)
- User-oriented methods (1)
- Verbrauch (1)
- Verbraucherinformatik (1)
- Virtuelle Realität (1)
- Virtuelle Realität de (1)
- Visuelle Wahrnehmung (1)
- Weihnachtsbaum (1)
- West Africa (1)
- Whole genome amplification (WGA) (1)
- WiFi-based Long Distance networks (1)
- Wirkstofffreisetzung (1)
- Zweckbindung (1)
- additive (1)
- adhesion (1)
- ammonia (1)
- antimicrobial coatings (1)
- antimikrobielle Beschichtungen (1)
- antioxidant (1)
- benzoyl-coA (1)
- beschleunigte Verwitterung (1)
- bio-based (1)
- biobased plastics (1)
- biobasiert (1)
- biobasierte Kunststoffe (1)
- blown film extrusion (1)
- bulk and local viscoelastic properties (1)
- chain extender cross-linker (1)
- characterization (1)
- coefficient of thermal expansion (1)
- compost disintegration (1)
- coniferous woods (1)
- determination of OH content (1)
- developer centered security (1)
- dielectric analysis (1)
- dielektrická analýza (1)
- diffusion (1)
- distribuce záření (1)
- eating behavior (1)
- entwicklerzentrierte Sicherheit (1)
- external faults (1)
- extrusion blow molding (1)
- fault handling (1)
- fertilizer (1)
- fotokompozit (1)
- glutamine N-phenylacetyltransferase (1)
- guidance (1)
- haptics (1)
- hardness testing (1)
- health intervention (1)
- horse chestnut (1)
- hybrid robot skill representation (1)
- information flows (1)
- kinetika vytvrzování (1)
- light distribution (1)
- lignin (1)
- microindentation (1)
- migration (1)
- molecular weight (1)
- morphology (1)
- multisensory (1)
- nachhaltig (1)
- naive physics (1)
- nudging (1)
- phase II reaction (1)
- phenylacetyl-coA (1)
- photo-polymerization (1)
- poly(butylene adipate-co-terephthalate) (1)
- poly(lactic acid) (1)
- pressure sensitive adhesives (1)
- process-induced structure (1)
- processing-structure-property relationship (1)
- qualitative reasoning (1)
- reaction kinetics (1)
- recommender systems (1)
- ressources (1)
- rheology (1)
- robot context awareness (1)
- robot failure diagnosis (1)
- robot introspection (1)
- robot skill execution failures (1)
- robot skill generalisation (1)
- rock powder (1)
- security APIs (1)
- sensory perception (1)
- service robots (1)
- simulation (1)
- slurry (1)
- snowdrop (1)
- software development (1)
- structure elucidation (1)
- students (1)
- sustainable (1)
- synergism (1)
- thermo-mechanical properties (1)
- transdermal therapeutic systems (1)
- tvrdost (1)
- urea cycle defect (1)
- viscoelastic properties (1)
- visible light curing resin based composites (1)
- viskoelastické vlastnosti (1)
- volatile organic compounds (1)
- vytvrzování světlem (1)
Solving differential-algebraic equations (DAEs) efficiently by means of appropriate numerical schemes for time-integration is an ongoing topic in applied mathematics. In this context, especially when considering large systems that occur with respect to many fields of practical application effective computation becomes relevant. In particular, corresponding examples are given when having to simulate network structures that consider transport of fluid and gas or electrical circuits. Due to the stiffness properties of DAEs, time-integration of such problems generally demands for implicit strategies. Among the schemes that prove to be an adequate choice are linearly implicit Rung-Kutta methods in the form of Rosenbrock-Wanner (ROW) schemes. Compared to fully implicit methods, they are easy to implement and avoid the solution of non-linear equations by including Jacobian information within their formulation. However, Jacobian calculations are a costly operation. Hence, necessity of having to compute the exact Jacobian with every successful time-step proves to be a considerable drawback. To overcome this drawback, a ROW-type method is introduced that allows for non-exact Jacobian entries when solving semi-explicit DAEs of index one. The resulting scheme thus enables to exploit several strategies for saving computational effort. Examples include using partial explicit integration of non-stiff components, utilizing more advantageous sparse Jacobian structures or making use of time-lagged Jacobian information. In fact, due to the property of allowing for non-exact Jacobian expressions, the given scheme can be interpreted as a generalized ROW-type method for DAEs. This is because it covers many different ROW-type schemes known from literature. To derive the order conditions of the ROW-type method introduced, a theory is developed that allows to identify occurring differentials and coefficients graphically by means of rooted trees. Rooted trees for describing numerical methods were originally introduced by J.C. Butcher. They significantly simplify the determination and definition of relevant characteristics because they allow for applying straightforward procedures. In fact, the theory presented combines strategies used to represent ROW-type methods with exact Jacobian for DAEs and ROW-type methods with non-exact Jacobian for ODEs. For this purpose, new types of vertices are considered in order to describe occurring non-exact elementary differentials completely. The resulting theory thus automatically comprises relevant approaches known from literature. As a consequence, it allows to recognize order conditions of familiar methods covered and to identify new conditions. With the theory developed, new sets of coefficients are derived that allow to realize the ROW-type method introduced up to orders two and three. Some of them are constructed based on methods known from literature that satisfy additional conditions for the purpose of avoiding effects of order reduction. It is shown that these methods can be improved by means of the new order conditions derived without having to increase the number of internal stages. Convergence of the resulting methods is analyzed with respect to several academic test problems. Results verify the theory determined and the order conditions found as only schemes satisfying the order conditions predicted preserve their order when using non-exact Jacobian expressions.
Optimization plays an essential role in industrial design, but is not limited to minimization of a simple function, such as cost or strength. These tools are also used in conceptual phases, to better understand what is possible. To support this exploration we focus on Quality Diversity (QD) algorithms, which produce sets of varied, high performing solutions. These techniques often require the evaluation of millions of solutions -- making them impractical in design cases. In this thesis we propose methods to radically improve the data-efficiency of QD with machine learning, enabling its application to design. In our first contribution, we develop a method of modeling the performance of evolved neural networks used for control and design. The structures of these networks grow and change, making them difficult to model -- but with a new method we are able to estimate their performance based on their heredity, improving data-efficiency by several times. In our second contribution we combine model-based optimization with MAP-Elites, a QD algorithm. A model of performance is created from known designs, and MAP-Elites creates a new set of designs using this approximation. A subset of these designs are the evaluated to improve the model, and the process repeats. We show that this approach improves the efficiency of MAP-Elites by orders of magnitude. Our third contribution integrates generative models into MAP-Elites to learn domain specific encodings. A variational autoencoder is trained on the solutions produced by MAP-Elites, capturing the common “recipe” for high performance. This learned encoding can then be reused by other algorithms for rapid optimization, including MAP-Elites. Throughout this thesis, though the focus of our vision is design, we examine applications in other fields, such as robotics. These advances are not exclusive to design, but serve as foundational work on the integration of QD and machine learning.
The detection of human skin in images is a very desirable feature for applications such as biometric face recognition, which is becoming more frequently used for, e.g., automated border or access control. However, distinguishing real skin from other materials based on imagery captured in the visual spectrum alone and in spite of varying skin types and lighting conditions can be dicult and unreliable. Therefore, spoofing attacks with facial disguises or masks are still a serious problem for state of the art face recognition algorithms. This dissertation presents a novel approach for reliable skin detection based on spectral remission properties in the short-wave infrared (SWIR) spectrum and proposes a cross-modal method that enhances existing solutions for face verification to ensure the authenticity of a face even in the presence of partial disguises or masks. Furthermore, it presents a reference design and the necessary building blocks for an active multispectral camera system that implements this approach, as well as an in-depth evaluation. The system acquires four-band multispectral images within T = 50ms. Using a machine-learning-based classifier, it achieves unprecedented skin detection accuracy, even in the presence of skin-like materials used for spoofing attacks. Paired with a commercial face recognition software, the system successfully rejected all evaluated attempts to counterfeit a foreign face.
The human enzymes GLYAT (glycine N-acyltransferase), GLYATL1 (glutamine N-phenylacetyltransferase) and GLYATL2 (glycine N-acyltransferase-like protein 2) are not only important in the detoxification of xenobiotics via the human liver, but are also involved in the elimination of acyl residues that accumulate in the form of their coenzyme A (coA) esters in some rare inborn errors of metabolism. This concerns, for example, disorders in the degradation of branched-chain amino acids, such as isovaleric acidemia or propionic acidemia. In addition, they also assist in the elimination of ammonium, which is produced during the transamination of amino acids and accumulates in urea cycle defects. Sequence variants of the enzymes have also been investigated, which may provide evidence of impaired enzyme activities, from which therapy adjustments can potentially be derived. A modified Escherichia coli strain was chosen for the overexpression and partial biochemical characterization of the enzymes, which may allow solubility and proper folding. Since post-translational protein modifications are very limited in bacteria, we also attempted to overexpress the enzymes in HEK293 cells (human-derived). In addition to characterization via immunoblots and activity assays, intracellular localization of the enzymes was also performed using GFP coupling and confocal laser scanning microscopy in transfected HEK293 cells. The GLYATL2 enzyme may have tasks beyond detoxification and metabolic defects and the preliminary molecular biology work has been performed as part of this project - the enzyme activity determinations were outsourced to a co-supervised bachelor thesis. The enzyme activity determinations with purified recombinant human enzyme from Escherichia coli provided a threefold higher activity of the sequence variant p.(Asn156Ser) for GLYAT, which should be considered as the probably authentic wild type of the enzyme. In addition, a reduced activity of the GLYAT variant p.(Gln61Leu), which is very common in South Africa, was shown, which could be of particular importance in the treatment of isovaleric acidemia, which is also common in South Africa. Intracellularly, GLYAT and GLYATL1 could be localized mitochondrially. As the analyses have shown, sequence variations of GLYAT and GLYATL1 influence their enzyme activity. As an example, the GLYAT variant p.(Gln61Leu) is frequently found in South Africa. In the case of reduced GLYAT activity, patients could be increasingly treated with L-carnitine in the sense of an individualized therapy, since the conjugation of the toxic isovaleryl-coA with glycine is restricted by the GLYAT sequence variation. Activity-reducing variants identified in this project are of particular interest, as they may influence the treatment of certain metabolic defects.
Due to the use of fossil fuel resources, many environmental problems have been increasingly growing. Thus, the recent research focuses on the use of environment friendly materials from sustainable feedstocks for future fuels, chemicals, fibers and polymers. Lignocellulosic biomass has become the raw material of choice for these new materials. Recently, the research has focused on using lignin as a substitute material in many industrial applications. The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. DPPH assay was used to determine the antioxidant activity of Kraft lignin compared to Organosolv lignins from different biomasses. The purification procedure of Kraft lignin showed that double-fold selective extraction is the most efficient confirmed by UV-Vis, FTIR, HSQC, 31PNMR, SEC, and XRD. The antioxidant capacity was discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: Biomass source influences the DPPH inhibition (softwood > grass) and the TPC (softwood < grass). DPPH inhibition affected by the polarity of the extraction solvent. Following the trend: ethanol > diethylether > acetone. Reduced polydispersity has positive influence on the DPPH inhibition. Storage decreased the DPPH inhibition but increased the TPC values. The DPPH assay was also used to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. In both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems the 5% addition showed the highest activity and the highest addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source; Organosolv of softwood > Kraft of softwood > Organosolv of grass. Lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of Kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin leaching in the produced films affected the activity positively and the chitosan addition enhances the activity for both Gram-positive and Gram-negative bacteria. Testing the films against food spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both food spoilage bacteria.
Intelligent virtual agents provide a framework for simulating more life-like behavior and increasing plausibility in virtual training environments. They can improve the learning process if they portray believable behavior that can also be controlled to support the training objectives. In the context of this thesis, cognitive agents are considered a subset of intelligent virtual agents (IVA) with the focus on emulating cognitive processes to achieve believable behavior. The complexity of employed algorithms, however, is often limited since multiple agents need to be simulated in real-time. Available solutions focus on a subset of the indicated aspects: plausibility, controllability, or real-time capability (scalability). Within this thesis project, an agent architecture for attentive cognitive agents is developed that considers all three aspects at once. The result is a lightweight cognitive agent architecture that is customizable to application-specific requirements. A generic trait-based personality model influences all cognitive processes, facilitating the generation of consistent and individual behavior. An additional mapping process provides a formalized mechanism to transfer results of psychological studies to the architecture. Personality profiles are combined with an emotion model to achieve situational behavior adaptation. Which action an agent selects in a situation also influences plausibility. An integral element of this selection process is an agent's knowledge about its world. Therefore, synthetic perception is modeled and integrated into the architecture to provide a credible knowledge base. The developed perception module includes a unified sensor interface, a memory hierarchy, and an attention process. With the presented realization of the architecture (CAARVE), it is possible for the first time to simulate cognitive agents, whose behaviors are simultaneously computable in real-time and controllable. The architecture's applicability is demonstrated by integrating an agent-based traffic simulation built with CAARVE into a bicycle simulator for road-safety education. The developed ideas and their realization are evaluated within this work using different strategies and scenarios. For example, it is shown how CAARVE agents utilize personality profiles and emotions to plausibly resolve deadlocks in traffic simulations. Controllability and adaptability are demonstrated in additional scenarios. Using the realization, 200 agents can be simulated in real-time (50 FPS), illustrating scalability. The achieved results verify that the developed architecture can generate plausible and controllable agent behavior in real-time. The presented concepts and realizations provide sound fundamentals to everyone interested in simulating IVA in real-time environments.
With the digital transformation, software systems have become an integral part of our society and economy. In every part of our life, software systems are increasingly utilized to, e.g., simplify housework or to optimize business processes. All these applications are connected to the Internet, which already includes millions of software services consumed by billions of people. Applications which process such a magnitude of users and data traffic requires to be highly scalable and are therefore denoted as Ultra Large Scale (ULS) systems. Roy Fielding has defined one of the first approaches which allows designing modern ULS software systems. In his doctoral thesis, Fielding introduced the architectural style Representational State Transfer (REST) which builds the theoretical foundation of the web. At present, the web is considered as the world's largest ULS system. Due to a large number of users and the significance of software for society and the economy, the security of ULS systems is another crucial quality factor besides high scalability.
Collaboration among multiple users on large screens leads to complicated behavior patterns and group dynamics. To gain a deeper understanding of collaboration on vertical, large, high-resolution screens, this dissertation builds on previous research and gains novel insights through new observational studies. Among other things, the collected results reveal new patterns of collaborative coupling, suggest that territorial behavior is less critical than shown in previous research, and demonstrate that workspace awareness can also negatively affect the effectiveness of individual users.
Neben der individuellen Bedeutung von Gesundheit für jeden Menschen, steigt auch die Relevanz von „gesunden Beschäftigten“. Gerade in Zeiten von Vollbeschäftigung, Fachkräftemangel und höherem Renteneintrittsalter, rückt die Gesundheit der Beschäftigten und die damit verbundene Arbeitsfähigkeit jedes Einzelnen stärker in den Fokus. Staat, Sozialversicherungsträger und Unternehmen sind zunehmend daran interessiert, Arbeitsplätze und Arbeitsbedingungen gesundheitsförderlich zu gestalten. Hierbei bildet die BGF den Rahmen für die existierenden gesundheitsförderlichen Interventionen, die in einer Vielzahl im betrieblichen Setting vorzufinden sind. Die Arbeitspause kann in diesem Kontext als geeignete Intervention angesehen werden, die jedoch sehr vielfältig in der Ausgestaltung sein kann.
Over the last 50 years, the controlled motion of robots has become a very mature domain of expertise. It can deal with all sorts of topologies and types of joints and actuators, with kinematic as well as dynamic models of devices, and with one or several tools or sensors attached to the mechanical structure. Nevertheless, the domain has not succeeded in standardizing the modelling of robot devices (including such fundamental entities as “reference frames”!), let alone the semantics of their motion specification and control. This thesis aims to solve this long-standing problem, from three different sides: semantic models for robot kinematics and dynamics, semantic models of all possible motion specification and control problems, and software that can support the latter while being configured by a systematic use of the former.
The use of manually fed machines (e.g. table saws) bares risks of injury that are clearly above the average level of other high risk workplaces.
The wide use of such machines causes severe problems for occupational safety and implies high costs for medical treatments and accident annuities.
This thesis presents a new concept of a multispectral sensor to monitor an area in front of a danger zone to detect the user’s limbs and trigger safeguarding measures to prevent an accident in time.
The sensor concept realizes a contact-free material classification, which comprises the development of a system design and specific safety requirements with respect to international safety standards.
Furthermore, a prototypical implementation using four wavebands, which were determined for skin detection through an analysis of reflectance spectra acquired specifically for this purpose, was built.
In recent years, eXtended Reality (XR) technology like Augmented Reality and Virtual Reality became both technically feasible as well as affordable which lead to a drastic demand of professionally designed and developed applications. However, this demand combined with a rapid pace of innovation revealed a lack of design tool support for professional interaction designers as well as a knowledge gap regarding their approaches and needs. To address this gap, this thesis engages with the work of professional XR interaction designers in a qualitative research into XR interaction design approach. Therefore, this thesis applies two complementary lenses stemming from scientific design and social practice theory discourses to observe, describe, analyze, and understand professional XR interaction designers' challenges and approaches with a focus on application prototyping.
During the last 50 years, a broad range of visible light curing resin based composites (VLC RBC) was developed for restorative applications in dentistry. Correspondingly, the technologies of light curing units (LCU) have changed from UV to visible blue light, and there from quartz tungsten halogen over plasma arc to LED LCUs increasing their light intensity significantly. In this thesis, the influence of the curing conditions in terms of irradiance, exposure time and irradiance distribution of LCU on reaction kinetics as well as corresponding mechanical and viscoelastic properties were investigated.
The present thesis elucidates the development of (i) a series of small molecule inhibitors reacting in a covalent-irreversible manner with the targeted proteases and (ii) a fluorescently labeled activity-based probe as a pharmacological tool compound for investigation of specific functions of the mentioned enzymes in vitro. Herein, the rational design, organic synthesis and quantitative structure-activity-relationships are described extensively.
In this thesis it is posed that the central object of preference discovery is a co-creative process in which the Other can be represented by a machine. It explores efficient methods to enhance introverted intuition using extraverted intuition's communication lines. Possible implementations of such processes are presented using novel algorithms that perform divergent search to feed the users' intuition with many examples of high quality solutions, allowing them to take influence interactively. The machine feeds and reflects upon human intuition, combining both what is possible and preferred. The machine model and the divergent optimization algorithms are the motor behind this co-creative process, in which machine and users co-create and interactively choose branches of an ad hoc hierarchical decomposition of the solution space.
The proposed co-creative process consists of several elements: a formal model for interactive co-creative processes, evolutionary divergent search, diversity and similarity, data-driven methods to discover diversity, limitations of artificial creative agents, matters of efficiency in behavioral and morphological modeling, visualization, a connection to prototype theory, and methods to allow users to influence artificial creative agents. This thesis helps putting the human back into the design loop in generative AI and optimization.
In forensic DNA profiling, the occurrence of complex mixed profiles is currently a common issue. Cases involving intimate swabs or skin flake tape liftings are prone to mixed profiles, because of more than one donor contributing to a DNA sample. By DNA profiling of single spermatozoa and skin flakes, problems associated with mixed profile could ideally be overcome. However, PCR is not a sensitive enough method to generate DNA profiles by STRs on single cells. Moreover, high quality intact DNA is required, but is not always available in skin flakes due to degradation. Additionally, single skin flakes are difficult to discriminate from other similar looking particles on tape liftings used to secure DNA samples from evidence. The main purpose of this study was to develop a method that enables DNA profiling of single sperm cells and skin flakes. After studying multiple whole genome amplification (WGA) protocols, REPLI-g Single Cell WGA was selected due to its suitability in the pre-amplification step of template DNA. Micromanipulation was used to isolate single spermatozoa. Furthermore, micromanipulation in combination with REPLI-g Single Cell WGA resulted in successful DNA profiling of single spermatozoa by using autosomal STRs as well as X- and Y-chromosomal STRs. The single spermatozoa DNA profiling method described in this thesis was successfully used to identify male contributors from mock intimate swabs with a mixture of semen from multiple male contributors. Different dyes were analysed to develop a staining method to discriminate skin flakes from other particles including particles such as those from hair cosmetic products. From all dyes tested, Orange G was the only dye which successfully discriminated skin flakes from hair product particles. Also, an alkaline based lysis protocol was developed that allowed PCR to be carried out directly on the lysates of single skin flakes. Furthermore, REPLI-g Single Cell WGA was tested on single skin flakes. In contrast to the single spermatozoa, REPLI-g Single Cell WGA was not successful in DNA profiling of single skin flakes. The single skin flake DNA profiling method described in this thesis was successfully used in correctly identifying contributors from mock mixed DNA evidence. Additionally, a small amplicon-based NGS method was tested on single skin flakes. Compared to the PCR and CE approach, the small amplicon-based NGS method improved DNA profiling of single skin flakes, giving a significant increase in allele recovery. In conclusion, this study shows circumventing mixtures is possible by DNA profiling of single spermatozoa, using micromanipulation and WGA. Furthermore, DNA profiling of single skin flakes has been improved by the staining of tape liftings methodology with Orange G, alkaline lysis, direct-PCR and a small amplicon-based NGS approach. Nonetheless, future work is required to assess the performance of the single spermatozoa method on mock swabs with more diluted semen. Also, commercially available NGS kits should be tested with single skin flakes and compared with the in-house NGS method.
Telogene Einzelhaare sind häufig vorkommende Spurentypen an Tatorten. Derzeit werden sie zumeist von der STR-Typisierung ausgeschlossen, weil ihre STR-Profile aufgrund geringer DNA-Mengen und starker DNA-Degradierung in vielen Fällen unvollständig und schwierig zu interpretieren sind. In der vorliegenden Arbeit wurde eine systematische Vorgehensweise angewandt, um Korrelationen zwischen der DNA-Menge und DNA-Degradierung zu dem Erfolg der STR-Typisierung aufzuweisen und darauf basierend den Typisierungs-Erfolg von DNA aus Haaren vorhersagen zu können.
Zu diesem Zweck wurde ein human- (RiboD) und ein canin-spezifischer (RiboDog) qPCR-basierter Assay zur Messung der DNA-Menge und Bewertung der DNA-Integrität mittels eines Degradierungswerts (D-Wert) entwickelt. Aufgrund der Lage der genutzten Primer, welche auf ubiquitär vorkommende ribosomale DNA-Sequenzen abzielen, ist das Funktionsprinzip schnell und kostengünstig auf unterschiedliche Spezies anzuwenden. Die Funktionsweise der Assays wurde mittels seriell degradierter DNA bestätigt und der humane Assay wurde im Vergleich zum kommerziellen Quantifiler? Trio DNA Quantification Kit validiert. Schließlich wurde mit den Assays an DNA aus telogenen und katagenen Einzelhaaren von Menschen und Hunden der Zusammenhang zwischen DNA-Menge und DNA-Integrität zu der Vollständigkeit der STR-Allele (Allel Recovery) von DNA-Profilen untersucht, die mittels kapillarelektrophoretischer (CE) STR-Kits erhaltenen wurde. Es zeigte sich, dass bei humanen Einzelhaaren die Allel-Recovery sowohl von der DNA-Menge als auch der DNA-Integrität abhängt. Dagegen war die DNA-Degradierung bei einzelnen Hundehaaren durchweg geringer und die Allel-Recovery hing allein von der extrahierten DNA-Menge ab.
Um die STR-Analytik degradierter humaner DNA-Proben weiter zu verbessern, wurde ein neuartiger NGS-basierter Assay (maSTR, Mini-Amplikon-STR) etabliert, der die 16 forensischen STR-Loci des European Standard Sets und Amelogenin als sehr kurze Amplikons (76-296 bp) parallel amplifiziert. Mit intakter DNA generierte der maSTR-Assay im Mengenbereich von 200 pg eingesetzter DNA reproduzierbare, vollständige Profile ohne Allelic Drop-ins. Bei niedrigeren DNA-Mengen traten vereinzelt Allelic Drop-ins auf, wobei unter Verwendung von mindestens 43 pg DNA vollständige Profile erhalten wurden.
Die kombinierte Strategie aus RiboD-Messungen der DNA-Menge und -Integrität und daraus resultierendem STR-Typisierungserfolg des maSTR-Assays wurde an degradierter DNA validiert. Anschließend wurde die Strategie auf DNA aus telogenen und katagenen Einzelhaaren angewandt und mit den Ergebnissen des CE-basierten PowerPlex? ESX 17-Kits verglichen, das dasselbe STR-Marker-Set analysiert. Dabei zeigte sich, dass der Erfolg der STR-Typisierung beider STR-Assays sowohl von der optimalen Menge der Template-DNA als auch von der DNA-Integrität abhängt. Mit dem maSTR-Assay wurden vollständige Profile mit ungefähr 50 pg Input-DNA für leicht degradierte DNA aus Einzelhaaren nachgewiesen, sowie mit ungefähr 500 pg stark degradierter DNA. Aufgrund der geringen DNA-Mengen von telogenen Einzelhaaren schwankte die Reproduzierbarkeit der maSTR-Ergebnisse, war jedoch stets dem PowerPlex? ESX 17-Kit in Bezug auf die Allel-Recovery überlegen.
Ein Vergleich mit zwei, hinsichtlich der Längenverteilung der Amplikons komplementären CE-basierten STR-Kits (PowerPlex? ESX 17 und ESI 17 Fast), sowie mit einem kommerziellen NGS-Kit (ForenSeq? DNA Signature Prep) ergab, dass nicht die Technik der NGS, sondern die Kürze der Amplikons der wichtigste Faktor zur Typisierung degradierter DNA ist. Der maSTR-Assay wies in allen Vergleichen mit den genutzten kommerziellen Kits jedoch eine höhere Anzahl an Allelic Drop-ins auf. Diese traten umso häufiger auf, je geringer die verwendete DNA-Menge und je stärker degradiert diese war.
Weil Profile mit Allelic Drop-ins Mischprofilen entsprechen, wurden die per maSTR-Assay generierten STR-Profile mit Verfahren zur Interpretation von Mischspuren untersucht. Bei der Composite-Interpretation werden alle vorkommenden Allele von Replikaten gezählt, bei der Consensus-Interpretation lediglich die reproduzierbaren Allele. Dabei stellte sich heraus, dass im Fall von wenigen Allelic Drop-ins (PowerPlex? ESX 17-generierte Profile) die Composite-Interpretation und bei Allelic Drop-in-haltigen Profilen (maSTR-generierte Profile) die Consensus-Interpretation am besten geeignet ist.
Schließlich wurde mittels der GenoProof Mixture 3-Software untersucht, inwieweit semi- und vollständig kontinuierliche probabilistische Verfahren bei der biostatistischen Bewertung der DNA-Profile aus Einzelhaaren geeignet sind. Dabei zeigte sich, dass der maSTR-Assay aufgrund der hohen Anzahl an Allelic Drop-ins den CE-basierten Methoden nur in Fällen von DNA leicht überlegen ist, die in ausreichender Menge und gering degradiert vorliegt. In diesem Bereich gelingt die Zuordnung des Profils aus Haaren zum Referenzprofil jedoch ebenfalls mittels CE-basierten Methoden.
Aus allen Ergebnissen wurde eine Empfehlung für die Handhabung von DNA aus ausgefallenen Einzelhaaren abgeleitet, die auf dem DNA-Degradierungsgrad in Kombination mit der DNA-Menge basiert. Die vorliegende Arbeit schafft somit eine Grundlage, um ausgefallene Einzelhaare in der Routine-Arbeit von kriminaltechnischen Ermittlungen nutzbar zu machen, sowie gegebenenfalls auf andere Spurentypen mit degradierter DNA geringer Menge anzuwenden. Dadurch könnte die Nutzbarkeit solcher Spurentypen für die forensische Kriminalistik erhöht werden, insbesondere wenn die standardmäßig verwendeten CE-basierten Methoden versagen. Perspektivisch ist die Technik der NGS aufgrund der großen Multiplexierbarkeit uniformer, kurzer Marker generell der CE-basierten Technik bei der Typisierung degradierter DNA überlegen.