Refine
H-BRS Bibliography
- yes (66) (remove)
Departments, institutes and facilities
- Graduierteninstitut (66) (remove)
Document Type
- Doctoral Thesis (66)
Year of publication
Has Fulltext
- no (66)
Keywords
- Lignin (3)
- Antioxidans (2)
- Evolutionary optimization (2)
- Gitter-Boltzmann-Methode (2)
- Human-Computer Interaction (2)
- Nachhaltigkeit (2)
- Quality diversity (2)
- Robotics (2)
- Spektroskopie (2)
- Strömungssimulation (2)
The design of an efficient digital circuit in term of low-power has become a very challenging issue. For this reason, low-power digital circuit design is a topic addressed in electrical and computer engineering curricula, but it also requires practical experiments in a laboratory. This PhD research investigates a novel approach, the low-power design laboratory system by developing a new technical and pedagogical system. The low-power design laboratory system is composed of two types of laboratories: the on-site (hands-on) laboratory and the remote laboratory. It has been developed at the Bonn-Rhine-Sieg University of Applied Sciences to teach low-power techniques in the laboratory. Additionally, this thesis contributes a suggestion on how the learning objectives can be complemented by developing a remote system in order to improve the teaching process of the low-power digital circuit design. This laboratory system enables online experiments that can be performed using physical instruments and obtaining real data via the internet. The laboratory experiments use a Field Programmable Gate Array (FPGA) as a design platform for circuit implementation by students and use image processing as an application for teaching low-power techniques.
This thesis presents the instructions for the low-power design experiments which use a top-down hierarchical design methodology. The engineering student designs his/her algorithm with a high level of abstraction and the experimental results are obtained and measured at a low level (hardware) so that more information is available to correctly estimate the power dissipation such as specification, latency, thermal effect, and technology used. Power dissipation of the digital system is influenced by specification, design, technology used, as well as operating temperature. Digital circuit designers can observe the most influential factors in power dissipation during the laboratory exercises in the on-site system and then use the remote system to supplement investigating the other factors. Furthermore, the remote system has obvious benefits such as developing learning outcomes, facilitating new teaching methods, reducing costs and maintenance, cost-saving by reducing the numbers of instructors, saving instructor time and simplifying their tasks, facilitating equipment sharing, improving reliability, and finally providing flexibility of usage the laboratories.
Due to the use of fossil fuel resources, many environmental problems have been increasingly growing. Thus, the recent research focuses on the use of environment friendly materials from sustainable feedstocks for future fuels, chemicals, fibers and polymers. Lignocellulosic biomass has become the raw material of choice for these new materials. Recently, the research has focused on using lignin as a substitute material in many industrial applications. The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. DPPH assay was used to determine the antioxidant activity of Kraft lignin compared to Organosolv lignins from different biomasses. The purification procedure of Kraft lignin showed that double-fold selective extraction is the most efficient confirmed by UV-Vis, FTIR, HSQC, 31PNMR, SEC, and XRD. The antioxidant capacity was discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: Biomass source influences the DPPH inhibition (softwood > grass) and the TPC (softwood < grass). DPPH inhibition affected by the polarity of the extraction solvent. Following the trend: ethanol > diethylether > acetone. Reduced polydispersity has positive influence on the DPPH inhibition. Storage decreased the DPPH inhibition but increased the TPC values. The DPPH assay was also used to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. In both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems the 5% addition showed the highest activity and the highest addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source; Organosolv of softwood > Kraft of softwood > Organosolv of grass. Lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of Kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin leaching in the produced films affected the activity positively and the chitosan addition enhances the activity for both Gram-positive and Gram-negative bacteria. Testing the films against food spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both food spoilage bacteria.
Diese Arbeit beschäftigt sich mit der Effizienz der Seitenkanal-Kryptanalyse. In Teil II dieser Arbeit demonstrieren wir, wie die Laufzeit der wichtigsten Analysewerkzeuge mit Hilfe der CUDA Plattform erheblich gesteigert werden kann. Zweitens untersuchen wir neue Ansätze der profilierenden Seitenkanal-Kryptanalyse. Der Forschungszweig des maschinellen Lernens kann für deutliche Verbesserungen adaptiert werden, wurde jedoch wenig dahingehend untersucht. In Teil III dieser Arbeit präsentieren wir zwei neue Methoden, die einige Gemeinsamkeiten jedoch auch einige Unterschiede aufbieten, sodass sich Prüfergebnisse in einem vollständigeren Bild zeigen lassen. Darüber hinaus schlagen wir in Teil IV eine Seitenkanalanwendung zum Schutz geistigen Eigentums (IP) vor. In Teil V beschäftigen wir uns tiefergehend mit praktischer Seitenkanal-Kryptanalyse, indem wir Attacken auf einen Sicherheitsmikrokontroller durchführen, der Anwendung in einer, in Deutschland weit verbreiteten, EC Karte findet.
Miscanthus bietet als nachwachsende Industrie- und Energiepflanze zahlreiche Vorteile, die neben den direkten landwirtschaftlichen Anwendungen wie Verbrennung und Tiereinstreu auch eine stoffliche Nutzung im chemischen Bereich zulassen. Als C4-Pflanze mit gesteigerter Photosynthese-Aktivität weist Miscanthus zudem eine hohe CO2-Fixierrate auf. Aufgrund des geringen Kultivierungsaufwandes sowie der hohen Erträge bietet sich Miscanthus als ausgesprochen attraktiver Rohstoff für die Produktion erneuerbarer Kraftstoffe und Chemikalien an, welche mittels thermo-chemischer Umwandlung gewonnen werden.
The present thesis elucidates the development of (i) a series of small molecule inhibitors reacting in a covalent-irreversible manner with the targeted proteases and (ii) a fluorescently labeled activity-based probe as a pharmacological tool compound for investigation of specific functions of the mentioned enzymes in vitro. Herein, the rational design, organic synthesis and quantitative structure-activity-relationships are described extensively.
Process-induced changes in thermo-mechanical viscoelastic properties and the corresponding morphology of biodegradable polybutylene adipate terephthalate (PBAT) and polylactic acid (PLA) blown film blends modified with four multifunctional chain-extending cross-linkers (CECL) were investigated. The introduction of CECL modified the properties of the reference PBAT/PLA blend significantly. The thermal analysis showed that the chemical reactions were incomplete after compounding, and that film blowing extended them. SEM investigations of the fracture surfaces of blown extrusion films reveal the significant effect of CECL on the morphology formed during the processing. The anisotropic morphology introduced during film blowing proved to affect the degradation processes as well. Furthermore, the reactions of CECL with PBAT/PLA induced by the processing depend on the deformation directions. The blow-up ratio parameter was altered to investigate further process-induced changes proving synergy with mechanical and morphological features. Using blown film extrusion, the elongational behavior represents a very important characteristic. However, its evaluation may be quite often problematic, but with the SER Universal Testing Platform it was possible to determine changes in the duration of time intervals corresponding to the rupture of elongated samples.
In this thesis, unique administrative data, a relevant time of follow-up and advanced statistical measures to handle confounding have been utilized in order to provide new and informative evidence on the effects of vocational rehabilitation programs on work participation outcomes in Germany. While re-affirming the important role of micro-level determinants, the present study provides an extensive example of the individual and fiscal effects that are possible through meaningful vocational rehabilitation measures. The analysis showed that the principal objective, namely, to improve participation in employment, was generally achieved. Contrary to the common misconception that “off-the-job training” is relatively ineffective, this thesis has provided an empirical example of the positive impact of the programs.
Optimization plays an essential role in industrial design, but is not limited to minimization of a simple function, such as cost or strength. These tools are also used in conceptual phases, to better understand what is possible. To support this exploration we focus on Quality Diversity (QD) algorithms, which produce sets of varied, high performing solutions. These techniques often require the evaluation of millions of solutions -- making them impractical in design cases. In this thesis we propose methods to radically improve the data-efficiency of QD with machine learning, enabling its application to design. In our first contribution, we develop a method of modeling the performance of evolved neural networks used for control and design. The structures of these networks grow and change, making them difficult to model -- but with a new method we are able to estimate their performance based on their heredity, improving data-efficiency by several times. In our second contribution we combine model-based optimization with MAP-Elites, a QD algorithm. A model of performance is created from known designs, and MAP-Elites creates a new set of designs using this approximation. A subset of these designs are the evaluated to improve the model, and the process repeats. We show that this approach improves the efficiency of MAP-Elites by orders of magnitude. Our third contribution integrates generative models into MAP-Elites to learn domain specific encodings. A variational autoencoder is trained on the solutions produced by MAP-Elites, capturing the common “recipe” for high performance. This learned encoding can then be reused by other algorithms for rapid optimization, including MAP-Elites. Throughout this thesis, though the focus of our vision is design, we examine applications in other fields, such as robotics. These advances are not exclusive to design, but serve as foundational work on the integration of QD and machine learning.
The initially large number of variants is reduced by applying custom variant annotation and filtering procedures. This requires complex software toolchains to be set up and data sources to be integrated. Furthermore, increasing study sizes subsequently require higher efforts to manage datasets in a multi-user and multi-institution environment. It is common practice to expect numerous iterations of continuative respecification and refinement of filter strategies, when the cause for a disease or phenotype is unknown. Data analysis support during this phase is fundamental, because handling the large volume of data is not possible or inadequate for users with limited computer literacy. Constant feedback and communication is necessary when filter parameters are adjusted or the study grows with additional samples. Consequently, variant filtering and interpretation becomes time-consuming and hinders a dynamic and explorative data analysis by experts.
At the end of 2019, about 4.1 billion people on earth were using the internet. Because people entrust their most intimate and private data to their devices, the European legislation has declared the protection of natural persons in relation to the processing of personal data as a fundamental right. In 2018 23 million people worldwide, having the responsibility of implementing data security and privacy, were developing software. However, the implementation of data and application security is a challenge, as evidenced by over 41 thousand documented security incidents in 2019. Probably the most basic, powerful, and frequently used tools software developers work with are Application Programming Interfaces (APIs). Security APIs are essential tools to bring data and application security into software products. However, research results have revealed that usability problems of security APIs lead to insecure API use during development. Basic security requirements such as securely stored passwords, encrypted files or secure network connections can become an error-prone challenge and in consequence lead to unreliable or missing security and privacy. Because software developers hold a key position in the development processes of software, not properly operating security tools pose a risk to all people using software. However, little is known about the requirements of developers to address the problem and improve the usability of security APIs. This thesis is one of the first to examine the usability of security APIs. To this end, the author examines to what extent information flows can support software developers in using security APIs to implement secure software by conducting empirical studies with software developers. This thesis has contributed fundamental results that can be used in future work to identify and improve important information flows in software development. The studies have clearly shown that developer-tailored information flows with adapted security-relevant content have a positive influence on the correct implementation of security. However, the results have also led to the conclusion that API producers need to pay special attention to the channels through which they direct information flows to API users and how the information is designed to be useful for them. In many cases, it is not enough to provide security-relevant information via the documentation only. Here, proactive methods like the API security advice proposed by this thesis achieve significantly better results in terms of findability and actionable support. To further increase the effectiveness of the API security advice, this thesis developed a cryptographic API warning design for the terminal by adopting a participatory design approach with experienced software developers. However, it also became clear that a single information flow can only support up to a certain extent. As observed from two studies conducted in complex API environments in web development, multiple complementary information flows have to meet the extensive information needs of developers to be able to develop secure software. Some evaluated new approaches provided promising insights towards more API consumer-focused documentation designs as a complement to API warnings.
During the last 50 years, a broad range of visible light curing resin based composites (VLC RBC) was developed for restorative applications in dentistry. Correspondingly, the technologies of light curing units (LCU) have changed from UV to visible blue light, and there from quartz tungsten halogen over plasma arc to LED LCUs increasing their light intensity significantly. In this thesis, the influence of the curing conditions in terms of irradiance, exposure time and irradiance distribution of LCU on reaction kinetics as well as corresponding mechanical and viscoelastic properties were investigated.
In this thesis it is posed that the central object of preference discovery is a co-creative process in which the Other can be represented by a machine. It explores efficient methods to enhance introverted intuition using extraverted intuition's communication lines. Possible implementations of such processes are presented using novel algorithms that perform divergent search to feed the users' intuition with many examples of high quality solutions, allowing them to take influence interactively. The machine feeds and reflects upon human intuition, combining both what is possible and preferred. The machine model and the divergent optimization algorithms are the motor behind this co-creative process, in which machine and users co-create and interactively choose branches of an ad hoc hierarchical decomposition of the solution space.
The proposed co-creative process consists of several elements: a formal model for interactive co-creative processes, evolutionary divergent search, diversity and similarity, data-driven methods to discover diversity, limitations of artificial creative agents, matters of efficiency in behavioral and morphological modeling, visualization, a connection to prototype theory, and methods to allow users to influence artificial creative agents. This thesis helps putting the human back into the design loop in generative AI and optimization.
Lignin ist bereits ein intensives Gebiet der Forschung, allerdings werden Verknüpfungen zwischen Quelle, Aufschlussmethode und Einsatz in der Literatur kaum beschrieben. In der vorliegenden Arbeit werden Lignine von verschiedenen Quellen (Weizenstroh, Buche, Nadelholz) und Aufschlussmethoden (AFEX, Wasserdampfaufschluss, Organosolv, Saure Hydrolyse) analytisch erfasst und hinsichtlich ihres Einsatzes in polymeren Materialien charakterisiert. Eine breite Auswahl an Methoden wurden eingesetzt, FT-IR- Spektroskopie, UV-Vis, 31P-NMR, GPC, Pyrolyse-GC/MS, sowie HPLC zur Bestimmung der Reinheit gemäß des NREL-Standard-Protokolls. Thermische Analysen, wie TGA und DSC zeigten Glasübergangstemperaturen um 120°C, sowie Zersetzungstemperaturen zwischen 340°C und 380°C. Die Ergebnisse weisen für das Organosolv-Buchenholz-Lignin hochreine Fraktionen auf, die bis dato noch nicht erreicht wurden. Die Ergebnisse dieser Arbeit identifizien die Organosolv-Buchenholz-Lignine als ein verwertbares Produkt im Hinblick auf die Anwendung in Polyurethanen sowie Phenol-Formaldehydharzen.
Typically, plastic packaging materials are produced using additives, like e.g. stabilisers, to introduce specific desired properties into the material or, in case of stabilisers, to prolong the shelf life of such packaging materials. However, those stabilisers are typically fossil-based and can pose risks to both environmental and human health. Therefore, the present study presents more sustainable alternatives based on regional renewable resources which show the relevant antioxidant, antimicrobial and UV absorbing properties to successfully serve as a plastic stabiliser. In the study, all plants are extracted and characterised with regard to not only antioxidant, antimicrobial and UV absorbing effects, but also with regard to additional relevant information like chemical constituents, molar mass distribution, absorbance in the visible range et cetera. The extraction process is furthermore optimised and, where applicable, reasonable opportunities for waste valorisation are explored and analysed. Furthermore, interactions between analysed plant extracts are described and model films based on Poly-Lactic Acid are prepared, incorporating analysed plant extracts. Based on those model films, formulation tests and migration analysis according to EU legislation is conducted.
The well-known aromatic and medicinal plant thyme (Thymus vulgaris L.) includes phenolic terpenoids like thymol and carvacrol which have strong antioxidant, antimicrobial and UV absorbing effects. Analyses show that those effects can be used in both lipophilic and hydrophilic surroundings, that the variant Varico 3 is a more potent cultivar than other analysed thyme variants, and that a passive extraction setup can be used for extract preparation while distillation of the Essential Oils can be a more efficient approach.
Macromolecular antioxidant polyphenols, particularly proanthocyanidins, have been found in the seed coats of the European horse chestnut (Aesculus hippocastanum L.) which are regularly discarded in phytopharmaceutical industry. In this study, such effects and compounds have been reported for the first time while a valorisation of waste materials has been analysed successfully. Furthermore, a passive extraction setup for waste materials and whole seeds has been developed. In extracts of snowdrops, precisely Galanthus elwesii HOOK.F., high concentrations of tocopherol have been found which promote a particularly high antioxidant capacity in lipophilic surroundings. Different coniferous woods (Abies div., Picea div.) which are in use as Christmas trees are extracted after separating the biomass in leafs and wood parts before being analysed regarding extraction optimisation and drought resistance of active substances. Antioxidant and UV absorbing proanthocyanidins are found even in dried biomasses, allowing the circular use of already used Christmas trees as bio-based stabilisers and the production of sustainable paper as a byproduct.
Telogene Einzelhaare sind häufig vorkommende Spurentypen an Tatorten. Derzeit werden sie zumeist von der STR-Typisierung ausgeschlossen, weil ihre STR-Profile aufgrund geringer DNA-Mengen und starker DNA-Degradierung in vielen Fällen unvollständig und schwierig zu interpretieren sind. In der vorliegenden Arbeit wurde eine systematische Vorgehensweise angewandt, um Korrelationen zwischen der DNA-Menge und DNA-Degradierung zu dem Erfolg der STR-Typisierung aufzuweisen und darauf basierend den Typisierungs-Erfolg von DNA aus Haaren vorhersagen zu können.
Zu diesem Zweck wurde ein human- (RiboD) und ein canin-spezifischer (RiboDog) qPCR-basierter Assay zur Messung der DNA-Menge und Bewertung der DNA-Integrität mittels eines Degradierungswerts (D-Wert) entwickelt. Aufgrund der Lage der genutzten Primer, welche auf ubiquitär vorkommende ribosomale DNA-Sequenzen abzielen, ist das Funktionsprinzip schnell und kostengünstig auf unterschiedliche Spezies anzuwenden. Die Funktionsweise der Assays wurde mittels seriell degradierter DNA bestätigt und der humane Assay wurde im Vergleich zum kommerziellen Quantifiler? Trio DNA Quantification Kit validiert. Schließlich wurde mit den Assays an DNA aus telogenen und katagenen Einzelhaaren von Menschen und Hunden der Zusammenhang zwischen DNA-Menge und DNA-Integrität zu der Vollständigkeit der STR-Allele (Allel Recovery) von DNA-Profilen untersucht, die mittels kapillarelektrophoretischer (CE) STR-Kits erhaltenen wurde. Es zeigte sich, dass bei humanen Einzelhaaren die Allel-Recovery sowohl von der DNA-Menge als auch der DNA-Integrität abhängt. Dagegen war die DNA-Degradierung bei einzelnen Hundehaaren durchweg geringer und die Allel-Recovery hing allein von der extrahierten DNA-Menge ab.
Um die STR-Analytik degradierter humaner DNA-Proben weiter zu verbessern, wurde ein neuartiger NGS-basierter Assay (maSTR, Mini-Amplikon-STR) etabliert, der die 16 forensischen STR-Loci des European Standard Sets und Amelogenin als sehr kurze Amplikons (76-296 bp) parallel amplifiziert. Mit intakter DNA generierte der maSTR-Assay im Mengenbereich von 200 pg eingesetzter DNA reproduzierbare, vollständige Profile ohne Allelic Drop-ins. Bei niedrigeren DNA-Mengen traten vereinzelt Allelic Drop-ins auf, wobei unter Verwendung von mindestens 43 pg DNA vollständige Profile erhalten wurden.
Die kombinierte Strategie aus RiboD-Messungen der DNA-Menge und -Integrität und daraus resultierendem STR-Typisierungserfolg des maSTR-Assays wurde an degradierter DNA validiert. Anschließend wurde die Strategie auf DNA aus telogenen und katagenen Einzelhaaren angewandt und mit den Ergebnissen des CE-basierten PowerPlex? ESX 17-Kits verglichen, das dasselbe STR-Marker-Set analysiert. Dabei zeigte sich, dass der Erfolg der STR-Typisierung beider STR-Assays sowohl von der optimalen Menge der Template-DNA als auch von der DNA-Integrität abhängt. Mit dem maSTR-Assay wurden vollständige Profile mit ungefähr 50 pg Input-DNA für leicht degradierte DNA aus Einzelhaaren nachgewiesen, sowie mit ungefähr 500 pg stark degradierter DNA. Aufgrund der geringen DNA-Mengen von telogenen Einzelhaaren schwankte die Reproduzierbarkeit der maSTR-Ergebnisse, war jedoch stets dem PowerPlex? ESX 17-Kit in Bezug auf die Allel-Recovery überlegen.
Ein Vergleich mit zwei, hinsichtlich der Längenverteilung der Amplikons komplementären CE-basierten STR-Kits (PowerPlex? ESX 17 und ESI 17 Fast), sowie mit einem kommerziellen NGS-Kit (ForenSeq? DNA Signature Prep) ergab, dass nicht die Technik der NGS, sondern die Kürze der Amplikons der wichtigste Faktor zur Typisierung degradierter DNA ist. Der maSTR-Assay wies in allen Vergleichen mit den genutzten kommerziellen Kits jedoch eine höhere Anzahl an Allelic Drop-ins auf. Diese traten umso häufiger auf, je geringer die verwendete DNA-Menge und je stärker degradiert diese war.
Weil Profile mit Allelic Drop-ins Mischprofilen entsprechen, wurden die per maSTR-Assay generierten STR-Profile mit Verfahren zur Interpretation von Mischspuren untersucht. Bei der Composite-Interpretation werden alle vorkommenden Allele von Replikaten gezählt, bei der Consensus-Interpretation lediglich die reproduzierbaren Allele. Dabei stellte sich heraus, dass im Fall von wenigen Allelic Drop-ins (PowerPlex? ESX 17-generierte Profile) die Composite-Interpretation und bei Allelic Drop-in-haltigen Profilen (maSTR-generierte Profile) die Consensus-Interpretation am besten geeignet ist.
Schließlich wurde mittels der GenoProof Mixture 3-Software untersucht, inwieweit semi- und vollständig kontinuierliche probabilistische Verfahren bei der biostatistischen Bewertung der DNA-Profile aus Einzelhaaren geeignet sind. Dabei zeigte sich, dass der maSTR-Assay aufgrund der hohen Anzahl an Allelic Drop-ins den CE-basierten Methoden nur in Fällen von DNA leicht überlegen ist, die in ausreichender Menge und gering degradiert vorliegt. In diesem Bereich gelingt die Zuordnung des Profils aus Haaren zum Referenzprofil jedoch ebenfalls mittels CE-basierten Methoden.
Aus allen Ergebnissen wurde eine Empfehlung für die Handhabung von DNA aus ausgefallenen Einzelhaaren abgeleitet, die auf dem DNA-Degradierungsgrad in Kombination mit der DNA-Menge basiert. Die vorliegende Arbeit schafft somit eine Grundlage, um ausgefallene Einzelhaare in der Routine-Arbeit von kriminaltechnischen Ermittlungen nutzbar zu machen, sowie gegebenenfalls auf andere Spurentypen mit degradierter DNA geringer Menge anzuwenden. Dadurch könnte die Nutzbarkeit solcher Spurentypen für die forensische Kriminalistik erhöht werden, insbesondere wenn die standardmäßig verwendeten CE-basierten Methoden versagen. Perspektivisch ist die Technik der NGS aufgrund der großen Multiplexierbarkeit uniformer, kurzer Marker generell der CE-basierten Technik bei der Typisierung degradierter DNA überlegen.
As robots are becoming ubiquitous and more capable, the need for introducing solid robot software development methods is pressing to increase robots' task spectrum. This thesis is concerned with improving software engineering of robot perception systems. The presented research employs a model-based approach to provide the means to represent knowledge about robotics software. The thesis is divided into three parts, namely research on the specification, deployment and adaptation of robot perception systems.
Discrimination and classification of eight strains related to meat spoilage microorganisms commonly found in poultry meat were successfully carried out using two dispersive Raman spectrometers (Microscope and Portable Fiber-Optic systems) in combination with chemometric methods. Principal Components Analysis (PCA) and Multi-Class Support Vector Machines (MC-SVM) were applied to develop discrimination and classification models. These models were certified using validation data sets which were successfully assigned to the correct bacterial genera and even to the right strain. The discrimination of bacteria down to the strain level was performed for the pre-processed spectral data using a 3-stage model based on PCA. The spectral features and differences among the species on which the discrimination was based were clarified through PCA loadings. In MC-SVM the pre-processed spectral data was subjected to PCA and utilized to build a classification model. When using the first two components, the accuracy of the MC-SVM model was 97.64% and 93.23% for the validation data collected by the Raman Microscope and the Portable Fiber-Optic Raman system, respectively. The accuracy reached 100% for the validation data by using the first eight and ten PC’s from the data collected by Raman Microscope and by Portable Fiber-Optic Raman system, respectively. The results reflect the strong discriminative power and the high performance of the developed models, the suitability of the pre-processing method used in this study and that the low accuracy of the Portable Fiber-Optic Raman system does not adversely affect the discriminative power of the developed models.
Due to the popularity of the Internet and the networked services that it facilitates, networked devices have become increasingly common in both the workplace and everyday life in recent years—following the trail blazed by smartphones. The data provided by these devices allow for the creation of rich user profiles. As a result, the collection, processing and exchange of such personal data have become drivers of economic growth. History shows that the adoption of new technologies is likely to influence both individual and societal concepts of privacy. Research into privacy has therefore been confronted with continuously changing concepts due to technological progress. From a legal perspective, privacy laws that reflect social values are sought. Privacy enhancing technologies are developed or adapted to take account of technological development. Organizations must also identify protective measures that are effective in terms of scalability and automation. Similarly, research is being conducted from the perspective of Human-Computer Interaction (HCI) to explore design spaces that empower individuals to manage their protection needs with regard to novel data, which they may perceive as sensitive. Taking such an HCI perspective with regard to understanding privacy management on the Internet of Things (IoT), this research mainly focuses on three interrelated goals across the fields of application: 1. Exploring and analyzing how people make sense of data, especially when managing privacy and data disclosure; 2. Identifying, framing and evaluating potential resources for designing sense-making processes; and 3. Exploring the fitness of the identified concepts for inclusion in legal and technical perspectives on supporting decisions regarding privacy on the IoT. Although this work's point of departure is the HCI perspective, it emphasizes the importance of the interrelationships among seemingly independent perspectives. Their interdependence is therefore also emphasized and taken into account by subscribing to a user-centered design process throughout this study. More specifically, this thesis adopts a design case study approach. This approach makes it possible to conduct full user-centered design lifecycles in a concrete application case with participants in the context of everyday life. Based on this approach, it was possible to investigate several domains of the IoT that are currently relevant, namely smart metering, smartphones, smart homes and connected cars. The results show that the participants were less concerned about (raw) data than about the information that could potentially be derived from it. Against the background of the constant collection of highly technical and abstract data, the content of which only becomes visible through the application of complex algorithms, this study indicates that people should learn to explore and understand these data flexibly, and provides insights in how to design for supporting this aim. From the point of view of design for usable privacy protection measures, the information that is provided to users about data disclosure should be focused on the consequences thereof for users' environments and life. A related concept from law is “informed consent,” which I propose should be further developed in order to implement usable mechanisms for individual privacy protection in the era of the IoT. Finally, this thesis demonstrates how research on HCI can be methodologically embedded in a regulative process that will inform both the development of technology and the drafting of legislation.
Solving differential-algebraic equations (DAEs) efficiently by means of appropriate numerical schemes for time-integration is an ongoing topic in applied mathematics. In this context, especially when considering large systems that occur with respect to many fields of practical application effective computation becomes relevant. In particular, corresponding examples are given when having to simulate network structures that consider transport of fluid and gas or electrical circuits. Due to the stiffness properties of DAEs, time-integration of such problems generally demands for implicit strategies. Among the schemes that prove to be an adequate choice are linearly implicit Rung-Kutta methods in the form of Rosenbrock-Wanner (ROW) schemes. Compared to fully implicit methods, they are easy to implement and avoid the solution of non-linear equations by including Jacobian information within their formulation. However, Jacobian calculations are a costly operation. Hence, necessity of having to compute the exact Jacobian with every successful time-step proves to be a considerable drawback. To overcome this drawback, a ROW-type method is introduced that allows for non-exact Jacobian entries when solving semi-explicit DAEs of index one. The resulting scheme thus enables to exploit several strategies for saving computational effort. Examples include using partial explicit integration of non-stiff components, utilizing more advantageous sparse Jacobian structures or making use of time-lagged Jacobian information. In fact, due to the property of allowing for non-exact Jacobian expressions, the given scheme can be interpreted as a generalized ROW-type method for DAEs. This is because it covers many different ROW-type schemes known from literature. To derive the order conditions of the ROW-type method introduced, a theory is developed that allows to identify occurring differentials and coefficients graphically by means of rooted trees. Rooted trees for describing numerical methods were originally introduced by J.C. Butcher. They significantly simplify the determination and definition of relevant characteristics because they allow for applying straightforward procedures. In fact, the theory presented combines strategies used to represent ROW-type methods with exact Jacobian for DAEs and ROW-type methods with non-exact Jacobian for ODEs. For this purpose, new types of vertices are considered in order to describe occurring non-exact elementary differentials completely. The resulting theory thus automatically comprises relevant approaches known from literature. As a consequence, it allows to recognize order conditions of familiar methods covered and to identify new conditions. With the theory developed, new sets of coefficients are derived that allow to realize the ROW-type method introduced up to orders two and three. Some of them are constructed based on methods known from literature that satisfy additional conditions for the purpose of avoiding effects of order reduction. It is shown that these methods can be improved by means of the new order conditions derived without having to increase the number of internal stages. Convergence of the resulting methods is analyzed with respect to several academic test problems. Results verify the theory determined and the order conditions found as only schemes satisfying the order conditions predicted preserve their order when using non-exact Jacobian expressions.
The art of nudging
(2023)
Do simple and subtle changes in the living and study environment improve the eating behaviour of students in an educational setting? This dissertation provides a not-so-simple answer to this simple question based on the outcomes of four studies that explore the effects and design of artwork nudges (specifically the artwork of Alberto Giacometti) on the eating behaviour of students by applying different research designs. Study 1 explores the effects of a Giacometti-like nudge (a more contemporary version of the original nudge) regarding the dietary behaviour of high school students in a controlled setting. Study 2 applies different artwork nudges within a virtual vignette setting to measure their effects on virtual meal choices made. Also, the degree to which individuals were aware of the nudge’s presence is included as an influential factor in nudge effectiveness. Study 3 assesses the susceptibility to nudges as measured with a questionnaire. Susceptibility to nudges is defined as nudgeability. Study 4 assesses the effects of the original Giacometti nudge in a real-world university cafeteria setting. Specifically, the immediate and sustained effects of the original Giacometti nudge on students’ meal purchases in the university cafeteria are considered. In addition, the role of awareness of the nudge’s presence as well as the acceptance of this specific nudge are discussed. The conclusion is drawn that the original Giacometti nudge should only be applied in an educational setting to improve healthy eating behaviour if the intended target groups and environment meet certain conditions. Artwork nudges in general should be applied only after rigorous testing of various types of different nudges and more research reflecting healthy eating in its entirety.