Refine
Departments, institutes and facilities
- Präsidium (397)
- Fachbereich Angewandte Naturwissenschaften (189)
- Fachbereich Informatik (178)
- Fachbereich Wirtschaftswissenschaften (154)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (133)
- Fachbereich Ingenieurwissenschaften und Kommunikation (124)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (100)
- Institut für funktionale Gen-Analytik (IFGA) (72)
- Fachbereich Sozialpolitik und Soziale Sicherung (43)
- Institute of Visual Computing (IVC) (41)
Document Type
- Article (439)
- Part of Periodical (407)
- Conference Object (160)
- Part of a Book (83)
- Report (54)
- Working Paper (42)
- Preprint (19)
- Bachelor Thesis (17)
- Master's Thesis (14)
- Other (10)
Year of publication
Has Fulltext
- yes (1267) (remove)
Keywords
- Entrepreneurship (8)
- Ghana (8)
- Hochschule Bonn-Rhein-Sieg (7)
- Machine Learning (7)
- Robotik (7)
- cytokine-induced killer cells (7)
- lignin (7)
- Digitalisierung (6)
- Kenya (6)
- Lignin (6)
The curricula of all degree programs at H-BRS have many different practice-oriented activities and focus on hands-on learning. In labs and small classrooms (30–60 persons), students get a personalized learning environment which is complemented with many individual and group projects that foster collaborative work situations. There are several main areas that students learn from working with industry, local organizations or public institutions.
Less is Often More: Header Whitelisting as Semantic Gap Mitigation in HTTP-Based Software Systems
(2021)
The web is the most wide-spread digital system in the world and is used for many crucial applications. This makes web application security extremely important and, although there are already many security measures, new vulnerabilities are constantly being discovered. One reason for some of the recent discoveries lies in the presence of intermediate systems—e.g. caches, message routers, and load balancers—on the way between a client and a web application server. The implementations of such intermediaries may interpret HTTP messages differently, which leads to a semantically different understanding of the same message. This so-called semantic gap can cause weaknesses in the entire HTTP message processing chain.
In this paper we introduce the header whitelisting (HWL) approach to address the semantic gap in HTTP message processing pipelines. The basic idea is to normalize and reduce an HTTP request header to the minimum required fields using a whitelist before processing it in an intermediary or on the server, and then restore the original request for the next hop. Our results show that HWL can avoid misinterpretations of HTTP messages in the different components and thus prevent many attacks rooted in a semantic gap including request smuggling, cache poisoning, and authentication bypass.
Lehren an Hochschulen
(2015)
Die Servicestelle Lehrbeauftragtenpool ist ein vom Bundesministerium für Bildung und Forschung (BMBF) gefördertes Verbundprojekt der Hochschulen Bonn-Rhein-Sieg, Düsseldorf, Niederrhein und Rhein-Waal. Unser Ziel ist es, die Qualität der Lehre zu verbessern, indem wir Menschen mit Berufserfahrung an die Hochschulen und in die Lehre bringen.
Mit der Aufnahme des Lehrbetriebes im Wintersemester 1995/96 haben sich die Fachbereiche Wirtschaft in Sankt Augustin und Rheinbach die laufende Qualitätssicherung und Qualitätsverbesserung der Ausbildung zum Ziel gesetzt. Die Evaluierung der Lehre und des Studiums wurde frühzeitig implementiert. Der Fachbereich versteht den Lehr- und Evaluationsbericht als Instrument der selbst gesteuerten Qualitätssicherung.
During robot-assisted therapy, a robot typically needs to be partially or fully controlled by therapists, for instance using a Wizard-of-Oz protocol; this makes therapeutic sessions tedious to conduct, as therapists cannot fully focus on the interaction with the person under therapy. In this work, we develop a learning-based behaviour model that can be used to increase the autonomy of a robot’s decision-making process. We investigate reinforcement learning as a model training technique and compare different reward functions that consider a user’s engagement and activity performance. We also analyse various strategies that aim to make the learning process more tractable, namely i) behaviour model training with a learned user model, ii) policy transfer between user groups, and iii) policy learning from expert feedback. We demonstrate that policy transfer can significantly speed up the policy learning process, although the reward function has an important effect on the actions that a robot can choose. Although the main focus of this paper is the personalisation pipeline itself, we further evaluate the learned behaviour models in a small-scale real-world feasibility study in which six users participated in a sequence learning game with an assistive robot. The results of this study seem to suggest that learning from guidance may result in the most adequate policies in terms of increasing the engagement and game performance of users, but a large-scale user study is needed to verify the validity of that observation.
This thesis introduces and demonstrates a novel method for learning qualitative models of the world by an autonomous robot. The method makes possible generation of qualitative models that can be used for prediction as well as directing the experiments to improve the model. The qualitative models form the knowledge representation of the robot and consists of qualitative trees and non-deterministic finite automaton. An efficient exploration algorithm that lets the robot collect the most relevant learning samples is also introduced. To demonstrate the use of the methodology, representation and algorithm, two experiments are described. The first experiment is conducted using a mobile robot and a ball, where the robot observes the ball and learns the effect of its actions on the observed attributes of the world. The second experiment is conducted using a mobile robot and five boxes, two non-movable boxes and three movable boxes. The robot experiments actively with the objects and observes the changes in the attributes of the world. The main difference with the two experiments is that the first one tries to learn by observation while the second tries to learn by experimentation. In both experiments the robot learns qualitative models from its actions and observations. Although the primary objective of the robot is to improve itself by being able to predict the outcome of its actions, the models Learned were also used at each step of the learning process to direct the experiments so that the model converges to the final model as quickly as possible.
Rapid and sustained innovation in developed markets triggers the generation of innovative start-ups, some with disruptive innovations. However, when their offering faces a saturated market with satisfactory and widely available established traditional solutions, many innovative start-ups from these markets may fail. The literature on some start-ups that successfully brought their innovation to emerging markets shows how using leapfrogging traditional solutions to innovative solutions can offer survival and growth opportunities to these start-ups. However, a wide exploitation of leapfrogging processes in emerging markets for survival or business growth of innovative start-ups from developed markets is not yet theorized. To contribute to closing this gap, we propose a conceptual framework to assess the readiness of an emerging market to leapfrog to innovative solutions.
The design of the conceptual framework uses a scenario-planning like approach with two key factors, namely Context Readiness and Value Network Integration. To test and refine the proposed framework and show its relevance for coming to an informed expansion decision making, we used PAR (Participatory Action Research). For the illustration of the application of the proposed conceptual framework, the case of telehealth in Morocco is used.
When users in virtual reality cannot physically walk and self-motions are instead only visually simulated, spatial updating is often impaired. In this paper, we report on a study that investigated if HeadJoystick, an embodied leaning-based flying interface, could improve performance in a 3D navigational search task that relies on maintaining situational awareness and spatial updating in VR. We compared it to Gamepad, a standard flying interface. For both interfaces, participants were seated on a swivel chair and controlled simulated rotations by physically rotating. They either leaned (forward/backward, right/left, up/down) or used the Gamepad thumbsticks for simulated translation. In a gamified 3D navigational search task, participants had to find eight balls within 5 min. Those balls were hidden amongst 16 randomly positioned boxes in a dark environment devoid of any landmarks. Compared to the Gamepad, participants collected more balls using the HeadJoystick. It also minimized the distance travelled, motion sickness, and mental task demand. Moreover, the HeadJoystick was rated better in terms of ease of use, controllability, learnability, overall usability, and self-motion perception. However, participants rated HeadJoystick could be more physically fatiguing after a long use. Overall, participants felt more engaged with HeadJoystick, enjoyed it more, and preferred it. Together, this provides evidence that leaning-based interfaces like HeadJoystick can provide an affordable and effective alternative for flying in VR and potentially telepresence drones.
Human and robot tasks in household environments include actions such as carrying an object, cleaning a surface, etc. These tasks are performed by means of dexterous manipulation, and for humans, they are straightforward to accomplish. Moreover, humans perform these actions with reasonable accuracy and precision but with much less energy and stress on the actuators (muscles) than the robots do. The high agility in controlling their forces and motions is actually due to "laziness", i.e. humans exploit the existing natural forces and constraints to execute the tasks.
The above-mentioned properties of the human lazy strategy motivate us to relax the problem of controlling robot motions and forces, and solve it with the help of the environment. Therefore, in this work, we developed a lazy control strategy, i.e. task specification models and control architectures that relax several aspects of robot control by exploiting prior knowledge about the task and environment. The developed control strategy is realized in four different robotics use cases. In this work, the Popov-Vereshchagin hybrid dynamics solver is used as one of the building blocks in the proposed control architectures. An extension of the solver’s interface with the artificial Cartesian force and feed-forward joint torque task-drivers is proposed in this thesis.
To validate the proposed lazy control approach, an experimental evaluation was performed in a simulation environment and on a real robot platform.
The lattice Boltzmann method (LBM) stands apart from conventional macroscopic approaches due to its low numerical dissipation and reduced computational cost, attributed to a simple streaming and local collision step. While this property makes the method particularly attractive for applications such as direct noise computation, it also renders the method highly susceptible to instabilities. A vast body of literature exists on stability-enhancing techniques, which can be categorized into selective filtering, regularized LBM, and multi-relaxation time (MRT) models. Although each technique bolsters stability by adding numerical dissipation, they act on different modes. Consequently, there is not a universal scheme optimally suited for a wide range of different flows. The reason for this lies in the static nature of these methods; they cannot adapt to local or global flow features. Still, adaptive filtering using a shear sensor constitutes an exception to this. For this reason, we developed a novel collision operator that uses space- and time-variant collision rates associated with the bulk viscosity. These rates are optimized by a physically informed neural net. In this study, the training data consists of a time series of different instances of a 2D barotropic vortex solution, obtained from a high-order Navier–Stokes solver that embodies desirable numerical features. For this specific text case our results demonstrate that the relaxation times adapt to the local flow and show a dependence on the velocity field. Furthermore, the novel collision operator demonstrates a better stability-to-precision ratio and outperforms conventional techniques that use an empirical constant for the bulk viscosity.
Vor dem Hintergrund der Covid-19-Pandemie hat sich das Home-Office in Deutschland seit dem Jahr 2020 weit verbreitet und wird seitdem bei vielen Arbeitgebern als neue Arbeitsmethode genutzt. Der Einsatz von Home-Office kann verschiedene positive als auch negative Effekte auf die Beschäftigten und den Arbeitgeber sowie die Gesellschaft allgemein haben. Damit von möglichst vielen positiven Effekten profitiert werden kann, ist ein gutes Home-Office Konzept erforderlich. Welche Anforderungen an ein solches Konzept bestehen und welche Voraussetzungen grundlegend mit der Nutzung von Home-Office verbunden sind, wird in dem Beitrag aufgezeigt. Dabei werden von technischen bis hin zu sozialen Aspekten Anforderungen verschiedener Arten berücksichtigt, welche durch eine durch den Autor durchgeführte Studie gebildet worden sind. Im Fokus dieses Beitrages sollen die kritischen Erfolgsfaktoren für das ideale Arbeiten im Home-Office stehen, also die Anforderungen, welche ausschlaggebend für die erfolgreiche Umsetzung eines Home-Office Konzeptes sind und Einfluss auf die wahrgenommenen Effekte des Home-Office haben. Die im Beitrag aufgeführte Studie des Autors wurde im Rahmen der Abschlussarbeit von Herrn Jeske durchgeführt, auf welcher der Beitrag basiert.
Das Interesse an Virtual Reality (VR) für die Hochschullehre steigt aktuell vermehrt durch die Möglichkeit, logistisch schwierige Aufgaben abzubilden sowie aufgrund positiver Ergebnisse aus Wirksamkeitsstudien. Gleichzeitig fehlt es jedoch an Studien, die immersive VR-Umgebungen, nicht-immersive Desktop-Umgebungen und konventionelle Lernmaterialien gegenüberstellen und lehr-lernmethodische Aspekte evaluieren. Aus diesem Grund beschäftigt sich dieser Beitrag mit der Konzeption und Realisierung einer Lernumgebung für die Hochschullehre, die sowohl mit einem Head Mounted Display (HMD) als auch mittels Desktops genutzt werden kann, sowie deren Evaluation anhand eines experimentellen Gruppendesigns. Die Lernumgebung wurde auf Basis einer eigens entwickelten Softwareplattform erstellt und die Wirksamkeit mithilfe von zwei Experimentalgruppen – VR vs. Desktop-Umgebung – und einer Kontrollgruppe evaluiert und verglichen. In einer Pilotstudie konnten sowohl qualitativ als auch quantitativ positive Einschätzungen der Usability der Lernumgebung in beiden Experimentalgruppen herausgestellt werden. Darüber hinaus zeigten sich positive Effekte auf die kognitive und affektive Wirkung der Lernumgebung im Vergleich zu konventionellen Lernmaterialien. Unterschiede zwischen der Nutzung als VR- oder Desktop-Umgebung zeigen sich auf kognitiver und affektiver Ebene jedoch kaum. Die Analyse von Log-Daten deutet allerdings auf Unterschiede im Lern- und Explorationsverhalten hin.
Als Basis für Simulationen innerhalb virtueller Umgebungen werden meist unterliegende Semantiken benötigt. Im Fall von Verkehrssimulationen werden in der Regel definierte Verkehrsnetzwerke genutzt. Die Erstellung dieser Netzwerke wird meist per Hand durchgeführt, wodurch sie fehleranfällig ist und viel Zeit erfordert. Dieses Projekt wurde im Rahmen des AVeSi Projektes durchgeführt, in dem an der Entwicklung einer realistischen Verkehrssimulation für virtuelle Umgebung geforscht wird. Der im Projekt angestrebte Simulationsansatz basiert auf der Nutzung von zwei Komplexitätsebenen – einer mikroskopischen und einer mesoskopischen. Um einen Übergang zwischen den Simulationsebenen zu realisieren ist eine Verknüpfung der Verkehrsnetzwerke notwendig, was ebenfalls mit einem hohen Zeitaufwand verbunden ist. In diesem Bericht werden Modelle für Verkehrsnetzwerke beider Ebenen vorgestellt. Anschließend wird ein Ansatz beschrieben, der eine automatische Generierung und Verknüpfung von Verkehrsnetzwerken beider Modelle ermöglicht. Als Grundlage für die Generierung der Netzwerke dienen Daten im OpenDRIVE®-Format. Zur Evaluierung wurden wirklichkeitsgetreue OpenStreetMap-Daten, durch Verwendung einer Drittanbietersoftware, in OpenDRIVE®-Datensätze überführt. Es konnte nachgewiesen werden, dass es durch den Ansatz möglich ist, innerhalb weniger Minuten, große Verkehrsnetzwerke zu erzeugen, auf denen unmittelbar Simulationen ausgeführt werden können. Die Qualität der zur Evaluierung generierten Netzwerke reicht jedoch für Umgebungen, in denen ein hoher Realitätsgrad gefordert wird, nicht aus, was einen zusätzlichen Bearbeitungsschritt notwendig macht. Die Qualitätsprobleme konnten darauf zurückgeführt werden, dass der Detailgrad, der den Evaluierungsdaten zu Grunde liegenden OpenStreetMap-Daten, nicht hoch genug und der Überführungsprozess nicht ausreichend transparent ist.
Die vorliegende Arbeit befasst sich mit der Entwicklung eines Schaltungskonzepts und Labormusters einer externen Beleuchtung für den Einsatz in der Forschung an Time-of-Flight (ToF) Kameras mit Amplitude-Modulated Continuous Wave (AMCW)-Verfahren. Die externe Beleuchtung stellt einen leistungsstarken Repeater der internen Beleuchtung einer ToF Kamera dar und ist in der Lage die von ToF Kameras genutzten hochfrequenten Rechtecksignale zu emittieren.
Da von ToF Kameras in der Regel kein elektrisches Steuersignal (Triggersignal) für den Einsatz einer externen Beleuchtung zur Verfügung gestellt wird, wird dieses aus dem optischen Signal der ToF Kamera gewonnen. Dafür wird ein Konzept für einen optischen Detektor (Trigger) vorgestellt. Dieser setzt sich aus einer Photodiode, einem Transimpedanzverstärker und einer anschließenden Signalaufbereitung zusammen. Außerdem wird gezeigt, wie eine schnelle externe Beleuchtung mit hoher Strahlungsleistung mithilfe eines Metal-Oxid-Semiconductor Field-Effekt-Transistor (MOSFET) und vier Vertical-Cavitiy Surface-Emitting Laser (VCSEL) umgesetzt werden kann. Dafür werden mit der Serien- und Parallelschaltung von MOSFET und VCSEL zwei Schaltungskonzepte vorgestellt. Als Lichtquellen kommen VCSEL mit einer für ToF Kameras typischen Wellenlänge von 940 nm im Nahinfraroten (NIR) zum Einsatz.
Es konnte gezeigt werden, dass mit dem optischen Trigger Signale von bis zu 100 MHz in elektrische Ausgangssignale gewandelt werden können. Außerdem wurden rechteckige Triggersignale mit Anstiegszeiten von 650 ps und Abfallzeiten von 440 ps erzielt. Mit der externen Beleuchtung konnten Signale mit bis zu 100 MHz emittiert werden. Es wurden im Zusammenspiel mit dem optischen Trigger optische Signale mit Anstiegszeiten von 1,5 ns und Abfallzeiten von 960 ps erreicht. Dabei konnten Strahlungsleistungen von knapp 7 W erzielt werden. Das gesamte System aus optischem Trigger und externer Beleuchtung weist eine Latenz von 16 ns auf. Als Ergebnis dieser Arbeit konnte ein System aufgebaut werden, das aufgrund der erzielten Ergebnisse höchstwahrscheinlich als externe Beleuchtung zu Forschungszwecken mit verschiedenen ToF Kameras eingesetzt werden kann. Außerdem besteht die Möglichkeit den optischen Trigger und die Beleuchtung separat zu nutzen.
Question Answering (QA) has gained significant attention in recent years, with transformer-based models improving natural language processing. However, issues of explainability remain, as it is difficult to determine whether an answer is based on a true fact or a hallucination. Knowledge-based question answering (KBQA) methods can address this problem by retrieving answers from a knowledge graph. This paper proposes a hybrid approach to KBQA called FRED, which combines pattern-based entity retrieval with a transformer-based question encoder. The method uses an evolutionary approach to learn SPARQL patterns, which retrieve candidate entities from a knowledge base. The transformer-based regressor is then trained to estimate each pattern’s expected F1 score for answering the question, resulting in a ranking ofcandidate entities. Unlike other approaches, FRED can attribute results to learned SPARQL patterns, making them more interpretable. The method is evaluated on two datasets and yields MAP scores of up to 73 percent, with the transformer-based interpretation falling only 4 pp short of an oracle run. Additionally, the learned patterns successfully complement manually generated ones and generalize well to novel questions.
In Unterkünften für geflüchtete Menschen lebt ein hoher Anteil Kinder in einem Umfeld, das häufig für Erwachsene geschaffen wurde und/oder von diesen dominiert wird. Die Beschaffenheit, die Struktur und das Zusammenleben vor Ort bestimmen daher wesentlich die Lebenswelten von Kindern. Dabei haben Kinder besondere Rechte und Bedarfe. Der Schutz von Kindern und ein förderliches Umfeld für eine gute Entwicklung sind wesentliche Aspekte, die durch internationale Abkommen, wie die UN-Kinderrechtskonvention verbrieft sind und umgesetzt werden müssen. Zwar sind die Bundesländer im nationalen gesetzlichen Rahmen dazu verpflichtet, den Schutz von Kindern in Unterkünften für geflüchtete Menschen zu gewährleisten, die Umsetzung ist jedoch oft nicht verbindlich geregelt. Die vorliegende Analyse diskutiert kinderrechtliche Aspekte für den Schutz von Kindern in Unterkünften für geflüchtete Menschen, zudem wirft sie einen Blick auf Aktivitäten und Maßnahmen der Bundesinitiative Schutz von geflüchteten Menschen in Flüchtlingsunterkünften initiiert durch das Bundesministerium für Familien, Senioren, Frauen und Jugend (BMFSFJ) und dem Kinderhilfswerk der Vereinten Nationen (UNICEF) und gibt Einblick in die neueste Sekundärliteratur zum Kinderschutz in Sammelunterkünften. Ziel des Beitrags ist es, aufzuzeigen, welche Aspekte den Schutz von Kindern begünstigen und wo die Herausforderungen liegen.
Dieses Einführungspapier ist als Orientierungshilfe zum Thema Künstliche Intelligenz (KI) (engl. Artifical Intelligence, AI) im DaF/DaZ-Kontext gedacht. Ausgehend von häufig gestellten Fragen enthält es grundsätzliche Informationen zu technischen und historischen Hintergründen, didaktisch-methodische Reflexionsanregungen sowie praktische Ideen zum Einsatz von KI im DaF/DaZ-Kontext.
The dawn of the 21st Century has witnessed a tremendous increase in trade pacts among nations, resulting in renewed hopes for sustainable enterprise development in emerging economies worldwide. Ghana and other sub- Saharan African (SSA) countries have signed onto several North-South and South-South free trade agreements with the hope of strengthening their presence in the international trade arena, and to promote economic growth in SSA. For over two decades, however, very little has changed, and many have dashed their high hopes as enterprises continue to struggle in SSA. Not even the African Continental Free Trade Agreement (AfCFTA) could renew the hopes of sceptics. Several studies opined that enterprises in SSA could improve their domestic and international competitiveness by establishing mutually beneficial partnerships with their counterparts from the Global North and South. This study delved into the issues that affect North-South and South-South business collaborations and recommends key success factors that could help promote mutually beneficial cross-border business partnerships. The research includes both literature and empirical information on the key success factors of business partnerships between African enterprises as well as between African enterprises and firms from the Global North. We approached the study qualitatively using a phenomenological research design. Research participants included important stakeholders in Africa and Europe's international trade and sustainable enterprise development ecosystem. The study identified several challenges with the current business collaborations and recommended new ways of making such partnerships more beneficial.
Kenya, like all other developing countries in the world, is faced with the task of working strategically towards the achievement of the Sustained Development Goals (SDGs) 2030. These goals whose due date of accomplishment coincides with those of the national development blueprint, namely, the Kenya Vision 2030, have become a major focus of attention in the country. Conferences, workshops, and seminars are organized throughout the country on regular bases by joint multiplicity of organizations to address modalities of ensuring a timely achievement of SDGs in the country. Universities either individually or jointly are working towards this same target. More specifically, there are great areas of concern or priority areas that the country is focusing on as a strategic focus towards the achievement of the Kenya Vision 2030 and SDGs 2030. These strategic areas of focus have been isolated and declared by the President of the Republic of Kenya, His Excellency Uhuru Kenyatta, as the country’s “big four priority areas”, namely, affordable housing, affordable health care, food security, and manufacturing as a grandiose effort towards achievement of the SDGs, Kenya Vision 2030 as well as job and wealth creation. Similarly, Mount Kenya University’s top management established the Graduate Enterprise Academy (GEA) in 2013 under the direct Patronage of the university’s Founder with the primary aim of assisting graduates to be job and wealth creators rather than being job seekers. So far, over twenty start-ups are running throughout the country under Graduate Enterprise Academy (GEA). Incidentally, although the Graduate Enterprise Academy’s diverse areas of focus extend beyond the President of Kenya’s “Big Four” to include ICT and creative arts, among others, there are justifiable cases to indicate that GEA’s activities are also in support of the national “Big Four” agenda. This paper gives an exposition of different start-ups under MKU’s Graduate Enterprise Academy and are show-cased as evidence of MKU’s support towards the achievement of the national “Big Four” agenda. The paper covers a part of an ongoing program through desk-top analyses of reports, with an objective of show-casing MKU’s contribution to the national agenda through the Graduate Enterprise Academy for possible scale - up.
In times of climatic or political grievances that affect not only human life worldwide, but also the environment and the economic situation of a country, a change in the way of thinking about tourism is beginning and the sector of ecotourism is also becoming increasingly important in Germany. The applicability of this form of tourism in the East African destination Kenya in the form of a travel package that is both partly unique and can be designed individually describes the subject matter of this elaboration and is illustrated using the example of the charitable organization Mully Children's Family and the related registered tourism company, MCF Africa Safaris. The underlying research aims to determine how to transform the organisation's own tree planting initiative into a niche tourist market and how this must be geared to gain the interest of the German eco-tourist. Based on the evaluation of the research results, there is high potential, which is dedicated to the implementation of a form of travel consisting of the active support of the named charity and its initiative as well as individually selectable holiday activities in the target market Kenya. As a result, there are basic prerequisites, the consideration of which is essential for the successful integration of the so-called niche market tree planting and the branch-specific nature of ecotourism in the Kenyan travel market.
Trade of wild-caught animals is illegal for many taxa and in many countries. Common regulatory procedures involve documentation and marking techniques. However, these procedures are subject to fraud and thus should be complemented by routine genetic testing in order to authenticate the captive-bred origin of animals intended for trade. A suitable class of genetic markers are SNPSTRs that combine a short tandem repeat (STR) and single nucleotide polymorphisms (SNPs) within one amplicon. This combined marker type can be used for genetic identification and for parentage analyses and in addition, provides insight into haplotype history. As a proof of principle, this study establishes a set of 20 SNPSTR markers for Athene noctua, one of the most trafficked owls in CITES Appendix II. These markers can be coamplified in a single multiplex reaction. Based on population data, the percentage of observed and expected heterozygosities of the markers ranged from 0.400 to 1.000 and 0.545 to 0.850, respectively. A combined probability of identity of 5.3*10-23 was achieved with the whole set, and combined parentage exclusion probabilities reached over 99.99%, even if the genotype of one parent was missing. A direct comparison of an owl family and an unrelated owl demonstrated the applicability of the SNPSTR set in parentage testing. The established SNPSTR set thus proved to be highly useful for identifying individuals and analysing parentage to determine wild or captive origin. We propose to implement SNPSTR-based routine certification in wildlife trade as a way to reveal animal laundering and misdeclaration of wild-caught animals.
Jahresbericht 2021
(2022)
ITS Jahresbericht 2019
(2020)
ITS Jahresbericht 2018
(2019)
The Information and Communication Technology (ICT) sector is a significant global industry, and addressing climate change is of critical importance. This paper aims to assess the resources utilized by the ICT sector, the associated negative environmental impacts, and potential mitigation measures. In order to understand these aspects, this study attempts to categorize the resources used by ICT, analyze the amount consumed and the resulting negative impacts, and determine what measures exist to mitigate them. An economic and empirical evaluation shows a negative trend in ICT’s resource consumption, mainly due to increased energy consumption and rising carbon emissions from devices such as smartphones and data centers. The investigated countermeasures focus on Green IT strategies that encompass energy efficiency, carbon awareness, and hardware efficiency principles as outlined by the Green Software Foundation. Special attention is given to reducing the environmental footprint of data center operations and smartphones. This paper concludes that Green IT strategies, although promising in theory, are often not implemented at an industry level.
The molecular weight properties of lignins are one of the key elements that need to be analyzed for a successful industrial application of these promising biopolymers. In this study, the use of 1H NMR as well as diffusion-ordered spectroscopy (DOSY NMR), combined with multivariate regression methods, was investigated for the determination of the molecular weight (Mw and Mn) and the polydispersity of organosolv lignins (n = 53, Miscanthus x giganteus, Paulownia tomentosa, and Silphium perfoliatum). The suitability of the models was demonstrated by cross validation (CV) as well as by an independent validation set of samples from different biomass origins (beech wood and wheat straw). CV errors of ca. 7–9 and 14–16% were achieved for all parameters with the models from the 1H NMR spectra and the DOSY NMR data, respectively. The prediction errors for the validation samples were in a similar range for the partial least squares model from the 1H NMR data and for a multiple linear regression using the DOSY NMR data. The results indicate the usefulness of NMR measurements combined with multivariate regression methods as a potential alternative to more time-consuming methods such as gel permeation chromatography.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. The method is tested on data from two measurement campaigns that took place in the Allgäu region in Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 min resolution along with a non-linear photovoltaic module temperature model, global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 5.79 W m−2 (7.35 W m−2) under clear (cloudy) skies, averaged over the two campaigns, whereas for the retrieval using coarser 15 min power data with a linear temperature model the mean bias error is 5.88 and 41.87 W m−2 under clear and cloudy skies, respectively.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a 1D radiative transfer simulation, and the results are compared to both satellite retrievals and data from the Consortium for Small-scale Modelling (COSMO) weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken-cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. Specifically, the aerosol (cloud) optical depth is inferred during clear sky (completely overcast) conditions. The method is tested on data from two measurement campaigns that took place in Allgäu, Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 minute resolution, the hourly global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 11.45 W m−2, averaged over the two campaigns, whereas for the retrieval using coarser 15 minute power data the mean bias error is 16.39 W m−2.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a one-dimensional radiative transfer simulation, and the results are compared to both satellite retrievals as well as data from the COSMO weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and are properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
The electricity grid of the future will be built on renewable energy sources, which are highly variable and dependent on atmospheric conditions. In power grids with an increasingly high penetration of solar photovoltaics (PV), an accurate knowledge of the incoming solar irradiance is indispensable for grid operation and planning, and reliable irradiance forecasts are thus invaluable for energy system operators. In order to better characterise shortwave solar radiation in time and space, data from PV systems themselves can be used, since the measured power provides information about both irradiance and the optical properties of the atmosphere, in particular the cloud optical depth (COD). Indeed, in the European context with highly variable cloud cover, the cloud fraction and COD are important parameters in determining the irradiance, whereas aerosol effects are only of secondary importance.
The rapid increase in solar photovoltaic (PV) installations worldwide has resulted in the electricity grid becoming increasingly dependent on atmospheric conditions, thus requiring more accurate forecasts of incoming solar irradiance. In this context, measured data from PV systems are a valuable source of information about the optical properties of the atmosphere, in particular the cloud optical depth (COD). This work reports first results from an inversion algorithm developed to infer global, direct and diffuse irradiance as well as atmospheric optical properties from PV power measurements, with the goal of assimilating this information into numerical weather prediction (NWP) models.
Photovoltaic (PV) power data are a valuable but as yet under-utilised resource that could be used to characterise global irradiance with unprecedented spatio-temporal resolution. The resulting knowledge of atmospheric conditions can then be fed back into weather models and will ultimately serve to improve forecasts of PV power itself. This provides a data-driven alternative to statistical methods that use post-processing to overcome inconsistencies between ground-based irradiance measurements and the corresponding predictions of regional weather models (see for instance Frank et al., 2018). This work reports first results from an algorithm developed to infer global horizontal irradiance as well as atmospheric optical properties such as aerosol or cloud optical depth from PV power measurements.
Space exposure experiments from the last 15 years have unexpectedly shown that several terrestrial organisms, including some multi-cellular species, are able to survive in open space without protection. The robustness of bdelloid rotifers suggests that these tiny creatures can possibly be added to the still restricted list of animals that can deal with the exposure to harsh condition of space. Bdelloids are one of the smallest animals on Earth. Living all over the world, mostly in semi-terrestrial environments, they appear to be extremely stress tolerant. Their desiccation tolerance at any stage of their life cycle is known to confer tolerance to a variety of stresses including high doses of radiation and freezing. In addition, they constitute a major scandal in evolutionary biology due to the putative absence of sexual reproduction for at least 60 million years. Adineta vaga, with its unique characteristics and a draft genome available, was selected by ESA (European Space Agency) as a model system to study extreme resistance of organisms exposed to space environment. In this manuscript, we documented the resistance of desiccated A. vaga individuals exposed to increasing doses of X-ray, protons and Fe ions. Consequences of exposure to different sources of radiation were investigated in regard to the cellular type including somatic (survival assay) and germinal cells (fertility assay). Then, the capacity of A. vaga individuals to repair DNA DSB induced by different source of radiation was investigated. Bdelloid rotifers represent a promising model in order to investigate damage induced by high or low LET radiation. The possibility of exposure both on hydrated or desiccated specimens may help to decipher contribution of direct and indirect radiation damage on biological processes. Results achieved through this study consolidate our knowledge about the radioresistance of A. vaga and improve our capacity to compare extreme resistance against radiation among living organisms including metazoan.
Among the celestial bodies in the Solar System, Mars currently represents the main target for the search for life beyond Earth. However, its surface is constantly exposed to high doses of cosmic rays (CRs) that may pose a threat to any biological system. For this reason, investigations into the limits of resistance of life to space relevant radiation is fundamental to speculate on the chance of finding extraterrestrial organisms on Mars. In the present work, as part of the STARLIFE project, the responses of dried colonies of the black fungus Cryomyces antarcticus Culture Collection of Fungi from Extreme Environments (CCFEE) 515 to the exposure to accelerated iron (LET: 200 keV/μm) ions, which mimic part of CRs spectrum, were investigated. Samples were exposed to the iron ions up to 1000 Gy in the presence of Martian regolith analogues. Our results showed an extraordinary resistance of the fungus in terms of survival, recovery of metabolic activity and DNA integrity. These experiments give new insights into the survival probability of possible terrestrial-like life forms on the present or past Martian surface and shallow subsurface environments.
Polyether and polyether/ester based TPU (thermoplastic polyurethanes) were investigated with wide-angle XRD (X-ray diffraction) and SAXS (small angle X-ray scattering). Furthermore, SAXS measurements were performed in the temperature range of 30 °C to 130 °C. Polyether based polymers exhibit only one broad diffraction signal in a region of 2 θ 15° to 25°. In case of polyurethanes with ether/ester modification, the broad diffraction signal arises with small sharp diffraction signals. SAXS measurements of polymers reveal the size and shape of the crystalline zones of the polymer. Between 30 °C and 130 °C the size of the crystalline zone changes significantly. The size decreases in most of investigated TPU. In the case of Desmopan 9365D an increase of the particle size was observed.
In this study, we investigate the thermo-mechanical relaxation and crystallization behavior of polyethylene using mesoscale molecular dynamics simulations. Our models specifically mimic constraints that occur in real-life polymer processing: After strong uniaxial stretching of the melt, we quench and release the polymer chains at different loading conditions. These conditions allow for free or hindered shrinkage, respectively. We present the shrinkage and swelling behavior as well as the crystallization kinetics over up to 600 ns simulation time. We are able to precisely evaluate how the interplay of chain length, temperature, local entanglements and orientation of chain segments influences crystallization and relaxation behavior. From our models, we determine the temperature dependent crystallization rate of polyethylene, including crystallization onset temperature.
Because the robust and rapid determination of spoilage microorganisms is becoming increasingly important in industry, the use of IR microspectroscopy, and the establishment of robust and versatile chemometric models for data processing and classification, is gaining importance. To further improve the chemometric models, bacterial stress responses were induced, to study the effect on the IR spectra and to improve the chemometric model. Thus, in this work, nine important food-relevant microorganisms were subjected to eight stress conditions, besides the regular culturing as a reference. Spectral changes compared to normal growth conditions without stressors were found in the spectral regions of 900–1500 cm−1 and 1500–1700 cm−1. These differences might stem from changes in the protein secondary structure, exopolymer production, and concentration of nucleic acids, lipids, and polysaccharides. As a result, a model for the discrimination of the studied microorganisms at the genus, species and strain level was established, with an accuracy of 96.6%. This was achieved despite the inclusion of various stress conditions and times after incubation of the bacteria. In addition, a model was developed for each individual microorganism, to separate each stress condition or regular treatment with 100% accuracy.
Solar energy is one option to serve the rising global energy demand with low environmental impact.1 Building an energy system with a considerable share of solar power requires long-term investment and a careful investigation of potential sites. Therefore, understanding the impacts from varying regionally and locally determined meteorological conditions on solar energy production will influence energy yield projections. Clouds are moving on a short term timescale and have a high influence on the available solar radiation, as they absorb, reflect and scatter parts of the incoming light.2 However, the impact of cloudiness on photovoltaic power yields (PV) and cloud induced deviations from average yields might vary depending on the technology, location and time scale under consideration.
Gas chromatography (GC) is a type of chromatography. According to the International Union of Pure and Applied Chemistry (IUPAC) recommendation, gas chromatography is defined as a separation technique in which the mobile phase is a gas. Gas chromatography is always carried out in a column [1]. GC is a separation and detection method for sample mixtures, whose components can be volatilized without thermal decomposition.
According to the International Union of Pure and Applied Chemistry (IUPAC) recommendation, analytical pyrolysis (Py) is defined as the characterization in an inert atmosphere of a material or a chemical process by a chemical degradation reaction(s) induced by thermal energy [1]. Thermal degradation under controlled conditions is often used as a part of an analytical procedure, either to render a sample into a suitable form for subsequent analysis by gas chromatography (GC), mass spectrometry (MS), gas chromatography coupled with the mass spectrometry (GC/MS), with the Fourier-transform infrared spectroscopy (GC/FTIR), or by direct monitoring as an analytical technique in its own right [2].
Designing a social protection system is of course not only a technical exercise but a very political affair. A systems approach to social protection is shaped by the political elites and the respective coalitions of change, the political institutions as well as the political system of a country. This explains why also seemingly similar countries in terms of their risk profile, poverty situation and economic situation can adopt very different social protection systems or make very different progress with respect to social protection expansion. Not only are the established welfare states of the Global North but also the nascent social protection systems in the Global South a testimony of this variety.
The changing world poses many challenges to public policies, including social policies – among them social protection policies, which are the main focus of this handbook. Here, in this part of the handbook, we take on a number of these challenges: demographic changes and their interaction with social protection policies; roles of social protection in coping with the consequences of the COVID-19 pandemic (both topics discussed in Chapter 39 and 43 by Woodall); the challenges of globalisation (discussed in Chapter 40 by Betz) and the limitations it imposes on state sovereignty and its ability to decide on the size of publicly funded programmes, in particular social protection; challenges to labour markets and social effective protection coverage posed by automation and digitalisation of businesses (discussed in Chapter 41 by Gassmann) and, last but not least, potential roles of social protection in facilitating population’s adjustments to climate change (discussed in Chapter 42 by Malerba).
While there is a standard set of instruments that can be used in social protection systems, this needs to be adapted and combined in different ways in order to serve different groups in society best. The needs of a young person who is just starting life and should not be trapped from birth in unfavourable socio-economic conditions are different from the social protection requirements of a retired person who has finished the active part of life and requires income and care security for an indefinite time period.
While social protection has become an important policy field in many low- and middle-income countries (LICs and MICs), 55 per cent of the world’s population are still not even covered by one social protection benefit, with 87 per cent of people uncovered in Sub-Saharan Africa and 61 per cent in Asia and the Pacific (ILO 2017). Next to undercoverage, there are other factors that lower the efficiency, effectiveness and social justice of social protection in many countries, such as the lack of a joint vision and policy strategy, fragmented social protection programmes, duplication of administrative systems and efforts and irrational prioritisation in spending. These all call for a stronger systems approach to social protection. This handbook is therefore dedicated to social protection systems, highlighting the relevance but also the challenges that are related to a harmonised and coordinated approach across different social protection instruments, institutions, actors and delivery mechanisms. It takes the reader through all possible aspects of social protection systems.
The cube in cube approach was used by Paul and Ishai-Cohen to model and derive formulas for filler content dependent Young's moduli of particle filled composites assuming perfect filler matrix adhesion. Their formulas were chosen because of their simplicity, and recalculated using an elementary volume approach which transforms spherical inclusions to cubic inclusions. The EV approach led to expression of the composites moduli that allows introducing an adhesion factor kadh ranging from 0 and 1 to take into account reduced filler matrix adhesion. This adhesion factor scales the edge length of the cubic inclusions, thus reducing the stress transfer area between matrix and filler. Fitting the experimental data with the modified Paul model provides reasonable kadh for PA66, PBT, PP, PE-LD and BR which are in line with their surface energies. Further analysis showed that stiffening only occurs if kadh exceeds [Formula: see text] and depends on the ratio of matrix modulus and filler modulus. The modified model allows for a quick calculation of any particle filled composites for known matrix modulus EM, filler modulus EF, filler volume content vF and adhesion factor kadh. Thus, finite element analysis (FEA) simulations of any particle filled polymer parts as well as materials selection are significantly eased. FEA of cubic and hexagonal EV arrangements show that stress distributions within the EV exhibit more shear stresses if one deviates from the cubic arrangement. At high filler contents the assumption that the property of the EV is representative for the whole composite, holds only for filler volume contents up to 15 or 20% (corresponding to 30 to 40 weight %). Thus, for vast majority of commercially available particulate composites, the modified model can be applied. Furthermore, this indicates that the cube in cube approach reaches two limits: (i) the occurrence of increasing shear stresses at filler contents above 20% due to deviations of EV arrangements or spatial filler distribution from cubic arrangements (singular), and (ii) increasing interaction between particles with the formation of particle network within the matrix violating the EV assumption of their homogeneous dispersion.
Introduction
(2018)
This handbook describes the processes and success factors of marketoriented university services to the non-academic world, and the processes to integrate these services into teaching. It aims to highlight benchmark examples from Africa and Germany in order to outline motivational factors, influencing aspects, as well as drivers and barriers to applied university services in developing countries.
Intersectoral collaborations are an integral component of the prevention and control of diseases in a complex health system. On the one hand, One Health (OH) is promoting the establishment of intersectoral collaborations for prevention at the human-animal-environment interface. On the other hand, operationalising OH can only be realized through intersectoral collaborations.
This work contributes to broadening the knowledge of the process for operationalising OH by analysing the governance structures behind different initiatives that tackle health problems at the human-animal-environment interface. The cases taken as examples for the analysis are the control and response to rabies and avian influenza under “classical OH”, and the management of floods and droughts for insights into “extended OH”. Data from Ghana and India were collected and compared to identify the key elements that enable ISC for OH.
Despite the case studies being heterogeneous in terms of their geographic, economic, social, cultural, and historical contexts, strong similarities were identified on how intersectoral collaborations in OH were initiated, managed, and taken to scale.
The actions documented for rabies prevention and control were historically based on one sector being the leader and implementer of activities, while avian influenza management relied more on intersectoral collaborations with clearly defined sectoral responsibilities. The management of the impact of flood and droughts on health provided a good example of intersectoral collaborations achieved by sectoral integration; however, the human health component was only involved in the response stage in the case of Ghana, while for India, there were broader schemes of intersectoral collaborations for prevention, adaptation, and response concerning climate change and disaster.
Interne Audits können mehr
(2024)
Dieser Beitrag zeigt, wie das Deutsche Zentrum für Luft- und Raumfahrt e. V. (DLR) Zufriedenheitsanalysen aus zwei Sichtweisen durchführt: Aus Sicht der Auditoren und aus Sicht der Managementbeauftragten der auditierten Institute und Einrichtungen. Die Ergebnisse fließen in die jährliche Auditprogrammplanung ein. Damit wird der Nutzen von internen Audits gesteigert.