Refine
Departments, institutes and facilities
- Graduierteninstitut (51)
- Fachbereich Informatik (20)
- Fachbereich Angewandte Naturwissenschaften (18)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (5)
- Institut für Verbraucherinformatik (IVI) (5)
- Fachbereich Wirtschaftswissenschaften (4)
- Institut für Cyber Security & Privacy (ICSP) (4)
- Institute of Visual Computing (IVC) (4)
- Institut für Sicherheitsforschung (ISF) (3)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (2)
Document Type
- Doctoral Thesis (64) (remove)
Year of publication
Language
- English (64) (remove)
Has Fulltext
- no (64)
Keywords
- Robotics (3)
- Antioxidans (2)
- Evolutionary optimization (2)
- Human-Computer Interaction (2)
- Nachhaltigkeit (2)
- Quality diversity (2)
- Softwareentwicklung (2)
- Sustainability (2)
- Virtual Reality (2)
- API Gebrauchstauglichkeit (1)
With the digital transformation, software systems have become an integral part of our society and economy. In every part of our life, software systems are increasingly utilized to, e.g., simplify housework or to optimize business processes. All these applications are connected to the Internet, which already includes millions of software services consumed by billions of people. Applications which process such a magnitude of users and data traffic requires to be highly scalable and are therefore denoted as Ultra Large Scale (ULS) systems. Roy Fielding has defined one of the first approaches which allows designing modern ULS software systems. In his doctoral thesis, Fielding introduced the architectural style Representational State Transfer (REST) which builds the theoretical foundation of the web. At present, the web is considered as the world's largest ULS system. Due to a large number of users and the significance of software for society and the economy, the security of ULS systems is another crucial quality factor besides high scalability.
Due to the use of fossil fuel resources, many environmental problems have been increasingly growing. Thus, the recent research focuses on the use of environment friendly materials from sustainable feedstocks for future fuels, chemicals, fibers and polymers. Lignocellulosic biomass has become the raw material of choice for these new materials. Recently, the research has focused on using lignin as a substitute material in many industrial applications. The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. DPPH assay was used to determine the antioxidant activity of Kraft lignin compared to Organosolv lignins from different biomasses. The purification procedure of Kraft lignin showed that double-fold selective extraction is the most efficient confirmed by UV-Vis, FTIR, HSQC, 31PNMR, SEC, and XRD. The antioxidant capacity was discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: Biomass source influences the DPPH inhibition (softwood > grass) and the TPC (softwood < grass). DPPH inhibition affected by the polarity of the extraction solvent. Following the trend: ethanol > diethylether > acetone. Reduced polydispersity has positive influence on the DPPH inhibition. Storage decreased the DPPH inhibition but increased the TPC values. The DPPH assay was also used to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. In both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems the 5% addition showed the highest activity and the highest addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source; Organosolv of softwood > Kraft of softwood > Organosolv of grass. Lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of Kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin leaching in the produced films affected the activity positively and the chitosan addition enhances the activity for both Gram-positive and Gram-negative bacteria. Testing the films against food spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both food spoilage bacteria.
For a sustainable development the electricity sector needs to be decarbonized. In 2017 only 54% of the West African households had access to the electrical grid. Thus, renewable sources should play a major role for the development of the power sector in West Africa. Above all, solar power shows highest potential of renewable energy sources. However, it is highly variable, depending on the atmospheric conditions. This study addresses the challenges for a solar based power system in West Africa by analyzing the atmospheric variability of solar power. For this purpose, two aspects are investigated. In the first part, the daily power reduction due to atmospheric aerosols is quantified for different solar power technologies. Meteorological data at six ground-based stations is used to model photovoltaic and parabolic trough power during all mostly clear-sky days in 2006. A radiative transfer model is combined with solar power model. The results show, that the reduction due to aerosols can be up to 79% for photovoltaic and up to 100% for parabolic trough power plants during a major dust outbreak. Frequent dust outbreaks occurring in West Africa would cause frequent blackouts if sufficient storage capacities are not available. On average, aerosols reduce the daily power yields by 13% to 22% for photovoltaic and by 22% to 37% for parabolic troughs. For the second part, long-term atmospheric variability and trends of solar irradiance are analyzed and their impact on photovoltaic yields is examined for West Africa. Based on a 35-year satellite data record (1983 - 2017) the temporal and spatial variability and general trend are depicted for global and direct horizontal irradiances. Furthermore, photovoltaic yields are calculated on a daily basis. They show a strong meridional gradient with highest values of 5 kWh/kWp in the Sahara and Sahel zone and lowest values in southern West Africa (around 4 kWh/kWp). Thereby, the temporal variability is highest in southern West Africa (up to around 18%) and lowest in the Sahara (around 4.5%). This implies the need of a North-South grid development, to feed the increasing demand on the highly populated coast by solar power from the northern parts of West Africa. Additionally, global irradiances show a long-term positive trend (up to +5 W/m²/decade) in the Sahara and a negative trend (up to -5 W/m²/decade) in southern West Africa. If this trend is continuing, the spatial differences in solar power potential will increase in the future. This thesis provides a better understanding of the impact of atmospheric variability on solar power in a challenging environment like West Africa, characterized by the strong influence of the African monsoon. Thereby, the importance of aerosols is pointed out. Furthermore, long-term changes of irradiance are characterized concerning their implications for photovoltaic power.
Software offshoring has been established as an important business strategy over the last decade. While research on such forms of Global Software Development (GSD) has mainly focused on the situation of large enterprises, small enterprises are increasingly engaging in offshoring, too. Representing the biggest share of the German software industry, small companies are known to be important innovators and market pioneers. They often regard their flexibility and customer-orientation as core competitive advantages. Unlike large corporations, their small size allows them to adopt software development approaches that are characterized by a high agility and flat hierarchies. At the same time, their distinct strategies make it unlikely that they can simply adopt management strategies that were developed for larger companies.
Flexible development approaches like the ones preferred by small corporations have proven to be problematic in the context of offshoring, as their strong dependency on constant communication is strongly affected by the various barriers of international cooperation between companies. Cooperating closely over companies’ borders in different time zones and in culturally diverse teams poses complex obstacles for flexible management approaches. It is still a matter of discussion in fields like Software Engineering and Computer Supported Cooperative Work how these obstacles can be tackled and how they affect companies in the long term. Hence, it is agreed that we need a more detailed understanding of distributed software development practices in order to come to feasible technological and organizational solutions.
This dissertation presents results from two ethnographically-informed case studies of software offshoring in small German enterprises. By adopting Anselm Strauss’ concept of articulation work, we want to deepen the understanding of managing distributed software development in flexible, customer-oriented organizations. In doing so, we show how practices of coordinating inter-organizational software development are closely related to aspects of organizational learning in small enterprises. By means of interviews with developers and project managers from both parties of the cooperation, we do not only take into account the multiple perspectives of the cooperation, but also include the socio-cultural background of international software development projects into our analysis.
Optimization plays an essential role in industrial design, but is not limited to minimization of a simple function, such as cost or strength. These tools are also used in conceptual phases, to better understand what is possible. To support this exploration we focus on Quality Diversity (QD) algorithms, which produce sets of varied, high performing solutions. These techniques often require the evaluation of millions of solutions -- making them impractical in design cases. In this thesis we propose methods to radically improve the data-efficiency of QD with machine learning, enabling its application to design. In our first contribution, we develop a method of modeling the performance of evolved neural networks used for control and design. The structures of these networks grow and change, making them difficult to model -- but with a new method we are able to estimate their performance based on their heredity, improving data-efficiency by several times. In our second contribution we combine model-based optimization with MAP-Elites, a QD algorithm. A model of performance is created from known designs, and MAP-Elites creates a new set of designs using this approximation. A subset of these designs are the evaluated to improve the model, and the process repeats. We show that this approach improves the efficiency of MAP-Elites by orders of magnitude. Our third contribution integrates generative models into MAP-Elites to learn domain specific encodings. A variational autoencoder is trained on the solutions produced by MAP-Elites, capturing the common “recipe” for high performance. This learned encoding can then be reused by other algorithms for rapid optimization, including MAP-Elites. Throughout this thesis, though the focus of our vision is design, we examine applications in other fields, such as robotics. These advances are not exclusive to design, but serve as foundational work on the integration of QD and machine learning.
Discrimination and classification of eight strains related to meat spoilage microorganisms commonly found in poultry meat were successfully carried out using two dispersive Raman spectrometers (Microscope and Portable Fiber-Optic systems) in combination with chemometric methods. Principal Components Analysis (PCA) and Multi-Class Support Vector Machines (MC-SVM) were applied to develop discrimination and classification models. These models were certified using validation data sets which were successfully assigned to the correct bacterial genera and even to the right strain. The discrimination of bacteria down to the strain level was performed for the pre-processed spectral data using a 3-stage model based on PCA. The spectral features and differences among the species on which the discrimination was based were clarified through PCA loadings. In MC-SVM the pre-processed spectral data was subjected to PCA and utilized to build a classification model. When using the first two components, the accuracy of the MC-SVM model was 97.64% and 93.23% for the validation data collected by the Raman Microscope and the Portable Fiber-Optic Raman system, respectively. The accuracy reached 100% for the validation data by using the first eight and ten PC’s from the data collected by Raman Microscope and by Portable Fiber-Optic Raman system, respectively. The results reflect the strong discriminative power and the high performance of the developed models, the suitability of the pre-processing method used in this study and that the low accuracy of the Portable Fiber-Optic Raman system does not adversely affect the discriminative power of the developed models.
In forensic DNA profiling, the occurrence of complex mixed profiles is currently a common issue. Cases involving intimate swabs or skin flake tape liftings are prone to mixed profiles, because of more than one donor contributing to a DNA sample. By DNA profiling of single spermatozoa and skin flakes, problems associated with mixed profile could ideally be overcome. However, PCR is not a sensitive enough method to generate DNA profiles by STRs on single cells. Moreover, high quality intact DNA is required, but is not always available in skin flakes due to degradation. Additionally, single skin flakes are difficult to discriminate from other similar looking particles on tape liftings used to secure DNA samples from evidence. The main purpose of this study was to develop a method that enables DNA profiling of single sperm cells and skin flakes. After studying multiple whole genome amplification (WGA) protocols, REPLI-g Single Cell WGA was selected due to its suitability in the pre-amplification step of template DNA. Micromanipulation was used to isolate single spermatozoa. Furthermore, micromanipulation in combination with REPLI-g Single Cell WGA resulted in successful DNA profiling of single spermatozoa by using autosomal STRs as well as X- and Y-chromosomal STRs. The single spermatozoa DNA profiling method described in this thesis was successfully used to identify male contributors from mock intimate swabs with a mixture of semen from multiple male contributors. Different dyes were analysed to develop a staining method to discriminate skin flakes from other particles including particles such as those from hair cosmetic products. From all dyes tested, Orange G was the only dye which successfully discriminated skin flakes from hair product particles. Also, an alkaline based lysis protocol was developed that allowed PCR to be carried out directly on the lysates of single skin flakes. Furthermore, REPLI-g Single Cell WGA was tested on single skin flakes. In contrast to the single spermatozoa, REPLI-g Single Cell WGA was not successful in DNA profiling of single skin flakes. The single skin flake DNA profiling method described in this thesis was successfully used in correctly identifying contributors from mock mixed DNA evidence. Additionally, a small amplicon-based NGS method was tested on single skin flakes. Compared to the PCR and CE approach, the small amplicon-based NGS method improved DNA profiling of single skin flakes, giving a significant increase in allele recovery. In conclusion, this study shows circumventing mixtures is possible by DNA profiling of single spermatozoa, using micromanipulation and WGA. Furthermore, DNA profiling of single skin flakes has been improved by the staining of tape liftings methodology with Orange G, alkaline lysis, direct-PCR and a small amplicon-based NGS approach. Nonetheless, future work is required to assess the performance of the single spermatozoa method on mock swabs with more diluted semen. Also, commercially available NGS kits should be tested with single skin flakes and compared with the in-house NGS method.
Due to the popularity of the Internet and the networked services that it facilitates, networked devices have become increasingly common in both the workplace and everyday life in recent years—following the trail blazed by smartphones. The data provided by these devices allow for the creation of rich user profiles. As a result, the collection, processing and exchange of such personal data have become drivers of economic growth. History shows that the adoption of new technologies is likely to influence both individual and societal concepts of privacy. Research into privacy has therefore been confronted with continuously changing concepts due to technological progress. From a legal perspective, privacy laws that reflect social values are sought. Privacy enhancing technologies are developed or adapted to take account of technological development. Organizations must also identify protective measures that are effective in terms of scalability and automation. Similarly, research is being conducted from the perspective of Human-Computer Interaction (HCI) to explore design spaces that empower individuals to manage their protection needs with regard to novel data, which they may perceive as sensitive. Taking such an HCI perspective with regard to understanding privacy management on the Internet of Things (IoT), this research mainly focuses on three interrelated goals across the fields of application: 1. Exploring and analyzing how people make sense of data, especially when managing privacy and data disclosure; 2. Identifying, framing and evaluating potential resources for designing sense-making processes; and 3. Exploring the fitness of the identified concepts for inclusion in legal and technical perspectives on supporting decisions regarding privacy on the IoT. Although this work's point of departure is the HCI perspective, it emphasizes the importance of the interrelationships among seemingly independent perspectives. Their interdependence is therefore also emphasized and taken into account by subscribing to a user-centered design process throughout this study. More specifically, this thesis adopts a design case study approach. This approach makes it possible to conduct full user-centered design lifecycles in a concrete application case with participants in the context of everyday life. Based on this approach, it was possible to investigate several domains of the IoT that are currently relevant, namely smart metering, smartphones, smart homes and connected cars. The results show that the participants were less concerned about (raw) data than about the information that could potentially be derived from it. Against the background of the constant collection of highly technical and abstract data, the content of which only becomes visible through the application of complex algorithms, this study indicates that people should learn to explore and understand these data flexibly, and provides insights in how to design for supporting this aim. From the point of view of design for usable privacy protection measures, the information that is provided to users about data disclosure should be focused on the consequences thereof for users' environments and life. A related concept from law is “informed consent,” which I propose should be further developed in order to implement usable mechanisms for individual privacy protection in the era of the IoT. Finally, this thesis demonstrates how research on HCI can be methodologically embedded in a regulative process that will inform both the development of technology and the drafting of legislation.
Solving differential-algebraic equations (DAEs) efficiently by means of appropriate numerical schemes for time-integration is an ongoing topic in applied mathematics. In this context, especially when considering large systems that occur with respect to many fields of practical application effective computation becomes relevant. In particular, corresponding examples are given when having to simulate network structures that consider transport of fluid and gas or electrical circuits. Due to the stiffness properties of DAEs, time-integration of such problems generally demands for implicit strategies. Among the schemes that prove to be an adequate choice are linearly implicit Rung-Kutta methods in the form of Rosenbrock-Wanner (ROW) schemes. Compared to fully implicit methods, they are easy to implement and avoid the solution of non-linear equations by including Jacobian information within their formulation. However, Jacobian calculations are a costly operation. Hence, necessity of having to compute the exact Jacobian with every successful time-step proves to be a considerable drawback. To overcome this drawback, a ROW-type method is introduced that allows for non-exact Jacobian entries when solving semi-explicit DAEs of index one. The resulting scheme thus enables to exploit several strategies for saving computational effort. Examples include using partial explicit integration of non-stiff components, utilizing more advantageous sparse Jacobian structures or making use of time-lagged Jacobian information. In fact, due to the property of allowing for non-exact Jacobian expressions, the given scheme can be interpreted as a generalized ROW-type method for DAEs. This is because it covers many different ROW-type schemes known from literature. To derive the order conditions of the ROW-type method introduced, a theory is developed that allows to identify occurring differentials and coefficients graphically by means of rooted trees. Rooted trees for describing numerical methods were originally introduced by J.C. Butcher. They significantly simplify the determination and definition of relevant characteristics because they allow for applying straightforward procedures. In fact, the theory presented combines strategies used to represent ROW-type methods with exact Jacobian for DAEs and ROW-type methods with non-exact Jacobian for ODEs. For this purpose, new types of vertices are considered in order to describe occurring non-exact elementary differentials completely. The resulting theory thus automatically comprises relevant approaches known from literature. As a consequence, it allows to recognize order conditions of familiar methods covered and to identify new conditions. With the theory developed, new sets of coefficients are derived that allow to realize the ROW-type method introduced up to orders two and three. Some of them are constructed based on methods known from literature that satisfy additional conditions for the purpose of avoiding effects of order reduction. It is shown that these methods can be improved by means of the new order conditions derived without having to increase the number of internal stages. Convergence of the resulting methods is analyzed with respect to several academic test problems. Results verify the theory determined and the order conditions found as only schemes satisfying the order conditions predicted preserve their order when using non-exact Jacobian expressions.
The present thesis elucidates the development of (i) a series of small molecule inhibitors reacting in a covalent-irreversible manner with the targeted proteases and (ii) a fluorescently labeled activity-based probe as a pharmacological tool compound for investigation of specific functions of the mentioned enzymes in vitro. Herein, the rational design, organic synthesis and quantitative structure-activity-relationships are described extensively.
The globalisation and the increasing international trade have raised the number and risk of introduction of foreign species and invasive pests for years. Although native species have adapted to the native habitat over many years and generations, invasive intruders often possess characteristics that are superior to native species. Thus, and because of a lack of natural enemies, they bear the potential of decimation or complete displacement of the native species; furthermore, the introduction of pathogens or nematodes as a vector possesses a high damage potential. The available measures of the local plant protection services to combat invasive species are confined. They are limited to the felling of infested trees or plants and regular controls within the infested area. A spread of single infestations can thereby be prevented, but undetected infestations can unimpededly spread, which points out the main challenge: the detection of the species. This concerns the infestation in open land as well as the single animal on its path of introduction. Concerning the development of new adequate detection systems for invasive species, there is only little research activity going on. For other fields like detection of explosives or narcotics, the research activities date back for more than one decade and consequently there are detection systems available, which are, for example, used for explosive detection in airports. The detection principle bases on the chemistry of these substances.
3D time of flight distance measurement with custom solid state image sensors in CMOS, CCD technology
(2000)
Since we are living in a three-dimensional world, an adequate description of our environment for many applications includes the relative position and motion of the different objects in a scene. Nature has satisfied this need for spatial perception by providing most animals with at least two eyes. This stereo vision ability is the basis that allows the brain to calculate qualitative depth information of the observed scene. Another important parameter in the complex human depth perception is our experience and memory. Although it is far more difficult, a human being is even able to recognize depth information without stereo vision. For example, we can qualitatively deduce the 3D scene from most photos, assuming that the photos contain known objects [COR]. The acquisition, storage, processing and comparison of such a huge amount of information requires enormous computational power - with which nature fortunately provides us. Therefore, for a technical implementation, one should resort to other simpler measurement principles. Additionally, the qualitative distance estimates of such knowledge-based passive vision systems can be replaced by accurate range measurements.
Evaluation and Optimization of IEEE802.11 multi-hop Backhaul Networks with Directional Antennas
(2020)
A major problem for rural areas is the inaccessibility to affordable broadband Internet connections. In these areas distances are large, and digging a cable into the ground is extremely expensive, considering the small number of potential customers at the end of that cable. This leads to a digital divide, where urban areas enjoy a high-quality service at low cost, while rural areas suffer from the reverse.
This work is dedicated to an alternative technical approach aiming to reduce the cost for Internet Service Provider in rural areas: WiFi-based Long Distance networks. A set of significant contributions of technology related aspects of WiFi-based Long Distance networks is described in three different fields: Propagation on long distance Wi-Fi links, MAC-layer scheduling and Interference modeling and Channel Assignment with directional antennas.
For each field, the author composes and discusses the state-of-the-art. Afterwards, the author derives research questions and tackles several open issues to develop these kinds of networks further towards a suitable technology for the backhaul segment.
In this thesis, unique administrative data, a relevant time of follow-up and advanced statistical measures to handle confounding have been utilized in order to provide new and informative evidence on the effects of vocational rehabilitation programs on work participation outcomes in Germany. While re-affirming the important role of micro-level determinants, the present study provides an extensive example of the individual and fiscal effects that are possible through meaningful vocational rehabilitation measures. The analysis showed that the principal objective, namely, to improve participation in employment, was generally achieved. Contrary to the common misconception that “off-the-job training” is relatively ineffective, this thesis has provided an empirical example of the positive impact of the programs.
Computer graphics research strives to synthesize images of a high visual realism that are indistinguishable from real visual experiences. While modern image synthesis approaches enable to create digital images of astonishing complexity and beauty, processing resources remain a limiting factor. Here, rendering efficiency is a central challenge involving a trade-off between visual fidelity and interactivity. For that reason, there is still a fundamental difference between the perception of the physical world and computer-generated imagery. At the same time, advances in display technologies drive the development of novel display devices. The dynamic range, the pixel densities, and refresh rates are constantly increasing. Display systems enable a larger visual field to be addressed by covering a wider field-of-view, due to either their size or in the form of head-mounted devices. Currently, research prototypes are ranging from stereo and multi-view systems, head-mounted devices with adaptable lenses, up to retinal projection, and lightfield/holographic displays. Computer graphics has to keep step with, as driving these devices presents us with immense challenges, most of which are currently unsolved. Fortunately, the human visual system has certain limitations, which means that providing the highest possible visual quality is not always necessary. Visual input passes through the eye’s optics, is filtered, and is processed at higher level structures in the brain. Knowledge of these processes helps to design novel rendering approaches that allow the creation of images at a higher quality and within a reduced time-frame. This thesis presents the state-of-the-art research and models that exploit the limitations of perception in order to increase visual quality but also to reduce workload alike - a concept we call perception-driven rendering. This research results in several practical rendering approaches that allow some of the fundamental challenges of computer graphics to be tackled. By using different tracking hardware, display systems, and head-mounted devices, we show the potential of each of the presented systems. The capturing of specific processes of the human visual system can be improved by combining multiple measurements using machine learning techniques. Different sampling, filtering, and reconstruction techniques aid the visual quality of the synthesized images. An in-depth evaluation of the presented systems including benchmarks, comparative examination with image metrics as well as user studies and experiments demonstrated that the methods introduced are visually superior or on the same qualitative level as ground truth, whilst having a significantly reduced computational complexity.
Process-dependent thermo-mechanical viscoelastic properties and the corresponding morphology of HDPE extrusion blow molded (EBM) parts were investigated. Evaluation of bulk data showed that flow direction, draw ratio, and mold temperature influence the viscoelastic behavior significantly in certain temperature ranges. Flow induced orientations due to higher draw ratio and higher mold temperature lead to higher crystallinities. To determine the local viscoelastic properties, a new microindentation system was developed by merging indentation with dynamic mechanical analysis. The local process-structure-property relationship of EBM parts showed that the cross-sectional temperature distribution is clearly reflected by local crystallinities and local complex moduli. Additionally, a model to calculate three-dimensional anisotropic coefficients of thermal expansion as a function of the process dependent crystallinity was developed based on an elementary volume unit cell with stacked layers of amorphous phase and crystalline lamellae. Good agreement of the predicted thermal expansion coefficients with measured ones was found up to a temperature of 70 °C.
The initially large number of variants is reduced by applying custom variant annotation and filtering procedures. This requires complex software toolchains to be set up and data sources to be integrated. Furthermore, increasing study sizes subsequently require higher efforts to manage datasets in a multi-user and multi-institution environment. It is common practice to expect numerous iterations of continuative respecification and refinement of filter strategies, when the cause for a disease or phenotype is unknown. Data analysis support during this phase is fundamental, because handling the large volume of data is not possible or inadequate for users with limited computer literacy. Constant feedback and communication is necessary when filter parameters are adjusted or the study grows with additional samples. Consequently, variant filtering and interpretation becomes time-consuming and hinders a dynamic and explorative data analysis by experts.
In the field of domestic service robots, recovery from faults is crucial to promote user acceptance. In this context, this work focuses on some specific faults which arise from the interaction of a robot with its real world environment. Even a well-modelled robot may fail to perform its tasks successfully due to external faults which occur because of an infinite number of unforeseeable and unmodelled situations. Through investigating the most frequent failures in typical scenarios which have been observed in real-world demonstrations and competitions using the autonomous service robots Care-O-Bot III and youBot, we identified four different fault classes caused by disturbances, imperfect perception, inadequate planning operator or chaining of action sequences. This thesis then presents two approaches to handle external faults caused by insufficient knowledge about the preconditions of the planning operator. The first approach presents reasoning on detected external faults using knowledge about naive physics. The naive physics knowledge is represented by the physical properties of objects which are formalized in a logical framework. The proposed approach applies a qualitative version of physical laws to these properties in order to reason. By interpreting the reasoning results the robot identifies the information about the situations which can cause the fault. Applying this approach to simple manipulation tasks like picking and placing objects show that naive physics holds great possibilities for reasoning on unknown external faults in robotics. The second approach includes missing knowledge about the execution of an action through learning by experimentation. Firstly, it investigates such representation of execution specific knowledge that can be learned for one particular situation and reused for situations which deviate from the original. The combination of symbolic and geometric models allows us to represent action execution knowledge effectively. This representation is called action execution model (AEM) here. The approach provides a learning strategy which uses a physical simulation for generating the training data to learn both symbolic and geometric aspects of the model. The experimental analysis, performed on two physical robots, shows that AEM can reliably describe execution specific knowledge and thereby serving as a potential model for avoiding the occurrence of external faults.
The design of an efficient digital circuit in term of low-power has become a very challenging issue. For this reason, low-power digital circuit design is a topic addressed in electrical and computer engineering curricula, but it also requires practical experiments in a laboratory. This PhD research investigates a novel approach, the low-power design laboratory system by developing a new technical and pedagogical system. The low-power design laboratory system is composed of two types of laboratories: the on-site (hands-on) laboratory and the remote laboratory. It has been developed at the Bonn-Rhine-Sieg University of Applied Sciences to teach low-power techniques in the laboratory. Additionally, this thesis contributes a suggestion on how the learning objectives can be complemented by developing a remote system in order to improve the teaching process of the low-power digital circuit design. This laboratory system enables online experiments that can be performed using physical instruments and obtaining real data via the internet. The laboratory experiments use a Field Programmable Gate Array (FPGA) as a design platform for circuit implementation by students and use image processing as an application for teaching low-power techniques.
This thesis presents the instructions for the low-power design experiments which use a top-down hierarchical design methodology. The engineering student designs his/her algorithm with a high level of abstraction and the experimental results are obtained and measured at a low level (hardware) so that more information is available to correctly estimate the power dissipation such as specification, latency, thermal effect, and technology used. Power dissipation of the digital system is influenced by specification, design, technology used, as well as operating temperature. Digital circuit designers can observe the most influential factors in power dissipation during the laboratory exercises in the on-site system and then use the remote system to supplement investigating the other factors. Furthermore, the remote system has obvious benefits such as developing learning outcomes, facilitating new teaching methods, reducing costs and maintenance, cost-saving by reducing the numbers of instructors, saving instructor time and simplifying their tasks, facilitating equipment sharing, improving reliability, and finally providing flexibility of usage the laboratories.
Daryoush Daniel Vaziri illustrates that the use of mixed methods designs may support the induction of more subtle and complete theories about older adults’ use of technologies for the support of active and healthy aging. The results show that older adults’ social contexts and environments considerably affect their perspectives, practices and attitudes with respect to health, quality of life, well-being and technology use for active and healthy aging support. Results were collected with older adults aged 60+ as well as relevant secondary stakeholders like caregivers, policy makers or health insurance companies.
Diese Arbeit beschäftigt sich mit der Effizienz der Seitenkanal-Kryptanalyse. In Teil II dieser Arbeit demonstrieren wir, wie die Laufzeit der wichtigsten Analysewerkzeuge mit Hilfe der CUDA Plattform erheblich gesteigert werden kann. Zweitens untersuchen wir neue Ansätze der profilierenden Seitenkanal-Kryptanalyse. Der Forschungszweig des maschinellen Lernens kann für deutliche Verbesserungen adaptiert werden, wurde jedoch wenig dahingehend untersucht. In Teil III dieser Arbeit präsentieren wir zwei neue Methoden, die einige Gemeinsamkeiten jedoch auch einige Unterschiede aufbieten, sodass sich Prüfergebnisse in einem vollständigeren Bild zeigen lassen. Darüber hinaus schlagen wir in Teil IV eine Seitenkanalanwendung zum Schutz geistigen Eigentums (IP) vor. In Teil V beschäftigen wir uns tiefergehend mit praktischer Seitenkanal-Kryptanalyse, indem wir Attacken auf einen Sicherheitsmikrokontroller durchführen, der Anwendung in einer, in Deutschland weit verbreiteten, EC Karte findet.
As robots are becoming ubiquitous and more capable, the need for introducing solid robot software development methods is pressing to increase robots' task spectrum. This thesis is concerned with improving software engineering of robot perception systems. The presented research employs a model-based approach to provide the means to represent knowledge about robotics software. The thesis is divided into three parts, namely research on the specification, deployment and adaptation of robot perception systems.