Refine
Departments, institutes and facilities
- Fachbereich Informatik (64)
- Fachbereich Angewandte Naturwissenschaften (49)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (42)
- Fachbereich Ingenieurwissenschaften und Kommunikation (33)
- Fachbereich Wirtschaftswissenschaften (22)
- Institut für Cyber Security & Privacy (ICSP) (18)
- Institut für funktionale Gen-Analytik (IFGA) (17)
- Institut für Verbraucherinformatik (IVI) (16)
- Institute of Visual Computing (IVC) (16)
- Institut für Sicherheitsforschung (ISF) (8)
Document Type
- Conference Object (96)
- Article (88)
- Preprint (9)
- Doctoral Thesis (6)
- Part of a Book (5)
- Report (4)
- Book (monograph, edited volume) (3)
- Master's Thesis (3)
- Conference Proceedings (1)
- Research Data (1)
Year of publication
- 2019 (217) (remove)
Language
- English (217) (remove)
Keywords
- lignin (4)
- Navigation (3)
- security (3)
- work engagement (3)
- Aminoacylase (2)
- Design (2)
- Drosophila (2)
- Exergame (2)
- Extrusion blow molding (2)
- FPGA (2)
For robots acting - and failing - in everyday environments, a predictable behaviour representation is important so that it can be utilised for failure analysis, recovery, and subsequent improvement. Learning from demonstration combined with dynamic motion primitives is one commonly used technique for creating models that are easy to analyse and interpret; however, mobile manipulators complicate such models since they need the ability to synchronise arm and base motions for performing purposeful tasks. In this paper, we analyse dynamic motion primitives in the context of a mobile manipulator - a Toyota Human Support Robot (HSR)- and introduce a small extension of dynamic motion primitives that makes it possible to perform whole body motion with a mobile manipulator. We then present an extensive set of experiments in which our robot was grasping various everyday objects in a domestic environment, where a sequence of object detection, pose estimation, and manipulation was required for successfully completing the task. Our experiments demonstrate the feasibility of the proposed whole body motion framework for everyday object manipulation, but also illustrate the necessity for highly adaptive manipulation strategies that make better use of a robot's perceptual capabilities.
2-methylacetoacetyl-coenzyme A thiolase (beta-ketothiolase) deficiency: one disease - two pathways
(2019)
Background: 2-methylacetoacetyl-coenzyme A thiolase deficiency (MATD; deficiency of mitochondrial acetoacetyl-coenzyme A thiolase T2/ “beta-ketothiolase”) is an autosomal recessive disorder of ketone body utilization and isoleucine degradation due to mutations in ACAT1.
Methods: We performed a systematic literature search for all available clinical descriptions of patients with MATD. 244 patients were identified and included in this analysis. Clinical course and biochemical data are presented and discussed.
Results: For 89.6 % of patients at least one acute metabolic decompensation was reported. Age at first symptoms ranged from 2 days to 8 years (median 12 months). More than 82% of patients presented in the first two years of life, while manifestation in the neonatal period was the exception (3.4%). 77.0% (157 of 204 patients) of patients showed normal psychomotor development without neurologic abnormalities.
Conclusion: This comprehensive data analysis provides a systematic overview on all cases with MATD identified in the literature. It demonstrates that MATD is a rather benign disorder with often favourable outcome, when compared with many other organic acidurias.
Background 3-hydroxy-3-methylglutaryl-coenzyme A lyase deficiency (HMGCLD) is an autosomal recessive disorder of ketogenesis and leucine degradation due to mutations in HMGCL .
Method We performed a systematic literature search to identify all published cases. 211 patients of whom relevant clinical data were available were included in this analysis. Clinical course, biochemical findings and mutation data are highlighted and discussed. An overview on all published HMGCL variants is provided.
Results More than 95% of patients presented with acute metabolic decompensation. Most patients manifested within the first year of life, 42.4% already neonatally. Very few individuals remained asymptomatic. The neurologic long-term outcome was favorable with 62.6% of patients showing normal development.
Conclusion This comprehensive data analysis provides a systematic overview on all published cases with HMGCLD including a list of all known HMGCL mutations.
The choice of suitable semiconducting metal oxide (MOX) gas sensors for the detection of a specific gas or gas mixture is time-consuming since the sensor’s sensitivity needs to be characterized at multiple temperatures to find its optimal operating conditions. To obtain reliable measurement results, it is very important that the power for the sensor’s integrated heater is stable, regulated and error-free (or error-tolerant). Especially the error-free requirement can be only be achieved if the power supply implements failure-avoiding and failure-detection methods. The biggest challenge is deriving multiple different voltages from a common supply in an efficient way while keeping the system as small and lightweight as possible. This work presents a reliable, compact, embedded system that addresses the power supply requirements for fully automated simultaneous sensor characterization for up to 16 sensors at multiple temperatures. The system implements efficient (avg. 83.3% efficiency) voltage conversion with low ripple output (<32 mV) and supports static or temperature-cycled heating modes. Voltage and current of each channel are constantly monitored and regulated to guarantee reliable operation. To evaluate the proposed design, 16 sensors were screened. The results are shown in the experimental part of this work.
In Sensor-based Fault Detection and Diagnosis (SFDD) methods, spatial and temporal dependencies among the sensor signals can be modeled to detect faults in the sensors, if the defined dependencies change over time. In this work, we model Granger causal relationships between pairs of sensor data streams to detect changes in their dependencies. We compare the method on simulated signals with the Pearson correlation, and show that the method elegantly handles noise and lags in the signals and provides appreciable dependency detection. We further evaluate the method using sensor data from a mobile robot by injecting both internal and external faults during operation of the robot. The results show that the method is able to detect changes in the system when faults are injected, but is also prone to detecting false positives. This suggests that this method can be used as a weak detection of faults, but other methods, such as the use of a structural model, are required to reliably detect and diagnose faults.
Herein we report an update to ACPYPE, a Python3 tool that now properly converts AMBER to GROMACS topologies for force fields that utilize nondefault and nonuniform 1–4 electrostatic and nonbonded scaling factors or negative dihedral force constants. Prior to this work, ACPYPE only converted AMBER topologies that used uniform, default 1–4 scaling factors and positive dihedral force constants. We demonstrate that the updated ACPYPE accurately transfers the GLYCAM06 force field from AMBER to GROMACS topology files, which employs non-uniform 1–4 scaling factors as well as negative dihedral force constants. Validation was performed using β-d-GlcNAc through gas-phase analysis of dihedral energy curves and probability density functions. The updated ACPYPE retains all of its original functionality, but now allows the simulation of complex glycomolecular systems in GROMACS using AMBER-originated force fields. ACPYPE is available for download at https://github.com/alanwilter/acpype.
For years, the common logic that underpinned entrepreneurship was to find a niche within in a market/sector and then solidify business practice to achieve success in the market segment. The dawn of technologically-based disruptive enterprises, such as Uber and Air B&B, coupled with the nearing Fourth Industrial revolution seriously call into question the conventional business logic. In this article, the projected impact of these forces on African entrepreneurs is explored. We look at the role of government, business and education systems to prepare for the impact of the Fourth Industrial revolution. Specific focus is placed on the need for entrepreneurial skills and training to prepare for the impact of the Fourth Industrial revolution. We also explore the importance of innovation, both in terms of products and processes to mitigate against the impact of these forces.
The aim of the descriptive study is to gain an understanding of the perceived level of fairness in their experience of security screening relation to their satisfaction. The context of the study was a major aviation hub in East Africa. The target population was all departing international passengers. Primary data was collected using a self-administered questionnaire. The respondents were selected using convenience sampling of passengers who had just completed the final security check at the departure area of the airport. A total of 251 usable responses were collected from a target of 384 respondents giving a response rate of 65 percent.
The findings contribute to the existing body of knowledge on the relationship between the perceptions of fairness of security procedures and their influence on satisfaction. One way between groups analysis of variance (ANOVA) was conducted to test for statistical significance. A Cronbach’s alpha of 88.7 was computed demonstrating a high level of internal consistency of the survey instrument. The adequacy of security procedures, level of communication provided before and during the screening process, consistency and fairness were found to have a significant relationship to the level of satisfaction reported by passengers. The findings suggest that there are significant differences between groups’ perception of different elements security procedures.
The implications of the study are twofold. The study was cross sectional and indeed was impacted by significant changes in security procedures at the airport at the time of the study. A longitudinal survey may further mitigate the impact of the variances of responses and support a robust contribution to the development of a theoretical model of airport passenger satisfaction. Airport managers could use the results of this study as inputs to enhance the design of screening procedures in modern hubs to enhance the passenger experience to drive revenue growth.
This paper proposes an approach to an ANN-based temperature controller design for a plastic injection moulding system. This design approach is applied to the development of a controller based on a combination of a classical ANN and integrator. The controller provides a fast temperature response and zero steady-state error for three typical heaters (bar, nozzle, and cartridge) for a plastic moulding system. The simulation results in Matlab Simulink software and in comparison to an industrial PID regulator have shown the advantages of the controller, such as significantly less overshoot and faster transient (compared to PID with autotuning) for all examined heaters. In order to verify the proposed approach, the designed ANN controller was implemented and tested using an experimental setup based on an STM32 board.
This paper stresses the importance of entrepreneurship education towards enhancing sustainable development in Kenya. The problems facing the country ranging from high rate of poverty, youth and graduate unemployment; overdependence on foreign goods and technology.
This paper therefore argues that entrepreneurship education will equip the students with the skills with which to not only be self-reliant, but to become wealth creators. The intervention level of entrepreneurship education has been at tertiary institutions and universities. This paper argues that attitudes and values are acquired at formative stage in life. Based on literature review of the models that have been used and yielded positive results, this paper proposes an innovative approach to the teaching of entrepreneurship education that is inclusive of pre-school, primary, secondary, tertiary and university levels. This paper explores the “Mully Model of Applied Entrepreneurship Teaching” as a case study, using interviews, surveys and reviewing relevant MCF data. The organization’s success factors within the Kenyan context are discussed.
The paper also recommended that educational programs at all levels of education should be made relevant to provide the youth the needed entrepreneurial skills. Further, it recommends that experiential learning methodologies be emphasized in the delivery of entrepreneurship education.
The limited sodium availability of freshwater and terrestrial environments was a major physiological challenge during vertebrate evolution. The epithelial sodium channel (ENaC) is present in the apical membrane of sodium-absorbing vertebrate epithelia and evolved as part of a machinery for efficient sodium conservation. ENaC belongs to the degenerin/ENaC protein family and is the only member that opens without an external stimulus. We hypothesized that ENaC evolved from a proton-activated sodium channel present in ionocytes of freshwater vertebrates and therefore investigated whether such ancestral traits are present in ENaC isoforms of the aquatic pipid frog Xenopus laevis. Using whole-cell and single-channel electrophysiology of Xenopus oocytes expressing ENaC isoforms assembled from alpha beta gamma- or delta beta gamma-subunit combinations, we demonstrate that Xenopus delta beta gamma-ENaC is profoundly activated by extracellular acidification within biologically relevant ranges (pH 8.0-6.0). This effect was not observed in Xenopus alpha beta gamma-ENaC or human ENaC orthologs. We show that protons interfere with allosteric ENaC inhibition by extracellular sodium ions, thereby increasing the probability of channel opening. Using homology modeling of ENaC structure and site-directed mutagenesis, we identified a cleft region within the extracellular loop of the delta-subunit that contains several acidic amino acid residues that confer proton-sensitivity and enable allosteric inhibition by extracellular sodium ions. We propose that Xenopus delta beta gamma-ENaC can serve as a model for investigating ENaC transformation from a proton-activated toward a constitutively-active ion channel. Such transformation might have occurred during the evolution of tetrapod vertebrates to enable bulk sodium absorption during the water-to-land transition.
Analytical pyrolysis
(2019)
Analytical pyrolysis deals with the structural identification and quantitation of pyrolysis products with the ultimate aim of establishing the identity of the original material and the mechanisms of its thermal decomposition. The pyrolytic process is carried out in a pyrolyzer interfaced with analytical instrumentation such as gas chromatography (GC), mass spectrometry (MS), gas chromatography coupled with mass spectrometry (GC/MS), or with Fourier-transform infrared spectroscopy (GC/FTIR). By measurement and identification of pyrolysis products, the molecular composition of the original sample can often be reconstructed.This book is the outcome of contributions by experts in the field of pyrolysis and includes applications of the analytical pyrolysis-GC/MS to characterize the structure of synthetic organic polymers and lignocellulosic materials as well as cellulosic pulps and isolated lignins, solid wood, waste particle board, and bio-oil. The thermal degradation of cellulose and biomass is examined by scanning electron micrography, FTIR spectroscopy, thermogravimetry (TG), differential thermal analysis, and TG/MS. The calorimetric determination of high heating values of different raw biomass, plastic waste, and biomass/plastic waste mixtures and their by-products resulting from pyrolysis is described.
More and more devices will be connected to the internet [3]. Many devicesare part of the so-called Internet of Things (IoT) which contains many low-powerdevices often powered by a battery. These devices mainly communicate with the manufacturers back-end and deliver personal data and secrets like passwords.
Lower back pain is one of the most prevalent diseases in Western societies. A large percentage of European and American populations suffer from back pain at some point in their lives. One successful approach to address lower back pain is postural training, which can be supported by wearable devices, providing real-time feedback about the user’s posture. In this work, we analyze the changes in posture induced by postural training. To this end, we compare snapshots before and after training, as measured by the Gokhale SpineTracker™. Considering pairs of before and after snapshots in different positions (standing, sitting, and bending), we introduce a feature space, that allows for unsupervised clustering. We show that resulting clusters represent certain groups of postural changes, which are meaningful to professional posture trainers.
The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. The polyphenolic structure of lignin in addition to the presence of O-containing functional groups is potentially responsible for these activities. This study used DPPH assays to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. The scavenging activity (SA) of both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems was affected by the percentage of the added lignin: the 5% addition showed the highest activity and the 30% addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source showing the following trend: organosolv of softwood > kraft of softwood > organosolv of grass. Testing the antimicrobial activities of lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin release in the produced films affected the activity positively and the chitosan addition enhances the activity even more for both Gram-positive and Gram-negative bacteria. Testing the films against spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both B. thermosphacta and P. fluorescens.
Due to global ecological and economic challenges that have been correlated to the transition from fossil-based to renewable resources, fundamental studies are being performed worldwide to replace fossil fuel raw materials in plastic production. One aspect of current research is the development of lignin-derived polyols to substitute expensive fossil-based polyol components for polyurethane and polyester production. This article describes the synthesis of bioactive lignin-based polyurethane coatings using unmodified and demethylated Kraft lignins. Demethylation was performed to enhance the reaction selectivity toward polyurethane formation. The antimicrobial activity was tested according to a slightly modified standard test (JIS Z 2801:2010). Besides effects caused by the lignins themselves, triphenylmethane derivatives (brilliant green and crystal violet) were used as additional antimicrobial substances. Results showed increased antimicrobial capacity against Staphylococcus aureus. Furthermore, the coating color could be varied from dark brown to green and blue, respectively.
Are quality diversity algorithms better at generating stepping stones than objective-based search?
(2019)
The route to the solution of complex design problems often lies through intermediate "stepping stones" which bear little resemblance to the final solution. By greedily following the path of greatest fitness improvement, objective-based search overlooks and discards stepping stones which might be critical to solving the problem. Here, we hypothesize that Quality Diversity (QD) algorithms are a better way to generate stepping stones than objective-based search: by maintaining a large set of solutions which are of high-quality, but phenotypically different, these algorithms collect promising stepping stones while protecting them in their own "ecological niche". To demonstrate the capabilities of QD we revisit the challenge of recreating images produced by user-driven evolution, a classic challenge which spurred work in novelty search and illustrated the limits of objective-based search. We show that QD far outperforms objective-based search in matching user-evolved images. Further, our results suggest some intriguing possibilities for leveraging the diversity of solutions created by QD.
This work aims to create a natural language generation (NLG) base for further development of systems for automatic examination questions generation and automatic summarization in Hochschule Bonn-Rhein-Sieg and Fraunhofer IAIS, respectively. Nowadays both tasks are very relevant. The first can significantly simplify the university teachers' work and the second to be of assistance for a faster retrieval of knowledge from an excessively large amount of information that people often work with. We focus on the search for an efficient and robust approach to the controlled NLG problem. Therefore, though the initial idea of the project was the usage of the generative adversarial neural networks (GANs), we switched our attention to more robust and easily-controllable autoencoders. Thus, in this work we implement an autoencoder for unsupervised discovery of latent space representations of text, and show the ability of the system to generate new sentences based on this latent space. Apart from that, we apply Gaussian mixture techniques in order to obtain meaningful text clusters and thereby try to create a tool that would allow us to generate sentences relevant to the semantics of the Gaussian clusters, e.g. positive or negative reviews or examination questions on certain topic. The developed system is tested on several datasets and compared to GANs' performance.
This work addresses the issue of finding an optimal flight zone for a side-by-side tracking and following Unmanned Aerial Vehicle(UAV) adhering to space-restricting factors brought upon by a dynamic Vector Field Extraction (VFE) algorithm. The VFE algorithm demands a relatively perpendicular field of view of the UAV to the tracked vehicle, thereby enforcing the space-restricting factors which are distance, angle and altitude. The objective of the UAV is to perform side-by-side tracking and following of a lightweight ground vehicle while acquiring high quality video of tufts attached to the side of the tracked vehicle. The recorded video is supplied to the VFE algorithm that produces the positions and deformations of the tufts over time as they interact with the surrounding air, resulting in an airflow model of the tracked vehicle. The present limitations of wind tunnel tests and computational fluid dynamics simulation suggest the use of a UAV for real world evaluation of the aerodynamic properties of the vehicle’s exterior. The novelty of the proposed approach is alluded to defining the specific flight zone restricting factors while adhering to the VFE algorithm, where as a result we were capable of formalizing a locally-static and a globally-dynamic geofence attached to the tracked vehicle and enclosing the UAV.
Beyond HCI and CSCW: Challenges and Useful Practices Towards a Human-Centred Vision of AI and IA
(2019)
Change - shaping reality
(2019)
The complex nature of multifactorial diseases, such as Morbus Alzheimer, has produced a strong need to design multitarget-directed ligands to address the involved complementary pathways. We performed a purposive structural modification of a tetratarget small-molecule, that is contilisant, and generated a combinatorial library of 28 substituted chromen-4-ones. The compounds comprise a basic moiety which is linker-connected to the 6-position of the heterocyclic chromenone core. The syntheses were accomplished by Mitsunobu- or Williamson-type ether formations. The resulting library members were evaluated at a panel of seven human enzymes, all of which being involved in the pathophysiology of neurodegeneration. A concomitant inhibition of human acetylcholinesterase and human monoamine oxidase B, with IC50 values of 5.58 and 7.20 μM, respectively, was achieved with the dual-target 6-(4-(piperidin-1-yl)butoxy)-4H-chromen-4-one (7).
Currently, a variety of methods exist for creating different types of spatio-temporal world models. Despite the numerous methods for this type of modeling, there exists no methodology for comparing the different approaches or their suitability for a given application e.g. logistics robots. In order to establish a means for comparing and selecting the best-fitting spatio-temporal world modeling technique, a methodology and standard set of criteria must be established. To that end, state-of-the-art methods for this type of modeling will be collected, listed, and described. Existing methods used for evaluation will also be collected where possible.
Using the collected methods, new criteria and techniques will be devised to enable the comparison of various methods in a qualitative manner. Experiments will be proposed to further narrow and ultimately select a spatio-temporal model for a given purpose. An example network of autonomous logistic robots, ROPOD, will serve as a case study used to demonstrate the use of the new criteria. This will also serve to guide the design of future experiments that aim to select a spatio-temporal world modeling technique for a given task. ROPOD was specifically selected as it operates in a real-world, human shared environment. This type of environment is desirable for experiments as it provides a unique combination of common and novel problems that arise when selecting an appropriate spatio-temporal world model. Using the developed criteria, a qualitative analysis will be applied to the selected methods to remove unfit options.
Then, experiments will be run on the remaining methods to provide comparative benchmarks. Finally, the results will be analyzed and recommendations to ROPOD will be made.
Multi-robot systems (MRS) are capable of performing a set of tasks by dividing them among the robots in the fleet. One of the challenges of working with multirobot systems is deciding which robot should execute each task. Multi-robot task allocation (MRTA) algorithms address this problem by explicitly assigning tasks to robots with the goal of maximizing the overall performance of the system. The indoor transportation of goods is a practical application of multi-robot systems in the area of logistics. The ROPOD project works on developing multi-robot system solutions for logistics in hospital facilities. The correct selection of an MRTA algorithm is crucial for enhancing transportation tasks. Several multi-robot task allocation algorithms exist in the literature, but just few experimental comparative analysis have been performed. This project analyzes and assesses the performance of MRTA algorithms for allocating supply cart transportation tasks to a fleet of robots. We conducted a qualitative analysis of MRTA algorithms, selected the most suitable ones based on the ROPOD requirements, implemented four of them (MURDOCH, SSI, TeSSI, and TeSSIduo), and evaluated the quality of their allocations using a common experimental setup and 10 experiments. Our experiments include off-line and semi on-line allocation of tasks as well as scalability tests and use virtual robots implemented as Docker containers. This design should facilitate deployment of the system on the physical robots. Our experiments conclude that TeSSI and TeSSIduo suit best the ROPOD requirements. Both use temporal constraints to build task schedules and run in polynomial time, which allow them to scale well with the number of tasks and robots. TeSSI distributes the tasks among more robots in the fleet, while TeSSIduo tends to use a lower percentage of the available robots.
Subsequently, we have integrated TeSSI and TeSSIduo to perform multi-robot task allocation for the ROPOD project.
Systemic autoinflammatory diseases (SAIDs) are a group of inflammatory disorders caused by dysregulation in the innate immune system that leads to enhanced immune responses. The clinical diagnosis of SAIDs can be difficult since individually these are rare diseases with considerable phenotypic overlap. Most SAIDs have a strong genetic background, but environmental and epigenetic influences can modulate the clinical phenotype. Molecular diagnosis has become essential for confirmation of clinical diagnosis. To date there are over 30 genes and a variety of modes of inheritance that have been associated with monogenic SAIDs. Mutations in the same gene can lead to very distinct phenotypes and can have different inheritance patterns. In addition, somatic mutations have been reported in several of these conditions. New genetic testing methods and databases are being developed to facilitate the molecular diagnosis of SAIDs, which is of major importance for treatment, prognosis and genetic counselling. The aim of this review is to summarize the latest advances in genetic testing for SAIDs and discuss potential obstacles that might arise during the molecular diagnosis of SAIDs.
Traffic sign recognition is an important component of many advanced driving assistance systems, and it is required for full autonomous driving. Computational performance is usually the bottleneck in using large scale neural networks for this purpose. SqueezeNet is a good candidate for efficient image classification of traffic signs, but in our experiments it does not reach high accuracy, and we believe this is due to lack of data, requiring data augmentation. Generative adversarial networks can learn the high dimensional distribution of empirical data, allowing the generation of new data points. In this paper we apply pix2pix GANs architecture to generate new traffic sign images and evaluate the use of these images in data augmentation. We were motivated to use pix2pix to translate symbolic sign images to real ones due to the mode collapse in Conditional GANs. Through our experiments we found that data augmentation using GAN can increase classification accuracy for circular traffic signs from 92.1% to 94.0%, and for triangular traffic signs from 93.8% to 95.3%, producing an overall improvement of 2%. However some traditional augmentation techniques can outperform GAN data augmentation, for example contrast variation in circular traffic signs (95.5%) and displacement on triangular traffic signs (96.7 %). Our negative results shows that while GANs can be naively used for data augmentation, they are not always the best choice, depending on the problem and variability in the data.
Data-Driven Robot Fault Detection and Diagnosis Using Generative Models: A Modified SFDD Algorithm
(2019)
This paper presents a modification of the data-driven sensor-based fault detection and diagnosis (SFDD) algorithm for online robot monitoring. Our version of the algorithm uses a collection of generative models, in particular restricted Boltzmann machines, each of which represents the distribution of sliding window correlations between a pair of correlated measurements. We use such models in a residual generation scheme, where high residuals generate conflict sets that are then used in a subsequent diagnosis step. As a proof of concept, the framework is evaluated on a mobile logistics robot for the problem of recognising disconnected wheels, such that the evaluation demonstrates the feasibility of the framework (on the faulty data set, the models obtained 88.6% precision and 75.6% recall rates), but also shows that the monitoring results are influenced by the choice of distribution model and the model parameters as a whole.
Destination Development for Entrepreneurial Tourism in Lake Bosomtwe and Kintampo falls (Ghana)
(2019)
The tourism industry is one of the world’s largest industries (direct, indirect and induced Africa has the potential with its cultural and natural resources to outpace other regions in attracting valuable tourism dollars. The main aim of the study is to improve visitor experience on the two tourist sites. To do this it is necessary to explore the elements and success factors of Tourism Destination Development and using these as a checklist to identify the strength and weaknesses of the selected Tourist Destinations in Ghana West Africa. The rationale behind the study is to outline the crucial Destination Management (DM) criteria of all aspect that contribute to boost ultimate visitor experience, articulating the roles of the different stakeholders and identifying clear actions for effective Tourism Development in Ghana. The interview technique was employed to collect data from staff and management of the selected destinations. Data was analyzed for themes related to elements, success factors and challenges of destination development and new ideas for development was also solicited. It was revealed that some of the elements that feature for tourists’ attraction are good hotels, high hygiene and sanitation standards, good food and activities of amusements. Competency gaps identified suggest collaboration with academia to secure a high level of knowledge through research in this present world of dynamism. Some of the critical success factors found are: systematic provision of cultural events, advance knowledge of agents and tour operators and quality leisure and recreation. It is recommended that product and service development should be a joint idea of all stakeholders. The research team therefore, have plans underway to proceed on the second phase of the project: that is to gather resources together to make lake Bosomtwe and Kintampo falls sites attractive to tourists.
Scratch assays enable the study of the migration process of an injured adherent cell layer in vitro. An apparatus for the reproducible performance of scratch assays and cell harvesting has been developed that meets the requirements for reproducibility in tests as well as easy handling. The entirely autoclavable setup is divided into a sample translation and a scratching system. The translational system is compatible with standard culture dishes and can be modified to adapt to different cell culture systems, while the scratching system can be adjusted according to angle, normal force, shape, and material to adapt to specific questions and demanding substrates. As a result, a fully functional prototype can be presented. This system enables the creation of reproducible and clear scratch edges with a low scratch border roughness within a monolayer of cells. Moreover, the apparatus allows the collection of the migrated cells after scratching for further molecular biological investigations without the need for a second processing step. For comparison, the mechanical properties of manually performed scratch assays are evaluated.
Digital transformation in Higher Education and Science is a mission-critical demand to prepare educational institutions for their future competition on the international market. In many cases, the digitization goes along with the search for and acquisition of new software. For easily exchangeable software, wrong product decisions, in the worst case, lead to calculable financial losses. However, if a planned software requires a lot of technological adjustments and is to be applied as central component of a business- and/or security-critical environment, wrong decisions during the software acquisition process might lead to hardly calculable damage. Questions arising are how to decide for a product and how many resources should be invested for the acquisition process.
We planned to apply a commercial Business Support System, which should replace the currently used in-house developed software. Our goals were the increase of our university’s level of data security, to ease the interaction between stakeholders, to eliminate media discontinuities, to improve the process management and transparency, and to reduce the execution time of automated processes. Alongside with the introduction of the electronic case file, our agenda stipulates the digitization (and automation) of administrative university processes, especially, but not limited to, the student self-service and the administrative student life cycle. Usual tools and practices, commonly applied to (simple) software acquisition, failed in our scenario.
With the case study introduced in this paper, we address all persons, involved within software acquisition processes: From our experiences, we strongly recommend to place greater value on an exhaustively completed acquisition process, than on short-termed economic advantages.
Bioinspired stem cell-based hard tissue engineering includes numerous aspects: The synthesis and fabrication of appropriate scaffold materials, their analytical characterization, and guided osteogenesis using the sustained release of osteoinducing and/or osteoconducting drugs for mesenchymal stem cell differentiation, growth, and proliferation. Here, the effect of silicon- and silicate-containing materials on osteogenesis at the molecular level has been a particular focus within the last decade. This review summarizes recently published scientific results, including material developments and analysis, with a special focus on silicon hybrid bone composites. First, the sources, bioavailability, and functions of silicon on various tissues are discussed. The second focus is on the effects of calcium-silicate biomineralization and corresponding analytical methods in investigating osteogenesis and bone formation. Finally, recent developments in the manufacturing of Si-containing scaffolds are discussed, including in vitro and in vivo studies, as well as recently filed patents that focus on the influence of silicon on hard tissue formation.
In this thesis, unique administrative data, a relevant time of follow-up and advanced statistical measures to handle confounding have been utilized in order to provide new and informative evidence on the effects of vocational rehabilitation programs on work participation outcomes in Germany. While re-affirming the important role of micro-level determinants, the present study provides an extensive example of the individual and fiscal effects that are possible through meaningful vocational rehabilitation measures. The analysis showed that the principal objective, namely, to improve participation in employment, was generally achieved. Contrary to the common misconception that “off-the-job training” is relatively ineffective, this thesis has provided an empirical example of the positive impact of the programs.
The initially large number of variants is reduced by applying custom variant annotation and filtering procedures. This requires complex software toolchains to be set up and data sources to be integrated. Furthermore, increasing study sizes subsequently require higher efforts to manage datasets in a multi-user and multi-institution environment. It is common practice to expect numerous iterations of continuative respecification and refinement of filter strategies, when the cause for a disease or phenotype is unknown. Data analysis support during this phase is fundamental, because handling the large volume of data is not possible or inadequate for users with limited computer literacy. Constant feedback and communication is necessary when filter parameters are adjusted or the study grows with additional samples. Consequently, variant filtering and interpretation becomes time-consuming and hinders a dynamic and explorative data analysis by experts.
Energy Profiles of the Ring Puckering of Cyclopentane, Methylcyclopentane and Ethylcyclopentane
(2019)
The link between universities and the industry has been of concern both locally as well as globally for a long time, for the obvious reason that it is perceived to enhance organizational performance. The gap between universities and the industry has been widening in developing countries leading to lost opportunities for joint research, product development and job creation. Marketing and entrepreneurship could play a pivotal role in reversing the weakened linkages by building mutual relationship and strengthening bonds between universities and industry. This study sought to examine the role of marketing and entrepreneurship as important tools for enhancing the university industry linkages. The study sought to determine the aspects of marketing and entrepreneurship that have the highest influence on enhancing the university industry linkages. It considered the nexus of entrepreneurship and marketing exemplified by the attributes of innovativeness, creativity, risk taking; proactive orientation and value creation as crucial for creating, nurturing and developing sustained linkages between universities and industry. The study targeted 150 small and medium sized enterprises in Nairobi City County, out of which 143 responded, giving a response rate of 95 %. Data was collected using structured questionnaire administered to managers of small and medium sized enterprises engaged in manufacturing, retail, banking and hospitals. Survey data collected from small and medium enterprises will be analyzed through descriptive statistics including mean scores and standard deviation. We will test our hypothesis through regression analysis. The study found that marketing practices especially those focused on the product, promotion and distribution were key in enhancing University industry linkage. With regards to entrepreneurial orientation, risk taking, and creativity indicators were found to be more important than innovation in enhancing university-industry linkages.
Estimating the impact of successful completion of vocational education on employment outcomes
(2019)
Although work events can be regarded as pivotal elements of organizational life, only a few studies have examined how positive and negative events relate to and combine to affect work engagement over time. Theory suggests that to better understand how current events affect work engagement (WE), we have to account for recent events that have preceded these current events. We present competing theoretical views on how recent and current work events may affect employees (e.g., getting used to a high frequency of negative events or becoming more sensitive to negative events). Although the occurrence of events implies discrete changes in the experience of work, prior research has not considered whether work events actually accumulate to sustained mid-term changes in WE. To address these gaps in the literature, we conducted a week-level longitudinal study across a period of 15 consecutive weeks among 135 employees, which yielded 849 weekly observations. While positive events were associated with higher levels of WE within the same week, negative events were not. Our results support neither satiation nor sensitization processes. However, high frequencies of negative events in the preceding week amplified the beneficial effects of positive events on WE in the current week. Growth curve analyses show that the benefits of positive events accumulate to sustain high levels of WE. WE dissipates in the absence of continuous experience of positive events. Our study adds a temporal component and informs research that has taken a feature-oriented perspective on the dynamic interplay of job demands and resources.
During the dawn of chemistry when the temperature of the young Universe had fallen below ∼4000 K, the ions of the light elements produced in Big Bang nucleosynthesis recombined in reverse order of their ionization potential. With its higher ionization potentials, He++ (54.5 eV) and He+ (24.6 eV) combined first with free electrons to form the first neutral atom, prior to the recombination of hydrogen (13.6 eV). At that time, in this metal-free and low-density environment, neutral helium atoms formed the Universe's first molecular bond in the helium hydride ion HeH+, by radiative association with protons (He + H+ → HeH+ + hν). As recombination progressed, the destruction of HeH+ (HeH+ + H → He + H+2) created a first path to the formation of molecular hydrogen, marking the beginning of the Molecular Age. Despite its unquestioned importance for the evolution of the early Universe, the HeH+ molecule has so far escaped unequivocal detection in interstellar space. In the laboratory, the ion was discovered as long ago as 1925, but only in the late seventies was the possibility that HeH+ might exist in local astrophysical plasmas discussed. In particular, the conditions in planetary nebulae were shown to be suitable for the production of potentially detectable HeH+ column densities: the hard radiation field from the central hot white dwarf creates overlapping Strömgren spheres, where HeH+ is predicted to form, primarily by radiative association of He+ and H. With the GREAT spectrometer onboard SOFIA, the HeH+ rotational ground-state transition at λ149.1 μm is now accessible. We report here its detection towards the planetary nebula NGC7027.
Gas Chromatography
(2019)
Gas chromatography (GC) is one of the most important types of chromatography used in analytical chemistry for separating and analyzing chemical organic compounds. Today, gas chromatography is one of the most widespread investigation methods of instrumental analysis. This technique is used in the laboratories of chemical, petrochemical, and pharmaceutical industries, in research institutes, and also in clinical, environmental, and food and beverage analysis. This book is the outcome of contributions by experts in the field of gas chromatography and includes a short history of gas chromatography, an overview of derivatization methods and sample preparation techniques, a comprehensive study on pyrazole mass spectrometric fragmentation, and a GC/MS/MS method for the determination and quantification of pesticide residues in grape samples.
Neural network based object detectors are able to automatize many difficult, tedious tasks. However, they are usually slow and/or require powerful hardware. One main reason is called Batch Normalization (BN) [1], which is an important method for building these detectors. Recent studies present a potential replacement called Self-normalizing Neural Network (SNN) [2], which at its core is a special activation function named Scaled Exponential Linear Unit (SELU). This replacement seems to have most of BNs benefits while requiring less computational power. Nonetheless, it is uncertain that SELU and neural network based detectors are compatible with one another. An evaluation of SELU incorporated networks would help clarify that uncertainty. Such evaluation is performed through series of tests on different neural networks. After the evaluation, it is concluded that, while indeed faster, SELU is still not as good as BN for building complex object detector networks.
Modern Monte-Carlo-based rendering systems still suffer from the computational complexity involved in the generation of noise-free images, making it challenging to synthesize interactive previews. We present a framework suited for rendering such previews ofstatic scenes using a caching technique that builds upon a linkless octree. Our approach allows for memory-efficient storage and constant-time lookup to cache diffuse illumination at multiple hitpoints along the traced paths. Non-diffuse surfaces are dealt with in a hybrid way in order to reconstruct view-dependent illumination while maintaining interactive frame rates. By evaluating the visual fidelity against ground truth sequences and by benchmarking, we show that our approach compares well to low-noise path traced results, but with a greatly reduced computational complexity allowing for interactive frame rates. This way, our caching technique provides a useful tool for global illumination previews and multi-view rendering.
In January 2015, German retail and industry jointly started a sector-wide initiative ("Initiative Tierwohl" - ITW) to improve animal welfare standards. The principle of the ITW is communicated mostly via the websites of ITW and its participating companies. However, uncertainty remained whether or not these websites provide the necessary information consumers need on the ITW products. Based on Schwartz's basic human values, different types of consumers were identified by a cluster analysis (ward-method, k-means). The results showed that depending on expressed meta‐values (Self-Transcendence/Openness to Change Self-Enhancement or Conservation), respondents had different specific information sources and needs. Online sources were rarely mentioned, the majority of consumers referred to brochures, flyers and interpersonal contacts.
Atmospheric aerosols affect the power production of solar energy systems. Their impact depends on both the atmospheric conditions and the solar technology employed. By being a region with a lack in power production and prone to high solar insolation, West Africa shows high potential for the application of solar power systems. However, dust outbreaks, containing high aerosol loads, occur especially in the Sahel, located between the Saharan desert in the north and the Sudanian Savanna in the south. They might affect the whole region for several days with significant effects on power generation. This study investigates the impact of atmospheric aerosols on solar energy production for the example year 2006 making use of six well instrumented sites in West Africa. Two different solar power technologies, a photovoltaic (PV) and a parabolic through (PT) power plant, are considered. The daily reduction of solar power due to aerosols is determined over mostly clear-sky days in 2006 with a model chain combining radiative transfer and technology specific power generation. For mostly clear days the local daily reduction of PV power (at alternating current) (PVAC) and PT power (PTP) due to the presence of aerosols lies between 13 % and 22 % and between 22 % and 37 %, respectively. In March 2006 a major dust outbreak occurred, which serves as an example to investigate the impact of an aerosol extreme event on solar power. During the dust outbreak, daily reduction of PVAC and PTP of up to 79 % and 100 % occur with a mean reduction of 20 % to 40 % for PVAC and of 32 % to 71 % for PTP during the 12 days of the event.
Due to the policy goals for sustainable energy production, renewable energy plants such as photovoltaics are increasingly in use. The energy production from solar radiation depends strongly on atmospheric conditions. As the weather mostly changes, electrical power generation fluctuates, making technical planning and control of power grids to a complex problem.
Background & Objective: Due to the policy goals for sustainable energy production, renewable energy plants such as photovoltaics are increasingly in use. The energy production from solar radiation depends strongly on atmospheric conditions. As the weather mostly changes, electrical power generation fluctuates, making technical planning and control of power grids to a complex problem. Due to used materials (semiconductors e.g. silicon, gallium arsenide, cadmium telluride) the photovoltaic cells are spectrally selective. It means that only radiation of certain wavelengths converts into electrical energy. A material property called spectral response characterizes a certain degree of conversion of solar radiation into the electric current for each wavelength of solar light.
In the field of domestic service robots, recovery from faults is crucial to promote user acceptance. In this context, this work focuses on some specific faults which arise from the interaction of a robot with its real world environment. Even a well-modelled robot may fail to perform its tasks successfully due to external faults which occur because of an infinite number of unforeseeable and unmodelled situations. Through investigating the most frequent failures in typical scenarios which have been observed in real-world demonstrations and competitions using the autonomous service robots Care-O-Bot III and youBot, we identified four different fault classes caused by disturbances, imperfect perception, inadequate planning operator or chaining of action sequences. This thesis then presents two approaches to handle external faults caused by insufficient knowledge about the preconditions of the planning operator. The first approach presents reasoning on detected external faults using knowledge about naive physics. The naive physics knowledge is represented by the physical properties of objects which are formalized in a logical framework. The proposed approach applies a qualitative version of physical laws to these properties in order to reason. By interpreting the reasoning results the robot identifies the information about the situations which can cause the fault. Applying this approach to simple manipulation tasks like picking and placing objects show that naive physics holds great possibilities for reasoning on unknown external faults in robotics. The second approach includes missing knowledge about the execution of an action through learning by experimentation. Firstly, it investigates such representation of execution specific knowledge that can be learned for one particular situation and reused for situations which deviate from the original. The combination of symbolic and geometric models allows us to represent action execution knowledge effectively. This representation is called action execution model (AEM) here. The approach provides a learning strategy which uses a physical simulation for generating the training data to learn both symbolic and geometric aspects of the model. The experimental analysis, performed on two physical robots, shows that AEM can reliably describe execution specific knowledge and thereby serving as a potential model for avoiding the occurrence of external faults.
In mathematical modeling by means of performance models, the Fitness-Fatigue Model (FF-Model) is a common approach in sport and exercise science to study the training performance relationship. The FF-Model uses an initial basic level of performance and two antagonistic terms (for fitness and fatigue). By model calibration, parameters are adapted to the subject’s individual physical response to training load. Although the simulation of the recorded training data in most cases shows useful results when the model is calibrated and all parameters are adjusted, this method has two major difficulties. First, a fitted value as basic performance will usually be too high. Second, without modification, the model cannot be simply used for prediction. By rewriting the FF-Model such that effects of former training history can be analyzed separately – we call those terms preload – it is possible to close the gap between a more realistic initial performance level and an athlete's actual performance level without distorting other model parameters and increase model accuracy substantially. Fitting error of the preload-extended FF-Model is less than 32% compared to the error of the FF-Model without preloads. Prediction error of the preload-extended FF-Model is around 54% of the error of the FF-Model without preloads.