Refine
Departments, institutes and facilities
- Fachbereich Informatik (64)
- Fachbereich Angewandte Naturwissenschaften (49)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (42)
- Fachbereich Ingenieurwissenschaften und Kommunikation (33)
- Fachbereich Wirtschaftswissenschaften (22)
- Institut für Cyber Security & Privacy (ICSP) (18)
- Institut für funktionale Gen-Analytik (IFGA) (17)
- Institut für Verbraucherinformatik (IVI) (16)
- Institute of Visual Computing (IVC) (16)
- Institut für Sicherheitsforschung (ISF) (8)
Document Type
- Conference Object (96)
- Article (88)
- Preprint (9)
- Doctoral Thesis (6)
- Part of a Book (5)
- Report (4)
- Book (monograph, edited volume) (3)
- Master's Thesis (3)
- Conference Proceedings (1)
- Research Data (1)
Year of publication
- 2019 (217) (remove)
Language
- English (217) (remove)
Keywords
- lignin (4)
- Navigation (3)
- security (3)
- work engagement (3)
- Aminoacylase (2)
- Design (2)
- Drosophila (2)
- Exergame (2)
- Extrusion blow molding (2)
- FPGA (2)
Systemic autoinflammatory diseases (SAIDs) are a group of inflammatory disorders caused by dysregulation in the innate immune system that leads to enhanced immune responses. The clinical diagnosis of SAIDs can be difficult since individually these are rare diseases with considerable phenotypic overlap. Most SAIDs have a strong genetic background, but environmental and epigenetic influences can modulate the clinical phenotype. Molecular diagnosis has become essential for confirmation of clinical diagnosis. To date there are over 30 genes and a variety of modes of inheritance that have been associated with monogenic SAIDs. Mutations in the same gene can lead to very distinct phenotypes and can have different inheritance patterns. In addition, somatic mutations have been reported in several of these conditions. New genetic testing methods and databases are being developed to facilitate the molecular diagnosis of SAIDs, which is of major importance for treatment, prognosis and genetic counselling. The aim of this review is to summarize the latest advances in genetic testing for SAIDs and discuss potential obstacles that might arise during the molecular diagnosis of SAIDs.
The pyrin inflammasome has evolved as an innate immune sensor to detect bacterial toxin-induced Rho guanosine triphosphatase (Rho GTPase)-inactivation, a process that is similar to the "guard" mechanism in plants. Rho GTPases act as molecular switches to regulate a variety of signal transduction pathways including cytoskeletal organization. Pathogens can modulate Rho GTPase activity to suppress host immune responses such as phagocytosis. Pyrin is encoded by MEFV, the gene that is mutated in patients with familial Mediterranean fever (FMF). FMF is the prototypic autoinflammatory disease characterized by recurring short episodes of systemic inflammation and is a common disorder in many populations in the Mediterranean basin. Pyrin specifically senses modifications in the activity of the small GTPase RhoA, which binds to many effector proteins including the serine/threonine-protein kinases PKN1 and PKN2 and actin-binding proteins. RhoA activation leads to PKN-mediated phosphorylation-dependent pyrin inhibition. Conversely, pathogen virulence factors downregulate RhoA activity in a variety of ways, and these changes are detected by the pyrin inflammasome irrespective of the type of modifications. MEFV pathogenic variants favor the active state of pyrin and elicit proinflammatory cytokine release and pyroptosis. They can be inherited either as a dominant or recessive trait depending on the variant's location and effect on the protein function. Mutations in the C-terminal B30.2 domain are usually considered recessive, although heterozygotes may manifest a biochemical or even a clinical phenotype. These variants are hypomorphic in regard to their effect on intramolecular interactions, but ultimately accentuate pyrin activity. Heterozygous mutations in other domains of pyrin affect residues critical for inhibition or protein oligomerization, and lead to constitutively active inflammasome. In healthy carriers of FMF mutations who have the subclinical inflammatory phenotype, the increased activity of pyrin might have been protective against endemic infections over human history. This finding is supported by the observation of high carrier frequencies of FMF-mutations in multiple populations. The pyrin inflammasome also plays a role in mediating inflammation in other autoinflammatory diseases linked to dysregulation in the actin polymerization pathway. Therefore, the assembly of the pyrin inflammasome is initiated in response to fluctuations in cytoplasmic homeostasis and perturbations in cytoskeletal dynamics.
Towards self-explaining social robots. Verbal explanation strategies for a needs-based architecture
(2019)
In order to establish long-term relationships with users, social companion robots and their behaviors need to be comprehensible. Purely reactive behavior such as answering questions or following commands can be readily interpreted by users. However, the robot's proactive behaviors, included in order to increase liveliness and improve the user experience, often raise a need for explanation. In this paper, we provide a concept to produce accessible “why-explanations” for the goal-directed behavior an autonomous, lively robot might produce. To this end we present an architecture that provides reasons for behaviors in terms of comprehensible needs and strategies of the robot, and we propose a model for generating different kinds of explanations.
Plant sap-feeding insects are widespread, having evolved to occupy diverse environmental niches despite exclusive feeding on an impoverished diet lacking in essential amino acids and vitamins. Success depends exquisitely on their symbiotic relationships with microbial symbionts housed within specialized eukaryotic bacteriocyte cells. Each bacteriocyte is packed with symbionts that are individually surrounded by a host-derived symbiosomal membrane representing the absolute host-symbiont interface. The symbiosomal membrane must be a dynamic and selectively permeable structure to enable bidirectional and differential movement of essential nutrients, metabolites, and biosynthetic intermediates, vital for growth and survival of host and symbiont. However, despite this crucial role, the molecular basis of membrane transport across the symbiosomal membrane remains unresolved in all bacteriocyte-containing insects. A transport protein was immuno-localized to the symbiosomal membrane separating the pea aphid Acyrthosiphon pisum from its intracellular symbiont Buchnera aphidicola. The transporter, A. pisum nonessential amino acid transporter 1, or ApNEAAT1 (gene: ACYPI008971), was characterized functionally following heterologous expression in Xenopus oocytes, and mediates both inward and outward transport of small dipolar amino acids (serine, proline, cysteine, alanine, glycine). Electroneutral ApNEAAT1 transport is driven by amino acid concentration gradients and is not coupled to transmembrane ion gradients. Previous metabolite profiling of hemolymph and bacteriocyte, alongside metabolic pathway analysis in host and symbiont, enable prediction of a physiological role for ApNEAAT1 in bidirectional host-symbiont amino acid transfer, supplying both host and symbiont with indispensable nutrients and biosynthetic precursors to facilitate metabolic complementarity.
The limited sodium availability of freshwater and terrestrial environments was a major physiological challenge during vertebrate evolution. The epithelial sodium channel (ENaC) is present in the apical membrane of sodium-absorbing vertebrate epithelia and evolved as part of a machinery for efficient sodium conservation. ENaC belongs to the degenerin/ENaC protein family and is the only member that opens without an external stimulus. We hypothesized that ENaC evolved from a proton-activated sodium channel present in ionocytes of freshwater vertebrates and therefore investigated whether such ancestral traits are present in ENaC isoforms of the aquatic pipid frog Xenopus laevis. Using whole-cell and single-channel electrophysiology of Xenopus oocytes expressing ENaC isoforms assembled from alpha beta gamma- or delta beta gamma-subunit combinations, we demonstrate that Xenopus delta beta gamma-ENaC is profoundly activated by extracellular acidification within biologically relevant ranges (pH 8.0-6.0). This effect was not observed in Xenopus alpha beta gamma-ENaC or human ENaC orthologs. We show that protons interfere with allosteric ENaC inhibition by extracellular sodium ions, thereby increasing the probability of channel opening. Using homology modeling of ENaC structure and site-directed mutagenesis, we identified a cleft region within the extracellular loop of the delta-subunit that contains several acidic amino acid residues that confer proton-sensitivity and enable allosteric inhibition by extracellular sodium ions. We propose that Xenopus delta beta gamma-ENaC can serve as a model for investigating ENaC transformation from a proton-activated toward a constitutively-active ion channel. Such transformation might have occurred during the evolution of tetrapod vertebrates to enable bulk sodium absorption during the water-to-land transition.
For protection from inhaled pathogens many strategies have evolved in the airways such as mucociliary clearance and cough. We have previously shown that protective respiratory reflexes to locally released bacterial bitter taste substances are most probably initiated by tracheal brush cells (BC). Our single-cell RNA-seq analysis of murine BC revealed high expression levels of cholinergic and bitter taste signaling transcripts (Tas2r108, Gnat3, Trpm5). We directly demonstrate the secretion of acetylcholine (ACh) from BC upon stimulation with the Tas2R agonist denatonium. Inhibition of the taste transduction cascade abolished the increase in [Ca2+](i) in BC and subsequent ACh-release. ACh-release is regulated in an autocrine manner. While the muscarinic ACh-receptors M3R and M1R are activating, M2R is inhibitory. Paracrine effects of ACh released in response to denatonium included increased [Ca2+](i) in ciliated cells. Stimulation by denatonium or with Pseudomonas quinolone signaling molecules led to an increase in mucociliary clearance in explanted tracheae that was Trpm5- and M3R-mediated. We show that ACh-release from BC via the bitter taste cascade leads to immediate paracrine protective responses that can be boosted in an autocrine manner. This mechanism represents the initial step for the activation of innate immune responses against pathogens in the airways.
This article examines similarities and differences in the attitudes and social representations of destination managers towards implementing sustainable tourism between the mountain regions of the Alps and the Dinarides. Bearing in mind the transnational impacts (i.e., environmental, economic and social) of the tourism industry the research methodology adopted an international perspective by sending a questionnaire to tourism organizations in fourteen different countries in the Alps and the Dinarides. The research is interdisciplinary in nature, because it integrates knowledge from sustainability and management science with tourism geography and social psychology. The findings confirm that social representations of sustainable tourism differ significantly in the two mountain regions.
The media is considered to be the fourth pillar in a democratic country. It acts as an effective control mechanism to check the other branches of the government. But this is only consequential when the media functions in an independent and transparent fashion with trained and neutral professionals who are aware of the accountability and consequences of their work. All these factors together would further the country as a democratic institution. Traditionally, it was legacy media responsible for a one-to-many communication process. Their goal was to provide information to the citizens. But this changed with development in technology and the use of social media in daily life. The internet brought with it new media formats which are easily accessible but also unstructured. These lowered barriers of entry in the media enabled citizens to become active participants in the communication process. As a result, these citizens developed a different relationship with the already existing media wherein they were not only the receivers to information but also co-producers. Real-time information allows users to communicate with each other and in turn widely generate public opinion on internet platforms. A many-to-many communication style emerged. While on the one hand, this type of discourse could be an opportunity for citizens to exercise their fundamental freedom of speech and expression, it is on the other hand, proving to have a detrimental effect in two parts: Lack of neutrality, polarized views and pre-existing misconceptions on the part of citizens as well as algorithms and formation of echo-chambers on the part of technology. Some questions arise in this scenario about the capability of citizen journalists, the duties they should adhere to along with the enjoyment of their rights and freedoms, the risks involved in an unchecked method of communication and the effect of citizen journalism in the democratic process.
Background & Objective: Due to the policy goals for sustainable energy production, renewable energy plants such as photovoltaics are increasingly in use. The energy production from solar radiation depends strongly on atmospheric conditions. As the weather mostly changes, electrical power generation fluctuates, making technical planning and control of power grids to a complex problem. Due to used materials (semiconductors e.g. silicon, gallium arsenide, cadmium telluride) the photovoltaic cells are spectrally selective. It means that only radiation of certain wavelengths converts into electrical energy. A material property called spectral response characterizes a certain degree of conversion of solar radiation into the electric current for each wavelength of solar light.
We present a systematization of usable security principles, guidelines and patterns to facilitate the transfer of existing knowledge to researchers and practitioners. Based on a literature review, we extracted 23 principles, 11 guidelines and 47 patterns for usable security and identified their interconnection. The results indicate that current research tends to focus on only a subset of important principles. The fact that some principles are not yet addressed by any design patterns suggests that further work on refining these patterns is needed. We developed an online repository, which stores the harmonized principles, guidelines and patterns. The tool enables users to search for relevant guidance and explore it in an interactive and programmatic manner. We argue that both the insights presented in this article and the web-based repository will be highly valuable for students to get a good overview, practitioners to implement usable security and researchers to identify areas of future research.
Contemporary software is inherently distributed. The principles guiding the design of such software have been mainly manifested by the service-oriented architecture (SOA) concept. In a SOA, applications are orchestrated by software services generally operated by distinct entities. Due to the latter fact, service security has been of importance in such systems ever since. A dominant protocol for implementing SOA-based systems is SOAP, which comes with a well-elaborated security framework. As an alternative to SOAP, the architectural style representational state transfer (REST) is gaining traction as a simple, lightweight and flexible guideline for designing distributed service systems that scale at large. This paper starts by introducing the basic constraints representing REST. Based on these foundations, the focus is afterwards drawn on the security needs of REST-based service systems. The limitations of transport-oriented protection means are emphasized and the demand for specific message-oriented safeguards is assessed. The paper then reviews the current activities in respect to REST-security and finds that the available schemes are mostly HTTP-centered and very heterogeneous. More importantly, all of the analyzed schemes contain vulnerabilities. The paper contributes a methodology on how to establish REST-security as a general security framework for protecting REST-based service systems of any kind by consistent and comprehensive protection means. First adoptions of the introduced approach are presented in relation to REST message authentication with instantiations for REST-ful HTTP (web/cloud services) and REST-ful constraint application protocol (CoAP) (internet of things (IoT) services).
Energy Profiles of the Ring Puckering of Cyclopentane, Methylcyclopentane and Ethylcyclopentane
(2019)
Beyond HCI and CSCW: Challenges and Useful Practices Towards a Human-Centred Vision of AI and IA
(2019)
Incoming solar radiation is an important driver of our climate and weather. Several studies (see for instance Frank et al. 2018) have revealed discrepancies between ground-based irradiance measurements and the predictions of regional weather models. In the realm of electricity generation, accurate forecasts of solar photovoltaic (PV)energy yield are becoming indispensable for cost-effective grid operation: in Germany there are 1.6 million PVsystems installed, with a nominal power of 46 GW (Bundesverband Solarwirtschaft 2019). The proliferation of PV systems provides a unique opportunity to characterise global irradiance with unprecedented spatiotemporalresolution, which in turn will allow for highly resolved PV power forecasts.
Renewable energies play an increasingly important role for energy production in Europe. Unlike coal or gas powerplants, solar energy production is highly variable in space and time. This is due to the strong variability of cloudsand their influence on the surface solar irradiance. Especially in regions with large contribution from photovoltaicpower production, the intermittent energy feed-in to the power grid can be a risk for grid stability. Therefore goodforecasts of temporal and spatial variability of surface irradiance are necessary to be able to properly regulate thepower supply.
Due to the policy goals for sustainable energy production, renewable energy plants such as photovoltaics are increasingly in use. The energy production from solar radiation depends strongly on atmospheric conditions. As the weather mostly changes, electrical power generation fluctuates, making technical planning and control of power grids to a complex problem.
Emotion and gender recognition from facial features are important properties of human empathy. Robots should also have these capabilities. For this purpose we have designed special convolutional modules that allow a model to recognize emotions and gender with a considerable lower number of parameters, enabling real-time evaluation on a constrained platform. We report accuracies of 96% in the IMDB gender dataset and 66% in the FER-2013 emotion dataset, while requiring a computation time of less than 0.008 seconds on a Core i7 CPU. All our code, demos and pre-trained architectures have been released under an open-source license in our repository at https://github.com/oarriaga/face classification.
Risk-based authentication (RBA) is an adaptive security measure to strengthen password-based authentication. RBA monitors additional implicit features during password entry such as device or geolocation information, and requests additional authentication factors if a certain risk level is detected. RBA is recommended by the NIST digital identity guidelines, is used by several large online services, and offers protection against security risks such as password database leaks, credential stuffing, insecure passwords and large-scale guessing attacks. Despite its relevance, the procedures used by RBA-instrumented online services are currently not disclosed. Consequently, there is little scientific research about RBA, slowing down progress and deeper understanding, making it harder for end users to understand the security provided by the services they use and trust, and hindering the widespread adoption of RBA.
In this paper, with a series of studies on eight popular online services, we (i) analyze which features and combinations/classifiers are used and are useful in practical instances, (ii) develop a framework and a methodology to measure RBA in the wild, and (iii) survey and discuss the differences in the user interface for RBA. Following this, our work provides a first deeper understanding of practical RBA deployments and helps fostering further research in this direction.
The need for innovation around the control functions of inverters is great. PV inverters were initially expected to be passive followers of the grid and to disconnect as soon as abnormal conditions happened. Since future power systems will be dominated by generation and storage resources interfaced through inverters these converters must move from following to forming and sustaining the grid. As “digital natives” PV inverters can also play an important role in the digitalisation of distribution networks. In this short review we identified a large potential to make the PV inverter the smart local hub in a distributed energy system. At the micro level, costs and coordination can be improved with bidirectional inverters between the AC grid and PV production, stationary storage, car chargers and DC loads. At the macro level the distributed nature of PV generation means that the same devices will support both to the local distribution network and to the global stability of the grid. Much success has been obtained in the former. The later remains a challenge, in particular in terms of scaling. Yet there is some urgency in researching and demonstrating such solutions. And while digitalisation offers promise in all control aspects it also raises significant cybersecurity concerns.
This work introduces a semi-Lagrangian lattice Boltzmann (SLLBM) solver for compressible flows (with or without discontinuities). It makes use of a cell-wise representation of the simulation domain and utilizes interpolation polynomials up to fourth order to conduct the streaming step. The SLLBM solver allows for an independent time step size due to the absence of a time integrator and for the use of unusual velocity sets, like a D2Q25, which is constructed by the roots of the fifth-order Hermite polynomial. The properties of the proposed model are shown in diverse example simulations of a Sod shock tube, a two-dimensional Riemann problem and a shock-vortex interaction. It is shown that the cell-based interpolation and the use of Gauss-Lobatto-Chebyshev support points allow for spatially high-order solutions and minimize the mass loss caused by the interpolation. Transformed grids in the shock-vortex interaction show the general applicability to non-uniform grids.
More and more devices will be connected to the internet [3]. Many devicesare part of the so-called Internet of Things (IoT) which contains many low-powerdevices often powered by a battery. These devices mainly communicate with the manufacturers back-end and deliver personal data and secrets like passwords.
In this thesis, unique administrative data, a relevant time of follow-up and advanced statistical measures to handle confounding have been utilized in order to provide new and informative evidence on the effects of vocational rehabilitation programs on work participation outcomes in Germany. While re-affirming the important role of micro-level determinants, the present study provides an extensive example of the individual and fiscal effects that are possible through meaningful vocational rehabilitation measures. The analysis showed that the principal objective, namely, to improve participation in employment, was generally achieved. Contrary to the common misconception that “off-the-job training” is relatively ineffective, this thesis has provided an empirical example of the positive impact of the programs.
Computer graphics research strives to synthesize images of a high visual realism that are indistinguishable from real visual experiences. While modern image synthesis approaches enable to create digital images of astonishing complexity and beauty, processing resources remain a limiting factor. Here, rendering efficiency is a central challenge involving a trade-off between visual fidelity and interactivity. For that reason, there is still a fundamental difference between the perception of the physical world and computer-generated imagery. At the same time, advances in display technologies drive the development of novel display devices. The dynamic range, the pixel densities, and refresh rates are constantly increasing. Display systems enable a larger visual field to be addressed by covering a wider field-of-view, due to either their size or in the form of head-mounted devices. Currently, research prototypes are ranging from stereo and multi-view systems, head-mounted devices with adaptable lenses, up to retinal projection, and lightfield/holographic displays. Computer graphics has to keep step with, as driving these devices presents us with immense challenges, most of which are currently unsolved. Fortunately, the human visual system has certain limitations, which means that providing the highest possible visual quality is not always necessary. Visual input passes through the eye’s optics, is filtered, and is processed at higher level structures in the brain. Knowledge of these processes helps to design novel rendering approaches that allow the creation of images at a higher quality and within a reduced time-frame. This thesis presents the state-of-the-art research and models that exploit the limitations of perception in order to increase visual quality but also to reduce workload alike - a concept we call perception-driven rendering. This research results in several practical rendering approaches that allow some of the fundamental challenges of computer graphics to be tackled. By using different tracking hardware, display systems, and head-mounted devices, we show the potential of each of the presented systems. The capturing of specific processes of the human visual system can be improved by combining multiple measurements using machine learning techniques. Different sampling, filtering, and reconstruction techniques aid the visual quality of the synthesized images. An in-depth evaluation of the presented systems including benchmarks, comparative examination with image metrics as well as user studies and experiments demonstrated that the methods introduced are visually superior or on the same qualitative level as ground truth, whilst having a significantly reduced computational complexity.
Process-dependent thermo-mechanical viscoelastic properties and the corresponding morphology of HDPE extrusion blow molded (EBM) parts were investigated. Evaluation of bulk data showed that flow direction, draw ratio, and mold temperature influence the viscoelastic behavior significantly in certain temperature ranges. Flow induced orientations due to higher draw ratio and higher mold temperature lead to higher crystallinities. To determine the local viscoelastic properties, a new microindentation system was developed by merging indentation with dynamic mechanical analysis. The local process-structure-property relationship of EBM parts showed that the cross-sectional temperature distribution is clearly reflected by local crystallinities and local complex moduli. Additionally, a model to calculate three-dimensional anisotropic coefficients of thermal expansion as a function of the process dependent crystallinity was developed based on an elementary volume unit cell with stacked layers of amorphous phase and crystalline lamellae. Good agreement of the predicted thermal expansion coefficients with measured ones was found up to a temperature of 70 °C.
The aim of this study was to investigate whether beneficial vacation effects can be strengthened and prolonged with a smartphone-based intervention. In a four-week longitudinal study among 79 Finnish teachers, we investigated the development of recovery, well-being, and job performance before, during, and after a one-week vacation in three groups: non-users (n = 51), passive (n = 18) and active (n = 10) users. Participants were instructed to actively use a recovery app (called Holidaily) and complete five digital questionnaires. Most recovery experiences and well-being indicators increased during the vacation. Job performance and concentration capacity showed no significant time effects. Among active app users, creativity at work increased from baseline to after the vacation, whereas among non-users it decreased and among passive users it decreased a few days after the vacation but increased again one and a half weeks after the vacation. The fading of beneficial vacation effects on negative affect seems to have been slower among active app users. Only few participants used the app actively. Still, results suggest that a smartphone-based recovery intervention may support beneficial vacation effects.
2-methylacetoacetyl-coenzyme A thiolase (beta-ketothiolase) deficiency: one disease - two pathways
(2019)
Background: 2-methylacetoacetyl-coenzyme A thiolase deficiency (MATD; deficiency of mitochondrial acetoacetyl-coenzyme A thiolase T2/ “beta-ketothiolase”) is an autosomal recessive disorder of ketone body utilization and isoleucine degradation due to mutations in ACAT1.
Methods: We performed a systematic literature search for all available clinical descriptions of patients with MATD. 244 patients were identified and included in this analysis. Clinical course and biochemical data are presented and discussed.
Results: For 89.6 % of patients at least one acute metabolic decompensation was reported. Age at first symptoms ranged from 2 days to 8 years (median 12 months). More than 82% of patients presented in the first two years of life, while manifestation in the neonatal period was the exception (3.4%). 77.0% (157 of 204 patients) of patients showed normal psychomotor development without neurologic abnormalities.
Conclusion: This comprehensive data analysis provides a systematic overview on all cases with MATD identified in the literature. It demonstrates that MATD is a rather benign disorder with often favourable outcome, when compared with many other organic acidurias.
Background 3-hydroxy-3-methylglutaryl-coenzyme A lyase deficiency (HMGCLD) is an autosomal recessive disorder of ketogenesis and leucine degradation due to mutations in HMGCL .
Method We performed a systematic literature search to identify all published cases. 211 patients of whom relevant clinical data were available were included in this analysis. Clinical course, biochemical findings and mutation data are highlighted and discussed. An overview on all published HMGCL variants is provided.
Results More than 95% of patients presented with acute metabolic decompensation. Most patients manifested within the first year of life, 42.4% already neonatally. Very few individuals remained asymptomatic. The neurologic long-term outcome was favorable with 62.6% of patients showing normal development.
Conclusion This comprehensive data analysis provides a systematic overview on all published cases with HMGCLD including a list of all known HMGCL mutations.
Modern Monte-Carlo-based rendering systems still suffer from the computational complexity involved in the generation of noise-free images, making it challenging to synthesize interactive previews. We present a framework suited for rendering such previews ofstatic scenes using a caching technique that builds upon a linkless octree. Our approach allows for memory-efficient storage and constant-time lookup to cache diffuse illumination at multiple hitpoints along the traced paths. Non-diffuse surfaces are dealt with in a hybrid way in order to reconstruct view-dependent illumination while maintaining interactive frame rates. By evaluating the visual fidelity against ground truth sequences and by benchmarking, we show that our approach compares well to low-noise path traced results, but with a greatly reduced computational complexity allowing for interactive frame rates. This way, our caching technique provides a useful tool for global illumination previews and multi-view rendering.
The complex nature of multifactorial diseases, such as Morbus Alzheimer, has produced a strong need to design multitarget-directed ligands to address the involved complementary pathways. We performed a purposive structural modification of a tetratarget small-molecule, that is contilisant, and generated a combinatorial library of 28 substituted chromen-4-ones. The compounds comprise a basic moiety which is linker-connected to the 6-position of the heterocyclic chromenone core. The syntheses were accomplished by Mitsunobu- or Williamson-type ether formations. The resulting library members were evaluated at a panel of seven human enzymes, all of which being involved in the pathophysiology of neurodegeneration. A concomitant inhibition of human acetylcholinesterase and human monoamine oxidase B, with IC50 values of 5.58 and 7.20 μM, respectively, was achieved with the dual-target 6-(4-(piperidin-1-yl)butoxy)-4H-chromen-4-one (7).
Bone tissue engineering is an ever-changing, rapidly evolving, and highly interdisciplinary field of study, where scientists try to mimic natural bone structure as closely as possible in order to facilitate bone healing. New insights from cell biology, specifically from mesenchymal stem cell differentiation and signaling, lead to new approaches in bone regeneration. Novel scaffold and drug release materials based on polysaccharides gain increasing attention due to their wide availability and good biocompatibility to be used as hydrogels and/or hybrid components for drug release and tissue engineering. This article reviews the current state of the art, recent developments, and future perspectives in polysaccharide-based systems used for bone regeneration.
In an effort to assist researchers in choosing basis sets for quantum mechanical modeling of molecules (i.e. balancing calculation cost versus desired accuracy), we present a systematic study on the accuracy of computed conformational relative energies and their geometries in comparison to MP2/CBS and MP2/AV5Z data, respectively. In order to do so, we introduce a new nomenclature to unambiguously indicate how a CBS extrapolation was computed. Nineteen minima and transition states of buta-1,3-diene, propan-2-ol and the water dimer were optimized using forty-five different basis sets. Specifically, this includes one Pople (i.e. 6-31G(d)), eight Dunning (i.e. VXZ and AVXZ, X=2-5), twenty-five Jensen (i.e. pc-n, pcseg-n, aug-pcseg-n, pcSseg-n and aug-pcSseg-n, n=0-4) and nine Karlsruhe (e.g. def2-SV(P), def2-QZVPPD) basis sets. The molecules were chosen to represent both common and electronically diverse molecular systems. In comparison to MP2/CBS relative energies computed using the largest Jensen basis sets (i.e. n=2,3,4), the use of smaller sizes (n=0,1,2 and n=1,2,3) provides results that are within 0.11--0.24 and 0.09-0.16 kcal/mol. To practically guide researchers in their basis set choice, an equation is introduced that ranks basis sets based on a user-defined balance between their accuracy and calculation cost. Furthermore, we explain why the aug-pcseg-2, def2-TZVPPD and def2-TZVP basis sets are very suitable choices to balance speed and accuracy.
Currently, a variety of methods exist for creating different types of spatio-temporal world models. Despite the numerous methods for this type of modeling, there exists no methodology for comparing the different approaches or their suitability for a given application e.g. logistics robots. In order to establish a means for comparing and selecting the best-fitting spatio-temporal world modeling technique, a methodology and standard set of criteria must be established. To that end, state-of-the-art methods for this type of modeling will be collected, listed, and described. Existing methods used for evaluation will also be collected where possible.
Using the collected methods, new criteria and techniques will be devised to enable the comparison of various methods in a qualitative manner. Experiments will be proposed to further narrow and ultimately select a spatio-temporal model for a given purpose. An example network of autonomous logistic robots, ROPOD, will serve as a case study used to demonstrate the use of the new criteria. This will also serve to guide the design of future experiments that aim to select a spatio-temporal world modeling technique for a given task. ROPOD was specifically selected as it operates in a real-world, human shared environment. This type of environment is desirable for experiments as it provides a unique combination of common and novel problems that arise when selecting an appropriate spatio-temporal world model. Using the developed criteria, a qualitative analysis will be applied to the selected methods to remove unfit options.
Then, experiments will be run on the remaining methods to provide comparative benchmarks. Finally, the results will be analyzed and recommendations to ROPOD will be made.
Multi-robot systems (MRS) are capable of performing a set of tasks by dividing them among the robots in the fleet. One of the challenges of working with multirobot systems is deciding which robot should execute each task. Multi-robot task allocation (MRTA) algorithms address this problem by explicitly assigning tasks to robots with the goal of maximizing the overall performance of the system. The indoor transportation of goods is a practical application of multi-robot systems in the area of logistics. The ROPOD project works on developing multi-robot system solutions for logistics in hospital facilities. The correct selection of an MRTA algorithm is crucial for enhancing transportation tasks. Several multi-robot task allocation algorithms exist in the literature, but just few experimental comparative analysis have been performed. This project analyzes and assesses the performance of MRTA algorithms for allocating supply cart transportation tasks to a fleet of robots. We conducted a qualitative analysis of MRTA algorithms, selected the most suitable ones based on the ROPOD requirements, implemented four of them (MURDOCH, SSI, TeSSI, and TeSSIduo), and evaluated the quality of their allocations using a common experimental setup and 10 experiments. Our experiments include off-line and semi on-line allocation of tasks as well as scalability tests and use virtual robots implemented as Docker containers. This design should facilitate deployment of the system on the physical robots. Our experiments conclude that TeSSI and TeSSIduo suit best the ROPOD requirements. Both use temporal constraints to build task schedules and run in polynomial time, which allow them to scale well with the number of tasks and robots. TeSSI distributes the tasks among more robots in the fleet, while TeSSIduo tends to use a lower percentage of the available robots.
Subsequently, we have integrated TeSSI and TeSSIduo to perform multi-robot task allocation for the ROPOD project.
The Learning Culture Survey (LCS) is a questionnaire-based research, investigating students’ perceptions of and expectations towards Higher Education (HE). The aim of this survey is to improve our understanding about the sources of cultural conflicts in educational scenarios. This understanding, shell help us to predict potential conflict situations and develop supportive measures.
After three years of development, the LCS was initialized in 2010 in South Korea and Germany. During the following years, the investigations were extended to further countries. The results, on the one hand, provided insights about the cultural context of HE in general and on the other hand, about specific (national / regional) characteristics of learners in HE. Most issues targeted with the questionnaire were directly linked to value systems. Thus, we expected from the beginning that the collected data would keep valid over longer periods of time. However, we had no evidence regarding the actual persistence of learning culture. For a study, designed to being implemented on a global scope and providing input for further applications, persistence is a basic condition to justify related investigations.
To answer the question on persistence, we repeated the LCS in our university every four years, between 2010 to 2018/19. Besides a small number of slight changes, explainable out of their situational context, the overall results kept consistent over the investigated years. In this paper, after an introduction of the LCS’ concept, setting and its general results from the past years, we present the insights from our most recently finalized longitudinal study on learning culture.
Digital transformation in Higher Education and Science is a mission-critical demand to prepare educational institutions for their future competition on the international market. In many cases, the digitization goes along with the search for and acquisition of new software. For easily exchangeable software, wrong product decisions, in the worst case, lead to calculable financial losses. However, if a planned software requires a lot of technological adjustments and is to be applied as central component of a business- and/or security-critical environment, wrong decisions during the software acquisition process might lead to hardly calculable damage. Questions arising are how to decide for a product and how many resources should be invested for the acquisition process.
We planned to apply a commercial Business Support System, which should replace the currently used in-house developed software. Our goals were the increase of our university’s level of data security, to ease the interaction between stakeholders, to eliminate media discontinuities, to improve the process management and transparency, and to reduce the execution time of automated processes. Alongside with the introduction of the electronic case file, our agenda stipulates the digitization (and automation) of administrative university processes, especially, but not limited to, the student self-service and the administrative student life cycle. Usual tools and practices, commonly applied to (simple) software acquisition, failed in our scenario.
With the case study introduced in this paper, we address all persons, involved within software acquisition processes: From our experiences, we strongly recommend to place greater value on an exhaustively completed acquisition process, than on short-termed economic advantages.
Large display environments are highly suitable for immersive analytics. They provide enough space for effective co-located collaboration and allow users to immerse themselves in the data. To provide the best setting - in terms of visualization and interaction - for the collaborative analysis of a real-world task, we have to understand the group dynamics during the work on large displays. Among other things, we have to study, what effects different task conditions will have on user behavior.
In this paper, we investigated the effects of task conditions on group behavior regarding collaborative coupling and territoriality during co-located collaboration on a wall-sized display. For that, we designed two tasks: a task that resembles the information foraging loop and a task that resembles the connecting facts activity. Both tasks represent essential sub-processes of the sensemaking process in visual analytics and cause distinct space/display usage conditions. The information foraging activity requires the user to work with individual data elements to look into details. Here, the users predominantly occupy only a small portion of the display. In contrast, the connecting facts activity requires the user to work with the entire information space. Therefore, the user has to overview the entire display.
We observed 12 groups for an average of two hours each and gathered qualitative data and quantitative data. During data analysis, we focused specifically on participants' collaborative coupling and territorial behavior.
We could detect that participants tended to subdivide the task to approach it, in their opinion, in a more effective way, in parallel. We describe the subdivision strategies for both task conditions. We also detected and described multiple user roles, as well as a new coupling style that does not fit in either category: loosely or tightly. Moreover, we could observe a territory type that has not been mentioned previously in research. In our opinion, this territory type can affect the collaboration process of groups with more than two collaborators negatively. Finally, we investigated critical display regions in terms of ergonomics. We could detect that users perceived some regions as less comfortable for long-time work.
The Peren-Clement index (PCI) is a methodology to analyze country-specific risk for businesses engaged in international trade and direct investment. This index, established in 1998, provides a guideline when deciding which foreign markets offer the possibility for additional business engagement and investment, and to what extent existing engagement or investment can be increased or should be reduced.
In mathematical modeling by means of performance models, the Fitness-Fatigue Model (FF-Model) is a common approach in sport and exercise science to study the training performance relationship. The FF-Model uses an initial basic level of performance and two antagonistic terms (for fitness and fatigue). By model calibration, parameters are adapted to the subject’s individual physical response to training load. Although the simulation of the recorded training data in most cases shows useful results when the model is calibrated and all parameters are adjusted, this method has two major difficulties. First, a fitted value as basic performance will usually be too high. Second, without modification, the model cannot be simply used for prediction. By rewriting the FF-Model such that effects of former training history can be analyzed separately – we call those terms preload – it is possible to close the gap between a more realistic initial performance level and an athlete's actual performance level without distorting other model parameters and increase model accuracy substantially. Fitting error of the preload-extended FF-Model is less than 32% compared to the error of the FF-Model without preloads. Prediction error of the preload-extended FF-Model is around 54% of the error of the FF-Model without preloads.
This work presents the preliminary research towards developing an adaptive tool for fault detection and diagnosis of distributed robotic systems, using explainable machine learning methods. Autonomous robots are complex systems that require high reliability in order to operate in different environments. Even more so, when considering distributed robotic systems, the task of fault detection and diagnosis becomes exponentially difficult.
To diagnose systems, models representing the behaviour under investigation need to be developed, and with distributed robotic systems generating large amount of data, machine learning becomes an attractive method of modelling especially because of its high performance. However, with current day methods such as artificial neural networks (ANNs), the issue of explainability arises where learnt models lack the ability to give explainable reasons behind their decisions.
This paper presents current trends in methods for data collection from distributed systems, inductive logic programming (ILP); an explainable machine learning method, and fault detection and diagnosis.
In the field of service robots, dealing with faults is crucial to promote user acceptance. In this context, this work focuses on some specific faults which arise from the interaction of a robot with its real world environment due to insufficient knowledge for action execution.
In our previous work [1], we have shown that such missing knowledge can be obtained through learning by experimentation. The combination of symbolic and geometric models allows us to represent action execution knowledge effectively. However we did not propose a suitable representation of the symbolic model.
In this work we investigate such symbolic representation and evaluate its learning capability. The experimental analysis is performed on four use cases using four different learning paradigms. As a result, the symbolic representation together with the most suitable learning paradigm are identified.
In Sensor-based Fault Detection and Diagnosis (SFDD) methods, spatial and temporal dependencies among the sensor signals can be modeled to detect faults in the sensors, if the defined dependencies change over time. In this work, we model Granger causal relationships between pairs of sensor data streams to detect changes in their dependencies. We compare the method on simulated signals with the Pearson correlation, and show that the method elegantly handles noise and lags in the signals and provides appreciable dependency detection. We further evaluate the method using sensor data from a mobile robot by injecting both internal and external faults during operation of the robot. The results show that the method is able to detect changes in the system when faults are injected, but is also prone to detecting false positives. This suggests that this method can be used as a weak detection of faults, but other methods, such as the use of a structural model, are required to reliably detect and diagnose faults.
This paper proposes an approach to an ANN-based temperature controller design for a plastic injection moulding system. This design approach is applied to the development of a controller based on a combination of a classical ANN and integrator. The controller provides a fast temperature response and zero steady-state error for three typical heaters (bar, nozzle, and cartridge) for a plastic moulding system. The simulation results in Matlab Simulink software and in comparison to an industrial PID regulator have shown the advantages of the controller, such as significantly less overshoot and faster transient (compared to PID with autotuning) for all examined heaters. In order to verify the proposed approach, the designed ANN controller was implemented and tested using an experimental setup based on an STM32 board.
Quantifying Interference in WiLD Networks using Topography Data and Realistic Antenna Patterns
(2019)
Avoiding possible interference is a key aspect to maximize the performance in Wi-Fi based Long Distance networks. In this paper we quantify self-induced interference based on data derived from our testbed and match the findings against simulations. By enhancing current simulation models with two key elements we significantly reduce the deviation between testbed and simulation: the usage of detailed antenna patterns compared to the cone model and propagation modeling enhanced by license-free topography data. Based on the gathered data we discuss several possible optimization approaches such as physical separation of local radios, tuning the sensitivity of the transmitter and using centralized compared to distributed channel assignment algorithms. While our testbed is based on 5 GHz Wi-Fi, we briefly discuss the possible impact of our results to other frequency bands.
Synthesis of Substituted Hydroxyapatite for Application in Bone Tissue Engineering and Drug Delivery
(2019)
Gas Chromatography
(2019)
Gas chromatography (GC) is one of the most important types of chromatography used in analytical chemistry for separating and analyzing chemical organic compounds. Today, gas chromatography is one of the most widespread investigation methods of instrumental analysis. This technique is used in the laboratories of chemical, petrochemical, and pharmaceutical industries, in research institutes, and also in clinical, environmental, and food and beverage analysis. This book is the outcome of contributions by experts in the field of gas chromatography and includes a short history of gas chromatography, an overview of derivatization methods and sample preparation techniques, a comprehensive study on pyrazole mass spectrometric fragmentation, and a GC/MS/MS method for the determination and quantification of pesticide residues in grape samples.
It is shown that the electrochemical kinetics of alkaline methanol oxidation can be reduced by setting certain fast reactions contained in it to a steady state. As a result, the underlying system of Ordinary Differential Equations (ODE) is transformed to a system of Differential-Algebraic Equations (DAE). We measure the precision characteristics of such transformation and discuss the consequences of the obtained model reduction.
The paper presents the topological reduction method applied to gas transport networks, using contraction of series, parallel and tree-like subgraphs. The contraction operations are implemented for pipe elements, described by quadratic friction law. This allows significant reduction of the graphs and acceleration of solution procedure for stationary network problems. The algorithm has been tested on several realistic network examples. The possible extensions of the method to different friction laws and other elements are discussed.
Survival of patients with pediatric acute lymphoblastic leukemia (ALL) after allogeneic hematopoietic stem cell transplantation (allo-SCT) is mainly compromised by leukemia relapse, carrying dismal prognosis. As novel individualized therapeutic approaches are urgently needed, we performed whole-exome sequencing of leukemic blasts of 10 children with post–allo-SCT relapses with the aim of thoroughly characterizing the mutational landscape and identifying druggable mutations. We found that post–allo-SCT ALL relapses display highly diverse and mostly patient-individual genetic lesions. Moreover, mutational cluster analysis showed substantial clonal dynamics during leukemia progression from initial diagnosis to relapse after allo-SCT. Only very few alterations stayed constant over time. This dynamic clonality was exemplified by the detection of thiopurine resistance-mediating mutations in the nucleotidase NT5C2 in 3 patients’ first relapses, which disappeared in the post–allo-SCT relapses on relief of selective pressure of maintenance chemotherapy. Moreover, we identified TP53 mutations in 4 of 10 patients after allo-SCT, reflecting acquired chemoresistance associated with selective pressure of prior antineoplastic treatment. Finally, in 9 of 10 children’s post–allo-SCT relapse, we found alterations in genes for which targeted therapies with novel agents are readily available. We could show efficient targeting of leukemic blasts by APR-246 in 2 patients carrying TP53 mutations. Our findings shed light on the genetic basis of post–allo-SCT relapse and may pave the way for unraveling novel therapeutic strategies in this challenging situation.
Scratch assays enable the study of the migration process of an injured adherent cell layer in vitro. An apparatus for the reproducible performance of scratch assays and cell harvesting has been developed that meets the requirements for reproducibility in tests as well as easy handling. The entirely autoclavable setup is divided into a sample translation and a scratching system. The translational system is compatible with standard culture dishes and can be modified to adapt to different cell culture systems, while the scratching system can be adjusted according to angle, normal force, shape, and material to adapt to specific questions and demanding substrates. As a result, a fully functional prototype can be presented. This system enables the creation of reproducible and clear scratch edges with a low scratch border roughness within a monolayer of cells. Moreover, the apparatus allows the collection of the migrated cells after scratching for further molecular biological investigations without the need for a second processing step. For comparison, the mechanical properties of manually performed scratch assays are evaluated.
The number of studies on work breaks and the importance of this subject is growing rapidly, with research showing that work breaks increase employees’ wellbeing and performance and workplace safety. However, comparing the results of work break research is difficult since the study designs and methods are heterogeneous and there is no standard theoretical model for work breaks. Based on a systematic literature search, this scoping review included a total of 93 studies on experimental work break research conducted over the last 30 years. This scoping review provides a first structured evaluation regarding the underlying theoretical framework, the variables investigated, and the measurement methods applied. Studies using a combination of measurement methods from the categories “self-report measures,” “performance measures,” and “physiological measures” are most common and to be preferred in work break research. This overview supplies important information for ergonomics researchers allowing them to design work break studies with a more structured and stronger theory-based approach. A standard theoretical model for work breaks is needed in order to further increase the comparability of studies in the field of experimental work break research in the future.
Although work events can be regarded as pivotal elements of organizational life, only a few studies have examined how positive and negative events relate to and combine to affect work engagement over time. Theory suggests that to better understand how current events affect work engagement (WE), we have to account for recent events that have preceded these current events. We present competing theoretical views on how recent and current work events may affect employees (e.g., getting used to a high frequency of negative events or becoming more sensitive to negative events). Although the occurrence of events implies discrete changes in the experience of work, prior research has not considered whether work events actually accumulate to sustained mid-term changes in WE. To address these gaps in the literature, we conducted a week-level longitudinal study across a period of 15 consecutive weeks among 135 employees, which yielded 849 weekly observations. While positive events were associated with higher levels of WE within the same week, negative events were not. Our results support neither satiation nor sensitization processes. However, high frequencies of negative events in the preceding week amplified the beneficial effects of positive events on WE in the current week. Growth curve analyses show that the benefits of positive events accumulate to sustain high levels of WE. WE dissipates in the absence of continuous experience of positive events. Our study adds a temporal component and informs research that has taken a feature-oriented perspective on the dynamic interplay of job demands and resources.
Application developers constitute an important part of a digital platform’s ecosystem. Knowledge about psychological processes that drive developer behavior in platform ecosystems is scarce. We build on the lead userness construct which comprises two dimensions, trend leadership and high expected benefits from a solution, to explain how developers’ innovative work behavior (IWB) is stimulated. We employ an efficiencyoriented and a social-political perspective to investigate the relationship between lead userness and IWB. The efficiency-oriented view resonates well with the expected benefit dimension of lead userness, while the social-political view might be interpreted as a reflection of trend leadership. Using structural equation modeling, we test our model with a sample of over 400 developers from three platform ecosystems. We find that lead userness is indirectly associated with IWB and the performance-enhancing view to be the stronger predictor of IWB. Finally, we unravel differences between paid and unpaid app developers in platform ecosystems.
Data-Driven Robot Fault Detection and Diagnosis Using Generative Models: A Modified SFDD Algorithm
(2019)
This paper presents a modification of the data-driven sensor-based fault detection and diagnosis (SFDD) algorithm for online robot monitoring. Our version of the algorithm uses a collection of generative models, in particular restricted Boltzmann machines, each of which represents the distribution of sliding window correlations between a pair of correlated measurements. We use such models in a residual generation scheme, where high residuals generate conflict sets that are then used in a subsequent diagnosis step. As a proof of concept, the framework is evaluated on a mobile logistics robot for the problem of recognising disconnected wheels, such that the evaluation demonstrates the feasibility of the framework (on the faulty data set, the models obtained 88.6% precision and 75.6% recall rates), but also shows that the monitoring results are influenced by the choice of distribution model and the model parameters as a whole.
Atmospheric aerosols affect the power production of solar energy systems. Their impact depends on both the atmospheric conditions and the solar technology employed. By being a region with a lack in power production and prone to high solar insolation, West Africa shows high potential for the application of solar power systems. However, dust outbreaks, containing high aerosol loads, occur especially in the Sahel, located between the Saharan desert in the north and the Sudanian Savanna in the south. They might affect the whole region for several days with significant effects on power generation. This study investigates the impact of atmospheric aerosols on solar energy production for the example year 2006 making use of six well instrumented sites in West Africa. Two different solar power technologies, a photovoltaic (PV) and a parabolic through (PT) power plant, are considered. The daily reduction of solar power due to aerosols is determined over mostly clear-sky days in 2006 with a model chain combining radiative transfer and technology specific power generation. For mostly clear days the local daily reduction of PV power (at alternating current) (PVAC) and PT power (PTP) due to the presence of aerosols lies between 13 % and 22 % and between 22 % and 37 %, respectively. In March 2006 a major dust outbreak occurred, which serves as an example to investigate the impact of an aerosol extreme event on solar power. During the dust outbreak, daily reduction of PVAC and PTP of up to 79 % and 100 % occur with a mean reduction of 20 % to 40 % for PVAC and of 32 % to 71 % for PTP during the 12 days of the event.