Refine
Departments, institutes and facilities
- Fachbereich Informatik (979)
- Fachbereich Angewandte Naturwissenschaften (630)
- Institut für funktionale Gen-Analytik (IFGA) (560)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (393)
- Fachbereich Ingenieurwissenschaften und Kommunikation (386)
- Fachbereich Wirtschaftswissenschaften (319)
- Institute of Visual Computing (IVC) (286)
- Institut für Cyber Security & Privacy (ICSP) (244)
- Institut für Verbraucherinformatik (IVI) (166)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (126)
Document Type
- Article (1649)
- Conference Object (1426)
- Part of a Book (249)
- Preprint (88)
- Report (73)
- Doctoral Thesis (64)
- Book (monograph, edited volume) (57)
- Master's Thesis (36)
- Working Paper (34)
- Conference Proceedings (27)
Year of publication
Language
- English (3792) (remove)
Keywords
- Machine Learning (15)
- Virtual Reality (15)
- FPGA (14)
- Robotics (14)
- Sustainability (14)
- ENaC (13)
- GC/MS (13)
- virtual reality (13)
- sustainability (12)
- ICT (11)
Background: the potency of drugs that interfere with glucose metabolism, i.e., glucose transporters (GLUT) and nicotinamide phosphoribosyltransferase (NAMPT) was analyzed in neuroendocrine tumor (NET, BON-1, and QPG-1 cells) and small cell lung cancer (SCLC, GLC-2, and GLC-36 cells) tumor cell lines. (2) Methods: the proliferation and survival rate of tumor cells was significantly affected by the GLUT-inhibitors fasentin and WZB1127, as well as by the NAMPT inhibitors GMX1778 and STF-31. (3) Results: none of the NET cell lines that were treated with NAMPT inhibitors could be rescued with nicotinic acid (usage of the Preiss–Handler salvage pathway), although NAPRT expression could be detected in two NET cell lines. We finally analyzed the specificity of GMX1778 and STF-31 in NET cells in glucose uptake experiments. As previously shown for STF-31 in a panel NET-excluding tumor cell lines, both drugs specifically inhibited glucose uptake at higher (50 μM), but not at lower (5 μM) concentrations. (4) Conclusions: our data suggest that GLUT and especially NAMPT inhibitors are potential candidates for the treatment of NET tumors.
Climate change is increasingly affecting vulnerable groups and resulting in dire social and economic consequences, especially for those in the Global South. Managing current and emerging climate-related risks will require increasing individual’s and communities’ resilience, including enhancing absorptive, adaptive, and transformative capacities. Policymakers are now considering the role that social protection policies and programmes can play in building climate resilience by contributing to these capacities. However, there is a limited understanding of the extent to which social protection instruments can influence these three resilience-related capacities. Lack of assessment tools or frameworks might contribute to limited evidence of social protection’s ability to increase climate resilience. In particular, there appear to be no frameworks or tools that help assess the role of social cash transfers (SCT) in building adaptive capacity. Based on a multi-staged literature review, we develop an adaptive capacity outcomes framework (ACOF) that can help assess SCT’s contribution to building adaptive capacity, and, consequently, resilience. The framework is then tested using impact evaluation and assessment reports from SCT programmes in Indonesia, Zambia, Ethiopia, Bangladesh, and Tanzania. The exercise finds that SCTs alone have a limited contribution to adaptive capacity outcomes, but interventions that combine cash transfers with other components such as nutrition or livelihood training show positive impacts. We find that the ACOF can support assessments of SCT’s contribution towards adaptive capacity. It can help build evidence, evaluate impacts, and through further research, can facilitate learning on SCTs' role in increasing climate resilience.
Skill generalisation and experience acquisition for predicting and avoiding execution failures
(2023)
For performing tasks in their target environments, autonomous robots usually execute and combine skills. Robot skills in general and learning-based skills in particular are usually designed so that flexible skill acquisition is possible, but without an explicit consideration of execution failures, the impact that failure analysis can have on the skill learning process, or the benefits of introspection for effective coexistence with humans. Particularly in human-centered environments, the ability to understand, explain, and appropriately react to failures can affect a robot's trustworthiness and, consequently, its overall acceptability. Thus, in this dissertation, we study the questions of how parameterised skills can be designed so that execution-level decisions are associated with semantic knowledge about the execution process, and how such knowledge can be utilised for avoiding and analysing execution failures. The first major segment of this work is dedicated to developing a representation for skill parameterisation whose objective is to improve the transparency of the skill parameterisation process and enable a semantic analysis of execution failures. We particularly develop a hybrid learning-based representation for parameterising skills, called an execution model, which combines qualitative success preconditions with a function that maps parameters to predicted execution success. The second major part of this work focuses on applications of the execution model representation to address different types of execution failures. We first present a diagnosis algorithm that, given parameters that have resulted in a failure, finds a failure hypothesis by searching for violations of the qualitative model, as well as an experience correction algorithm that uses the found hypothesis to identify parameters that are likely to correct the failure. Furthermore, we present an extension of execution models that allows multiple qualitative execution contexts to be considered so that context-specific execution failures can be avoided. Finally, to enable the avoidance of model generalisation failures, we propose an adaptive ontology-assisted strategy for execution model generalisation between object categories that aims to combine the benefits of model-based and data-driven methods; for this, information about category similarities as encoded in an ontology is integrated with outcomes of model generalisation attempts performed by a robot. The proposed methods are exemplified in terms of various use cases - object and handle grasping, object stowing, pulling, and hand-over - and evaluated in multiple experiments performed with a physical robot. The main contributions of this work include a formalisation of the skill parameterisation problem by considering execution failures as an integral part of the skill design and learning process, a demonstration of how a hybrid representation for parameterising skills can contribute towards improving the introspective properties of robot skills, as well as an extensive evaluation of the proposed methods in various experiments. We believe that this work constitutes a small first step towards more failure-aware robots that are suitable to be used in human-centered environments.
Loading of shipping containers for dairy products often includes a press-fit task, which involves manually stacking milk cartons in a container without using pallets or packaging. Automating this task with a mobile manipulator can reduce worker strain, and also enhance the efficiency and safety of the container loading process. This paper proposes an approach called Adaptive Compliant Control with Integrated Failure Recovery (ACCIFR), which enables a mobile manipulator to reliably perform the press-fit task. We base the approach on a demonstration learning-based compliant control framework, such that we integrate a monitoring and failure recovery mechanism for successful task execution. Concretely, we monitor the execution through distance and force feedback, detect collisions while the robot is performing the press-fit task, and use wrench measurements to classify the direction of collision; this information informs the subsequent recovery process. We evaluate the method on a miniature container setup, considering variations in the (i) starting position of the end effector, (ii) goal configuration, and (iii) object grasping position. The results demonstrate that the proposed approach outperforms the baseline demonstration-based learning framework regarding adaptability to environmental variations and the ability to recover from collision failures, making it a promising solution for practical press-fit applications.
In the design of robot skills, the focus generally lies on increasing the flexibility and reliability of the robot execution process; however, typical skill representations are not designed for analysing execution failures if they occur or for explicitly learning from failures. In this paper, we describe a learning-based hybrid representation for skill parameterisation called an execution model, which considers execution failures to be a natural part of the execution process. We then (i) demonstrate how execution contexts can be included in execution models, (ii) introduce a technique for generalising models between object categories by combining generalisation attempts performed by a robot with knowledge about object similarities represented in an ontology, and (iii) describe a procedure that uses an execution model for identifying a likely hypothesis of a parameterisation failure. The feasibility of the proposed methods is evaluated in multiple experiments performed with a physical robot in the context of handle grasping, object grasping, and object pulling. The experimental results suggest that execution models contribute towards avoiding execution failures, but also represent a first step towards more introspective robots that are able to analyse some of their execution failures in an explicit manner.
Saliency methods are frequently used to explain Deep Neural Network-based models. Adebayo et al.'s work on evaluating saliency methods for classification models illustrate certain explanation methods fail the model and data randomization tests. However, on extending the tests for various state of the art object detectors we illustrate that the ability to explain a model is more dependent on the model itself than the explanation method. We perform sanity checks for object detection and define new qualitative criteria to evaluate the saliency explanations, both for object classification and bounding box decisions, using Guided Backpropagation, Integrated Gradients, and their Smoothgrad versions, together with Faster R-CNN, SSD, and EfficientDet-D0, trained on COCO. In addition, the sensitivity of the explanation method to model parameters and data labels varies class-wise motivating to perform the sanity checks for each class. We find that EfficientDet-D0 is the most interpretable method independent of the saliency method, which passes the sanity checks with little problems.
A biodegradable blend of PBAT—poly(butylene adipate-co-terephthalate)—and PLA—poly(lactic acid)—for blown film extrusion was modified with four multi-functional chain extending cross-linkers (CECL). The anisotropic morphology introduced during film blowing affects the degradation processes. Given that two CECL increased the melt flow rate (MFR) of tris(2,4-di-tert-butylphenyl)phosphite (V1) and 1,3-phenylenebisoxazoline (V2) and the other two reduced it (aromatic polycarbodiimide (V3) and poly(4,4-dicyclohexylmethanecarbodiimide) (V4)), their compost (bio-)disintegration behavior was investigated. It was significantly altered with respect to the unmodified reference blend (REF). The disintegration behavior at 30 and 60 °C was investigated by determining changes in mass, Young’s moduli, tensile strengths, elongations at break and thermal properties. In order to quantify the disintegration behavior, the hole areas of blown films were evaluated after compost storage at 60 °C to calculate the kinetics of the time dependent degrees of disintegration. The kinetic model of disintegration provides two parameters: initiation time and disintegration time. They quantify the effects of the CECL on the disintegration behavior of the PBAT/PLA compound. Differential scanning calorimetry (DSC) revealed a pronounced annealing effect during storage in compost at 30 °C, as well as the occurrence of an additional step-like increase in the heat flow at 75 °C after storage at 60 °C. The disintegration consists of processes which affect amorphous and crystalline phase of PBAT in different manner that cannot be understood by a hydrolytic chain degradation only. Furthermore, gel permeation chromatography (GPC) revealed molecular degradation only at 60 °C for the REF and V1 after 7 days of compost storage. The observed losses of mass and cross-sectional area seem to be attributed more to mechanical decay than to molecular degradation for the given compost storage times.
Citizen participation is deemed to be crucial for sustainability and resilience planning. However, generational equity has been missing from recent academic discussions regarding sustainability and resilience. Therefore, the purpose of this paper is to reintroduce the topic of the existence or absence of an intergenerational consensus on the example of a rural community and its perceived brand image attributes and development priorities. The research is based on primary data collected through an online survey, with a sample size of N = 808 respondents in Neunkirchen-Seelscheid, Germany. The data were analyzed using the Kruskal–Wallis test for the presence and/or absence of consensus among the five generations regarding brand image attributes and development priorities. The findings point to divergence between what the median values indicate as the most relevant brand image attributes and development priorities among the citizens and the areas where the Kruskal–Wallis test shows that an intergenerational consensus either does or does not exist. The results imply the need for new concepts and applied approaches to citizen participation for sustainability and resilience, where intergenerational dialogue and equity-building take center stage. In addition to the importance of the theory of citizen participation for sustainability and resilience, our results provide ample evidence for how sustainability and resilience planning documents could potentially benefit from deploying the concept of intergenerational equity. The present research provides sustainability and political science with new conceptual and methodological approaches for taking intergenerational equity into account in regional planning processes in rural and other areas.
Rosenbrock–Wanner methods for systems of stiff ordinary differential equations are well known since the seventies. They have been continuously developed and are efficient for differential-algebraic equations of index-1, as well. Their disadvantage that the Jacobian matrix has to be updated in every time step becomes more and more obsolete when automatic differentiation is used. Especially the family of Rodas methods has proven to be a standard in the Julia package DifferentialEquations. However, the fifth-order Rodas5 method undergoes order reduction for certain problem classes. Therefore, the goal of this paper is to compute a new set of coefficients for Rodas5 such that this order reduction is reduced. The procedure is similar to the derivation of the methods Rodas4P and Rodas4P2. In addition, it is possible to provide new dense output formulas for Rodas5 and the new method Rodas5P. Numerical tests show that for higher accuracy requirements Rodas5P always belongs to the best methods within the Rodas family.
The representation, or encoding, utilized in evolutionary algorithms has a substantial effect on their performance. Examination of the suitability of widely used representations for quality diversity optimization (QD) in robotic domains has yielded inconsistent results regarding the most appropriate encoding method. Given the domain-dependent nature of QD, additional evidence from other domains is necessary. This study compares the impact of several representations, including direct encoding, a dictionary-based representation, parametric encoding, compositional pattern producing networks, and cellular automata, on the generation of voxelized meshes in an architecture setting. The results reveal that some indirect encodings outperform direct encodings and can generate more diverse solution sets, especially when considering full phenotypic diversity. The paper introduces a multi-encoding QD approach that incorporates all evaluated representations in the same archive. Species of encodings compete on the basis of phenotypic features, leading to an approach that demonstrates similar performance to the best single-encoding QD approach. This is noteworthy, as it does not always require the contribution of the best-performing single encoding.
The non-filarial and non-communicable disease podoconiosis affects around 4 million people and is characterized by severe leg lymphedema accompanied with painful intermittent acute inflammatory episodes, called acute dermatolymphangioadenitis (ADLA) attacks. Risk factors have been associated with the disease but the mechanisms of pathophysiology remain uncertain. Lymphedema can lead to skin lesions, which can serve as entry points for bacteria that may cause ADLA attacks leading to progression of the lymphedema. However, the microbiome of the skin of affected legs from podoconiosis individuals remains unclear. Thus, we analysed the skin microbiome of podoconiosis legs using next generation sequencing. We revealed a positive correlation between increasing lymphedema severity and non-commensal anaerobic bacteria, especially Anaerococcus provencensis, as well as a negative correlation with the presence of Corynebacterium, a constituent of normal skin flora. Disease symptoms were generally linked to higher microbial diversity and richness, which deviated from the normal composition of the skin. These findings show an association of distinct bacterial taxa with lymphedema stages, highlighting the important role of bacteria for the pathogenesis of podoconiosis and might enable a selection of better treatment regimens to manage ADLA attacks and disease progression.
The transport of carbon dioxide through pipelines is one of the important components of Carbon dioxide Capture and Storage (CCS) systems that are currently being developed. If high flow rates are desired a transportation in the liquid or supercritical phase is to be preferred. For technical reasons, the transport must stay in that phase, without transitioning to the gaseous state. In this paper, a numerical simulation of the stationary process of carbon dioxide transport with impurities and phase transitions is considered. We use the Homogeneous Equilibrium Model (HEM) and the GERG-2008 thermodynamic equation of state to describe the transport parameters. The algorithms used allow to solve scenarios of carbon dioxide transport in the liquid or supercritical phase, with the detection of approaching the phase transition region. Convergence of the solution algorithms is analyzed in connection with fast and abrupt changes of the equation of state and the enthalpy function in the region of phase transitions.
Based on the WEF Travel & Tourism Report data, this study deploys k-means cluster analysis to build a global typology of national destination governance. Previous studies have focused on case studies, while this chapter focuses on classification of different destination types, by deploying indicators a set of following relevant indicators: wastewater treatment, fixed broadband internet subscriptions, ground transport efficiency, quality of roads, quality of railroad infrastructure, reliability of police services, ease of finding skilled employees. The results present a four-cluster solution of national destination governance types, as well as their major characteristics. The chapter than provides and discusses important implication for theory and practice of destination governance.
In the project EILD.nrw, Open Educational Resources (OER) have been developed for teaching databases. Lecturers can use the tools and courses in a variety of learning scenarios. Students of computer science and application subjects can learn the complete life cycle of databases. For this purpose, quizzes, interactive tools, instructional videos, and courses for learning management systems are developed and published under a Creative Commons license. We give an overview of the developed OERs according to subject, description, teaching form, and format. Following, we describe how licencing, sustainability, accessibility, contextualization, content description, and technical adaptability are implemented. The feedback of students in ongoing classes are evaluated.
The purpose of this study is to extend previous research on brand innovation by uncovering the process of family winery branding in relation to the new product launch in the VUCA market on the case of three Serbian wineries. The study deploys qualitative oriented and empirical approach in presenting a multi-case study. Three semi-structured telephone interviews were conducted with owners and/or managers in these three wineries. The results demonstrate that all three family wineries are offering high-end product for the domestic market with smaller one still experimenting with strategic direction of innovating for high-end market while the two larger ones putting focus either on autochthonous grape varieties with eye-cathicng labels or authentic brand identity with strong storytelling. Another important aspect identified is the frugal nature of product launch in the family wineries due to limited resources. The paper presents is among only few studies on new product development in wine business literature.
The perceptual upright results from the multisensory integration of the directions indicated by vision and gravity as well as a prior assumption that upright is towards the head. The direction of gravity is signalled by multiple cues, the predominant of which are the otoliths of the vestibular system and somatosensory information from contact with the support surface. Here, we used neutral buoyancy to remove somatosensory information while retaining vestibular cues, thus "splitting the gravity vector" leaving only the vestibular component. In this way, neutral buoyancy can be used as a microgravity analogue. We assessed spatial orientation using the oriented character recognition test (OChaRT, which yields the perceptual upright, PU) under both neutrally buoyant and terrestrial conditions. The effect of visual cues to upright (the visual effect) was reduced under neutral buoyancy compared to on land but the influence of gravity was unaffected. We found no significant change in the relative weighting of vision, gravity, or body cues, in contrast to results found both in long-duration microgravity and during head-down bed rest. These results indicate a relatively minor role for somatosensation in determining the perceptual upright in the presence of vestibular cues. Short-duration neutral buoyancy is a weak analogue for microgravity exposure in terms of its perceptual consequences compared to long-duration head-down bed rest.
AI systems pose unknown challenges for designers, policymakers, and users which aggravates the assessment of potential harms and outcomes. Although understanding risks is a requirement for building trust in technology, users are often excluded from legal assessments and explanations of AI hazards. To address this issue we conducted three focus groups with 18 participants in total and discussed the European proposal for a legal framework for AI. Based on this, we aim to build a (conceptual) model that guides policymakers, designers, and researchers in understanding users’ risk perception of AI systems. In this paper, we provide selected examples based on our preliminary results. Moreover, we argue for the benefits of such a perspective.
Modern engineering relies heavily on utilizing computer technologies. This is especially true for thermoplastic manufacturing, such as blow molding. A crucial milestone for digitalization is the continuous integration of data in unified or interoperable systems. While new simulation technologies are constantly developed, data management standards such as STEP fail at integrating them. On the other hand, industrial standards such as ”VMAP” manage to improve interoperability for Small and Medium-sized Enterprises. However, they do not provide Simulation Process and Data Management (SPDM) technologies. For SPDM integration of VMAP data, Ontology-Based Data Access is used to allow continuing the digital thread in custom semantic-based open-source solutions. An ontology of the database format (VMAP) was generated alongside an expandable knowledge graph of data access methods. A Python-based software architecture was developed, automatically using the semantic representations of database format and data access to query data and metadata within the VMAP file. The result is a software architecture template that can be adapted for other data standards and integrated into semantic data management systems. It allows semantic queries on simulation data down to element-wise resolution without integrating the whole model information. The architecture can instantiate a file in a knowledge graph, query a file’s metadatum and, in case it is not yet available, find a semantically represented process that allows the creation and instantiation of the required metadatum. See Figure 1. The results of this thesis can be expected to form a basis for semantic SPDM tools.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. The method is tested on data from two measurement campaigns that took place in the Allgäu region in Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 min resolution along with a non-linear photovoltaic module temperature model, global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 5.79 W m−2 (7.35 W m−2) under clear (cloudy) skies, averaged over the two campaigns, whereas for the retrieval using coarser 15 min power data with a linear temperature model the mean bias error is 5.88 and 41.87 W m−2 under clear and cloudy skies, respectively.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a 1D radiative transfer simulation, and the results are compared to both satellite retrievals and data from the Consortium for Small-scale Modelling (COSMO) weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken-cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
When dialogues with voice assistants (VAs) fall apart, users often become confused or even frustrated. To address these issues and related privacy concerns, Amazon recently introduced a feature allowing Alexa users to inquire about why it behaved in a certain way. But how do users perceive this new feature? In this paper, we present preliminary results from research conducted as part of a three-year project involving 33 German households. This project utilized interviews, fieldwork, and co-design workshops to identify common unexpected behaviors of VAs, as well as users’ needs and expectations for explanations. Our findings show that, contrary to its intended purpose, the new feature actually exacerbates user confusion and frustration instead of clarifying Alexa's behavior. We argue that such voice interactions should be characterized as explanatory dialogs that account for VA’s unexpected behavior by providing interpretable information and prompting users to take action to improve their current and future interactions.
Western consumption patterns are strongly associated with environmental pollution and climate change, which challenges us with transforming our society and consumption towards a sustainable future. This thesis takes up this challenge and aims to contribute to this debate at the intersection of ICT artifacts and social practices through the examples of food and mobility consumption. The social practice lens is employed as an alternative to the predominant persuasive or motivational lens of design in the respective consumption domains. Against this background, this thesis first presents three research papers that contribute to a broader understanding of dynamic practices and their transformation towards a sustainable stable state. The following research takes up these sections' empirical results that more intensely focus on the appropriation of materials and infrastructures utilizing Recommender Systems. Given this approach, this thesis contributes to three fields - practice-based Computing, Recommender Systems, and Consumer Informatics.
Smart heating systems are one of the core components of smart homes. A large portion of domestic energy consumption is derived from HVAC (heating, ventilation and air conditioning) systems, making them a relevant topic of the efforts to support an energy transition in private housing. For that reason, the technology has attracted attention both from the academic and the industry communities. User interfaces of smart heating systems have evolved from simple adjusting knobs to advanced data visualization interfaces, that allow for more advanced setting such as time tables and status information. With the advent of AI, we are interested in exploring how the interfaces will be evolving to build the connection between user needs and underlying AI system. Hence, this paper is targeted to provide early design implications towards an AI-based user interface for smart heating systems.
Machine learning-based solutions are frequently adapted in several applications that require big data in operations. The performance of a model that is deployed into operations is subject to degradation due to unanticipated changes in the flow of input data. Hence, monitoring data drift becomes essential to maintain the model’s desired performance. Based on the conducted review of the literature on drift detection, statistical hypothesis testing enables to investigate whether incoming data is drifting from training data. Because Maximum Mean Discrepancy (MMD) and Kolmogorov-Smirnov (KS) have shown to be reliable distance measures between multivariate distributions in the literature review, both were selected from several existing techniques for experimentation. For the scope of this work, the image classification use case was experimented with using the Stream-51 dataset. Based on the results from different drift experiments, both MMD and KS showed high Area Under Curve values. However, KS exhibited faster performance than MMD with fewer false positives. Furthermore, the results showed that using the pre-trained ResNet-18 for feature extraction maintained the high performance of the experimented drift detectors. Furthermore, the results showed that the performance of the drift detectors highly depends on the sample sizes of the reference (training) data and the test data that flow into the pipeline’s monitor. Finally, the results also showed that if the test data is a mixture of drifting and non-drifting data, the performance of the drift detectors does not depend on how the drifting data are scattered with the non-drifting ones, but rather their amount in the test set
Trust your guts: fostering embodied knowledge and sustainable practices through voice interaction
(2023)
Despite various attempts to prevent food waste and motivate conscious food handling, household members find it difficult to correctly assess the edibility of food. With the rise of ambient voice assistants, we did a design case study to support households’ in situ decision-making process in collaboration with our voice agent prototype, Fischer Fritz. Therefore, we conducted 15 contextual inquiries to understand food practices at home. Furthermore, we interviewed six fish experts to inform the design of our voice agent on how to guide consumers and teach food literacy. Finally, we created a prototype and discussed with 15 consumers its impact and capability to convey embodied knowledge to the human that is engaged as sensor. Our design research goes beyond current Human-Food Interaction automation approaches by emphasizing the human-food relationship in technology design and demonstrating future complementary human-agent collaboration with the aim to increase humans’ competence to sense, think, and act.
The cystic fibrosis transmembrane conductance regulator (CFTR) anion channel and the epithelial Na+ channel (ENaC) play essential roles in transepithelial ion and fluid transport in numerous epithelial tissues. Inhibitors of both channels have been important tools for defining their physiological role in vitro. However, two commonly used CFTR inhibitors, CFTRinh-172 and GlyH-101, also inhibit non-CFTR anion channels, indicating they are not CFTR specific. However, the potential off-target effects of these inhibitors on epithelial cation channels has to date not been addressed. Here, we show that both CFTR blockers, at concentrations routinely employed by many researchers, caused a significant inhibition of store-operated calcium entry (SOCE) that was time-dependent, poorly reversible and independent of CFTR. Patch clamp experiments showed that both CFTRinh-172 and GlyH-101 caused a significant block of Orai1-mediated whole cell currents, establishing that they likely reduce SOCE via modulation of this Ca2+ release-activated Ca2+ (CRAC) channel. In addition to off-target effects on calcium channels, both inhibitors significantly reduced human αβγ-ENaC-mediated currents after heterologous expression in Xenopus oocytes, but had differential effects on δβγ-ENaC function. Molecular docking identified two putative binding sites in the extracellular domain of ENaC for both CFTR blockers. Together, our results indicate that caution is needed when using these two CFTR inhibitors to dissect the role of CFTR, and potentially ENaC, in physiological processes.
There has been a growing interest in taste research in the HCI and CSCW communities. However, the focus is more on stimulating the senses, while the socio-cultural aspects have received less attention. However, individual taste perception is mediated through social interaction and collective negotiation and is not only dependent on physical stimulation. Therefore, we study the digital mediation of taste by drawing on ethnographic research of four online wine tastings and one self-organized event. Hence, we investigated the materials, associated meanings, competences, procedures, and engagements that shaped the performative character of tasting practices. We illustrate how the tastings are built around the taste-making process and how online contexts differ in providing a more diverse and distributed environment. We then explore the implications of our findings for the further mediation of taste as a social and democratized phenomenon through online interaction.
Background
Consumers rely heavily on online user reviews when shopping online and cybercriminals produce fake reviews to manipulate consumer opinion. Much prior research focuses on the automated detection of these fake reviews, which are far from perfect. Therefore, consumers must be able to detect fake reviews on their own. In this study we survey the research examining how consumers detect fake reviews online.
Methods
We conducted a systematic literature review over the research on fake review detection from the consumer-perspective. We included academic literature giving new empirical data. We provide a narrative synthesis comparing the theories, methods and outcomes used across studies to identify how consumers detect fake reviews online.
Results
We found only 15 articles that met our inclusion criteria. We classify the most often used cues identified into five categories which were (1) review characteristics (2) textual characteristics (3) reviewer characteristics (4) seller characteristics and (5) characteristics of the platform where the review is displayed.
Discussion
We find that theory is applied inconsistently across studies and that cues to deception are often identified in isolation without any unifying theoretical framework. Consequently, we discuss how such a theoretical framework could be developed.
ESKAPEE Pathogen Biofilm Control on Surfaces with Probiotic Lactobacillaceae and Bacillus species
(2023)
Combatting the rapidly growing threat of antimicrobial resistance and reducing prevalence and transmission of ESKAPEE pathogens in healthcare settings requires innovative strategies, one of which is displacing these pathogens using beneficial microorganisms. Our review comprehensively examines the evidence of probiotic bacteria displacing ESKAPEE pathogens, with a focus on inanimate surfaces. A systematic search was conducted using the PubMed and Web of Science databases on 21 December 2021, and 143 studies were identified examining the effects of Lactobacillaceae and Bacillus spp. cells and products on the growth, colonization, and survival of ESKAPEE pathogens. While the diversity of study methods limits evidence analysis, results presented by narrative synthesis demonstrate that several species have the potential as cells or their products or supernatants to displace nosocomial infection-causing organisms in a variety of in vitro and in vivo settings. Our review aims to aid the development of new promising approaches to control pathogen biofilms in medical settings by informing researchers and policymakers about the potential of probiotics to combat nosocomial infections. More targeted studies are needed to assess safety and efficacy of different probiotic formulations, followed by large-scale studies to assess utility in infection control and medical practice.
Isovaleric acidemia (IVA), due to isovaleryl-CoA dehydrogenase (IVD) deficiency, results in the accumulation of isovaleryl-CoA, isovaleric acid and secondary metabolites. The increase in these metabolites decreases mitochondrial energy production and increases oxidative stress. This contributes to the neuropathological features of IVA. A general assumption in the literature exists that glycine N-acyltransferase (GLYAT) plays a role in alleviating the symptoms experienced by IVA patients through the formation of N-isovalerylglycine. GLYAT forms part of the phase II glycine conjugation pathway in the liver and detoxifies excess acyl-CoA’s namely benzoyl-CoA. However, very few studies support GLYAT as the enzyme that conjugates isovaleryl-CoA to glycine. Furthermore, GLYATL1, a paralogue of GLYAT, conjugates phenylacetyl-CoA to glutamine. Therefore, GLYATL1 might also be a candidate for the formation of N-isovalerylglycine. Based on the findings from the literature review, we proposed that GLYAT or GLYATL1 can form N-isovalerylglycine in IVA patients. To test this hypothesis, we performed an in-silico analysis to determine which enzyme is more likely to conjugate isovaleryl-CoA with glycine using AutoDock Vina. Thereafter, we performed in vitro validation using purified enzyme preparations. The in-silico and in vitro findings suggested that both enzymes could form N-isovaleryglycine albeit at lower affinities than their preferred substrates. Furthermore, an increase in glycine concentration does not result in an increase in N-isovalerylglycine formation. The results from the critical literature appraisal, in-silico, and in vitro validation, suggest the importance of further investigating the reaction kinetics and binding behaviors between these substrates and enzymes in understanding the pathophysiology of IVA.
Indoor spaces exhibit microbial compositions that are distinctly dissimilar from one another and from outdoor spaces. Unique in this regard, and a topic that has only recently come into focus, is the microbiome of hospitals. While the benefits of knowing exactly which microorganisms propagate how and where in hospitals are undoubtedly beneficial for preventing hospital-acquired infections, there are, to date, no standardized procedures on how to best study the hospital microbiome. Our study aimed to investigate the microbiome of hospital sanitary facilities, outlining the extent to which hospital microbiome analyses differ according to sample-preparation protocol. For this purpose, fifty samples were collected from two separate hospitals—from three wards and one hospital laboratory—using two different storage media from which DNA was extracted using two different extraction kits and sequenced with two different primer pairs (V1–V2 and V3–V4). There were no observable differences between the sample-preservation media, small differences in detected taxa between the DNA extraction kits (mainly concerning Propionibacteriaceae), and large differences in detected taxa between the two primer pairs V1–V2 and V3–V4. This analysis also showed that microbial occurrences and compositions can vary greatly from toilets to sinks to showers and across wards and hospitals. In surgical wards, patient toilets appeared to be characterized by lower species richness and diversity than staff toilets. Which sampling sites are the best for which assessments should be analyzed in more depth. The fact that the sample processing methods we investigated (apart from the choice of primers) seem to have changed the results only slightly suggests that comparing hospital microbiome studies is a realistic option. The observed differences in species richness and diversity between patient and staff toilets should be further investigated, as these, if confirmed, could be a result of excreted antimicrobials.
Universities, Entrepreneurship and Enterprise Development in Africa – Conference Proceedings 2022
(2023)
These proceedings are the outcome of the 10th annual joint conference on "Universities Entrepreneurship and Enterprise Development in Africa".
These proceedings document the culmination of the 10th annual joint conference on "Universities, Entrepreneurship and Enterprise Development in Africa," which was held on the 8th and 9th of September 2022 at the Campus Sankt Augustin, Hochschule Bonn-Rhein-Sieg University of Applied Sciences. The conference was a collaboration between the University of Cape Coast, Ghana, and Hochschule Bonn-Rhein-Sieg University of Applied Sciences, Germany.
Accurate forecasting of solar irradiance is crucial for the integration of solar energy into the power grid, power system planning, and the operation of solar power plants. The Weather Research and Forecasting (WRF) model, with its solar radiation (WRF-Solar) extension, has been used to forecast solar irradiance in various regions worldwide. However, the application of the WRF-Solar model for global horizontal irradiance (GHI) forecasting in West Africa, specifically in Ghana, has not been studied. This study aims to evaluate the performance of the WRF-Solar model for GHI forecasting in Ghana, focusing on 3 health centers (Kologo, Kumasi and Akwatia) for the year 2021. We applied a two one-way nested domain (D1=15 km and D2=3 km) to investigate the ability of the WRF solar model to forecast GHI up to 72 hours in advance under different atmospheric conditions. The initial and lateral boundary conditions were taken from the ECMWF operational forecasts. In addition, the optical aerosol depth (AOD) data at 550 nm from the Copernicus Atmosphere Monitoring Service (CAMS) were considered. The study uses statistical metrics such as mean bias error (MBE), root mean square error (RMSE), to evaluate the performance of the WRF-Solar model with the observational data obtained from automatic weather stations in the three health centers in Ghana. The results of this study will contribute to the understanding of the capabilities and limitations of the WRF-Solar model for forecasting GHI in West Africa, particularly in Ghana, and provide valuable information for stakeholders involved in solar energy generation and grid integration towards optimized management of in the region.
The representation, or encoding, utilized in evolutionary algorithms has a substantial effect on their performance. Examination of the suitability of widely used representations for quality diversity optimization (QD) in robotic domains has yielded inconsistent results regarding the most appropriate encoding method. Given the domain-dependent nature of QD, additional evidence from other domains is necessary. This study compares the impact of several representations, including direct encoding, a dictionary-based representation, parametric encoding, compositional pattern producing networks, and cellular automata, on the generation of voxelized meshes in an architecture setting. The results reveal that some indirect encodings outperform direct encodings and can generate more diverse solution sets, especially when considering full phenotypic diversity. The paper introduces a multi-encoding QD approach that incorporates all evaluated representations in the same archive. Species of encodings compete on the basis of phenotypic features, leading to an approach that demonstrates similar performance to the best single-encoding QD approach. This is noteworthy, as it does not always require the contribution of the best-performing single encoding.
In memoriam Willy Lehnert
(2023)
Pitfalls of using sequence databases for heterologous expression studies - a technical review
(2023)
Synthesis of DNA fragments based on gene sequences available in public resources has become an efficient and affordable method that gradually replaced traditional cloning efforts such as PCR cloning from cDNA. However, database entries based on genome sequencing results are prone to errors which can lead to false sequence information and, ultimately, errors in functional characterization of proteins such as ion channels and transporters in heterologous expression systems. We have identified five common problems that repeatedly appear in public resources: 1) Not every gene has yet been annotated; 2) Not all gene annotations are necessarily correct; 3) Transcripts may contain automated corrections; 4) There are mismatches between gene, mRNA, and protein sequences; and 5) Splicing patterns often lack experimental validation. This technical review highlights and provides a strategy to bypass these issues in order to avoid critical mistakes that could impact future studies of any gene/protein of interest in heterologous expression systems. Abstract figure legend Projects involving heterologous gene expression are often characterised by similar steps. Initially, database research (A) is necessary to retrieve information of full of partial sequences of a gene of interest. A multitude of genome assemblies are annotated and deposited in public databases or that are available for refined search options using individual sequence information. The search results need to be scrutinised and compared to already available information (B). Once the sequence has been determined, DNA synthesis (C) by PCR or commercial synthesis are necessary for further cloning procedures (D). Eventually, the DNA needs to be transfected (E) and expressed in, e.g., eukaryotic cells (F). Finally, the expression of the gene of interest needs to be documented and its function analysed (G). This article is protected by copyright. All rights reserved.
Microbiome analyses are essential for understanding microorganism composition and diversity, but interpretation is often challenging due to biological and technical variables. DNA extraction is a critical step that can significantly bias results, particularly in samples containing a high abundance of challenging-to-lyse microorganisms. Taking into consideration the distinctive microenvironments observed in different bodily locations, our study sought to assess the extent of bias introduced by suboptimal bead-beating during DNA extraction across diverse clinical sample types. The question was whether complex targeted extraction methods are always necessary for reliable taxonomic abundance estimation through amplicon sequencing or if simpler alternatives are effective for some sample types. Hence, for four different clinical sample types (stool, cervical swab, skin swab, and hospital surface swab samples), we compared the results achieved from extracting targeted manual protocols routinely used in our research lab for each sample type with automated protocols specifically not designed for that purpose. Unsurprisingly, we found that for the stool samples, manual extraction protocols with vigorous bead-beating were necessary in order to avoid erroneous taxa proportions on all investigated taxonomic levels and, in particular, false under- or overrepresentation of important genera such as Blautia, Faecalibacterium, and Parabacteroides. However, interestingly, we found that the skin and cervical swab samples had similar results with all tested protocols. Our results suggest that the level of practical automation largely depends on the expected microenvironment, with skin and cervical swabs being much easier to process than stool samples. Prudent consideration is necessary when extending the conclusions of this study to applications beyond rough estimations of taxonomic abundance.
The epithelial sodium channel (ENaC) is a key regulator of sodium homeostasis that contributes to blood pressure control. ENaC open probability is adjusted by extracellular sodium ions, a mechanism referred to as sodium self-inhibition (SSI). With a growing number of identified ENaC gene variants associated with hypertension, there is an increasing demand for medium- to high-throughput assays allowing the detection of alterations in ENaC activity and SSI. We evaluated a commercially available automated two-electrode voltage-clamp (TEVC) system that records transmembrane currents of ENaC-expressing Xenopus oocytes in 96-well microtiter plates. We employed guinea pig, human and Xenopus laevis ENaC orthologs that display specific magnitudes of SSI. While demonstrating some limitations over traditional TEVC systems with customized perfusion chambers, the automated TEVC system was able to detect the established SSI characteristics of the employed ENaC orthologs. We were able to confirm a reduced SSI in a gene variant, leading to C479R substitution in the human α-ENaC subunit that has been reported in Liddle syndrome. In conclusion, automated TEVC in Xenopus oocytes can detect SSI of ENaC orthologs and variants associated with hypertension. For precise mechanistic and kinetic analyses of SSI, optimization for faster solution exchange rates is recommended.
Atomic oxygen is a key species in the mesosphere and thermosphere of Venus. It peaks in the transition region between the two dominant atmospheric circulation patterns, the retrograde super-rotating zonal flow below 70 km and the subsolar to antisolar flow above 120 km altitude. However, past and current detection methods are indirect and based on measurements of other molecules in combination with photochemical models. Here, we show direct detection of atomic oxygen on the dayside as well as on the nightside of Venus by measuring its ground-state transition at 4.74 THz (63.2 µm). The atomic oxygen is concentrated at altitudes around 100 km with a maximum column density on the dayside where it is generated by photolysis of carbon dioxide and carbon monoxide. This method enables detailed investigations of the Venusian atmosphere in the region between the two atmospheric circulation patterns in support of future space missions to Venus.
Host-derived succinate accumulates in the airways during bacterial infection. Here, we show that luminal succinate activates murine tracheal brush (tuft) cells through a signaling cascade involving the succinate receptor 1 (SUCNR1), phospholipase Cβ2, and the cation channel transient receptor potential channel subfamily M member 5 (TRPM5). Stimulated brush cells then trigger a long-range Ca2+ wave spreading radially over the tracheal epithelium through a sequential signaling process. First, brush cells release acetylcholine, which excites nearby cells via muscarinic acetylcholine receptors. From there, the Ca2+ wave propagates through gap junction signaling, reaching also distant ciliated and secretory cells. These effector cells translate activation into enhanced ciliary activity and Cl- secretion, which are synergistic in boosting mucociliary clearance, the major innate defense mechanism of the airways. Our data establish tracheal brush cells as a central hub in triggering a global epithelial defense program in response to a danger-associated metabolite.
Microorganisms not only contribute to the spoilage of food but can also cause illnesses through consumption. Consumer concerns and doubts about the shelf life of the products and the resulting enormous amounts of food waste have led to a demand for a rapid, robust, and non-destructive method for the detection of microorganisms, especially in the food sector. Therefore, a rapid and simple sampling method for the Raman- and infrared (IR)-microspectroscopic study of microorganisms associated with spoilage processes was developed. For subsequent evaluation pre-processing routines, as well as chemometric models for classification of spoilage microorganisms were developed. The microbiological samples are taken using a disinfectable sampling stamp and measured by microspectroscopy without the usual pre-treatments such as purification separation, washing, and centrifugation. The resulting complex multivariate data sets were pre-processed, reduced by principal component analysis, and classified by discriminant analysis. Classification of independent unlabeled test data showed that microorganisms could be classified at genus, species, and strain levels with an accuracy of 96.5 % (Raman) and 94.5 % (IR), respectively, despite large biological differences and novel sampling strategies. As bacteria are exposed to constantly changing conditions and their adaptation mechanisms may make them inaccessible to conventional measurement methods, the methods and models developed were investigated for their suitability for microorganisms exposed to stress. Compared to normal growth conditions, spectral changes in lipids, polysaccharides, nucleic acids, and proteins were observed in microorganisms exposed to stress. Models were developed to discriminate microorganisms, independent of the involvement of various stress factors and storage times. Classification of the investigated bacteria yielded accuracies of 97.6 % (Raman) and 96.6 % (IR), respectively, and a robust and meaningful model was developed to discriminate different microorganisms at the genus, species, and strain levels. The obtained results are very promising and show that the methods and models developed for the discrimination of microorganisms as well as the investigation of stress factors on microorganisms by means of Raman- and IR-microspectroscopy have the potential to be used, for example, in the food sector for the rapid determination of surface contamination.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. Specifically, the aerosol (cloud) optical depth is inferred during clear sky (completely overcast) conditions. The method is tested on data from two measurement campaigns that took place in Allgäu, Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 minute resolution, the hourly global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 11.45 W m−2, averaged over the two campaigns, whereas for the retrieval using coarser 15 minute power data the mean bias error is 16.39 W m−2.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a one-dimensional radiative transfer simulation, and the results are compared to both satellite retrievals as well as data from the COSMO weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and are properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
Estimates of global horizontal irradiance (GHI) from reanalysis and satellite-based data are the most important information for the design and monitoring of PV systems in Africa, but their quality is unknown due to the lack of in situ measurements. In this study, we evaluate the performance of hourly GHI from state-of-the-art reanalysis and satellite-based products (ERA5, CAMS, MERRA-2, and SARAH-2) with 37 quality-controlled in situ measurements from novel meteorological networks established in Burkina Faso and Ghana under different weather conditions for the year 2020. The effects of clouds and aerosols are also considered in the analysis by using common performance measures for the main quality attributes and a new overall performance value for the joint assessment. The results show that satellite data performs better than reanalysis data under different atmospheric conditions. Nevertheless, both data sources exhibit significant bias of more than 150 W/m2 in terms of RMSE under cloudy skies compared to clear skies. The new measure of overall performance clearly shows that the hourly GHI derived from CAMS and SARAH-2 could serve as viable alternative data for assessing solar energy in the different climatic zones of West Africa.
PURPOSE
Cervical cancer (CC) is caused by a persistent high-risk human papillomavirus (hrHPV) infection. The cervico-vaginal microbiome may influence the development of (pre)cancer lesions. Aim of the study was (i) to evaluate the new CC screening program in Germany for the detection of high-grade CC precursor lesions, and (ii) to elucidate the role of the cervico-vaginal microbiome and its potential impact on cervical dysplasia.
METHODS
The microbiome of 310 patients referred to colposcopy was determined by amplicon sequencing and correlated with clinicopathological parameters.
RESULTS
Most patients were referred for colposcopy due to a positive hrHPV result in two consecutive years combined with a normal PAP smear. In 2.1% of these cases, a CIN III lesion was detected. There was a significant positive association between the PAP stage and Lactobacillus vaginalis colonization and between the severity of CC precursor lesions and Ureaplasma parvum.
CONCLUSION
In our cohort, the new cervical cancer screening program resulted in a low rate of additional CIN III detected. It is questionable whether these cases were only identified earlier with additional HPV testing before the appearance of cytological abnormalities, or the new screening program will truly increase the detection rate of CIN III in the long run. Colonization with U. parvum was associated with histological dysplastic lesions. Whether targeted therapy of this pathogen or optimization of the microbiome prevents dysplasia remains speculative.
Stably stratified Taylor–Green vortex simulations are performed by lattice Boltzmann methods (LBM) and compared to other recent works using Navier–Stokes solvers. The density variation is modeled with a separate distribution function in addition to the particle distribution function modeling the flow physics. Different stencils, forcing schemes, and collision models are tested and assessed. The overall agreement of the lattice Boltzmann solutions with reference solutions from other works is very good, even when no explicit subgrid model is used, but the quality depends on the LBM setup. Although the LBM forcing scheme is not decisive for the quality of the solution, the choice of the collision model and of the stencil are crucial for adequate solutions in underresolved conditions. The LBM simulations confirm the suppression of vertical flow motion for decreasing initial Froude numbers. To gain further insight into buoyancy effects, energy decay, dissipation rates, and flux coefficients are evaluated using the LBM model for various Froude numbers.
Neutral buoyancy has been used as an analog for microgravity from the earliest days of human spaceflight. Compared to other options on Earth, neutral buoyancy is relatively inexpensive and presents little danger to astronauts while simulating some aspects of microgravity. Neutral buoyancy removes somatosensory cues to the direction of gravity but leaves vestibular cues intact. Removal of both somatosensory and direction of gravity cues while floating in microgravity or using virtual reality to establish conflicts between them has been shown to affect the perception of distance traveled in response to visual motion (vection) and the perception of distance. Does removal of somatosensory cues alone by neutral buoyancy similarly impact these perceptions? During neutral buoyancy we found no significant difference in either perceived distance traveled nor perceived size relative to Earth-normal conditions. This contrasts with differences in linear vection reported between short- and long-duration microgravity and Earth-normal conditions. These results indicate that neutral buoyancy is not an effective analog for microgravity for these perceptual effects.
Intelligent virtual agents provide a framework for simulating more life-like behavior and increasing plausibility in virtual training environments. They can improve the learning process if they portray believable behavior that can also be controlled to support the training objectives. In the context of this thesis, cognitive agents are considered a subset of intelligent virtual agents (IVA) with the focus on emulating cognitive processes to achieve believable behavior. The complexity of employed algorithms, however, is often limited since multiple agents need to be simulated in real-time. Available solutions focus on a subset of the indicated aspects: plausibility, controllability, or real-time capability (scalability). Within this thesis project, an agent architecture for attentive cognitive agents is developed that considers all three aspects at once. The result is a lightweight cognitive agent architecture that is customizable to application-specific requirements. A generic trait-based personality model influences all cognitive processes, facilitating the generation of consistent and individual behavior. An additional mapping process provides a formalized mechanism to transfer results of psychological studies to the architecture. Personality profiles are combined with an emotion model to achieve situational behavior adaptation. Which action an agent selects in a situation also influences plausibility. An integral element of this selection process is an agent's knowledge about its world. Therefore, synthetic perception is modeled and integrated into the architecture to provide a credible knowledge base. The developed perception module includes a unified sensor interface, a memory hierarchy, and an attention process. With the presented realization of the architecture (CAARVE), it is possible for the first time to simulate cognitive agents, whose behaviors are simultaneously computable in real-time and controllable. The architecture's applicability is demonstrated by integrating an agent-based traffic simulation built with CAARVE into a bicycle simulator for road-safety education. The developed ideas and their realization are evaluated within this work using different strategies and scenarios. For example, it is shown how CAARVE agents utilize personality profiles and emotions to plausibly resolve deadlocks in traffic simulations. Controllability and adaptability are demonstrated in additional scenarios. Using the realization, 200 agents can be simulated in real-time (50 FPS), illustrating scalability. The achieved results verify that the developed architecture can generate plausible and controllable agent behavior in real-time. The presented concepts and realizations provide sound fundamentals to everyone interested in simulating IVA in real-time environments.
This research investigates the efficacy of multisensory cues for locating targets in Augmented Reality (AR). Sensory constraints can impair perception and attention in AR, leading to reduced performance due to factors such as conflicting visual cues or a restricted field of view. To address these limitations, the research proposes head-based multisensory guidance methods that leverage audio-tactile cues to direct users' attention towards target locations. The research findings demonstrate that this approach can effectively reduce the influence of sensory constraints, resulting in improved search performance in AR. Additionally, the thesis discusses the limitations of the proposed methods and provides recommendations for future research.
This paper presents a novel approach to address noise, vibration, and harshness (NVH) issues in electrically assisted bicycles (e-bikes) caused by the drive unit. By investigating and optimising the structural dynamics during early product development, NVH can decisively be improved and valuable resources can be saved, emphasising its significance for enhancing riding performance. The paper offers a comprehensive analysis of the e-bike drive unit’s mechanical interactions among relevant components, culminating—to the best of our knowledge—in the development of the first high-fidelity model of an entire e-bike drive unit. The proposed model uses the principles of elastic multi body dynamics (eMBD) to elucidate the structural dynamics in dynamic-transient calculations. Comparing power spectra between measured and simulated motion variables validates the chosen model assumptions. The measurements of physical samples utilise accelerometers, contactless laser Doppler vibrometry (LDV) and various test arrangements, which are replicated in simulations and provide accessibility to measure vibrations onto rotating shafts and stationary structures. In summary, this integrated system-level approach can serve as a viable starting point for comprehending and managing the NVH behaviour of e-bikes.
Quality diversity algorithms can be used to efficiently create a diverse set of solutions to inform engineers' intuition. But quality diversity is not efficient in very expensive problems, needing 100.000s of evaluations. Even with the assistance of surrogate models, quality diversity needs 100s or even 1000s of evaluations, which can make it use infeasible. In this study we try to tackle this problem by using a pre-optimization strategy on a lower-dimensional optimization problem and then map the solutions to a higher-dimensional case. For a use case to design buildings that minimize wind nuisance, we show that we can predict flow features around 3D buildings from 2D flow features around building footprints. For a diverse set of building designs, by sampling the space of 2D footprints with a quality diversity algorithm, a predictive model can be trained that is more accurate than when trained on a set of footprints that were selected with a space-filling algorithm like the Sobol sequence. Simulating only 16 buildings in 3D, a set of 1024 building designs with low predicted wind nuisance is created. We show that we can produce better machine learning models by producing training data with quality diversity instead of using common sampling techniques. The method can bootstrap generative design in a computationally expensive 3D domain and allow engineers to sweep the design space, understanding wind nuisance in early design phases.
The increasing ubiquity of Artificial Intelligence (AI) poses significant political consequences. The rapid proliferation of AI over the past decade has prompted legislators and regulators to attempt to contain AI’s technological consequences. For Germany, relevant design requirements have been expressed by the European Commission’s High-Level Expert Group on Artificial Intelligence (HLEG AI), and, at the national level, by the German government’s Data Ethics Commission (DEK) as well as the German Bundestag’s Commission of Inquiry on Artificial Intelligence (EKKI).
Several species of (poly)saccharides and organic acids can be found often simultaneously in various biological matrices, e.g., fruits, plant materials, and biological fluids. The analysis of such matrices sometimes represents a challenging task. Using Aloe vera (A. vera) plant materials as an example, the performance of several spectroscopic methods (80 MHz benchtop NMR, NIR, ATR-FTIR and UV-Vis) for the simultaneous analysis of quality parameters of this plant material was compared. The determined parameters include (poly)saccharides such as aloverose, fructose and glucose as well as organic acids (malic, lactic, citric, isocitric, acetic, fumaric, benzoic and sorbic acids). 500 MHz NMR and high-performance liquid chromatography (HPLC) were used as the reference methods.
UV-VIS data can be used only for identification of added preservatives (benzoic and sorbic acids) and drying agent (maltodextrin) and semiquantitative analysis of malic acid. NIR and MIR spectroscopies combined with multivariate regression can deliver more informative overview of A. vera extracts being able to additionally quantify glucose, aloverose, citric, isocitric, malic, lactic acids and fructose. Low-field NMR measurements can be used for the quantification of aloverose, glucose, malic, lactic, acetic, and benzoic acids. The benchtop NMR method was successfully validated in terms of robustness, stability, precision, reproducibility and limit of detection (LOD) and quantification (LOQ), respectively.
All spectroscopic techniques are useful for the screening of (poly)saccharides and organic acids in plant extracts and should be applied according to its availability as well as information and confidence required for the specific analytical goal. Benchtop NMR spectroscopy seems to be the most feasible solution for quality control of A. vera products.
Monitoring the content of dissolved ozone in purified water is often mandatory to ensure the appropriate levels of disinfection and sanitization. However, quantification bears challenges as colorimetric assays require laborious off-line analysis, while commercially available instruments for electrochemical process analysis are expensive and often lack the possibility for miniaturization and discretionary installation. In this study, potentiometric ionic polymer metal composite (IPMC) sensors for the determination of dissolved ozone in ultrapure water (UPW) systems are presented. Commercially available polymer electrolyte membranes are treated via an impregnation-reduction method to obtain nanostructured platinum layers. By applying 25 different synthesis conditions, layer thicknesses of 2.2 to 12.6 µm are obtained. Supporting radiographic analyses indicate that the platinum concentration of the impregnation solution has the highest influence on the obtained metal loading. The sensor response behavior is explained by a Langmuir pseudo-isotherm model and allows the quantification of dissolved ozone to trace levels of less than 10 µg L−1. Additional statistical evaluations show that the expected Pt loading and radiographic blackening levels can be predicted with high accuracy and significance (R2adj. > 0.90, p < 10−10) solely from given synthesis conditions.
This research studies in detail four different assays, namely DPPH (2,2-diphenyl-1-picrylhydrazyl), ABTS (2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid)), FRAP (ferric ion reducing antioxidant potential) and FC (Folin-Ciocalteu), to determine the antioxidant capacity of standard substances as well as 50 organosolv lignins, and two kraft lignins. The coefficient of variation was determined for each method and was lowest for ABTS and highest for DPPH. The best correlation was found for FRAP and FC, which both rely on a single electron transfer mechanism. A good correlation between ABTS, FRAP and FC, respectively, could be observed, even though ABTS relies on a more complex reaction mechanism. The DPPH assay merely correlates with the others, implying that it reflects different antioxidative attributes due to a different reaction mechanism. Lignins obtained from paulownia and silphium have been investigated for the first time regarding their antioxidant capacity. Paulownia lignin is in the same range as beech wood lignin, while silphium lignin resembles wheat straw lignin. Miscanthus lignin is an exception from the grass lignins and possesses a significantly higher antioxidant capacity. All lignins possess a good antioxidant capacity and thus are promising candidates for various applications, e. g. as additives in food packaging or for biomedical purposes.
Electric vehicles (EVs) are rapidly growing in popularity, but range variability has become an important research area with significant implications for EV performance, usability, and overall market adoption. This study aims to unravel the complexities of range variability by examining the contributing factors and offering innovative strategies to mitigate these differences during pack design. Through a detailed analysis of cell parameter deviation, cell connections, battery configuration, battery pack size, and driving behavior, the research illuminates their impact on extractable energy and driving range. The study employed a comprehensive approach and conducted systematic simulation-based experimentation to identify the optimal battery pack configuration based on maximum extractable energy, minimal variability and maximum range. The results reveal insights into the relationship between discharge rate and battery pack performance, and the impact of cell parameter variations on pack energy output. This research advances the understanding of EV performance optimisation, reduces pack-to-pack variability, and extends battery pack lifespan.