Refine
Departments, institutes and facilities
- Fachbereich Informatik (68)
- Fachbereich Angewandte Naturwissenschaften (59)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (43)
- Fachbereich Ingenieurwissenschaften und Kommunikation (38)
- Fachbereich Wirtschaftswissenschaften (37)
- Institut für funktionale Gen-Analytik (IFGA) (36)
- Fachbereich Sozialpolitik und Soziale Sicherung (22)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (20)
- Institut für Cyber Security & Privacy (ICSP) (16)
- Institut für Verbraucherinformatik (IVI) (16)
Document Type
- Article (118)
- Conference Object (89)
- Part of a Book (27)
- Preprint (15)
- Book (monograph, edited volume) (7)
- Research Data (6)
- Doctoral Thesis (5)
- Working Paper (4)
- Conference Proceedings (2)
- Part of Periodical (1)
Year of publication
- 2021 (276) (remove)
Language
- English (276) (remove)
Keywords
- Augmented Reality (4)
- Machine Learning (4)
- Big Data Analysis (3)
- Kenya (3)
- Machine learning (3)
- Usable Security (3)
- recovery (3)
- sustainability (3)
- AML (2)
- Africa (2)
Software developers build complex systems using plenty of third-party libraries. Documentation is key to understand and use the functionality provided via the libraries’ APIs. Therefore, functionality is the main focus of contemporary API documentation, while cross-cutting concerns such as security are almost never considered at all, especially when the API itself does not provide security features. Documentations of JavaScript libraries for use in web applications, e.g., do not specify how to add or adapt a Content Security Policy (CSP) to mitigate content injection attacks like Cross-Site Scripting (XSS). This is unfortunate, as security-relevant API documentation might have an influence on secure coding practices and prevailing major vulnerabilities such as XSS. For the first time, we study the effects of integrating security-relevant information in non-security API documentation. For this purpose, we took CSP as an exemplary study object and extended the official Google Maps JavaScript API documentation with security-relevant CSP information in three distinct manners. Then, we evaluated the usage of these variations in a between-group eye-tracking lab study involving N=49 participants. Our observations suggest: (1) Developers are focused on elements with code examples. They mostly skim the documentation while searching for a quick solution to their programming task. This finding gives further evidence to results of related studies. (2) The location where CSP-related code examples are placed in non-security API documentation significantly impacts the time it takes to find this security-relevant information. In particular, the study results showed that the proximity to functional-related code examples in documentation is a decisive factor. (3) Examples significantly help to produce secure CSP solutions. (4) Developers have additional information needs that our approach cannot meet.
Overall, our study contributes to a first understanding of the impact of security-relevant information in non-security API documentation on CSP implementation. Although further research is required, our findings emphasize that API producers should take responsibility for adequately documenting security aspects and thus supporting the sensibility and training of developers to implement secure systems. This responsibility also holds in seemingly non-security relevant contexts.
Sharing economies enabled by technical platforms have been studied regarding their economic, legal, and social effects, as well as with regard to their possible influences on CSCW topics such as work, collaboration, and trust. While a lot current research is focusing on the sharing economy and related communities, there is little work addressing the phenomenon from a socio-technical point of view. Our workshop is meant to address this gap. Building on research themes and discussion from last year’s ECSCW, we seek to engage deeper with topics such as novel socio-technical approaches for enabling sharing communities, discussing issues around digital consumer and worker protection, as well as emerging challenges and opportunities of existing platforms and approaches.
3-Hydroxyisobutyrate Dehydrogenase (HIBADH) deficiency - a novel disorder of valine metabolism
(2021)
3-Hydroxyisobutyric acid (3HiB) is an intermediate in the degradation of the branched-chain amino acid valine. Disorders in valine degradation can lead to 3HiB accumulation and its excretion in the urine. This article describes the first two patients with a new metabolic disorder, 3-hydroxyisobutyrate dehydrogenase (HIBADH) deficiency, its phenotype and its treatment with a low-valine diet. The detected mutation in the HIBADH gene leads to nonsense-mediated mRNA decay of the mutant allele and to a complete loss-of-function of the enzyme. Under strict adherence to a low-valine diet a rapid decrease of 3HiB excretion in the urine was observed. Due to limited patient numbers and intrafamilial differences in phenotype with one affected and one unaffected individual, the clinical phenotype of HIBADH deficiency needs further evaluation.
Animal models are often needed in cancer research but some research questions may be answered with other models, e.g., 3D replicas of patient-specific data, as these mirror the anatomy in more detail. We, therefore, developed a simple eight-step process to fabricate a 3D replica from computer tomography (CT) data using solely open access software and described the method in detail. For evaluation, we performed experiments regarding endoscopic tumor treatment with magnetic nanoparticles by magnetic hyperthermia and local drug release. For this, the magnetic nanoparticles need to be accumulated at the tumor site via a magnetic field trap. Using the developed eight-step process, we printed a replica of a locally advanced pancreatic cancer and used it to find the best position for the magnetic field trap. In addition, we described a method to hold these magnetic field traps stably in place. The results are highly important for the development of endoscopic tumor treatment with magnetic nanoparticles as the handling and the stable positioning of the magnetic field trap at the stomach wall in close proximity to the pancreatic tumor could be defined and practiced. Finally, the detailed description of the workflow and use of open access software allows for a wide range of possible uses.
New cars are increasingly "connected" by default. Since not having a car is not an option for many people, understanding the privacy implications of driving connected cars and using their data-based services is an even more pressing issue than for expendable consumer products. While risk-based approaches to privacy are well established in law, they have only begun to gain traction in HCI. These approaches are understood not only to increase acceptance but also to help consumers make choices that meet their needs. To the best of our knowledge, perceived risks in the context of connected cars have not been studied before. To address this gap, our study reports on the analysis of a survey with 18 open-ended questions distributed to 1,000 households in a medium-sized German city. Our findings provide qualitative insights into existing attitudes and use cases of connected car features and, most importantly, a list of perceived risks themselves. Taking the perspective of consumers, we argue that these can help inform consumers about data use in connected cars in a user-friendly way. Finally, we show how these risks fit into and extend existing risk taxonomies from other contexts with a stronger social perspective on risks of data use.
One of the biggest challenges faced by many tech start-ups from developed markets is to have validated market-fit products/services and to see their solutions implemented. In several sectors, stringent regulations, and the law of handicap of head start at home can be hurdles that limit the development and even the survival potential of theses start-ups. Tech start-ups seeking implementation, learning, and legitimacy may have a solution in expanding into emerging markets. Emerging markets offer both business opportunities in sectors in need of new technologies as they are “fertile grounds” for developing and testing internationalisation business models. We present here a process designed to help tech start-ups to identify, access, shape and seize these opportunities and to overcome their own specificities and emerging markets specificities. The three phases of the proposed process cover entry node concept, partnership, and business, operating and revenue joint models’ development. DesignScience Research Paradigm is used for the design and evaluation of the process. To show the relevance of this process, a case study on the expansion in Morocco of a Dutch start-up active in e-health is used. The study shows the importance of the process for the embeddedness in a local relevant value network with a relevant adopter’s system, a key enabler to achieve time and cost-effective expansion in that specific business and institutional contexts. A pilot to assess the proposed models and evidence of benefits is under development. To boost their chances of growth tech start-ups from developed markets should consider expansion into emerging markets in their strategy. It would be beneficial that policy makers adopt a strategy by which to assist tech start-ups in accessing value networks in emerging markets. It is also important for policy makers from emerging markets to consider developing schemes to attract tech start-ups from developed markets.
A qualitative study of Machine Learning practices and engineering challenges in Earth Observation
(2021)
Machine Learning (ML) is ubiquitously on the advance. Like many domains, Earth Observation (EO) also increasingly relies on ML applications, where ML methods are applied to process vast amounts of heterogeneous and continuous data streams to answer socially and environmentally relevant questions. However, developing such ML- based EO systems remains challenging: Development processes and employed workflows are often barely structured and poorly reported. The application of ML methods and techniques is considered to be opaque and the lack of transparency is contradictory to the responsible development of ML-based EO applications. To improve this situation a better understanding of the current practices and engineering-related challenges in developing ML-based EO applications is required. In this paper, we report observations from an exploratory study where five experts shared their view on ML engineering in semi-structured interviews. We analysed these interviews with coding techniques as often applied in the domain of empirical software engineering. The interviews provide informative insights into the practical development of ML applications and reveal several engineering challenges. In addition, interviewees participated in a novel workflow sketching task, which provided a tangible reflection of implicit processes. Overall, the results confirm a gap between theoretical conceptions and real practices in ML development even though workflows were sketched abstractly as textbook-like. The results pave the way for a large-scale investigation on requirements for ML engineering in EO.
The dataset contains the following data from successful and failed executions of the Toyota HSR robot placing a book on a shelf.
RGB images from the robot's head camera
Depth images from the robot's head camera
Rendered images of the robot's 3D model from the point of view of the robot's head camera
Force-torque readings from a wrist-mounted force-torque sensor
Joint efforts, velocities and positions
extrinsic and intrinsic camera calibration parameters
frame-level anomaly annotations
The anomalies that occur during execution include:
the manipulated book falling down
books on the shelf being disturbed significantly
camera occlusions
robot being disturbed by an external collision
The dataset is split into a train, validation and test set with the following number of trials:
Train: 48 successful trials
Validation: 6 successful trials
Test: 60 anomalous trials and 7 successful trials
Designs for decorative surfaces, such as flooring, must cover several square meters to avoid visible repeats. While the use of desktop systems is feasible to support the designer, it is challenging for a non-domain expert to get the right impression of the appearances of surfaces due to limited display sizes and a potentially unnatural interaction with digital designs. At the same time, large-format editing of structure and gloss is becoming increasingly important. Advances in the printing industry allow for more faithful reproduction of such surface details. Unfortunately, existing systems for visualizing surface designs cannot adequately account for gloss, especially for non-domain experts. Here, the complex interaction of light sources and the camera position must be controlled using software controls. As a result, only small parts of the data set can be properly inspected at a time. Also, real-world lighting is not considered here. This work presents a system for the processing and realistic visualization of large decorative surface designs. To this end, we present a tabletop solution that is coupled to a live 360° video feed and a spatial tracking system. This allows for reproducing natural view-dependent effects like real-world reflections, live image-based lighting, and the interaction with the design using virtual light sources employing natural interaction techniques that allow for a more accurate inspection even for non-domain experts.
The article explores SME (Small and Medium Sized Enterprises) brand strategies as a means to position and successfully engage in competitive markets. A derived typology of brand strategy types deals with social profiling and sheds light on brand strategy internalization of two current managerial paradigms—sustainability and co-creation. N = 895 German SME wineries were examined, leaning on a netnographic analysis of predominantly websites and social media interactions. A two-step clustering method thereby identified eight winery SME brand strategy types. The importance of sustainability across the identified eight brand strategy types is significant. Co-creation turned out to be a key profiling trait characterizing one brand strategy type. The typology illustrates strategic richness, with brand strategies leaning predominantly on traditional values, on sustainability, on external reputation, or on more innovative customer centric concepts such as co-creation. Hereby, the typology and the identified brand levers invite to strategically design brand management, governance, and sustainability. Wineries which focus on traditional positioning and legitimacy were found to be cautious in deploying co-creation through social media. Winery brands that are characterized by engagement in digital co-creation apparently either tend to expand their scope or partially combine it with traditional values, making them the most diverse type identified. Sustainability obviously needs to be addressed by all brand strategies. Despite industry and country focus, the analyses illustrate the relevance of socially-oriented profiling and highlights that sustainability has reached a status of a fundamental business approach still allowing to differentiate thereon. Furthermore, the business models of the SMEs need to deliver communicated values.
Actors
(2021)
Social protection is for many international organizations a state’s affair.1 While the state definitely plays an important role, the state is by far not the only actor and there is no predefined institutional arrangement of how social protection should be implemented. An exclusive focus on the state would therefore be short-sighted when assessing and comparing the performance of social protection systems. It is hence important to understand the mix of actors involved, the type of contribution they can make to social protection and their modes of cooperation. This contribution will therefore first sketch out the role and interplay of the main actors in social protection and then challenge some of the common assumptions made around how roles are best allocated in the social protection system concerning the providers of informal social protection, the private sector, civil society organizations (CSO) as well as international actors.
Solving transport network problems can be complicated by non-linear effects. In the particular case of gas transport networks, the most complex non-linear elements are compressors and their drives. They are described by a system of equations, composed of a piecewise linear ‘free’ model for the control logic and a non-linear ‘advanced’ model for calibrated characteristics of the compressor. For all element equations, certain stability criteria must be fulfilled, providing the absence of folds in associated system mapping. In this paper, we consider a transformation (warping) of a system from the space of calibration parameters to the space of transport variables, satisfying these criteria. The algorithm drastically improves stability of the network solver. Numerous tests on realistic networks show that nearly 100% convergence rate of the solver is achieved with this approach.
The clear-sky radiative effect of aerosol-radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear sky detection algorithm is used to identify cloud free observations. Considered are measurements of the shortwave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). Clear sky models used are MMAC, MRMv6.1, METSTAT, ESRA, Heliosat-1, CEM and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace-gases and aerosol from CAMS reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and assymetrie parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as average over the clear sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterisation, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear sky models, the largest level of agreement is shown by the ESRA and MRMv6.1 models.
The clear-sky radiative effect of aerosol–radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear-sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear-sky detection algorithm is used to identify cloud-free observations. Considered are measurements of the short-wave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). The clear-sky models used are the Modified MAC model (MMAC), the Meteorological Radiation Model (MRM) v6.1, the Meteorological–Statistical solar radiation model (METSTAT), the European Solar Radiation Atlas (ESRA), Heliosat-1, the Center for Environment and Man solar radiation model (CEM), and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace gases and aerosol from the Copernicus Atmosphere Monitoring Service (CAMS) reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and asymmetry parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear-sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as the average over the clear-sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterization, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear-sky models, the largest level of agreement is shown by the ESRA and MRM v6.1 models.
This study sought to apply the Structure Conduct Performance paradigm to Africa´s air transport landscape in general. To do that, it examines the past, present, and future expectations of four of Sub-Saharan Africa’s biggest aviation economies, namely South Africa, Kenya, Ethiopia, and Nigeria. Secondary data containing historical passenger traffic was analysed, and predictions for growth in the next ten years were proposed. The findings suggest that the experience of the existing liberalization initiatives, such as the Yamoussoukro Declaration (YD), has produced less than expected benefits. However, the future of aviation in Africa is somewhat positive, with a growth trajectory expected to follow a linear and gradual path supported by various initiatives, including the Single African Air Transport Market (SAATM) and the African Continental Free Trade Area (AFCTA). The study’s contribution is to illuminate the current discourse on the aviation sector in Africa through the Structure-Conduct-Performance theory paradigm and suggests a conceptual model that could be applied to future studies relating to aviation in Africa.
Voice assistants (VA) collect data about users’ daily life including interactions with other connected devices, musical preferences, and unintended interactions. While users appreciate the convenience of VAs, their understanding and expectations of data collection by vendors are often vague and incomplete. By making the collected data explorable for consumers, our research-through-design approach seeks to unveil design resources for fostering data literacy and help users in making better informed decisions regarding their use of VAs. In this paper, we present the design of an interactive prototype that visualizes the conversations with VAs on a timeline and provides end users with basic means to engage with data, for instance allowing for filtering and categorization. Based on an evaluation with eleven households, our paper provides insights on how users reflect upon their data trails and presents design guidelines for supporting data literacy of consumers in the context of VAs.
Many workers experience their jobs as effortful or even stressful, which can result in strain. Although recovery from work would be an adaptive strategy to prevent the adverse effects of work-related strain, many workers face problems finding enough time to rest and to mentally disconnect from work during nonwork time. What goes on in workers’ minds after a stressful workday? What is it about their jobs that makes them think about their work? This special issue aims to bridge the gap between research on recovery processes mainly examined in Occupational Health Psychology, and research on work stress and working hours, often investigated in the field of Human Resource Management. We first summarize conceptual and theoretical streams from both fields of research. In the following, we discuss the contributions of the five special issue papers and conclude with key messages and directions for further research.
The analysis of used engine oils from industrial engines enables the study of engine wear and oil degradation in order to evaluate the necessity of oil changes. As the matrix composition of an engine oil strongly depends on its intended application, meaningful diagnostic oil analyses bear considerable challenges. Owing to the broad spectrum of available oil matrices, we have evaluated the applicability of using an internal standard and/or preceding sample digestion for elemental analysis of used engine oils via inductively coupled plasma optical emission spectroscopy (ICP OES). Elements originating from both wear particles and additives as well as particle size influence could be clearly recognized by their distinct digestion behaviour. While a precise determination of most wear elements can be achieved in oily matrix, the measurement of additives is performed preferably after sample digestion. Considering a dataset of physicochemical parameters and elemental composition for several hundred used engine oils, we have further investigated the feasibility of predicting the identity and overall condition of an unknown combustion engine using the machine learning system XGBoost. A maximum accuracy of 89.6% in predicting the engine type was achieved, a mean error of less than 10% of the observed timeframe in predicting the oil running time and even less than 4% for the total engine running time, based purely on common oil check data. Furthermore, obstacles and possibilities to improve the performance of the machine learning models were analysed and the factors that enabled the prediction were explored with SHapley Additive exPlanation (SHAP). Our results demonstrate that both the identification of an unknown engine as well as a lifetime assessment can be performed for a first estimation of the actual sample without requiring meticulous documentation.
Ice accumulation in the blades of wind turbines can cause them to describe anomalous rotations or no rotations at all, thus affecting the generation of electricity and power output. In this work, we investigate the problem of ice accumulation in wind turbines by framing it as anomaly detection of multi-variate time series. Our approach focuses on two main parts: first, learning low-dimensional representations of time series using a Variational Recurrent Autoencoder (VRAE), and second, using unsupervised clustering algorithms to classify the learned representations as normal (no ice accumulated) or abnormal (ice accumulated). We have evaluated our approach on a custom wind turbine time series dataset, for the two-classes problem (one normal versus one abnormal class), we obtained a classification accuracy of up to 96$\%$ on test data. For the multiple-class problem (one normal versus multiple abnormal classes), we present a qualitative analysis of the low-dimensional learned latent space, providing insights into the capacities of our approach to tackle such problem. The code to reproduce this work can be found here https://github.com/agrija9/Wind-Turbines-VRAE-Paper.
This study investigated the application potential of Black Soldier Fly Larva Hermetia illucens Stratiomyidae: Diptera (L.1758) for wastewater treatment and the removal potential of chemical oxygen demand, ammonia, and phosphorus of and liquid manure residue and municipal waste water containing 1% solids content. Black Soldier Fly Larva were found to reduce the concentration of chemical oxygen demand, but unfortunately, increase the concentration of ammonia and phosphorus. The ability of Black Soldier Fly Larva to feed on organic waste of Liquid manure residue showed that Black Soldier Fly Larva increase their weight by 365% in a solution with 12% solids content and by 595% in a solution having 6% solids content. The study also showed that Black Soldier Fly Larva have the ability to survive in a solution of 1% solids content and have the ability to reduce chemical oxygen demand by up to 86.4% for liquid manure residue and 46.9% for municipal wastewater after 24 hours. Generally, ammonia increased by 43.9% for Liquid manure residue and 98.6% for municipal wastewater. Total phosphorus showed an increase of 11.0% and 88.6% increase for liquid manure residue and municipal wastewater respectively over the 8-day study. Transparent environments tend to reduce the COD content more than the dark environment, both for the liquid manure residue (55.8% and 65.4%) and municipal wastewater (71.5% and 66.4%).
Atomic oxygen in the mesosphere and lower thermosphere measured by terahertz heterodyne spectroscopy
(2021)
Atomic oxygen is a main component of the mesosphere and lower thermosphere (MLT). The photochemistry and the energy balance of the MLT are governed by atomic oxygen. In addition, it is a tracer for dynamical motions in the MLT. It is difficult to measure with remote sensing techniques. Concentrations can be inferred indirectly from the oxygen air glow or from observations of OH, which is involved in photochemical processes related to atomic oxygen. Such measurements have been performed with several satellite instruments such as SCIAMACHY, SABER, WINDII and OSIRIS. However, the methods are indirect and rely on photochemical models and assumptions such as quenching rates, radiative lifetimes, and reaction coefficients. The results are not always in agreement, particularly when obtained with different instruments.
Cancer is one of the leading causes of death worldwide [183], with lung tumors being the most frequent cause of cancer deaths in men as well as one of the most common cancers diagnosed in woman [40]. As symptoms often arise in advanced stages, an early diagnosis is especially important to ensure the best and earliest possible treatment. In order to achieve this, Computed Tomography (CT) scans are frequently used for tumor detection and diagnosis. We will present examples of publicly available CT image data of lung cancer patients and discuss possible methods to realize an automatic system for automated cancer diagnosis. We will also look at the recent SPIE-AAPM Lung CT Challenge [10] data set in detail and describe possible methods and challenges for image segmentation and classification based on this data set.
Introduction: Chronic pain is a frequent severe disease and often associated with anxiety, depression, insomnia, disability, and reduced quality of life. This maladaptive condition is further characterized by sensory loss, hyperalgesia, and allodynia. Blue light has been hypothesized to modulate sensory neurons and thereby influence nociception.
Objectives: Here, we compared the effects of blue light vs red light and thermal control on pain sensation in a human experimental pain model.
Methods: Pain, hyperalgesia, and allodynia were induced in 30 healthy volunteers through high-density transcutaneous electrical stimulation. Subsequently, blue light, red light, or thermal control treatment was applied in a cross-over design. The nonvisual effects of the respective light treatments were examined using a well-established quantitative sensory testing protocol. Somatosensory parameters as well as pain intensity and quality were scored.
Results: Blue light substantially reduced spontaneous pain as assessed by numeric rating scale pain scoring. Similarly, pain quality was significantly altered as assessed by the German counterpart of the McGill Pain Questionnaire. Furthermore, blue light showed antihyperalgesic, antiallodynic, and antihypesthesic effects in contrast to red light or thermal control treatment.
Conclusion: Blue-light phototherapy ameliorates pain intensity and quality in a human experimental pain model and reveals antihyperalgesic, antiallodynic, and antihypesthesic effects. Therefore, blue-light phototherapy may be a novel approach to treat pain in multiple conditions.
This book shows in a comprehensive presentation how Bond Graph methodology can support model-based control, model-based fault diagnosis, fault accommodation, and failure prognosis by reviewing the state-of-the-art, presenting a hybrid integrated approach to Bond Graph model-based fault diagnosis and failure prognosis, and by providing a review of software that can be used for these tasks.
This article provides insights into the modalities of business-model change and innovation. On the basis of an analysis of empirical data of small and medium enterprises, a transition from wine production centrism to its expanded use in hospitality and tourism is explored. Previous research on wine tourism and hospitality predominantly focuses on a destination perspective, neglecting the organizational winery perspective. The article deploys a mixed methods approach, combining netnography and a content analysis for data collection with grounded research and clustering for theory building. The sample size included 885 German wineries. Data stemmed from two distinct sources (websites and a secondary publication in form of a wine guide) and has been analyzed through a two-step clustering algorithm as well as a Principal Component Analysis (PCA). The two-step clustering algorithm resulted in nine different business models while the PCA analysis grouped the variables into the following two categories: basic winery business model (BM) and BM extension into hospitality and tourism, thereby validating the difference between the two constructs. The results point to the diverse nature of business model extensions of wineries in tourism and hospitality, depending on their organizational type and size. This study offers a classification of small and medium sized enterprise’s strategic business model expansion, and explores the expansion of the wine industry through wine hospitality and tourism services, starting with the winery organizational perspective, which has not been done before.
Critical consumerism is complex as ethical values are difficult to negotiate, appropriate products are hard to find, and product information is overwhelming. Although recommender systems offer solutions to reduce such complexity, current designs are not appropriate for niche practices and use non-personalized intransparent ethics. To support critical consumption, we conducted a design case study on a personalized food recommender system. Therefore, we first conducted an empirical pre-study with 24 consumers to understand value negotiations and current practices, co-designed the recommender system, and finally evaluated it in a real-world trial with ten consumers. Our findings show how recommender systems can support the negotiation of ethical values within the context of consumption practices, reduce the complexity of finding products and stores, and strengthen consumers. In addition to providing implications for the design to support critical consumption practices, we critically reflect on the scope of such recommender systems and its appropriation.
Many people do not consume as much healthy food as recommended. Nudging has been identified as a promising intervention strategy to increase the consumption of healthy food. The present study analyzed the effects of three body shape nudges (thin, thick, or Giacometti artwork) on food ordering and assessed the mediating role of being aware of the nudge. Students (686) and employees (218) of a German university participated in an online experimental study. After randomization, participants visited a realistic online cafeteria and composed a meal for themselves. Under experimental conditions, participants were exposed to one out of three nudges while choosing dishes: (1) thin body shape, (2) thick body shape, and (3) the Giacometti artwork nudge. The Giacometti nudge resulted in more orders for salad among employees. The thin and thick body shape nudges did not change dish orders. Awareness of the nudge mediated the numbers of calories ordered when using the Giacometti or thin body shape nudges. These findings provide useful insights for health interventions in occupational and public health sectors using nudges. Our study contributes to the research on the Giacometti nudge by showing its effectiveness when participants are aware (it is effective under conditions where it is consciously perceived).
The promotion of sustainable packaging is part of the European Green Deal and plays a key role in the EU’s social and political strategy. One option is the use of renewable resources and biomass waste as raw materials for polymer production. Lignocellulose biomass from annual and perennial industrial crops and agricultural residues are a major source of polysaccharides, proteins, and lignin, and can also be used to obtain plant-based extracts and essential oils. Therefore, these biomasses are considered as potential substitute for fossil-based resources. Here, the status quo of bio-based polymers is discussed and evaluated in terms of properties related to packaging applications such as gas and water vapor permeability as well as mechanical properties. So far, their practical use is still restricted due to lower performance in fundamental packaging functions that directly influence food quality and safety, the length of shelf life and thus the amount of food waste. Besides bio-based polymers, this review focuses on plant extracts as active packaging agents. Incorporating extracts of herbs, flowers, trees, and their fruits is inevitable to achieve desired material properties that are capable to prolong the food shelf life. Finally, the adoption potential of packaging based on polymers from renewable resources is discussed from a bioeconomy perspective.
The promotion of sustainable packaging is part of the European Green Deal and plays a key role in the EU’s social and political strategy. One option is the use of renewable resources and biomass waste as raw materials for polymer production. Lignocellulose biomass from annual and perennial industrial crops and agricultural residues are a major source of polysaccharides, proteins, and lignin and can also be used to obtain plant-based extracts and essential oils. Therefore, these biomasses are considered as potential substitute for fossil-based resources. Here, the status quo of bio-based polymers is discussed and evaluated in terms of properties related to packaging applications such as gas and water vapor permeability as well as mechanical properties. So far, their practical use is still restricted due to lower performance in fundamental packaging functions that directly influence food quality and safety, the length of shelf life, and thus the amount of food waste. Besides bio-based polymers, this review focuses on plant extracts as active packaging agents. Incorporating extracts of herbs, flowers, trees, and their fruits is inevitable to achieve desired material properties that are capable to prolong the food shelf life. Finally, the adoption potential of packaging based on polymers from renewable resources is discussed from a bioeconomy perspective.
The idea of a basic income grant (BIG) is not new and there are ongoing debates internationally as well as nationally in low- and middle-income countries just like in high-income countries of a BIG as a social protection policy option. The challenge is that there are different conceptualisations, which conflates and muddles the understanding. In the context of social assistance provision, a universal basic income grant (UBIG) is often compared and contrasted against targeted cash transfers (CTs). This case study systematically presents the arguments for targeted CTs and UBIGs. The value of the case study is that it systematically brings together these arguments, highlighting the variations in UBIG applications, including the evidence and actual impact of UBIG experiments. The structure of the case study is as follows: Section 2 simultaneously contrasts and compares the arguments for targeted CTs and UBIG. Section 3 discusses UBIG experiments, as well as presenting the evidence on the application of the UBIG idea, and Section 4 concludes.
With the roll out of social protection programmes to national scale, questions about implementation and delivery move more and more into the centre of debate (e.g. UNDP 2020; UNDP and UNCDF 2014; Kramon 2019). This concerns in particular the local level, where key processes of implementation are taking place, but where at the same time institutional, operational and financial capacities are often the weakest. While social protection programmes are usually based on a clearly defined set of operational rules and regulations – usually set out in a programme manual – in practice these processes often tend to look quite different. Although many social protection programmes have explicitly excluded traditional authorities from playing an active role in programme delivery, there is ample evidence from across countries that in many local contexts, these ‘informal institutions’ continue to play an important role in the delivery of social protection programmes.
In recent years, the basic income grant (BIG) discourse has gained attention worldwide as a potential policy option in social protection as testified by recent public debates, ongoing pilot projects, campaigning efforts,1 policy measures during Covid-19 and the surge in academic research. A BIG refers to regular cash transfers paid to all members of society irrespective of their socio-economic status, their capacity or willingness to participate in the labour market or having to meet pre-determined conditions (Offe 2008; Van Parijs 1995, 2003; Wright 2004, 2006). Despite the recent hype around BIG, Iran is the only country worldwide with a scaled-up BIG (Tabatabai 2011, 2012). Other programmes have never gone beyond pilot programmes. This raises the question why this is the case.
Characterization of Urban Radio Channels and Base Station Antenna Correlation in the 3.75 GHz Band
(2021)
Cysticfibrosis (CF) arises from mutations in the CF transmembrane conductance regulator (CFTR) gene, resulting in progressiveand life-limiting respiratory disease. R751L is a rare CFTR mutation that is poorly characterized. Our aims were to describe theclinical and molecular phenotypes associated with R751L. Relevant clinical data were collected from three heterozygote individu-als harboring R751L (2 patients with G551D/R751L and 1 with F508del/R751L). Assessment of R751L-CFTR function was made inprimary human bronchial epithelial cultures (HBEs) andXenopusoocytes. Molecular properties of R751L-CFTR were investigatedin the presence of known CFTR modulators. Although sweat chloride was elevated in all three patients, the clinical phenotypeassociated with R751L was mild. Chloride secretion in F508del/R751L HBEs was reduced compared with non-CF HBEs and asso-ciated with a reduction in sodium absorption by the epithelial sodium channel (ENaC). However, R751L-CFTR function inXenopusoocytes, together with folding and cell surface transport of R751L-CFTR, was not different from wild-type CFTR. Overall,R751L-CFTR was associated with reduced sodium chloride absorption but had functional properties similar to wild-type CFTR.This is thefirst report of R751L-CFTR that combines clinical phenotype with characterization of functional and biological proper-ties of the mutant channel. Our work will build upon existing knowledge of mutations within this region of CFTR and, importantly,inform approaches for clinical management. Elevated sweat chloride and reduced chloride secretion in HBEs may be due to al-ternative non-CFTR factors, which require further investigation.
Cancer is a complex disease where resistance to therapies and relapses often pose a serious clinical challenge. The scenario is even more complicated when the cancer type itself is heterogeneous in nature, e.g., lymphoma, a cancer of the lymphocytes which constitutes more than 70 different subtypes. Indeed, the treatment options continue to expand in lymphomas. Herein, we provide insights into lymphoma-specific clinical trials based on cytokine-induced killer (CIK) cell therapy and other pre-clinical lymphoma models where CIK cells have been used along with other synergetic tumor-targeting immune modules to improve their therapeutic potential. From a broader perspective, we will highlight that CIK cell therapy has potential, and in this rapidly evolving landscape of cancer therapies its optimization (as a personalized therapeutic approach) will be beneficial in lymphomas.
Components and Architecture for the Implementation of Technology-Driven Employee Data Protection
(2021)
Most people use disaster apps infrequently, primarily only in situations of turmoil, when they are physically or emotionally vulnerable. Personal data may be necessary to help them, data protections may be waived. In some circumstances, free movement and liberties may be curtailed for public protection, as was seen in the current COVID pandemic. Consuming and producing disaster data can deepen problems arising at the confluence of surveillance and disaster capitalism, where data has become a tool for solutionist instrumentarian power (Zuboff 2019, Klein 2008) and part of a destructive mode of one world worlding (Law 2015, Escobar 2020). The special use of disaster apps prompts us to ask what role consumer protection could play in safeguarding democratic liberties. Within this work, a set of current approaches are briefly reviewed and two case studies are presented of what we call appropriation or design against datafication. These combine document analysis and literature research with several months of online and field ethnographic observation. The first case study examines disaster app use in response to the 2010 Haiti earthquake, the second explores COVID Contact Tracing in Taiwan in 2020/21. Against this backdrop we ask, ‘how could and how should consumer protection respond to problems of surveillance disaster capitalism?’ Drawing on our work with the is IT ethical? Exchange, a co-designed community platform and knowledge exchange for disaster information sharing, and a Societal Readiness Assessment Framework that we are developing alongside it, we explore how co-design methodologies could help define answers.
The identification of energetic materials in containments is an important challenge for analytical methods in the field of safety and security. Opening a package without knowledge of its contents and the resulting hazards is highly involved with risks and should be avoided whenever possible. Therefore, preferable methods work non-destructive with minimal interaction and are capable of identifying target substances in a containment quickly and reliably. Most spectroscopic methods find their limits, if the target substance is shielded by a covering material. To solve this problem, a combined laser drilling method with subsequent identification of the target substance by means of Raman spectroscopic measurements through microscopic bore holes of the covering material is presented. A pulsed laser beam is used for both the drilling process and as an excitation source for Raman measurements in the same optical setup. Results show the ability of this new method to gain high-quality spectra even when performed through microscopic small bore channels. With the laser parameters chosen right, the method can even be performed on highly sensitive explosives like triacetone triperoxide (TATP). Another advantageous effect arises in an observed reduction in unwanted fluorescence signal in the spectral data, resulting from the confocal-like measurement setup with the bore hole acting as aperture.
Most economies across the globe rely on entrepreneurship for growth. There is evidence to suggest that entrepreneurship creates job opportunities and spurs economic growth and development (Pacheco, Dean, & Payne, 2010; Mojica, Gebremedhin, & Schaeffer, 2010, and Solomon, 2007). Even though entrepreneurship is one of the fastest growing education disciplines globally, researchers are still divided on what should be taught and how it should be taught in institutions of higher learning. Entrepreneurial decision-making is laced with uncertainty and drawbacks. Hence, entrepreneurship learners must be taught using practical and conceptual methodologies to equip them with the requisite knowledge and skill that will enable them to confront such challenges in their entrepreneurial activities. This calls for entrepreneurship teachers to be innovative and to also encourage their learners to be innovative as entrepreneurship involves the generation of new business ideas. This paper sought to examine teaching methodologies for entrepreneurship education in institutions of higher learning in Kenya. A mixed-method approach that involved triangulation as the main data collection technique was used. Interviews were administered with teachers and learners of entrepreneurial education in Kenya, with a view to identifying the most commonly used teaching methodologies of entrepreneurial education and their shortcomings. Course outlines and curricula borrowed from twenty (20) institutions of higher learning in Kenya were reviewed. Results indicate that entrepreneurial education in Kenya is largely theoretical and does not meet the needs of the modern entrepreneur. The paper therefore recommends innovative teaching methodologies of entrepreneurial education that can be utilised by the teacher to prepare students adequately to generate entrepreneurial ideas and to identify entrepreneurial opportunities. For this reason, the paper recommends the use of such methodologies as business plan generation, idea generation, innovation, creativity, networking, opportunity recognition, expecting and embracing failure, and adapting to change.
Off-lattice Boltzmann methods increase the flexibility and applicability of lattice Boltzmann methods by decoupling the discretizations of time, space, and particle velocities. However, the velocity sets that are mostly used in off-lattice Boltzmann simulations were originally tailored to on-lattice Boltzmann methods. In this contribution, we show how the accuracy and efficiency of weakly and fully compressible semi-Lagrangian off-lattice Boltzmann simulations is increased by velocity sets derived from cubature rules, i.e. multivariate quadratures, which have not been produced by the Gauss-product rule. In particular, simulations of 2D shock-vortex interactions indicate that the cubature-derived degree-nine D2Q19 velocity set is capable to replace the Gauss-product rule-derived D2Q25. Likewise, the degree-five velocity sets D3Q13 and D3Q21, as well as a degree-seven D3V27 velocity set were successfully tested for 3D Taylor-Green vortex flows to challenge and surpass the quality of the customary D3Q27 velocity set. In compressible 3D Taylor-Green vortex flows with Mach numbers Ma={0.5;1.0;1.5;2.0} on-lattice simulations with velocity sets D3Q103 and D3V107 showed only limited stability, while the off-lattice degree-nine D3Q45 velocity set accurately reproduced the kinetic energy provided by literature.
Off-lattice Boltzmann methods increase the flexibility and applicability of lattice Boltzmann methods by decoupling the discretizations of time, space, and particle velocities. However, the velocity sets that are mostly used in off-lattice Boltzmann simulations were originally tailored to on-lattice Boltzmann methods. In this contribution, we show how the accuracy and efficiency of weakly and fully compressible semi-Lagrangian off-lattice Boltzmann simulations is increased by velocity sets derived from cubature rules, i.e. multivariate quadratures, which have not been produced by the Gauß-product rule. In particular, simulations of 2D shock-vortex interactions indicate that the cubature-derived degree-nine D2Q19 velocity set is capable to replace the Gauß-product rule-derived D2Q25. Likewise, the degree-five velocity sets D3Q13 and D3Q21, as well as a degree-seven D3V27 velocity set were successfully tested for 3D Taylor–Green vortex flows to challenge and surpass the quality of the customary D3Q27 velocity set. In compressible 3D Taylor–Green vortex flows with Mach numbers on-lattice simulations with velocity sets D3Q103 and D3V107 showed only limited stability, while the off-lattice degree-nine D3Q45 velocity set accurately reproduced the kinetic energy provided by literature.
Augmented/Virtual Reality (AR/VR) is still a fragmented space to design for due to the rapidly evolving hardware, the interdisciplinarity of teams, and a lack of standards and best practices. We interviewed 26 professional AR/VR designers and developers to shed light on their tasks, approaches, tools, and challenges. Based on their work and the artifacts they generated, we found that AR/VR application creators fulfill four roles: concept developers, interaction designers, content authors, and technical developers. One person often incorporates multiple roles and faces a variety of challenges during the design process from the initial contextual analysis to the deployment. From analysis of their tool sets, methods, and artifacts, we describe critical key challenges. Finally, we discuss the importance of prototyping for the communication in AR/VR development teams and highlight design implications for future tools to create a more usable AR/VR tool chain.
In tree-based adaptive mesh refinement (AMR) we store refinement trees in the cells of an unstructured coarse mesh. This lets us combine the speed and simpler management of structured refinement trees with the more flexible mesh generation of the unstructured coarse mesh. But this creates a conflict between performance and geometrical accuracy. If we favor speed we reduce the cells in our coarse mesh and hence reduce the accuracy of our geometrical representation. If we want more accurate results we generate a finer coarse mesh and lose performance by managing more cells in our unstructured coarse mesh. To mitigate this conflict we present the prototype of an geometry description which we implement in an already existing library. With this description we build geometry adapted hexahedral refinement trees, which also support high-order curved boundary cells. We also present examples on how to use this description. Moreover, we test the speedup of this new algorithm compared with coarse meshes with different geometrical errors.
The Covid-19 pandemic has challenged educators across the world to move their teaching and mentoring from in-person to remote. During nonpandemic semesters at their institutes (e.g. universities), educators can directly provide students the software environment needed to support their learning - either in specialized computer laboratories (e.g. computational chemistry labs) or shared computer spaces. These labs are often supported by staff that maintains the operating systems (OS) and software. But how does one provide a specialized software environment for remote teaching? One solution is to provide students a customized operating system (e.g., Linux) that includes open-source software for supporting your teaching goals. However, such a solution should not require students to install the OS alongside their existing one (i.e. dual/multi-booting) or be used as a complete replacement. Such approaches are risky because of a) the students' possible lack of software expertise, b) the possible disruption of an existing software workflow that is needed in other classes or by other family members, and c) the importance of maintaining a working computer when isolated (e.g. societal restrictions). To illustrate possible solutions, we discuss our approach that used a customized Linux OS and a Docker container in a course that teaches computational chemistry and Python3.
Policy analysis is the cornerstone of evidence-based policy making.1 It identifies the problems, informs programme design, supports the monitoring of policy implementation and is needed to evaluate programme impacts (Scott 2005). Rigorous and credible policy evidence is necessary to ensure the transparency and accountability of policy decisions, to secure political and public support and, hence, the allocation of financial resources. Sound policy analysis helps design effective and efficient programmes, thereby maximizing programme impact.
Data emerged as a central success factor for companies to benefit from digitization. However, the skills in successfully creating value from data – especially at the management level – are not always profound. To address this problem, several canvas models have already been designed. Canvas models are usually created to write down an idea in a structured way to promote transparency and traceability. However, some existing data science canvas models mainly address developers and are thus unsuitable for decision-makers and communication within interdisciplinary teams. Based on a literature review, we identified influencing factors that are essential for the success of data science projects. With the information gained, the Data Science Canvas was developed in an expert workshop and finally evaluated by practitioners to find out whether such an instrument could support data-driven value creation.
Ghana suffers from frequent power outages, which can be compensated by off-grid energy solutions. Photovoltaic-hybrid systems become more and more important for rural electrification due to their potential to offer a clean and cost-effective energy supply. However, uncertainties related to the prediction of electrical loads and solar irradiance result in inefficient system control and can lead to an unstable electricity supply, which is vital for the high reliability required for applications within the health sector. Model predictive control (MPC) algorithms present a viable option to tackle those uncertainties compared to rule-based methods, but strongly rely on the quality of the forecasts. This study tests and evaluates (a) a seasonal autoregressive integrated moving average (SARIMA) algorithm, (b) an incremental linear regression (ILR) algorithm, (c) a long short-term memory (LSTM) model, and (d) a customized statistical approach for electrical load forecasting on real load data of a Ghanaian health facility, considering initially limited knowledge of load and pattern changes through the implementation of incremental learning. The correlation of the electrical load with exogenous variables was determined to map out possible enhancements within the algorithms. Results show that all algorithms show high accuracies with a median normalized root mean square error (nRMSE) <0.1 and differing robustness towards load-shifting events, gradients, and noise. While the SARIMA algorithm and the linear regression model show extreme error outliers of nRMSE >1, methods via the LSTM model and the customized statistical approaches perform better with a median nRMSE of 0.061 and stable error distribution with a maximum nRMSE of <0.255. The conclusion of this study is a favoring towards the LSTM model and the statistical approach, with regard to MPC applications within photovoltaic-hybrid system solutions in the Ghanaian health sector.
Due to ongoing digitalization, more and more cloud services are finding their way into companies. In this context, data integration from the various software solutions, which are provided both on-premise (local use or licensing for local use of software) and as a service, is of great importance. In this regard, Integration Platform as a Service (IPaaS) models aim to support companies as well as software providers in the context of data integration by providing connectors to enable data flow between different applications and systems and other integration services. Since previous research has mostly focused on technical or legal aspects of IPaaS, this article focuses on deriving integration practices and design-related barriers and drivers regarding the adoption of IPaaS. Therefore, we conducted 10 interviews with experts from different software as a services vendors. Our results show that the main factors regarding the adoption of IPaaS are the standardization of data models, the usability and variety of connectors provided, and the issues regarding data privacy, security, and transparency.
Fabry disease (FD) is an X‐linked lysosomal storage disorder. Deficiency of the lysosomal enzyme alpha‐galactosidase (GLA) leads to accumulation of potentially toxic globotriaosylceramide (Gb3) on a multisystem level. Cardiac and cerebrovascular abnormalities as well as progressive renal failure are severe, life‐threatening long‐term complications. The complete pathophysiology of chronic kidney disease (CKD) in FD and the role of tubular involvement for its progression are unclear.
We established human renal tubular epithelial cell lines from the urine of male FD patients and male controls. The renal tubular system is rich in mitochondria and involved in transport processes at high energy costs. Our studies revealed fragmented mitochondria with disrupted cristae structure in FD patient cells. Oxidative stress levels were elevated and oxidative phosphorylation was up‐regulated in FD pointing at enhanced energetic needs. Mitochondrial homeostasis and energy metabolism revealed major changes as evidenced by differences in mitochondrial number, energy production and fuel consumption. The changes were accompanied by activation of the autophagy machinery in FD. Sirtuin1, an important sensor of (renal) metabolic stress and modifier of different defense pathways, was highly expressed in FD.
Our data show that lysosomal FD impairs mitochondrial function and results in severe disturbance of mitochondrial energy metabolism in renal cells. This insight on a tissue‐specific level points to new therapeutic targets which might enhance treatment efficacy.
Design of a Medium Voltage Generator with DC-Cascade for High Power Wind Energy Conversion Systems
(2021)
This paper shows a new concept to generate medium voltage (MV) in wind power application to avoid an additional transformer. Therefore, the generator must be redesigned with additional constraints and a new topology for the power rectifier system by using multiple low voltage (LV) power rectifiers connected in series and parallel to increase the DC output voltage. The combination of parallel and series connection of rectifiers is further introduced as DC-cascade. With the resulting DC-cascade, medium output voltage is achieved with low voltage rectifiers and without a bulky transformer. This approach to form a DC-cascade reduces the effort required to achieve medium DC voltage with a simple rectifier system. In this context, a suitable DC-cascade control was presented and verified with a laboratory test setup. A gearless synchronous generator, which is highly segmented so that each segment can be connected to its own power rectifier, is investigated. Due to the mixed AC and DC voltage given by the DC-cascade structure, it becomes more demanding to the design of the generator insulation, which influences the copper fill factor and the design of the cooling system. A design strategy for the overall generator design is carried out considering the new boundary conditions.
Developing the Circular Economy in Uganda: Prospects for Academia-Public-Private-Partnerships
(2021)
Issues: Circular economy is a production system that optimizes the reusability of by-products/waste as raw materials. As the global population threatens to reach 9 billion by 2050, consumption levels grow proportionally, raising food, material, and energy demands. In Uganda, soil nutrient depletion and energy poverty are key challenges faced by urban and rural communities. Rampart depletion of natural resources calls for transit from the linear economic models towards sustainable production/consumption technologies. This study investigated prospects for APPP to optimize the reusability of by-products/waste as raw materials. Approach: Quantitative and qualitative tools were used to collect data via document analysis, interviews, and participant observations. The tools were administered to municipal authorities, private waste-collecting agencies in cities and municipalities; officials in Ministries of energy and Agriculture; officials in universities research units and entrepreneurs that deal in agricultural and energy products; officials from civil society organizations. Findings: there are a number of sustainability projects being undertaken by Universities and High schools, Government agencies, companies, and civil society organization isolation. Singlehandedly, individual agencies lack the requisite capacity to develop closed-loop production/consumption models. Analysis of a few successful RRR projects suggests that APPP is positioned to promote CE. Transiting towards a circular economy requires joint ventures to optimize human, technological, and financial resources and develop policy and institutional frameworks. In Uganda, recycling biotic by-products can promote environmental sustainability; reduce stress on natural resources; enable cost savings; promote green entrepreneurship, and create jobs/livelihoods. Conclusion: working jointly, CE could be enhanced via technical and business models by the academia, private capital investment by companies, community engagement by CSOs, and development of supportive policy and institutional frameworks to facilitate decision-making processes. The APPPs are positioned to use interactive platforms for creating awareness and promote sensitization about green values through education and multimedia communication platforms.
In the research project "MetPVNet", both, the forecast-based operation management in distribution grids and as well as the forecasts of the feed-in of PV-power from decentralized plants could be improved on the basis of satellite data and numerical weather forecasts. Based on a detailed network analyses for a real medium-voltage grid area, it was shown that both – the integration of forecast data based on satellite and weather data and the improvement of subsequent day forecasts based on numerical weather models – have a significant added value for forecast-based congestion management or redispatch and reactive power management in the distribution grid. Furthermore, forecast improvements for the forecast model of the German Weather Service were achieved by assimilating visible satellite imagery, and cloud and radiation products from satellites were improved, thus improving the database for short-term forecasting as well as for assimilation. In addition, several methods have been developed that will enable forecast improvement in the future, especially for weather situations with high cloud induced variability and high forecast errors. This article summarizes the most important project results.
Background: Atypical myopathy (AM), an acquired multiple acyl-CoA dehydrogenase deficiency (MADD) in horses, induce changes in mitochondrial metabolism. Only few veterinary laboratories offer diagnostic testing for this disease. Inborn and acquired MADD exist in humans, therefore determination of organic acids (OA) in urine and acylcarnitines (AC) in blood by assays available in medical laboratories can serve as AM diagnostics. The evolution of OA and AC profiles in surviving horses is unreported.
Methods: AC profiles using electrospray ionization tandem mass spectrometry (ESI-MS/MS) and OA in urine using gas chromatography mass spectrometry (GC–MS) were determined in dried blot spots (DBS, n = 7) and urine samples (n = 5) of horses with AM (n = 7) at disease presentation and in longitudinal samples from 3/4 survivors and compared to DBS (n = 16) and urine samples (n = 7) from control horses using the Wilcoxon test.
Results: All short- (C2-C5) and medium-chain (C6-C12) AC in blood differed significantly (p < 0.008) between horses with AM and controls, except for C5:1 (p = 0.45) and C5OH + C4DC (p = 0.06). In AM survivors the AC concentrations decreased over time but were still partially elevated after 7 days. 14/62 (23%) of OA differed significantly between horses with AM and control horses. Concentrations of ethylmalonic acid, 2-hydroxyglutaric acid and the acylglycines (butyryl-, valeryl-, and hexanoylglycine) were highly elevated in the urine of all horses with AM at the day of disease presentation. In AM survivors, concentrations of those metabolites were initially lower and decreased during remission to approach normalization after 7 days.
Conclusion: OA and AC profiling by specialized human medical laboratories was used to diagnose AM in horses. Elevation of specific metabolites were still evident several days after disease presentation, allowing diagnosis via analysis of samples from convalescent animals.
In this thesis it is posed that the central object of preference discovery is a co-creative process in which the Other can be represented by a machine. It explores efficient methods to enhance introverted intuition using extraverted intuition's communication lines. Possible implementations of such processes are presented using novel algorithms that perform divergent search to feed the users' intuition with many examples of high quality solutions, allowing them to take influence interactively. The machine feeds and reflects upon human intuition, combining both what is possible and preferred. The machine model and the divergent optimization algorithms are the motor behind this co-creative process, in which machine and users co-create and interactively choose branches of an ad hoc hierarchical decomposition of the solution space.
The proposed co-creative process consists of several elements: a formal model for interactive co-creative processes, evolutionary divergent search, diversity and similarity, data-driven methods to discover diversity, limitations of artificial creative agents, matters of efficiency in behavioral and morphological modeling, visualization, a connection to prototype theory, and methods to allow users to influence artificial creative agents. This thesis helps putting the human back into the design loop in generative AI and optimization.
Diversity of Insects in Nature protected Areas (DINA): an interdisciplinary German research project
(2021)
Insect declines and biodiversity loss have attracted much attention in recent years, but lack of comprehensive data, conflicting interests among stakeholders and insufficient policy guidance hinder progress in preserving biodiversity. The project DINA (Diversity of Insects in Nature protected Areas) investigates insect communities in 21 nature reserves in Germany. All selected conservation sites border arable land, with agricultural practices assumed to influence insect populations. We taught citizen scientists how to manage Malaise traps for insect collection, and subsequently used a DNA metabarcoding approach for species identification. Vegetation surveys, plant metabarcoding as well as geospatial and ecotoxicological analyses will help to unravel contributing factors for the deterioration of insect communities. As a pioneering research project in this field, DINA includes a transdisciplinary dialogue involving relevant stakeholders such as local authorities, policymakers, and farmers, which aims at a shared understanding of conservation goals and action pathways. Stakeholder engagement combined with scientific results will support the development of sound policy recommendations to improve legal frameworks, landscape planning, land use, and conservation strategies. With this transdisciplinary approach, we aim to provide the background knowledge to implement policy strategies that will halt further decline of insects in German protected areas.
Intimate swabs taken for examination in sexual assault cases typically yield mixtures of sperm and epithelial cell types. While powerful, differential extraction protocols to overcome such cell type mixtures by separate lysis of epithelial cells and spermatozoa can still prove ineffective, in particular if only few sperm cells are present or if swabs contain sperm from more than one individual leading to complex low level DNA mixtures. A means to avoid such mixtures consists in the analysis of single micromanipulated sperm cells. However, the quantity of DNA from single sperm cells is not sufficient for conventional STR analysis. Here, we describe a simple method for micromanipulating individual sperm cells from intimate swabs and show that whole genome amplification can generate sufficient amounts of DNA from single cells for subsequent DNA profiling. We recovered over 80% of alleles of haploid autosomal STR profiles from the majority of individual sperm cells. Furthermore, we demonstrate that in mixtures of sperm from two contributors, Y-STR and X-STR profiles of individual sperm cells can be used to sort the haploid autosomal profiles to develop the diploid consensus STR profiles of the individual donors. Finally, by analysing single sperm cells from mock sexual assault swabs with one or two sperm donors, we showed that our protocols enabled the identification of the unknown male contributors.
DNA Sequencing
(2021)
This paper gives an overview of how we can benefit from using container technology in our academic work. It aims to be a starting point for fellow researchers which also think about applying these technologies. Hence, we focus on decribing our own experiences and motivations instead of proving hard scientific facts.
A school leader’s achievement is not what they study in learning institutions but the way they organize themselves into problem solving and realistic decision making. While this includes some taught hard skills, the bulk of school activities rely on soft skills. Soft skills, however, are frequently neglected, although they play an important role in school principals’ daily operations as an instructional supervisor. This study aimed to examine the relationship between soft skills training and Principals' performance. The study adopted a cross-sectional mixed survey design. Using Yamane formulae, the sample comprised of 167 principals from 286 public secondary schools in Kiambu County. These were spread proportionally across all the 12 sub-counties in the County. The principal research instrument was primarily a questionnaire. The reliability of the instrument using the Cronbach Alpha coefficient was deemed reasonable at.73. The findings showed that a substantial relationship exists between the training of the principal on soft skills and their good performance of the duties. The study suggests routine in-service training should be undertaken in the county to improve the development of soft skills. It is also advisable that undergraduate, postgraduate, or in-service training include soft skills as a unit, to build knowledge of the value of soft skills.
For several decades, farmers have been mixing rock powders with livestock slurry to reduce its NH3 emissions and increase its nutrient content. However, mixing rock powders with slurry is controversial, and there is currently no scientific evidence for its effects on NH3 and greenhouse gas (GHG) emissions or on changes in its nutrient content due to element release from rock powders. The major aim of this study was therefore to analyse the effects of mixing two commercially established rock powders with cattle slurry on NH3, CO2, N2O and CH4 emissions, and on nutrient release over a course of 46 days. We found that rock powders did not significantly affect CO2 emission rates. NH3 and N2O emission rates did not differ significantly up until the end of the trial, when the emission rates of the rock powder treatments significantly increased for NH3 and significantly decreased for N2O, respectively, which coincided with a reduction of the slurry crust. Cumulative NH3 emissions did not, however, differ significantly between treatments. Unexpected and significant increases in CH4 emission rates occurred for the rock powder treatments. Rock powders increased the macro- and micronutrient content of the slurry. The conflicting results are discussed and future research directions are proposed.
Here we provide the electrophysiology data for the manuscript "Two functional epithelial sodium channel isoforms are present in rodents despite pronounced evolutionary pseudogenization and exon fusion", published in Molecular Biology and Evolution (2021): msab271 (doi: 10.1093/molbev/msab271). Data are reported as current values in Excel format, sorted according to the appearance in Figures and supplemented by explanatory text on the procedures/data presentation.
Threats to passwords are still very relevant due to attacks like phishing or credential stuffing. One way to solve this problem is to remove passwords completely. User studies on passwordless FIDO2 authentication using security tokens demonstrated the potential to replace passwords. However, widespread acceptance of FIDO2 depends, among other things, on how user accounts can be recovered when the security token becomes permanently unavailable. For this reason, we provide a heuristic evaluation of 12 account recovery mechanisms regarding their properties for FIDO2 passwordless authentication. Our results show that the currently used methods have many drawbacks. Some even rely on passwords, taking passwordless authentication ad absurdum. Still, our evaluation identifies promising account recovery solutions and provides recommendations for further studies.
It has been well proved that deep networks are efficient at extracting features from a given (source) labeled dataset. However, it is not always the case that they can generalize well to other (target) datasets which very often have a different underlying distribution. In this report, we evaluate four different domain adaptation techniques for image classification tasks: DeepCORAL, DeepDomainConfusion, CDAN and CDAN+E. These techniques are unsupervised given that the target dataset dopes not carry any labels during training phase. We evaluate model performance on the office-31 dataset. A link to the github repository of this report can be found here: https://github.com/agrija9/Deep-Unsupervised-Domain-Adaptation.
Different analyses and feasibility studies have been conducted on the plant extracts of thyme (Thymus vulgaris), European horse chestnut (Aesculus hippocastanum), Nordmann fir (Abies nordmanniana), and snowdrop (Galanthus elwesii) to evaluate bio‐based alternatives to common petrol‐based stabilisers. For this purpose, in this study, plant extracts were incorporated into poly‐lactic acid films (PLA) at different concentrations. The films’ UV absorbance and migration into packed food was analysed via photometric assays (ABTS radical cation scavenging capacity assay, β‐carotene assay) and GC–MS analysis. Furthermore, the synergistic antioxidant effects of various combinations of extracts and isolated active compounds were determined. This way, antioxidant effects can be increased, allowing for a highly effective use of resources. All extracts were successfully incorporated into PLA films and showed notable photoabsorbing effects, while no migration risk was observed. Depending on extract combinations, high synergistic effects of up to 726% can be utilised to improve the effectiveness of bio‐based extracts. This applies particularly to tomato paste and Aesculus hippocastanum extracts, which overall show high synergistic and antioxidant effects in combination with each other and with isolated active compounds. The study shows that it is possible to create safe bio‐based antioxidant films which show even improved properties when using highlighted target combinations.
Research on entrepreneurial eco-systems is evolving with exhortations for empirical studies at regional and local levels to augment national surveys. The study, therefore, sought to explore the entrepreneurial eco-system of the Central Region, which is relatively well-endowed with natural resources but lags behind in economic advancement in Ghana. Through descriptive research design, quantitative data were collected using self-administered questionnaires from a convenience sample of 44 entrepreneurs under the presidential business support programme in the Central Region of Ghana, in 2019. Data were analysed, by conducting descriptive analysis such as means (M) and percentages and by exploratory factor analysis, with the IBM SPSS Version 25. Descriptive results of 37 valid responses showed that the respondents were satisfied, in varying degrees (M = 4.19-5.65), with 11 factors within the eco-system; the top three factors were demand, security and availability of raw materials. Respondents were, however, not satisfied with access to business development services, access to finance, rent charges and access to repairers of equipment and thus, pose as challenges to their entrepreneurial pursuits. Principal component analysis revealed inter-connectedness among the factors in the eco-system with strong loadings of measures of institutions and resource endowment under the two components of the solution. Based on the findings, it is concluded that the entrepreneurs surveyed were satisfied with more factors in the EES of the Central Region while they were dissatisfied with relatively few but critical factors in the EES, thereby posing as major challenges to their entrepreneurial activities. As an exploratory study, the findings suggest that the entrepreneurial eco-system of the Central Region of Ghana is, to some extent, supportive of entrepreneurial activities but has key challenges. To achieve maximum outcomes, policy interventions should collectively address, at a time, factors that interact strongly to influence entrepreneurship within the system.
Applied privacy research has so far focused mainly on consumer relations in private life. Privacy in the context of employment relationships is less well studied, although it is subject to the same legal privacy framework in Europe. The European General Data Protection Regulation (GDPR) has strengthened employees’ right to privacy by obliging that employers provide transparency and intervention mechanisms. For such mechanisms to be effective, employees must have a sound understanding of their functions and value. We explored possible boundaries by conducting a semistructured interview study with 27 office workers in Germany and elicited mental models of the right to informational self-determination, which is the European proxy for the right to privacy. We provide insights into (1) perceptions of different categories of data, (2) familiarity with the legal framework regarding expectations for privacy controls, and (3) awareness of data processing, data flow, safeguards, and threat models. We found that legal terms often used in privacy policies used to describe categories of data are misleading. We further identified three groups of mental models that differ in their privacy control requirements and willingness to accept restrictions on their privacy rights. We also found ignorance about actual data flow, processing, and safeguard implementation. Participants’ mindsets were shaped by their faith in organizational and technical measures to protect privacy. Employers and developers may benefit from our contributions by understanding the types of privacy controls desired by office workers and the challenges to be considered when conceptualizing and designing usable privacy protections in the workplace.
Although work events can be regarded as pivotal elements of organizational life, only a few studies have examined how positive and negative events relate to and combine to affect work engagement over time. Theory suggests that, to better understand how current events affect work engagement (WE), we have to account for recent events that have preceded these current events. We present competing theoretical views on how recent and current work events may affect employees (e.g., getting used to a high frequency of negative events or becoming more sensitive to negative events). Although the occurrence of events implies discrete changes in the experience of work, prior research has not considered whether work events actually accumulate to sustained mid-term changes in WE. To address these gaps in the literature, we conducted a week-level longitudinal study across a period of 15 consecutive weeks among 135 employees, which yielded 849 weekly observations. While positive events were associated with higher levels of WE within the same week, negative events were not. Our results support neither satiation nor sensitization processes. However, a high frequency of negative events in the preceding week amplified the beneficial effects of positive events on WE in the current week. Growth curve analyses show that the benefits of positive events accumulate to sustain high levels of WE. WE dissipates in the absence of a continuous experience of positive events. Our study adds a temporal component by highlighting that positive events affect work engagement, particularly in light of recent negative events. Our study informs research that has taken a feature-oriented perspective on the dynamic interplay of job demands and resources.
We consider multi-solution optimization and generative models for the generation of diverse artifacts and the discovery of novel solutions. In cases where the domain's factors of variation are unknown or too complex to encode manually, generative models can provide a learned latent space to approximate these factors. When used as a search space, however, the range and diversity of possible outputs are limited to the expressivity and generative capabilities of the learned model. We compare the output diversity of a quality diversity evolutionary search performed in two different search spaces: 1) a predefined parameterized space and 2) the latent space of a variational autoencoder model. We find that the search on an explicit parametric encoding creates more diverse artifact sets than searching the latent space. A learned model is better at interpolating between known data points than at extrapolating or expanding towards unseen examples. We recommend using a generative model's latent space primarily to measure similarity between artifacts rather than for search and generation. Whenever a parametric encoding is obtainable, it should be preferred over a learned representation as it produces a higher diversity of solutions.
New sustainable, environmentally friendly materials for thermal insulation of buildings are necessary to reduce their carbon footprints. In this study, Miscanthus fiber-reinforced geopolymer composites, foamed with sodium dodecyl sulfate (SDS), were developed using fly ash as a geopolymer precursor. The effects of fiber content, fiber size, curing temperature, foaming agent content, fumed silica specific surface area and fumed silica content on thermal conductivity and compressive strength were evaluated using a Plackett-Burman design of experiment. Furthermore, the microstructure of geopolymer composites was investigated using X-ray diffraction (XRD), X-ray micro-computed tomography (μCT) and scanning electron microscopy (SEM). The measured characteristic values were in the following ranges: Thermal conductivity 0.057 W (m K)−1 to 0.127 W (m K)−1, compressive strength 0.007 MPa–0.719 MPa and porosity 49 vol% to 76 vol%. The results reveal an enhancement of thermal conductivity by elevated fiber size and foaming agent content. In contrast, the compressive strength is enhanced by high fiber content. Additionally, SEM images indicate a good interaction between the fibers and the geopolymer matrix, because nearly the whole fiber surface is covered by the geopolymer.