Refine
Departments, institutes and facilities
- Fachbereich Informatik (68)
- Fachbereich Angewandte Naturwissenschaften (59)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (43)
- Fachbereich Ingenieurwissenschaften und Kommunikation (38)
- Fachbereich Wirtschaftswissenschaften (37)
- Institut für funktionale Gen-Analytik (IFGA) (36)
- Fachbereich Sozialpolitik und Soziale Sicherung (22)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (20)
- Institut für Cyber Security & Privacy (ICSP) (16)
- Institut für Verbraucherinformatik (IVI) (16)
Document Type
- Article (118)
- Conference Object (89)
- Part of a Book (27)
- Preprint (15)
- Book (monograph, edited volume) (7)
- Research Data (6)
- Doctoral Thesis (5)
- Working Paper (4)
- Conference Proceedings (2)
- Part of Periodical (1)
Year of publication
- 2021 (276) (remove)
Language
- English (276) (remove)
Keywords
- Augmented Reality (4)
- Machine Learning (4)
- Big Data Analysis (3)
- Kenya (3)
- Machine learning (3)
- Usable Security (3)
- recovery (3)
- sustainability (3)
- AML (2)
- Africa (2)
Software developers build complex systems using plenty of third-party libraries. Documentation is key to understand and use the functionality provided via the libraries’ APIs. Therefore, functionality is the main focus of contemporary API documentation, while cross-cutting concerns such as security are almost never considered at all, especially when the API itself does not provide security features. Documentations of JavaScript libraries for use in web applications, e.g., do not specify how to add or adapt a Content Security Policy (CSP) to mitigate content injection attacks like Cross-Site Scripting (XSS). This is unfortunate, as security-relevant API documentation might have an influence on secure coding practices and prevailing major vulnerabilities such as XSS. For the first time, we study the effects of integrating security-relevant information in non-security API documentation. For this purpose, we took CSP as an exemplary study object and extended the official Google Maps JavaScript API documentation with security-relevant CSP information in three distinct manners. Then, we evaluated the usage of these variations in a between-group eye-tracking lab study involving N=49 participants. Our observations suggest: (1) Developers are focused on elements with code examples. They mostly skim the documentation while searching for a quick solution to their programming task. This finding gives further evidence to results of related studies. (2) The location where CSP-related code examples are placed in non-security API documentation significantly impacts the time it takes to find this security-relevant information. In particular, the study results showed that the proximity to functional-related code examples in documentation is a decisive factor. (3) Examples significantly help to produce secure CSP solutions. (4) Developers have additional information needs that our approach cannot meet.
Overall, our study contributes to a first understanding of the impact of security-relevant information in non-security API documentation on CSP implementation. Although further research is required, our findings emphasize that API producers should take responsibility for adequately documenting security aspects and thus supporting the sensibility and training of developers to implement secure systems. This responsibility also holds in seemingly non-security relevant contexts.
Sharing economies enabled by technical platforms have been studied regarding their economic, legal, and social effects, as well as with regard to their possible influences on CSCW topics such as work, collaboration, and trust. While a lot current research is focusing on the sharing economy and related communities, there is little work addressing the phenomenon from a socio-technical point of view. Our workshop is meant to address this gap. Building on research themes and discussion from last year’s ECSCW, we seek to engage deeper with topics such as novel socio-technical approaches for enabling sharing communities, discussing issues around digital consumer and worker protection, as well as emerging challenges and opportunities of existing platforms and approaches.
3-Hydroxyisobutyrate Dehydrogenase (HIBADH) deficiency - a novel disorder of valine metabolism
(2021)
3-Hydroxyisobutyric acid (3HiB) is an intermediate in the degradation of the branched-chain amino acid valine. Disorders in valine degradation can lead to 3HiB accumulation and its excretion in the urine. This article describes the first two patients with a new metabolic disorder, 3-hydroxyisobutyrate dehydrogenase (HIBADH) deficiency, its phenotype and its treatment with a low-valine diet. The detected mutation in the HIBADH gene leads to nonsense-mediated mRNA decay of the mutant allele and to a complete loss-of-function of the enzyme. Under strict adherence to a low-valine diet a rapid decrease of 3HiB excretion in the urine was observed. Due to limited patient numbers and intrafamilial differences in phenotype with one affected and one unaffected individual, the clinical phenotype of HIBADH deficiency needs further evaluation.
Animal models are often needed in cancer research but some research questions may be answered with other models, e.g., 3D replicas of patient-specific data, as these mirror the anatomy in more detail. We, therefore, developed a simple eight-step process to fabricate a 3D replica from computer tomography (CT) data using solely open access software and described the method in detail. For evaluation, we performed experiments regarding endoscopic tumor treatment with magnetic nanoparticles by magnetic hyperthermia and local drug release. For this, the magnetic nanoparticles need to be accumulated at the tumor site via a magnetic field trap. Using the developed eight-step process, we printed a replica of a locally advanced pancreatic cancer and used it to find the best position for the magnetic field trap. In addition, we described a method to hold these magnetic field traps stably in place. The results are highly important for the development of endoscopic tumor treatment with magnetic nanoparticles as the handling and the stable positioning of the magnetic field trap at the stomach wall in close proximity to the pancreatic tumor could be defined and practiced. Finally, the detailed description of the workflow and use of open access software allows for a wide range of possible uses.
New cars are increasingly "connected" by default. Since not having a car is not an option for many people, understanding the privacy implications of driving connected cars and using their data-based services is an even more pressing issue than for expendable consumer products. While risk-based approaches to privacy are well established in law, they have only begun to gain traction in HCI. These approaches are understood not only to increase acceptance but also to help consumers make choices that meet their needs. To the best of our knowledge, perceived risks in the context of connected cars have not been studied before. To address this gap, our study reports on the analysis of a survey with 18 open-ended questions distributed to 1,000 households in a medium-sized German city. Our findings provide qualitative insights into existing attitudes and use cases of connected car features and, most importantly, a list of perceived risks themselves. Taking the perspective of consumers, we argue that these can help inform consumers about data use in connected cars in a user-friendly way. Finally, we show how these risks fit into and extend existing risk taxonomies from other contexts with a stronger social perspective on risks of data use.
One of the biggest challenges faced by many tech start-ups from developed markets is to have validated market-fit products/services and to see their solutions implemented. In several sectors, stringent regulations, and the law of handicap of head start at home can be hurdles that limit the development and even the survival potential of theses start-ups. Tech start-ups seeking implementation, learning, and legitimacy may have a solution in expanding into emerging markets. Emerging markets offer both business opportunities in sectors in need of new technologies as they are “fertile grounds” for developing and testing internationalisation business models. We present here a process designed to help tech start-ups to identify, access, shape and seize these opportunities and to overcome their own specificities and emerging markets specificities. The three phases of the proposed process cover entry node concept, partnership, and business, operating and revenue joint models’ development. DesignScience Research Paradigm is used for the design and evaluation of the process. To show the relevance of this process, a case study on the expansion in Morocco of a Dutch start-up active in e-health is used. The study shows the importance of the process for the embeddedness in a local relevant value network with a relevant adopter’s system, a key enabler to achieve time and cost-effective expansion in that specific business and institutional contexts. A pilot to assess the proposed models and evidence of benefits is under development. To boost their chances of growth tech start-ups from developed markets should consider expansion into emerging markets in their strategy. It would be beneficial that policy makers adopt a strategy by which to assist tech start-ups in accessing value networks in emerging markets. It is also important for policy makers from emerging markets to consider developing schemes to attract tech start-ups from developed markets.
A qualitative study of Machine Learning practices and engineering challenges in Earth Observation
(2021)
Machine Learning (ML) is ubiquitously on the advance. Like many domains, Earth Observation (EO) also increasingly relies on ML applications, where ML methods are applied to process vast amounts of heterogeneous and continuous data streams to answer socially and environmentally relevant questions. However, developing such ML- based EO systems remains challenging: Development processes and employed workflows are often barely structured and poorly reported. The application of ML methods and techniques is considered to be opaque and the lack of transparency is contradictory to the responsible development of ML-based EO applications. To improve this situation a better understanding of the current practices and engineering-related challenges in developing ML-based EO applications is required. In this paper, we report observations from an exploratory study where five experts shared their view on ML engineering in semi-structured interviews. We analysed these interviews with coding techniques as often applied in the domain of empirical software engineering. The interviews provide informative insights into the practical development of ML applications and reveal several engineering challenges. In addition, interviewees participated in a novel workflow sketching task, which provided a tangible reflection of implicit processes. Overall, the results confirm a gap between theoretical conceptions and real practices in ML development even though workflows were sketched abstractly as textbook-like. The results pave the way for a large-scale investigation on requirements for ML engineering in EO.
The dataset contains the following data from successful and failed executions of the Toyota HSR robot placing a book on a shelf.
RGB images from the robot's head camera
Depth images from the robot's head camera
Rendered images of the robot's 3D model from the point of view of the robot's head camera
Force-torque readings from a wrist-mounted force-torque sensor
Joint efforts, velocities and positions
extrinsic and intrinsic camera calibration parameters
frame-level anomaly annotations
The anomalies that occur during execution include:
the manipulated book falling down
books on the shelf being disturbed significantly
camera occlusions
robot being disturbed by an external collision
The dataset is split into a train, validation and test set with the following number of trials:
Train: 48 successful trials
Validation: 6 successful trials
Test: 60 anomalous trials and 7 successful trials
Designs for decorative surfaces, such as flooring, must cover several square meters to avoid visible repeats. While the use of desktop systems is feasible to support the designer, it is challenging for a non-domain expert to get the right impression of the appearances of surfaces due to limited display sizes and a potentially unnatural interaction with digital designs. At the same time, large-format editing of structure and gloss is becoming increasingly important. Advances in the printing industry allow for more faithful reproduction of such surface details. Unfortunately, existing systems for visualizing surface designs cannot adequately account for gloss, especially for non-domain experts. Here, the complex interaction of light sources and the camera position must be controlled using software controls. As a result, only small parts of the data set can be properly inspected at a time. Also, real-world lighting is not considered here. This work presents a system for the processing and realistic visualization of large decorative surface designs. To this end, we present a tabletop solution that is coupled to a live 360° video feed and a spatial tracking system. This allows for reproducing natural view-dependent effects like real-world reflections, live image-based lighting, and the interaction with the design using virtual light sources employing natural interaction techniques that allow for a more accurate inspection even for non-domain experts.
The article explores SME (Small and Medium Sized Enterprises) brand strategies as a means to position and successfully engage in competitive markets. A derived typology of brand strategy types deals with social profiling and sheds light on brand strategy internalization of two current managerial paradigms—sustainability and co-creation. N = 895 German SME wineries were examined, leaning on a netnographic analysis of predominantly websites and social media interactions. A two-step clustering method thereby identified eight winery SME brand strategy types. The importance of sustainability across the identified eight brand strategy types is significant. Co-creation turned out to be a key profiling trait characterizing one brand strategy type. The typology illustrates strategic richness, with brand strategies leaning predominantly on traditional values, on sustainability, on external reputation, or on more innovative customer centric concepts such as co-creation. Hereby, the typology and the identified brand levers invite to strategically design brand management, governance, and sustainability. Wineries which focus on traditional positioning and legitimacy were found to be cautious in deploying co-creation through social media. Winery brands that are characterized by engagement in digital co-creation apparently either tend to expand their scope or partially combine it with traditional values, making them the most diverse type identified. Sustainability obviously needs to be addressed by all brand strategies. Despite industry and country focus, the analyses illustrate the relevance of socially-oriented profiling and highlights that sustainability has reached a status of a fundamental business approach still allowing to differentiate thereon. Furthermore, the business models of the SMEs need to deliver communicated values.
Actors
(2021)
Social protection is for many international organizations a state’s affair.1 While the state definitely plays an important role, the state is by far not the only actor and there is no predefined institutional arrangement of how social protection should be implemented. An exclusive focus on the state would therefore be short-sighted when assessing and comparing the performance of social protection systems. It is hence important to understand the mix of actors involved, the type of contribution they can make to social protection and their modes of cooperation. This contribution will therefore first sketch out the role and interplay of the main actors in social protection and then challenge some of the common assumptions made around how roles are best allocated in the social protection system concerning the providers of informal social protection, the private sector, civil society organizations (CSO) as well as international actors.
Solving transport network problems can be complicated by non-linear effects. In the particular case of gas transport networks, the most complex non-linear elements are compressors and their drives. They are described by a system of equations, composed of a piecewise linear ‘free’ model for the control logic and a non-linear ‘advanced’ model for calibrated characteristics of the compressor. For all element equations, certain stability criteria must be fulfilled, providing the absence of folds in associated system mapping. In this paper, we consider a transformation (warping) of a system from the space of calibration parameters to the space of transport variables, satisfying these criteria. The algorithm drastically improves stability of the network solver. Numerous tests on realistic networks show that nearly 100% convergence rate of the solver is achieved with this approach.
The clear-sky radiative effect of aerosol-radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear sky detection algorithm is used to identify cloud free observations. Considered are measurements of the shortwave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). Clear sky models used are MMAC, MRMv6.1, METSTAT, ESRA, Heliosat-1, CEM and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace-gases and aerosol from CAMS reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and assymetrie parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as average over the clear sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterisation, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear sky models, the largest level of agreement is shown by the ESRA and MRMv6.1 models.
The clear-sky radiative effect of aerosol–radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear-sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear-sky detection algorithm is used to identify cloud-free observations. Considered are measurements of the short-wave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). The clear-sky models used are the Modified MAC model (MMAC), the Meteorological Radiation Model (MRM) v6.1, the Meteorological–Statistical solar radiation model (METSTAT), the European Solar Radiation Atlas (ESRA), Heliosat-1, the Center for Environment and Man solar radiation model (CEM), and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace gases and aerosol from the Copernicus Atmosphere Monitoring Service (CAMS) reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and asymmetry parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear-sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as the average over the clear-sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterization, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear-sky models, the largest level of agreement is shown by the ESRA and MRM v6.1 models.
This study sought to apply the Structure Conduct Performance paradigm to Africa´s air transport landscape in general. To do that, it examines the past, present, and future expectations of four of Sub-Saharan Africa’s biggest aviation economies, namely South Africa, Kenya, Ethiopia, and Nigeria. Secondary data containing historical passenger traffic was analysed, and predictions for growth in the next ten years were proposed. The findings suggest that the experience of the existing liberalization initiatives, such as the Yamoussoukro Declaration (YD), has produced less than expected benefits. However, the future of aviation in Africa is somewhat positive, with a growth trajectory expected to follow a linear and gradual path supported by various initiatives, including the Single African Air Transport Market (SAATM) and the African Continental Free Trade Area (AFCTA). The study’s contribution is to illuminate the current discourse on the aviation sector in Africa through the Structure-Conduct-Performance theory paradigm and suggests a conceptual model that could be applied to future studies relating to aviation in Africa.
Voice assistants (VA) collect data about users’ daily life including interactions with other connected devices, musical preferences, and unintended interactions. While users appreciate the convenience of VAs, their understanding and expectations of data collection by vendors are often vague and incomplete. By making the collected data explorable for consumers, our research-through-design approach seeks to unveil design resources for fostering data literacy and help users in making better informed decisions regarding their use of VAs. In this paper, we present the design of an interactive prototype that visualizes the conversations with VAs on a timeline and provides end users with basic means to engage with data, for instance allowing for filtering and categorization. Based on an evaluation with eleven households, our paper provides insights on how users reflect upon their data trails and presents design guidelines for supporting data literacy of consumers in the context of VAs.
Many workers experience their jobs as effortful or even stressful, which can result in strain. Although recovery from work would be an adaptive strategy to prevent the adverse effects of work-related strain, many workers face problems finding enough time to rest and to mentally disconnect from work during nonwork time. What goes on in workers’ minds after a stressful workday? What is it about their jobs that makes them think about their work? This special issue aims to bridge the gap between research on recovery processes mainly examined in Occupational Health Psychology, and research on work stress and working hours, often investigated in the field of Human Resource Management. We first summarize conceptual and theoretical streams from both fields of research. In the following, we discuss the contributions of the five special issue papers and conclude with key messages and directions for further research.
The analysis of used engine oils from industrial engines enables the study of engine wear and oil degradation in order to evaluate the necessity of oil changes. As the matrix composition of an engine oil strongly depends on its intended application, meaningful diagnostic oil analyses bear considerable challenges. Owing to the broad spectrum of available oil matrices, we have evaluated the applicability of using an internal standard and/or preceding sample digestion for elemental analysis of used engine oils via inductively coupled plasma optical emission spectroscopy (ICP OES). Elements originating from both wear particles and additives as well as particle size influence could be clearly recognized by their distinct digestion behaviour. While a precise determination of most wear elements can be achieved in oily matrix, the measurement of additives is performed preferably after sample digestion. Considering a dataset of physicochemical parameters and elemental composition for several hundred used engine oils, we have further investigated the feasibility of predicting the identity and overall condition of an unknown combustion engine using the machine learning system XGBoost. A maximum accuracy of 89.6% in predicting the engine type was achieved, a mean error of less than 10% of the observed timeframe in predicting the oil running time and even less than 4% for the total engine running time, based purely on common oil check data. Furthermore, obstacles and possibilities to improve the performance of the machine learning models were analysed and the factors that enabled the prediction were explored with SHapley Additive exPlanation (SHAP). Our results demonstrate that both the identification of an unknown engine as well as a lifetime assessment can be performed for a first estimation of the actual sample without requiring meticulous documentation.
Ice accumulation in the blades of wind turbines can cause them to describe anomalous rotations or no rotations at all, thus affecting the generation of electricity and power output. In this work, we investigate the problem of ice accumulation in wind turbines by framing it as anomaly detection of multi-variate time series. Our approach focuses on two main parts: first, learning low-dimensional representations of time series using a Variational Recurrent Autoencoder (VRAE), and second, using unsupervised clustering algorithms to classify the learned representations as normal (no ice accumulated) or abnormal (ice accumulated). We have evaluated our approach on a custom wind turbine time series dataset, for the two-classes problem (one normal versus one abnormal class), we obtained a classification accuracy of up to 96$\%$ on test data. For the multiple-class problem (one normal versus multiple abnormal classes), we present a qualitative analysis of the low-dimensional learned latent space, providing insights into the capacities of our approach to tackle such problem. The code to reproduce this work can be found here https://github.com/agrija9/Wind-Turbines-VRAE-Paper.
This study investigated the application potential of Black Soldier Fly Larva Hermetia illucens Stratiomyidae: Diptera (L.1758) for wastewater treatment and the removal potential of chemical oxygen demand, ammonia, and phosphorus of and liquid manure residue and municipal waste water containing 1% solids content. Black Soldier Fly Larva were found to reduce the concentration of chemical oxygen demand, but unfortunately, increase the concentration of ammonia and phosphorus. The ability of Black Soldier Fly Larva to feed on organic waste of Liquid manure residue showed that Black Soldier Fly Larva increase their weight by 365% in a solution with 12% solids content and by 595% in a solution having 6% solids content. The study also showed that Black Soldier Fly Larva have the ability to survive in a solution of 1% solids content and have the ability to reduce chemical oxygen demand by up to 86.4% for liquid manure residue and 46.9% for municipal wastewater after 24 hours. Generally, ammonia increased by 43.9% for Liquid manure residue and 98.6% for municipal wastewater. Total phosphorus showed an increase of 11.0% and 88.6% increase for liquid manure residue and municipal wastewater respectively over the 8-day study. Transparent environments tend to reduce the COD content more than the dark environment, both for the liquid manure residue (55.8% and 65.4%) and municipal wastewater (71.5% and 66.4%).
Atomic oxygen in the mesosphere and lower thermosphere measured by terahertz heterodyne spectroscopy
(2021)
Atomic oxygen is a main component of the mesosphere and lower thermosphere (MLT). The photochemistry and the energy balance of the MLT are governed by atomic oxygen. In addition, it is a tracer for dynamical motions in the MLT. It is difficult to measure with remote sensing techniques. Concentrations can be inferred indirectly from the oxygen air glow or from observations of OH, which is involved in photochemical processes related to atomic oxygen. Such measurements have been performed with several satellite instruments such as SCIAMACHY, SABER, WINDII and OSIRIS. However, the methods are indirect and rely on photochemical models and assumptions such as quenching rates, radiative lifetimes, and reaction coefficients. The results are not always in agreement, particularly when obtained with different instruments.
Cancer is one of the leading causes of death worldwide [183], with lung tumors being the most frequent cause of cancer deaths in men as well as one of the most common cancers diagnosed in woman [40]. As symptoms often arise in advanced stages, an early diagnosis is especially important to ensure the best and earliest possible treatment. In order to achieve this, Computed Tomography (CT) scans are frequently used for tumor detection and diagnosis. We will present examples of publicly available CT image data of lung cancer patients and discuss possible methods to realize an automatic system for automated cancer diagnosis. We will also look at the recent SPIE-AAPM Lung CT Challenge [10] data set in detail and describe possible methods and challenges for image segmentation and classification based on this data set.
Introduction: Chronic pain is a frequent severe disease and often associated with anxiety, depression, insomnia, disability, and reduced quality of life. This maladaptive condition is further characterized by sensory loss, hyperalgesia, and allodynia. Blue light has been hypothesized to modulate sensory neurons and thereby influence nociception.
Objectives: Here, we compared the effects of blue light vs red light and thermal control on pain sensation in a human experimental pain model.
Methods: Pain, hyperalgesia, and allodynia were induced in 30 healthy volunteers through high-density transcutaneous electrical stimulation. Subsequently, blue light, red light, or thermal control treatment was applied in a cross-over design. The nonvisual effects of the respective light treatments were examined using a well-established quantitative sensory testing protocol. Somatosensory parameters as well as pain intensity and quality were scored.
Results: Blue light substantially reduced spontaneous pain as assessed by numeric rating scale pain scoring. Similarly, pain quality was significantly altered as assessed by the German counterpart of the McGill Pain Questionnaire. Furthermore, blue light showed antihyperalgesic, antiallodynic, and antihypesthesic effects in contrast to red light or thermal control treatment.
Conclusion: Blue-light phototherapy ameliorates pain intensity and quality in a human experimental pain model and reveals antihyperalgesic, antiallodynic, and antihypesthesic effects. Therefore, blue-light phototherapy may be a novel approach to treat pain in multiple conditions.
This book shows in a comprehensive presentation how Bond Graph methodology can support model-based control, model-based fault diagnosis, fault accommodation, and failure prognosis by reviewing the state-of-the-art, presenting a hybrid integrated approach to Bond Graph model-based fault diagnosis and failure prognosis, and by providing a review of software that can be used for these tasks.
This article provides insights into the modalities of business-model change and innovation. On the basis of an analysis of empirical data of small and medium enterprises, a transition from wine production centrism to its expanded use in hospitality and tourism is explored. Previous research on wine tourism and hospitality predominantly focuses on a destination perspective, neglecting the organizational winery perspective. The article deploys a mixed methods approach, combining netnography and a content analysis for data collection with grounded research and clustering for theory building. The sample size included 885 German wineries. Data stemmed from two distinct sources (websites and a secondary publication in form of a wine guide) and has been analyzed through a two-step clustering algorithm as well as a Principal Component Analysis (PCA). The two-step clustering algorithm resulted in nine different business models while the PCA analysis grouped the variables into the following two categories: basic winery business model (BM) and BM extension into hospitality and tourism, thereby validating the difference between the two constructs. The results point to the diverse nature of business model extensions of wineries in tourism and hospitality, depending on their organizational type and size. This study offers a classification of small and medium sized enterprise’s strategic business model expansion, and explores the expansion of the wine industry through wine hospitality and tourism services, starting with the winery organizational perspective, which has not been done before.
Critical consumerism is complex as ethical values are difficult to negotiate, appropriate products are hard to find, and product information is overwhelming. Although recommender systems offer solutions to reduce such complexity, current designs are not appropriate for niche practices and use non-personalized intransparent ethics. To support critical consumption, we conducted a design case study on a personalized food recommender system. Therefore, we first conducted an empirical pre-study with 24 consumers to understand value negotiations and current practices, co-designed the recommender system, and finally evaluated it in a real-world trial with ten consumers. Our findings show how recommender systems can support the negotiation of ethical values within the context of consumption practices, reduce the complexity of finding products and stores, and strengthen consumers. In addition to providing implications for the design to support critical consumption practices, we critically reflect on the scope of such recommender systems and its appropriation.
Many people do not consume as much healthy food as recommended. Nudging has been identified as a promising intervention strategy to increase the consumption of healthy food. The present study analyzed the effects of three body shape nudges (thin, thick, or Giacometti artwork) on food ordering and assessed the mediating role of being aware of the nudge. Students (686) and employees (218) of a German university participated in an online experimental study. After randomization, participants visited a realistic online cafeteria and composed a meal for themselves. Under experimental conditions, participants were exposed to one out of three nudges while choosing dishes: (1) thin body shape, (2) thick body shape, and (3) the Giacometti artwork nudge. The Giacometti nudge resulted in more orders for salad among employees. The thin and thick body shape nudges did not change dish orders. Awareness of the nudge mediated the numbers of calories ordered when using the Giacometti or thin body shape nudges. These findings provide useful insights for health interventions in occupational and public health sectors using nudges. Our study contributes to the research on the Giacometti nudge by showing its effectiveness when participants are aware (it is effective under conditions where it is consciously perceived).
The promotion of sustainable packaging is part of the European Green Deal and plays a key role in the EU’s social and political strategy. One option is the use of renewable resources and biomass waste as raw materials for polymer production. Lignocellulose biomass from annual and perennial industrial crops and agricultural residues are a major source of polysaccharides, proteins, and lignin, and can also be used to obtain plant-based extracts and essential oils. Therefore, these biomasses are considered as potential substitute for fossil-based resources. Here, the status quo of bio-based polymers is discussed and evaluated in terms of properties related to packaging applications such as gas and water vapor permeability as well as mechanical properties. So far, their practical use is still restricted due to lower performance in fundamental packaging functions that directly influence food quality and safety, the length of shelf life and thus the amount of food waste. Besides bio-based polymers, this review focuses on plant extracts as active packaging agents. Incorporating extracts of herbs, flowers, trees, and their fruits is inevitable to achieve desired material properties that are capable to prolong the food shelf life. Finally, the adoption potential of packaging based on polymers from renewable resources is discussed from a bioeconomy perspective.
The promotion of sustainable packaging is part of the European Green Deal and plays a key role in the EU’s social and political strategy. One option is the use of renewable resources and biomass waste as raw materials for polymer production. Lignocellulose biomass from annual and perennial industrial crops and agricultural residues are a major source of polysaccharides, proteins, and lignin and can also be used to obtain plant-based extracts and essential oils. Therefore, these biomasses are considered as potential substitute for fossil-based resources. Here, the status quo of bio-based polymers is discussed and evaluated in terms of properties related to packaging applications such as gas and water vapor permeability as well as mechanical properties. So far, their practical use is still restricted due to lower performance in fundamental packaging functions that directly influence food quality and safety, the length of shelf life, and thus the amount of food waste. Besides bio-based polymers, this review focuses on plant extracts as active packaging agents. Incorporating extracts of herbs, flowers, trees, and their fruits is inevitable to achieve desired material properties that are capable to prolong the food shelf life. Finally, the adoption potential of packaging based on polymers from renewable resources is discussed from a bioeconomy perspective.
The idea of a basic income grant (BIG) is not new and there are ongoing debates internationally as well as nationally in low- and middle-income countries just like in high-income countries of a BIG as a social protection policy option. The challenge is that there are different conceptualisations, which conflates and muddles the understanding. In the context of social assistance provision, a universal basic income grant (UBIG) is often compared and contrasted against targeted cash transfers (CTs). This case study systematically presents the arguments for targeted CTs and UBIGs. The value of the case study is that it systematically brings together these arguments, highlighting the variations in UBIG applications, including the evidence and actual impact of UBIG experiments. The structure of the case study is as follows: Section 2 simultaneously contrasts and compares the arguments for targeted CTs and UBIG. Section 3 discusses UBIG experiments, as well as presenting the evidence on the application of the UBIG idea, and Section 4 concludes.
With the roll out of social protection programmes to national scale, questions about implementation and delivery move more and more into the centre of debate (e.g. UNDP 2020; UNDP and UNCDF 2014; Kramon 2019). This concerns in particular the local level, where key processes of implementation are taking place, but where at the same time institutional, operational and financial capacities are often the weakest. While social protection programmes are usually based on a clearly defined set of operational rules and regulations – usually set out in a programme manual – in practice these processes often tend to look quite different. Although many social protection programmes have explicitly excluded traditional authorities from playing an active role in programme delivery, there is ample evidence from across countries that in many local contexts, these ‘informal institutions’ continue to play an important role in the delivery of social protection programmes.
In recent years, the basic income grant (BIG) discourse has gained attention worldwide as a potential policy option in social protection as testified by recent public debates, ongoing pilot projects, campaigning efforts,1 policy measures during Covid-19 and the surge in academic research. A BIG refers to regular cash transfers paid to all members of society irrespective of their socio-economic status, their capacity or willingness to participate in the labour market or having to meet pre-determined conditions (Offe 2008; Van Parijs 1995, 2003; Wright 2004, 2006). Despite the recent hype around BIG, Iran is the only country worldwide with a scaled-up BIG (Tabatabai 2011, 2012). Other programmes have never gone beyond pilot programmes. This raises the question why this is the case.
Characterization of Urban Radio Channels and Base Station Antenna Correlation in the 3.75 GHz Band
(2021)
Cysticfibrosis (CF) arises from mutations in the CF transmembrane conductance regulator (CFTR) gene, resulting in progressiveand life-limiting respiratory disease. R751L is a rare CFTR mutation that is poorly characterized. Our aims were to describe theclinical and molecular phenotypes associated with R751L. Relevant clinical data were collected from three heterozygote individu-als harboring R751L (2 patients with G551D/R751L and 1 with F508del/R751L). Assessment of R751L-CFTR function was made inprimary human bronchial epithelial cultures (HBEs) andXenopusoocytes. Molecular properties of R751L-CFTR were investigatedin the presence of known CFTR modulators. Although sweat chloride was elevated in all three patients, the clinical phenotypeassociated with R751L was mild. Chloride secretion in F508del/R751L HBEs was reduced compared with non-CF HBEs and asso-ciated with a reduction in sodium absorption by the epithelial sodium channel (ENaC). However, R751L-CFTR function inXenopusoocytes, together with folding and cell surface transport of R751L-CFTR, was not different from wild-type CFTR. Overall,R751L-CFTR was associated with reduced sodium chloride absorption but had functional properties similar to wild-type CFTR.This is thefirst report of R751L-CFTR that combines clinical phenotype with characterization of functional and biological proper-ties of the mutant channel. Our work will build upon existing knowledge of mutations within this region of CFTR and, importantly,inform approaches for clinical management. Elevated sweat chloride and reduced chloride secretion in HBEs may be due to al-ternative non-CFTR factors, which require further investigation.
Cancer is a complex disease where resistance to therapies and relapses often pose a serious clinical challenge. The scenario is even more complicated when the cancer type itself is heterogeneous in nature, e.g., lymphoma, a cancer of the lymphocytes which constitutes more than 70 different subtypes. Indeed, the treatment options continue to expand in lymphomas. Herein, we provide insights into lymphoma-specific clinical trials based on cytokine-induced killer (CIK) cell therapy and other pre-clinical lymphoma models where CIK cells have been used along with other synergetic tumor-targeting immune modules to improve their therapeutic potential. From a broader perspective, we will highlight that CIK cell therapy has potential, and in this rapidly evolving landscape of cancer therapies its optimization (as a personalized therapeutic approach) will be beneficial in lymphomas.
Components and Architecture for the Implementation of Technology-Driven Employee Data Protection
(2021)
Most people use disaster apps infrequently, primarily only in situations of turmoil, when they are physically or emotionally vulnerable. Personal data may be necessary to help them, data protections may be waived. In some circumstances, free movement and liberties may be curtailed for public protection, as was seen in the current COVID pandemic. Consuming and producing disaster data can deepen problems arising at the confluence of surveillance and disaster capitalism, where data has become a tool for solutionist instrumentarian power (Zuboff 2019, Klein 2008) and part of a destructive mode of one world worlding (Law 2015, Escobar 2020). The special use of disaster apps prompts us to ask what role consumer protection could play in safeguarding democratic liberties. Within this work, a set of current approaches are briefly reviewed and two case studies are presented of what we call appropriation or design against datafication. These combine document analysis and literature research with several months of online and field ethnographic observation. The first case study examines disaster app use in response to the 2010 Haiti earthquake, the second explores COVID Contact Tracing in Taiwan in 2020/21. Against this backdrop we ask, ‘how could and how should consumer protection respond to problems of surveillance disaster capitalism?’ Drawing on our work with the is IT ethical? Exchange, a co-designed community platform and knowledge exchange for disaster information sharing, and a Societal Readiness Assessment Framework that we are developing alongside it, we explore how co-design methodologies could help define answers.
The identification of energetic materials in containments is an important challenge for analytical methods in the field of safety and security. Opening a package without knowledge of its contents and the resulting hazards is highly involved with risks and should be avoided whenever possible. Therefore, preferable methods work non-destructive with minimal interaction and are capable of identifying target substances in a containment quickly and reliably. Most spectroscopic methods find their limits, if the target substance is shielded by a covering material. To solve this problem, a combined laser drilling method with subsequent identification of the target substance by means of Raman spectroscopic measurements through microscopic bore holes of the covering material is presented. A pulsed laser beam is used for both the drilling process and as an excitation source for Raman measurements in the same optical setup. Results show the ability of this new method to gain high-quality spectra even when performed through microscopic small bore channels. With the laser parameters chosen right, the method can even be performed on highly sensitive explosives like triacetone triperoxide (TATP). Another advantageous effect arises in an observed reduction in unwanted fluorescence signal in the spectral data, resulting from the confocal-like measurement setup with the bore hole acting as aperture.