Refine
H-BRS Bibliography
- yes (1602) (remove)
Departments, institutes and facilities
- Fachbereich Angewandte Naturwissenschaften (531)
- Fachbereich Wirtschaftswissenschaften (411)
- Fachbereich Informatik (265)
- Fachbereich Ingenieurwissenschaften und Kommunikation (200)
- Institut für funktionale Gen-Analytik (IFGA) (183)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (176)
- Fachbereich Sozialpolitik und Soziale Sicherung (175)
- Institute of Visual Computing (IVC) (65)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (63)
- Institut für Sicherheitsforschung (ISF) (43)
Document Type
- Article (1602) (remove)
Year of publication
Keywords
- cytokine-induced killer cells (9)
- GC/MS (8)
- immunotherapy (8)
- Deutschland (6)
- ISM: molecules (6)
- Qualitätsmanagement (6)
- virtual reality (6)
- Africa (5)
- Corporate Social Responsibility (5)
- Gene expression (5)
After more than twenty years of research, the molecular events of apoptotic cell death can be succinctly stated; different pathways, activated by diverse signals, increase the activity of proteases called caspases that rapidly and irreversibly dismantle condemned cell by cleaving specific substrates. In this time the ideas that apoptosis protects us from tumourigenesis and that cancer chemotherapy works by inducing apoptosis also emerged. Currently, apoptosis research is shifting away from the intracellular events within the dying cell to focus on the effect of apoptotic cells on surrounding tissues. This is producing counterintuitive data showing that our understanding of the role of apoptosis in tumourigenesis and cancer therapy is too simple, with some interesting and provocative implications. Here, we will consider evidence supporting the idea that dying cells signal their presence to the surrounding tissue and, in doing so, elicit repair and regeneration that compensates for any loss of function caused by cell death. We will discuss evidence suggesting that cancer cell proliferation may be driven by inappropriate or corrupted tissue-repair programmes that are initiated by signals from apoptotic cells and show how this may dramatically modify how we view the role of apoptosis in both tumourigenesis and cancer therapy.
Software developers build complex systems using plenty of third-party libraries. Documentation is key to understand and use the functionality provided via the libraries’ APIs. Therefore, functionality is the main focus of contemporary API documentation, while cross-cutting concerns such as security are almost never considered at all, especially when the API itself does not provide security features. Documentations of JavaScript libraries for use in web applications, e.g., do not specify how to add or adapt a Content Security Policy (CSP) to mitigate content injection attacks like Cross-Site Scripting (XSS). This is unfortunate, as security-relevant API documentation might have an influence on secure coding practices and prevailing major vulnerabilities such as XSS. For the first time, we study the effects of integrating security-relevant information in non-security API documentation. For this purpose, we took CSP as an exemplary study object and extended the official Google Maps JavaScript API documentation with security-relevant CSP information in three distinct manners. Then, we evaluated the usage of these variations in a between-group eye-tracking lab study involving N=49 participants. Our observations suggest: (1) Developers are focused on elements with code examples. They mostly skim the documentation while searching for a quick solution to their programming task. This finding gives further evidence to results of related studies. (2) The location where CSP-related code examples are placed in non-security API documentation significantly impacts the time it takes to find this security-relevant information. In particular, the study results showed that the proximity to functional-related code examples in documentation is a decisive factor. (3) Examples significantly help to produce secure CSP solutions. (4) Developers have additional information needs that our approach cannot meet.
Overall, our study contributes to a first understanding of the impact of security-relevant information in non-security API documentation on CSP implementation. Although further research is required, our findings emphasize that API producers should take responsibility for adequately documenting security aspects and thus supporting the sensibility and training of developers to implement secure systems. This responsibility also holds in seemingly non-security relevant contexts.
"Innovation Journalism ist die Politikberichterstattung der Zukunft" Interview mit David Nordfors
(2011)
Die dargestellte Rating-Methodik auf der Basis der Vorgehensweise von S&P erscheint bei einer grundsätzlichen Untersuchung der Ausfallwahrscheinlichkeiten von Zahlungsverpflichtungen als sachgerecht. Allerdings bedürfen die aufgezeigten Analyseschritte einer deutlichen Adjustierung der Methodik, um die unterschiedlichen Effekte einer internen und einer externen Finanzierung so herauszuarbeiten, dass nicht bereits durch die angewandte Art der Analyse Verzerrungen in den Aussagen eintreten.
Das sogenannte „Deutschlandstipendium“ ist 2010 ins Leben gerufen worden. Gemäß den gesetzlichen Vorgaben sollen die Stipendien nach Begabung und Leistung vergeben werden. Darüber hinaus sollen auch gesellschaftliches Engagement oder besondere soziale, familiäre oder persönliche Umstände berücksichtigt werden. Bei der Finanzierung sind die Hochschulen zunächst auf das Einwerben privater Fördermittel angewiesen, die von Bund und Land um denselben Betrag aufgestockt werden. Die privaten Mittelgeber können für die von ihnen anteilig finanzierten Stipendien festlegen, aus welchen Studiengängen ihre Stipendiaten ausgewählt werden sollen. Die Hochschulen haben jedoch darauf zu achten, dass ein Drittel aller zu vergebenden Stipendien ohne eine entsprechende Zweckbindung vergeben werden. Einen direkten Einfluss auf die Auswahl einzelner Kandidaten dürfen die Förderer nicht haben. Vor diesem Hintergrund sind die Hochschulen angehalten, Anreize für private Förderer zu schaffen und parallel Bewerbungs- und Auswahlverfahren zu konzipieren, die die genannten gesetzlichen Vorgaben einhalten. Dadurch entsteht bei den Hochschulen ein erheblicher Verwaltungsaufwand. Zu dessen Reduzierung wird in diesem Artikel ein transparenter, nachvollziehbarer, zeit- und kostensparender Prozess durch einen programmierten Workflow beschrieben.
2-methylacetoacetyl-coenzyme A thiolase (beta-ketothiolase) deficiency: one disease - two pathways
(2020)
Background: 2-methylacetoacetyl-coenzyme A thiolase deficiency (MATD; deficiency of mitochondrial acetoacetyl-coenzyme A thiolase T2/ “beta-ketothiolase”) is an autosomal recessive disorder of ketone body utilization and isoleucine degradation due to mutations in ACAT1.
Methods: We performed a systematic literature search for all available clinical descriptions of patients with MATD. Two hundred forty-four patients were identified and included in this analysis. Clinical course and biochemical data are presented and discussed.
Results: For 89.6% of patients at least one acute metabolic decompensation was reported. Age at first symptoms ranged from 2 days to 8 years (median 12 months). More than 82% of patients presented in the first 2 years of life, while manifestation in the neonatal period was the exception (3.4%). 77.0% (157 of 204 patients) of patients showed normal psychomotor development without neurologic abnormalities. Conclusion: This comprehensive data analysis provides a systematic overview on all cases with MATD identified in the literature. It demonstrates that MATD is a rather benign disorder with often favourable outcome, when compared with many other organic acidurias.
Nitrile-type inhibitors are known to interact with cysteine proteases in a covalent-reversible manner. The chemotype of 3-cyano-3-aza-β-amino acid derivatives was designed in which the N-cyano group is centrally arranged in the molecule to allow for interactions with the nonprimed and primed binding regions of the target enzymes. These compounds were evaluated as inhibitors of the human cysteine cathepsins K, S, B, and L. They exhibited slow-binding behavior and were found to be exceptionally potent, in particular toward cathepsin K, with second-order rate constants up to 52 900 × 103 M–1 s–1.
Background: 3-hydroxy-3-methylglutaryl-coenzyme A lyase deficiency (HMGCLD) is an autosomal recessive disorder of ketogenesis and leucine degradation due to mutations in HMGCL.
Method: We performed a systematic literature search to identify all published cases. Two hundred eleven patients of whom relevant clinical data were available were included in this analysis. Clinical course, biochemical findings and mutation data are highlighted and discussed. An overview on all published HMGCL variants is provided.
Results: More than 95% of patients presented with acute metabolic decompensation. Most patients manifested within the first year of life, 42.4% already neonatally. Very few individuals remained asymptomatic. The neurologic long-term outcome was favorable with 62.6% of patients showing normal development.
Conclusion: This comprehensive data analysis provides a systematic overview on all published cases with HMGCLD including a list of all known HMGCL mutations.
3-Hydroxyisobutyrate Dehydrogenase (HIBADH) deficiency - a novel disorder of valine metabolism
(2021)
3-Hydroxyisobutyric acid (3HiB) is an intermediate in the degradation of the branched-chain amino acid valine. Disorders in valine degradation can lead to 3HiB accumulation and its excretion in the urine. This article describes the first two patients with a new metabolic disorder, 3-hydroxyisobutyrate dehydrogenase (HIBADH) deficiency, its phenotype and its treatment with a low-valine diet. The detected mutation in the HIBADH gene leads to nonsense-mediated mRNA decay of the mutant allele and to a complete loss-of-function of the enzyme. Under strict adherence to a low-valine diet a rapid decrease of 3HiB excretion in the urine was observed. Due to limited patient numbers and intrafamilial differences in phenotype with one affected and one unaffected individual, the clinical phenotype of HIBADH deficiency needs further evaluation.
Animal models are often needed in cancer research but some research questions may be answered with other models, e.g., 3D replicas of patient-specific data, as these mirror the anatomy in more detail. We, therefore, developed a simple eight-step process to fabricate a 3D replica from computer tomography (CT) data using solely open access software and described the method in detail. For evaluation, we performed experiments regarding endoscopic tumor treatment with magnetic nanoparticles by magnetic hyperthermia and local drug release. For this, the magnetic nanoparticles need to be accumulated at the tumor site via a magnetic field trap. Using the developed eight-step process, we printed a replica of a locally advanced pancreatic cancer and used it to find the best position for the magnetic field trap. In addition, we described a method to hold these magnetic field traps stably in place. The results are highly important for the development of endoscopic tumor treatment with magnetic nanoparticles as the handling and the stable positioning of the magnetic field trap at the stomach wall in close proximity to the pancreatic tumor could be defined and practiced. Finally, the detailed description of the workflow and use of open access software allows for a wide range of possible uses.
Background: Cancer heterogeneity poses a serious challenge concerning the toxicity and adverse effects of therapeutic inhibitors, especially when it comes to combinatorial therapies that involve multiple targeted inhibitors. In particular, in non-small cell lung cancer (NSCLC), a number of studies have reported synergistic effects of drug combinations in the preclinical models, while they were only partially successful in the clinical setup, suggesting those alternative clinical strategies (with genetic background and immune response) should be considered. Herein, we investigated the antitumor effect
of cytokine-induced killer (CIK) cells in combination with ALK and PD-1 inhibitors in vitro on genetically variable NSCLC cell lines.
Methods: We co-cultured the three genetically different NSCLC cell lines NCI-H2228 (EML4-ALK), A549 (KRAS mutation), and HCC-78 (ROS1 rearrangement) with and without nivolumab (PD-1 inhibitor) and crizotinib (ALK inhibitor). Additionally, we profiled the variability of surface expression multiple immune checkpoints, the concentration of absolute dead cells, intracellular granzyme B on CIK cells using flow cytometry as well as RT-qPCR. ELISA and Western blot were performed to verify the activation of CIK cells.
Results: Our analysis showed that (a) nivolumab significantly weakened PD-1 surface expression on CIK cells without impacting other immune checkpoints or PD-1 mRNA expression, (b) this combination strategy showed an effective response on cell viability, IFN-g production, and intracellular release of granzyme B in CD3+ CD56+ CIK cells, but solely in NCI-H2228, (c) the intrinsic expression of Fas ligand (FasL) as a T-cell activation marker in CIK cells was upregulated by this additive effect, and (d) nivolumab induced Foxp3 expression in CD4+CD25+ subpopulation of CIK cells significantly increased. Taken together, we could show that CIK cells in combination with crizotinib and nivolumab can enhance the anti-tumor immune response through FasL activation, leading to increased IFN-g and granzyme B, but only in NCI-H2228 cells with EML4-ALK rearrangement. Therefore, we hypothesize that CIK therapy may be a potential alternative in NSCLC patients harboring EML4-ALK rearrangement, in addition, we support the idea that combination therapies offer significant potential when they are optimized on a patient-by-patient basis.
The simultaneous operation of multiple different semiconducting metal oxide (MOX) gas sensors is demanding for the readout circuitry. The challenge results from the strongly varying signal intensities of the various sensor types to the target gas. While some sensors change their resistance only slightly, other types can react with a resistive change over a range of several decades. Therefore, a suitable readout circuit has to be able to capture all these resistive variations, requiring it to have a very large dynamic range. This work presents a compact embedded system that provides a full, high range input interface (readout and heater management) for MOX sensor operation. The system is modular and consists of a central mainboard that holds up to eight sensor-modules, each capable of supporting up to two MOX sensors, therefore supporting a total maximum of 16 different sensors. Its wide input range is archived using the resistance-to-time measurement method. The system is solely built with commercial off-the-shelf components and tested over a range spanning from 100Ω to 5 GΩ (9.7 decades) with an average measurement error of 0.27% and a maximum error of 2.11%. The heater management uses a well-tested power-circuit and supports multiple modes of operation, hence enabling the system to be used in highly automated measurement applications. The experimental part of this work presents the results of an exemplary screening of 16 sensors, which was performed to evaluate the system’s performance.
The choice of suitable semiconducting metal oxide (MOX) gas sensors for the detection of a specific gas or gas mixture is time-consuming since the sensor’s sensitivity needs to be characterized at multiple temperatures to find its optimal operating conditions. To obtain reliable measurement results, it is very important that the power for the sensor’s integrated heater is stable, regulated and error-free (or error-tolerant). Especially the error-free requirement can be only be achieved if the power supply implements failure-avoiding and failure-detection methods. The biggest challenge is deriving multiple different voltages from a common supply in an efficient way while keeping the system as small and lightweight as possible. This work presents a reliable, compact, embedded system that addresses the power supply requirements for fully automated simultaneous sensor characterization for up to 16 sensors at multiple temperatures. The system implements efficient (avg. 83.3% efficiency) voltage conversion with low ripple output (<32 mV) and supports static or temperature-cycled heating modes. Voltage and current of each channel are constantly monitored and regulated to guarantee reliable operation. To evaluate the proposed design, 16 sensors were screened. The results are shown in the experimental part of this work.
Electrical signal transmission in power electronic devices takes place through high-purity aluminum bonding wires. Cyclic mechanical and thermal stresses during operation lead to fatigue loads, resulting in premature failure of the wires, which cannot be reliably predicted. The following work presents two fatigue lifetime models calibrated and validated based on experimental fatigue results of an aluminum bonding wire and subsequently transferred and applied to other wire types. The lifetime modeling of Wöhler curves for different load ratios shows good but limited applicability for the linear model. The model can only be applied above 10,000 cycles and within the investigated load range of R = 0.1 to R = 0.7. The nonlinear model shows very good agreement between model prediction and experimental results over the entire investigated cycle range. Furthermore, the predicted Smith diagram is not only consistent in the investigated load range but also in the extrapolated load range from R = −1.0 to R = 0.8. A transfer of both model approaches to other wire types by using their tensile strengths can be implemented as well, although the nonlinear model is more suitable since it covers the entire load and cycle range.
Climate change is increasingly affecting vulnerable groups and resulting in dire social and economic consequences, especially for those in the Global South. Managing current and emerging climate-related risks will require increasing individual’s and communities’ resilience, including enhancing absorptive, adaptive, and transformative capacities. Policymakers are now considering the role that social protection policies and programmes can play in building climate resilience by contributing to these capacities. However, there is a limited understanding of the extent to which social protection instruments can influence these three resilience-related capacities. Lack of assessment tools or frameworks might contribute to limited evidence of social protection’s ability to increase climate resilience. In particular, there appear to be no frameworks or tools that help assess the role of social cash transfers (SCT) in building adaptive capacity. Based on a multi-staged literature review, we develop an adaptive capacity outcomes framework (ACOF) that can help assess SCT’s contribution to building adaptive capacity, and, consequently, resilience. The framework is then tested using impact evaluation and assessment reports from SCT programmes in Indonesia, Zambia, Ethiopia, Bangladesh, and Tanzania. The exercise finds that SCTs alone have a limited contribution to adaptive capacity outcomes, but interventions that combine cash transfers with other components such as nutrition or livelihood training show positive impacts. We find that the ACOF can support assessments of SCT’s contribution towards adaptive capacity. It can help build evidence, evaluate impacts, and through further research, can facilitate learning on SCTs' role in increasing climate resilience.
Failure prognostic builds up on constant data acquisition and processing and fault diagnosis and is an essential part of predictive maintenance of smart manufacturing systems enabling condition based maintenance, optimised use of plant equipment, improved uptime and yield and to prevent safety problems. Given known control inputs into a plant and real sensor outputs or simulated measurements, the model-based part of the proposed hybrid method provides numerical values of unknown parameter degradation functions at sampling time points by the evaluation of equations that have been derived offline from a bicausal diagnostic bond graph. These numerical values are computed concurrently to the constant monitoring of a system and are stored in a buffer of fixed length. The data-driven part of the method provides a sequence of remaining useful life estimates by repeated projection of the parameter degradation into the future based on the use of values in a sliding time window. Existing software can be used to determine the best fitting function and can account for its random parameters. The continuous parameter estimation and their projection into the future can be performed in parallel for multiple isolated simultaneous parametric faults on a multicore, multiprocessor computer.
The proposed hybrid bond graph model-based, data-driven method is verified by an offline simulation case study of a typical power electronic circuit. It can be used to implement embedded systems that enable cooperating machines in smart manufacturing to perform prognostic themselves.
In the design of robot skills, the focus generally lies on increasing the flexibility and reliability of the robot execution process; however, typical skill representations are not designed for analysing execution failures if they occur or for explicitly learning from failures. In this paper, we describe a learning-based hybrid representation for skill parameterisation called an execution model, which considers execution failures to be a natural part of the execution process. We then (i) demonstrate how execution contexts can be included in execution models, (ii) introduce a technique for generalising models between object categories by combining generalisation attempts performed by a robot with knowledge about object similarities represented in an ontology, and (iii) describe a procedure that uses an execution model for identifying a likely hypothesis of a parameterisation failure. The feasibility of the proposed methods is evaluated in multiple experiments performed with a physical robot in the context of handle grasping, object grasping, and object pulling. The experimental results suggest that execution models contribute towards avoiding execution failures, but also represent a first step towards more introspective robots that are able to analyse some of their execution failures in an explicit manner.
A Method for the Sustainable Documentation of Operations Processes in Parcel Distribution Centers
(2018)
There is often no common understanding on operational processes in logistics companies as they are not properly documented. Hence, people execute the same process differently and training is conducted by experienced operators on an ad-hoc basis. Furthermore, continuous process improvement is hampered as neither the ideal process nor current issues in as-is processes are visible. A major reason for the missing documentation is the complexity of existing business process modelling languages. Modelling experts are required for initially describing the processes and also for updating the models after process changes. Furthermore, operations people are usually not used to read complex process models in EPCs or BPMN diagrams. In order to overcome these limitations, a domain-specific modelling language which facilitates maintaining up-to-date process models has been designed with a large logistics company in Germany. The paper at hand briefly describes this language and illustrates the method on how to apply it in operations environments.
Integrating physical simulation data into data ecosystems challenges the compatibility and interoperability of data management tools. Semantic web technologies and relational databases mostly use other data types, such as measurement or manufacturing design data. Standardizing simulation data storage and harmonizing the data structures with other domains is still a challenge, as current standards such as the ISO standard STEP (ISO 10303 ”Standard for the Exchange of Product model data”) fail to bridge the gap between design and simulation data. This challenge requires new methods, such as ontologies, to rethink simulation results integration. This research describes a new software architecture and application methodology based on the industrial standard ”Virtual Material Modelling in Manufacturing” (VMAP). The architecture integrates large quantities of structured simulation data and their analyses into a semantic data structure. It is capable of providing data permeability from the global digital twin level to the detailed numerical values of data entries and even new key indicators in a three-step approach: It represents a file as an instance in a knowledge graph, queries the file’s metadata, and finds a semantically represented process that enables new metadata to be created and instantiated.
Recessive mutations in the MPV17 gene cause mitochondrial DNA depletion syndrome, a fatal infantile genetic liver disease in humans. Loss of function in mice leads to glomerulosclerosis and sensineural deafness accompanied with mitochondrial DNA depletion. Mutations in the yeast homolog Sym1, and in the zebra fish homolog tra cause interesting, but not obviously related phenotypes, although the human gene can complement the yeast Sym1 mutation. The MPV17 protein is a hydrophobic membrane protein of 176 amino acids and unknown function. Initially localised in murine peroxisomes, it was later reported to be a mitochondrial inner membrane protein in humans and in yeast. To resolve this contradiction we tested two new mouse monoclonal antibodies directed against the human MPV17 protein in Western blots and immunohistochemistry on human U2OS cells. One of these monoclonal antibodies showed specific reactivity to a protein of 20 kD absent in MPV17 negative mouse cells. Immunofluorescence studies revealed colocalisation with peroxisomal, endosomal and lysosomal markers, but not with mitochondria. This data reveal a novel connection between a possible peroxisomal/endosomal/lysosomal function and mitochondrial DNA depletion.
With the increasing demand for ultrapure water in the pharmaceutical and semiconductor industry, the need for precise measuring instruments for those applications is also growing. One critical parameter of water quality is the amount of total organic carbon (TOC). This work presents a system that uses the advantage of the increased oxidation power achieved with UV/O3 advanced oxidation process (AOP) for TOC measurement in combination with a significant miniaturization compared to the state of the art. The miniaturization is achieved by using polymer-electrolyte membrane (PEM) electrolysis cells for ozone generation in combination with UV-LEDs for irradiation of the measuring solution, as both components are significantly smaller than standard equipment. Conductivity measurement after oxidation is the measuring principle and measurements were carried out in the TOC range between 10 and 1000 ppb TOC. The suitability of the system for TOC measurement is demonstrated using the oxidation by ozonation combined with UV irradiation of defined concentrations of isopropyl alcohol (IPA).
This paper presents a novel approach to address noise, vibration, and harshness (NVH) issues in electrically assisted bicycles (e-bikes) caused by the drive unit. By investigating and optimising the structural dynamics during early product development, NVH can decisively be improved and valuable resources can be saved, emphasising its significance for enhancing riding performance. The paper offers a comprehensive analysis of the e-bike drive unit’s mechanical interactions among relevant components, culminating—to the best of our knowledge—in the development of the first high-fidelity model of an entire e-bike drive unit. The proposed model uses the principles of elastic multi body dynamics (eMBD) to elucidate the structural dynamics in dynamic-transient calculations. Comparing power spectra between measured and simulated motion variables validates the chosen model assumptions. The measurements of physical samples utilise accelerometers, contactless laser Doppler vibrometry (LDV) and various test arrangements, which are replicated in simulations and provide accessibility to measure vibrations onto rotating shafts and stationary structures. In summary, this integrated system-level approach can serve as a viable starting point for comprehending and managing the NVH behaviour of e-bikes.
In this paper, a gas-to-power (GtoP) system for power outages is digitally modeled and experimentally developed. The design includes a solid-state hydrogen storage system composed of TiFeMn as a hydride forming alloy (6.7 kg of alloy in five tanks) and an air-cooled fuel cell (maximum power: 1.6 kW). The hydrogen storage system is charged under room temperature and 40 bar of hydrogen pressure, reaching about 110 g of hydrogen capacity. In an emergency use case of the system, hydrogen is supplied to the fuel cell, and the waste heat coming from the exhaust air of the fuel cell is used for the endothermic dehydrogenation reaction of the metal hydride. This GtoP system demonstrates fast, stable, and reliable responses, providing from 149 W to 596 W under different constant as well as dynamic conditions. A comprehensive and novel simulation approach based on a network model is also applied. The developed model is validated under static and dynamic power load scenarios, demonstrating excellent agreement with the experimental results.
A qualitative study of Machine Learning practices and engineering challenges in Earth Observation
(2021)
Machine Learning (ML) is ubiquitously on the advance. Like many domains, Earth Observation (EO) also increasingly relies on ML applications, where ML methods are applied to process vast amounts of heterogeneous and continuous data streams to answer socially and environmentally relevant questions. However, developing such ML- based EO systems remains challenging: Development processes and employed workflows are often barely structured and poorly reported. The application of ML methods and techniques is considered to be opaque and the lack of transparency is contradictory to the responsible development of ML-based EO applications. To improve this situation a better understanding of the current practices and engineering-related challenges in developing ML-based EO applications is required. In this paper, we report observations from an exploratory study where five experts shared their view on ML engineering in semi-structured interviews. We analysed these interviews with coding techniques as often applied in the domain of empirical software engineering. The interviews provide informative insights into the practical development of ML applications and reveal several engineering challenges. In addition, interviewees participated in a novel workflow sketching task, which provided a tangible reflection of implicit processes. Overall, the results confirm a gap between theoretical conceptions and real practices in ML development even though workflows were sketched abstractly as textbook-like. The results pave the way for a large-scale investigation on requirements for ML engineering in EO.
This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated rendering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving tar- gets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.
Jet engines of airplanes are designed such that in some components damage occurs and accumulates in service without being critical up to a certain level of damage. Since maintenance, repair, and component exchange are very cost-intensive, it is necessary to predict efficiently the component lifetime with high accuracy. A former developed lifetime model, based on interpolated results of aerodynamic and structural mechanics simulations, uses material parameters estimated from literature values of standard creep experiments. For improved accuracy, an experimental procedure is developed for the characterization of the short-time creep behavior, which is relevant for the operation of turbine blades of jet engines. To consider microstructural influences resulting from the manufacturing of thin-walled single crystal turbine blades, small-scale specimens from used turbine blades are extracted and tested in short- and medium-time creep experiments. Based on experimental results and literature values, a creep model, which describes the fracture behavior for a wide range of creep loads, is calibrated and is now used for the lifetime prediction of turbine blades under real loading conditions.
Earth’s nearest candidate supermassive black hole lies at the centre of the Milky Way1. Its electromagnetic emission is thought to be powered by radiatively inefficient accretion of gas from its environment2, which is a standard mode of energy supply for most galactic nuclei. X-ray measurements have already resolved a tenuous hot gas component from which the black hole can be fed3. The magnetization of the gas, however, which is a crucial parameter determining the structure of the accretion flow, remains unknown. Strong magnetic fields can influence the dynamics of accretion, remove angular momentum from the infalling gas4, expel matter through relativistic jets5 and lead to synchrotron emission such as that previously observed6, 7, 8. Here we report multi-frequency radio measurements of a newly discovered pulsar close to the Galactic Centre9, 10, 11, 12 and show that the pulsar’s unusually large Faraday rotation (the rotation of the plane of polarization of the emission in the presence of an external magnetic field) indicates that there is a dynamically important magnetic field near the black hole. If this field is accreted down to the event horizon it provides enough magnetic flux to explain the observed emission—from radio to X-ray wavelengths—from the black hole.
Pozzolanic properties of Pennisetum purpureum grass ash were tested on Portland cement. Results show that the ash can be blended with cements without compromising binding strength of the cement. It was found that Portland cement could be blended with Pennisetum purpureum up to a ratio of 3:2 compromising compressive strength of mortar.Mortar with lower cement replacement took longer to set as evidenced by lower compressive strength within the 28-day aging time. Mortar with higher cement replacement had lower water absorption capacity, an indication that the test pozzolan was of smaller particulate size. XRF analysis and the FTIR spectrum showed that the ash has a higher content of silica. The XRD pattern of the ash showed that the ash was predominantly amorphous. SEM images showed that the ash produced at 600 o C had residual carbon material.
Host-derived succinate accumulates in the airways during bacterial infection. Here, we show that luminal succinate activates murine tracheal brush (tuft) cells through a signaling cascade involving the succinate receptor 1 (SUCNR1), phospholipase Cβ2, and the cation channel transient receptor potential channel subfamily M member 5 (TRPM5). Stimulated brush cells then trigger a long-range Ca2+ wave spreading radially over the tracheal epithelium through a sequential signaling process. First, brush cells release acetylcholine, which excites nearby cells via muscarinic acetylcholine receptors. From there, the Ca2+ wave propagates through gap junction signaling, reaching also distant ciliated and secretory cells. These effector cells translate activation into enhanced ciliary activity and Cl- secretion, which are synergistic in boosting mucociliary clearance, the major innate defense mechanism of the airways. Our data establish tracheal brush cells as a central hub in triggering a global epithelial defense program in response to a danger-associated metabolite.
The development of advanced robotic systems is challenging as expertise from multiple domains needs to be integrated conceptually and technically. Model-driven engineering promises an efficient and flexible approach for developing robotics applications that copes with this challenge. Domain-specific modeling allows to describe robotics concerns with concepts and notations closer to the respective problem domain. This raises the level of abstraction and results in models that are easier to understand and validate. Furthermore, model-driven engineering allows to increase the level of automation, e.g. through code generation, and to bridge the gap between modeling and implementation. The anticipated results are improved efficiency and quality of the robotics systems engineering process. Within this contribution, we survey the available literature on domain-specific modeling and languages that target core robotics concerns. In total 137 publications were identified that comply with a set of defined criteria, which we consider essential for contributions in this field. With the presented survey, we provide an overview on the state-of-the-art of domain-specific modeling approaches in robotics. The surveyed publications are investigated from the perspective of users and developers of model-based approaches in robotics along a set of quantitative and qualitative research questions. The presented quantitative analysis clearly indicates the rising popularity of applying domain-specific modeling approaches to robotics in the academic community. Beyond this statistical analysis, we map the selected publications to a defined set of robotics subdomains and typical development phases in robotic systems engineering as reference for potential users. Furthermore, we analyze these contributions from a language engineering viewpoint and discuss aspects such as the methods and tools used for their implementation as well as their documentation status, platform integration, typical use cases and the evaluation strategies used for validation of the proposed approaches. Finally, we conclude with recommendations for discussion in the model-driven engineering and robotics community based on the insights gained in this survey.
Background
Consumers rely heavily on online user reviews when shopping online and cybercriminals produce fake reviews to manipulate consumer opinion. Much prior research focuses on the automated detection of these fake reviews, which are far from perfect. Therefore, consumers must be able to detect fake reviews on their own. In this study we survey the research examining how consumers detect fake reviews online.
Methods
We conducted a systematic literature review over the research on fake review detection from the consumer-perspective. We included academic literature giving new empirical data. We provide a narrative synthesis comparing the theories, methods and outcomes used across studies to identify how consumers detect fake reviews online.
Results
We found only 15 articles that met our inclusion criteria. We classify the most often used cues identified into five categories which were (1) review characteristics (2) textual characteristics (3) reviewer characteristics (4) seller characteristics and (5) characteristics of the platform where the review is displayed.
Discussion
We find that theory is applied inconsistently across studies and that cues to deception are often identified in isolation without any unifying theoretical framework. Consequently, we discuss how such a theoretical framework could be developed.
Updating a shared data structure in a parallel program is usually done with some sort of high-level synchronization operation to ensure correctness and consistency. The realization of such high-level synchronization operations is done with appropriate low-level atomic synchronization instructions that the target processor architecture provides. These instructions are costly and often limited in their scalability on larger multi-core / multi-processor systems. In this paper, a technique is discussed that replaces atomic updates of a shared data structure with ordinary and cheaper read/write operations. The necessary conditions are specified that must be fulfilled to ensure overall correctness of the program despite missing synchronization. The advantage of this technique is the reduction of access costs as well as more scalability due to elided atomic operations. But on the other side, possibly more work has to be done caused by missing synchronization. Therefore, additional work is traded against costly atomic operations. A practical application is shown with level-synchronous parallel Breadth-First Search on an undirected graph where two vertex frontiers are accessed in parallel. This application scenario is also used for an evaluation of the technique. Tests were done on four different large parallel systems with up to 64-way parallelism. It will be shown that for the graph application examined the amount of additional work caused by missing synchronization is neglectible and the performance is almost always better than the approach with atomic operations.
We present a system that combines voxel and polygonal representations into a single octree acceleration structure that can be used for ray tracing. Voxels are well-suited to create good level-of-detail for high-frequency models where polygonal simplifications usually fail due to the complex structure of the model. However, polygonal descriptions provide the higher visual fidelity. In addition, voxel representations often oversample the geometric domain especially for large triangles, whereas a few polygons can be tested for intersection more quickly.
Abschlussarbeiten FAQ/FGA
(2006)
Abstract Classical ballet requires dancers to exercise significant muscle control and strength both while stationary and when moving. Following the Royal Academy of Dance (RAD) syllabus, 8 male and 27 female dancers (aged 20.2 + 1.9 yr) in a full-time university undergraduate dance training program were asked to stand in first position for 10 seconds and then perform 10 repeats of a demi-plié exercise to a counted rhythm. Accelerometer records from the wrist, sacrum, knee and ankle were compared with the numerical scores from a professional dance instructor. The sacrum mounted sensor detected lateral tilts of the torso in dances with lower scores (Spearman’s rank correlation coefficient r = -0.64, p < 0.005). The 5RMS6 acceleration amplitude of wrist mounted sensor was linearly correlated to the movement scores (Spearman’s rank correlation coefficient r = 0.63, p < 0.005). The application of sacrum and wrist mounted sensors for biofeedback during dance training is a realistic, low cost option.
This article concerns with the accessibility of Business process modelling tools (BPMo tools) and business process modelling languages (BPMo languages). Therefore the reader will be introduced to business process management and the authors' motivation behind this inquiry. Afterwards, the paper will reflect problems when applying inaccessible BPMo tools. To illustrate these problems the authors distinguish between two different categories of issues and provide practical examples. Finally the article will present three approaches to improve the accessibility of BPMo tools and BPMo languages.
Herein we report an update to ACPYPE, a Python3 tool that now properly converts AMBER to GROMACS topologies for force fields that utilize nondefault and nonuniform 1–4 electrostatic and nonbonded scaling factors or negative dihedral force constants. Prior to this work, ACPYPE only converted AMBER topologies that used uniform, default 1–4 scaling factors and positive dihedral force constants. We demonstrate that the updated ACPYPE accurately transfers the GLYCAM06 force field from AMBER to GROMACS topology files, which employs non-uniform 1–4 scaling factors as well as negative dihedral force constants. Validation was performed using β-d-GlcNAc through gas-phase analysis of dihedral energy curves and probability density functions. The updated ACPYPE retains all of its original functionality, but now allows the simulation of complex glycomolecular systems in GROMACS using AMBER-originated force fields. ACPYPE is available for download at https://github.com/alanwilter/acpype.
At present, data publication is one of the most dynamic topics in e-Research. While the fundamental problems of electronic text publication have been solved in the past decade, standards for the external and internal organisation of data repositories are advanced in some research disciplines but underdeveloped in others. We discuss the differences between an electronic text publication and a data publication and the challenges that result from these differences for the data publication process. We place the data publication process in the context of the human knowledge spiral and discuss key factors for the successful acquisition of research data from the point of view of a data repository. For the relevant activities of the publication process, we list some of the measures and best practices of successful data repositories.
The introduction of new steering conceptsSteer-by-Wire (SBW) gives possibility to replace theconventional steering wheel by an alternative userinterface such as a sidestick. In SBW system the sidestickcan be used as user input element instead of a steeringwheel. The implementation of sidestick in the Human-Machine-Interface (HMI) allows combiningthe conventional steering consisting of a steeringwheel, an accelerator and a brake pedal into a singleelement. Also the implementation of the sidestickcreates new, interesting and flexible design optionswhich can be used to transform the driver’s spatialenvironment. This article describes an active sidestickfor a vehicle which has been developed, integrated andtested in accordance of haptic, ergonomic and safetyrelevant requirements. The control strategies used forthe active attenuators of the sidestick have beeninvestigated and optimised using a Simulink model.
Low power dissipation is a current topic in digital design, and therefore, it should be covered in a state-of-the-art electrical engineering curriculum. This paper describes how low-power design can be addressed within a digital design course. Doing so would be beneficial for both topics because low-power design is not detached from the systems perspective, and the digital design course would be enriched by references to current challenges and applications. Thus, the presented course should serve as an example of how a course can be developed to also teach students about sustainable engineering.
Introduction: The paper analyses – basing itself on reports and other documents created by different parts of the International Labour Organisation (ILO) – the process which led to the adoption of Social Protection Floor Recommendation No. 202 and the shift in focus of social policy advice towards basic protection and to the Global South countries. We look at the actions of different actors which shape the standard setting and policy stand of the organisation. Objective: To provide a comprehensive analysis of the historical trajectory of ILO social security standards, examining the evolution of principles, conventions, and the global dynamics that have shaped the organization's approach to social protection over time. Materials and methods: The methods include examining ILO documents, relevant subject literature, and the author's participant observations from over twenty-years of service in the ILO's Social Security Department, aiming to provide insights into the decision-making processes within the organization. Conclusion: We conclude that change was brought by: 1) shift in the membership of the ILO and of its decision-making bodies towards the increased presence and powers of representatives from countries of the Global South, 2) the shift in the global development community policy priorities towards poverty reduction, 3) emergence of experimental social assistance schemes in Global South countries, with designs often ignoring principles embedded in the ILO standards. The Social Protection Floor Recommendation complements previous standards in response to the challenges of widespread poverty and informality and spreading atypical forms of employment. It provides two directions of policy responses: 1) formalizing informal employment relationships and 2) expanding universal or targeted rights-based social assistance schemes. Assistance provided by ILO to member states focuses now more on building the non-contributory schemes and on identifying the fiscal space necessary to close the coverage gaps. Nowadays, the ILO must collaborate more than before with other development partners and the main challenge is to build among them awareness and acceptance of the principles of the ILO social security standards.
The steadily decreasing prices of display technologies and computer graphics hardware contribute to the increasing popularity of multiple-display environments, like large, high-resolution displays. It is therefore necessary that educational organizations give the new generation of computer scientists an opportunity to become familiar with this kind of technology. However, there is a lack of tools that allow for getting started easily. Existing frameworks and libraries that provide support for multi-display rendering are often complex in understanding, configuration and extension. This is critical especially in educational context where the time that students have for their projects is limited and quite short. These tools are also rather known and used in research communities only, thus providing less benefit for future non-scientists. In this work we present an extension for the Unity game engine. The extension allows – with a small overhead – for implementation of applications that are apt to run on both single-display and multi-display systems. It takes care of the most common issues in the context of distributed and multi-display rendering like frame, camera and animation synchronization, thus reducing and simplifying the first steps into the topic. In conjunction with Unity, which significantly simplifies the creation of different kinds of virtual environments, the extension affords students to build mock-up virtual reality applications for large, high-resolution displays, and to implement and evaluate new interaction techniques and metaphors and visualization concepts. Unity itself, in our experience, is very popular among computer graphics students and therefore familiar to most of them. It is also often employed in projects of both research institutions and commercial organizations; so learning it will provide students with qualification in high demand.
Adoption of Modern Maize Varieties in India: Insights Based on Expert Elicitation Methodology
(2018)
Advanced thermal gradient mechanical fatigue testing of CMSX-4 with an oxidation protection coating
(2008)
This paper presents recent research on an active multispectral scanning sensor capable of classifying an object's surface material in order to distinguish between different kinds of materials and human skin. The sensor itself has already been presented in previous work and can be used in conjunction with safeguarding equipment at manually-fed machines or robot workplaces, for example. This work shows how an extended sensor system with advanced material classifiers can be used to provide additional value by distinguishing different materials of work pieces in order to suggest different tools or parameters for the machine (e.g. the use of a different saw blade or rotation speed at table saws). Additionally, a first implementation and evaluation of an active multispectral camera system addressing new safety applications is described. Both approaches intend to increase the productivity and the user's acceptance of the sensor technology.
The clear-sky radiative effect of aerosol–radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear-sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear-sky detection algorithm is used to identify cloud-free observations. Considered are measurements of the short-wave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). The clear-sky models used are the Modified MAC model (MMAC), the Meteorological Radiation Model (MRM) v6.1, the Meteorological–Statistical solar radiation model (METSTAT), the European Solar Radiation Atlas (ESRA), Heliosat-1, the Center for Environment and Man solar radiation model (CEM), and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace gases and aerosol from the Copernicus Atmosphere Monitoring Service (CAMS) reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and asymmetry parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear-sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as the average over the clear-sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterization, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear-sky models, the largest level of agreement is shown by the ESRA and MRM v6.1 models.
Agiles IT-Controlling
(2022)
Während im IT-Projektmanagement agile Methoden seit vielen Jahren in der Praxis Zuspruch finden, werden im IT-Controlling überwiegend noch klassische Methoden eingesetzt. Der Beitrag untersucht die Fragestellung, ob und wie die im IT-Controlling eingesetzten Methoden auch agilen Paradigmen folgen und Methoden des agilen IT-Projektmanagements adaptiert werden können.
Seit vielen Jahren ist der Übergang von der Schule zur Hochschule eines der zentralen Themen für didaktische Theorien, empirische Untersuchungen und bildungspolitische Diskussionen. Ein dabei identifiziertes großes Problem vieler Studierender ist, dass mit dem Abitur „eine Lebensphase mit meist klar definierten Zielen in überschaubaren räumlichen, familiären und schulischen Strukturen endet“.1) Entscheidet man sich als Studierender gegen die nicht akademische Laufbahn und nimmt ein Hochschulstudium auf, trifft man auf Studienstrukturen und -bedingungen, die einem fremd und chaotisch vorkommen können. Der Weg an die Hochschulen ermöglicht den Individuen eine Reihe von Optionen, ist aber leider auch immer mit Risiken und Unsicherheiten behaftet. Entscheidungen müssen nun selbstständig vorbereitet und getroffen werden und dies in einem Umfeld, das sehr unterschiedlich im Vergleich zur bekannten Schulstruktur sein kann.
The following work presents algorithms for semi-automatic validation, feature extraction and ranking of time series measurements acquired from MOX gas sensors. Semi-automatic measurement validation is accomplished by extending established curve similarity algorithms with a slope-based signature calculation. Furthermore, a feature-based ranking metric is introduced. It allows for individual prioritization of each feature and can be used to find the best performing sensors regarding multiple research questions. Finally, the functionality of the algorithms, as well as the developed software suite, are demonstrated with an exemplary scenario, illustrating how to find the most power-efficient MOX gas sensor in a data set collected during an extensive screening consisting of 16,320 measurements, all taken with different sensors at various temperatures and analytes.
Alleine Bowling spielen?
(2021)
Womöglich ist das social distancing dieser Tage gar kein soziales sondern einfach nur ein körperliches, ein body distancing. Sozial können wir heute auch sein, ohne uns physisch zu begegnen. Wir können sogar "gemeinsam einsam" musizieren oder im Chor singen. Trotzdem geht etwas verloren - und am Ende wissen wir Zusammenkünfte in Vereinen welcher Art auch immer vielleicht wieder besonders zu schätzen.
Allgemeines Steuerrecht
(2015)
Many workers experience their jobs as effortful or even stressful, which can result in strain. Although recovery from work would be an adaptive strategy to prevent the adverse effects of work-related strain, many workers face problems finding enough time to rest and to mentally disconnect from work during nonwork time. What goes on in workers’ minds after a stressful workday? What is it about their jobs that makes them think about their work? This special issue aims to bridge the gap between research on recovery processes mainly examined in Occupational Health Psychology, and research on work stress and working hours, often investigated in the field of Human Resource Management. We first summarize conceptual and theoretical streams from both fields of research. In the following, we discuss the contributions of the five special issue papers and conclude with key messages and directions for further research.
Amino acids perform multiple essential physiological roles in humans, and accordingly, their importance to health has been the subject of extensive attention. In this special issue of the Journal of Nutrition and Metabolism, we focus on the various inborn errors of amino acid metabolism, their diagnostic challenges, new treatment approaches, and recent advances in patient monitoring as well as clinical outcomes.