Refine
H-BRS Bibliography
- yes (359) (remove)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (89)
- Fachbereich Informatik (65)
- Fachbereich Angewandte Naturwissenschaften (59)
- Fachbereich Ingenieurwissenschaften und Kommunikation (58)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (50)
- Fachbereich Sozialpolitik und Soziale Sicherung (46)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (23)
- Institut für Medienentwicklung und -analyse (IMEA) (16)
- Institut für Verbraucherinformatik (IVI) (16)
- Institut für funktionale Gen-Analytik (IFGA) (15)
Document Type
- Article (138)
- Part of a Book (68)
- Conference Object (66)
- Book (monograph, edited volume) (19)
- Preprint (12)
- Contribution to a Periodical (8)
- Report (8)
- Research Data (6)
- Doctoral Thesis (6)
- Master's Thesis (6)
- Bachelor Thesis (5)
- Working Paper (5)
- Part of Periodical (4)
- Book review (4)
- Conference Proceedings (2)
- Other (2)
Year of publication
- 2022 (359) (remove)
Keywords
- Machine Learning (5)
- Lehrbuch (4)
- Medienästhetik (4)
- virtual reality (4)
- Cathepsin K (3)
- GDPR (3)
- Knowledge Graphs (3)
- Lignin (3)
- Medien (3)
- Medienwissenschaft (3)
Professor Dr. Dietmar Fink, Inhaber des Lehrstuhls für Unternehmensberatung an der Hochschule Bonn-Rhein-Sieg und Geschäftsführender Direktor der Wissenschaftlichen Gesellschaft für Management und Beratung (WGMB) in Bonn, über den Mehrwert von Consulting-Rankings und den Sinn von Beraterprojekten bei Versicherern
50 Jahre: Von der FH zur HAW
(2022)
Ziel der achten Auflage des wissenschaftlichen Workshops “Usable Security and Privacy” auf der Mensch und Computer 2022 ist es, aktuelle Forschungs- und Praxisbeiträge zu präsentieren und anschließend mit den Teilnehmenden zu diskutieren. Der Workshop soll ein etabliertes Forum fortführen und weiterentwickeln, in dem sich Experten aus verschiedenen Bereichen, z. B. Usability und Security Engineering, transdisziplinär austauschen können.
Safety-critical applications like autonomous driving use Deep Neural Networks (DNNs) for object detection and segmentation. The DNNs fail to predict when they observe an Out-of-Distribution (OOD) input leading to catastrophic consequences. Existing OOD detection methods were extensively studied for image inputs but have not been explored much for LiDAR inputs. So in this study, we proposed two datasets for benchmarking OOD detection in 3D semantic segmentation. We used Maximum Softmax Probability and Entropy scores generated using Deep Ensembles and Flipout versions of RandLA-Net as OOD scores. We observed that Deep Ensembles out perform Flipout model in OOD detection with greater AUROC scores for both datasets.
Background: Cancer heterogeneity poses a serious challenge concerning the toxicity and adverse effects of therapeutic inhibitors, especially when it comes to combinatorial therapies that involve multiple targeted inhibitors. In particular, in non-small cell lung cancer (NSCLC), a number of studies have reported synergistic effects of drug combinations in the preclinical models, while they were only partially successful in the clinical setup, suggesting those alternative clinical strategies (with genetic background and immune response) should be considered. Herein, we investigated the antitumor effect
of cytokine-induced killer (CIK) cells in combination with ALK and PD-1 inhibitors in vitro on genetically variable NSCLC cell lines.
Methods: We co-cultured the three genetically different NSCLC cell lines NCI-H2228 (EML4-ALK), A549 (KRAS mutation), and HCC-78 (ROS1 rearrangement) with and without nivolumab (PD-1 inhibitor) and crizotinib (ALK inhibitor). Additionally, we profiled the variability of surface expression multiple immune checkpoints, the concentration of absolute dead cells, intracellular granzyme B on CIK cells using flow cytometry as well as RT-qPCR. ELISA and Western blot were performed to verify the activation of CIK cells.
Results: Our analysis showed that (a) nivolumab significantly weakened PD-1 surface expression on CIK cells without impacting other immune checkpoints or PD-1 mRNA expression, (b) this combination strategy showed an effective response on cell viability, IFN-g production, and intracellular release of granzyme B in CD3+ CD56+ CIK cells, but solely in NCI-H2228, (c) the intrinsic expression of Fas ligand (FasL) as a T-cell activation marker in CIK cells was upregulated by this additive effect, and (d) nivolumab induced Foxp3 expression in CD4+CD25+ subpopulation of CIK cells significantly increased. Taken together, we could show that CIK cells in combination with crizotinib and nivolumab can enhance the anti-tumor immune response through FasL activation, leading to increased IFN-g and granzyme B, but only in NCI-H2228 cells with EML4-ALK rearrangement. Therefore, we hypothesize that CIK therapy may be a potential alternative in NSCLC patients harboring EML4-ALK rearrangement, in addition, we support the idea that combination therapies offer significant potential when they are optimized on a patient-by-patient basis.
With the increasing demand for ultrapure water in the pharmaceutical and semiconductor industry, the need for precise measuring instruments for those applications is also growing. One critical parameter of water quality is the amount of total organic carbon (TOC). This work presents a system that uses the advantage of the increased oxidation power achieved with UV/O3 advanced oxidation process (AOP) for TOC measurement in combination with a significant miniaturization compared to the state of the art. The miniaturization is achieved by using polymer-electrolyte membrane (PEM) electrolysis cells for ozone generation in combination with UV-LEDs for irradiation of the measuring solution, as both components are significantly smaller than standard equipment. Conductivity measurement after oxidation is the measuring principle and measurements were carried out in the TOC range between 10 and 1000 ppb TOC. The suitability of the system for TOC measurement is demonstrated using the oxidation by ozonation combined with UV irradiation of defined concentrations of isopropyl alcohol (IPA).
In this paper, a gas-to-power (GtoP) system for power outages is digitally modeled and experimentally developed. The design includes a solid-state hydrogen storage system composed of TiFeMn as a hydride forming alloy (6.7 kg of alloy in five tanks) and an air-cooled fuel cell (maximum power: 1.6 kW). The hydrogen storage system is charged under room temperature and 40 bar of hydrogen pressure, reaching about 110 g of hydrogen capacity. In an emergency use case of the system, hydrogen is supplied to the fuel cell, and the waste heat coming from the exhaust air of the fuel cell is used for the endothermic dehydrogenation reaction of the metal hydride. This GtoP system demonstrates fast, stable, and reliable responses, providing from 149 W to 596 W under different constant as well as dynamic conditions. A comprehensive and novel simulation approach based on a network model is also applied. The developed model is validated under static and dynamic power load scenarios, demonstrating excellent agreement with the experimental results.
Jet engines of airplanes are designed such that in some components damage occurs and accumulates in service without being critical up to a certain level of damage. Since maintenance, repair, and component exchange are very cost-intensive, it is necessary to predict efficiently the component lifetime with high accuracy. A former developed lifetime model, based on interpolated results of aerodynamic and structural mechanics simulations, uses material parameters estimated from literature values of standard creep experiments. For improved accuracy, an experimental procedure is developed for the characterization of the short-time creep behavior, which is relevant for the operation of turbine blades of jet engines. To consider microstructural influences resulting from the manufacturing of thin-walled single crystal turbine blades, small-scale specimens from used turbine blades are extracted and tested in short- and medium-time creep experiments. Based on experimental results and literature values, a creep model, which describes the fracture behavior for a wide range of creep loads, is calibrated and is now used for the lifetime prediction of turbine blades under real loading conditions.
For research in audiovisual interview archives often it is not only of interest what is said but also how. Sentiment analysis and emotion recognition can help capture, categorize and make these different facets searchable. In particular, for oral history archives, such indexing technologies can be of great interest. These technologies can help understand the role of emotions in historical remembering. However, humans often perceive sentiments and emotions ambiguously and subjectively. Moreover, oral history interviews have multi-layered levels of complex, sometimes contradictory, sometimes very subtle facets of emotions. Therefore, the question arises of the chance machines and humans have capturing and assigning these into predefined categories. This paper investigates the ambiguity in human perception of emotions and sentiment in German oral history interviews and the impact on machine learning systems. Our experiments reveal substantial differences in human perception for different emotions. Furthermore, we report from ongoing machine learning experiments with different modalities. We show that the human perceptual ambiguity and other challenges, such as class imbalance and lack of training data, currently limit the opportunities of these technologies for oral history archives. Nonetheless, our work uncovers promising observations and possibilities for further research.
Während sich die unternehmerische Arbeitswelt immer mehr in Richtung Agilität verschiebt, verharrt das IT-Controlling noch in alten, klassischen Strukturen. Diese Arbeit untersucht die Fragestellung, ob und inwieweit agile Ansätze im IT-Controlling eingesetzt werden können. Dieser Beitrag ist eine modifizierte Version des in der Zeitschrift „HMD Praxis der Wirtschaftsinformatik“ (https://link.springer.com/article/10.1365/s40702-022-00837-0) erschienenen Artikels „Agiles IT-Controlling“.
Agiles IT-Controlling
(2022)
Während im IT-Projektmanagement agile Methoden seit vielen Jahren in der Praxis Zuspruch finden, werden im IT-Controlling überwiegend noch klassische Methoden eingesetzt. Der Beitrag untersucht die Fragestellung, ob und wie die im IT-Controlling eingesetzten Methoden auch agilen Paradigmen folgen und Methoden des agilen IT-Projektmanagements adaptiert werden können.
The following work presents algorithms for semi-automatic validation, feature extraction and ranking of time series measurements acquired from MOX gas sensors. Semi-automatic measurement validation is accomplished by extending established curve similarity algorithms with a slope-based signature calculation. Furthermore, a feature-based ranking metric is introduced. It allows for individual prioritization of each feature and can be used to find the best performing sensors regarding multiple research questions. Finally, the functionality of the algorithms, as well as the developed software suite, are demonstrated with an exemplary scenario, illustrating how to find the most power-efficient MOX gas sensor in a data set collected during an extensive screening consisting of 16,320 measurements, all taken with different sensors at various temperatures and analytes.
The human enzymes GLYAT (glycine N-acyltransferase), GLYATL1 (glutamine N-phenylacetyltransferase) and GLYATL2 (glycine N-acyltransferase-like protein 2) are not only important in the detoxification of xenobiotics via the human liver, but are also involved in the elimination of acyl residues that accumulate in the form of their coenzyme A (coA) esters in some rare inborn errors of metabolism. This concerns, for example, disorders in the degradation of branched-chain amino acids, such as isovaleric acidemia or propionic acidemia. In addition, they also assist in the elimination of ammonium, which is produced during the transamination of amino acids and accumulates in urea cycle defects. Sequence variants of the enzymes have also been investigated, which may provide evidence of impaired enzyme activities, from which therapy adjustments can potentially be derived. A modified Escherichia coli strain was chosen for the overexpression and partial biochemical characterization of the enzymes, which may allow solubility and proper folding. Since post-translational protein modifications are very limited in bacteria, we also attempted to overexpress the enzymes in HEK293 cells (human-derived). In addition to characterization via immunoblots and activity assays, intracellular localization of the enzymes was also performed using GFP coupling and confocal laser scanning microscopy in transfected HEK293 cells. The GLYATL2 enzyme may have tasks beyond detoxification and metabolic defects and the preliminary molecular biology work has been performed as part of this project - the enzyme activity determinations were outsourced to a co-supervised bachelor thesis. The enzyme activity determinations with purified recombinant human enzyme from Escherichia coli provided a threefold higher activity of the sequence variant p.(Asn156Ser) for GLYAT, which should be considered as the probably authentic wild type of the enzyme. In addition, a reduced activity of the GLYAT variant p.(Gln61Leu), which is very common in South Africa, was shown, which could be of particular importance in the treatment of isovaleric acidemia, which is also common in South Africa. Intracellularly, GLYAT and GLYATL1 could be localized mitochondrially. As the analyses have shown, sequence variations of GLYAT and GLYATL1 influence their enzyme activity. As an example, the GLYAT variant p.(Gln61Leu) is frequently found in South Africa. In the case of reduced GLYAT activity, patients could be increasingly treated with L-carnitine in the sense of an individualized therapy, since the conjugation of the toxic isovaleryl-coA with glycine is restricted by the GLYAT sequence variation. Activity-reducing variants identified in this project are of particular interest, as they may influence the treatment of certain metabolic defects.
The design of a fully superconducting wind power generator is influenced by several factors. Among them, a low number of pole pairs is desirable to achieve low AC losses in the superconducting stator winding, which greatly influences the cooling system design and, consecutively, the efficiency of the entire wind power plant. However, it has been identified that a low number of pole pairs in a superconducting generator tends to greatly increase its output voltage, which in turn creates challenging conditions for the necessary power electronic converter. This study highlights the interdependencies between the design of a fully superconducting 10 MW wind power generator and the corresponding design of its power electronic converter.
Cytokine-induced killer cells (CIK) in combination with dendritic cells (DCs) have shown favorable outcomes in renal cell carcinoma (RCC), yet some patients exhibit recurrence or no response to this therapy. In a broader perspective, enhancing the antitumor response of DC-CIK cells may help to address this issue. Considering this, herein, we investigated the effect of anti-CD40 and anti-CTLA-4 antibodies on the antitumor response of DC-CIK cells against RCC cell lines. Our analysis showed that, a) anti-CD40 antibody (G28.5) increased the CD3+CD56+ effector cells of CIK cells by promoting the maturation and activation of DCs, b) G28.5 also increased CTLA-4 expression in CIK cells via DCs, but the increase could be hindered by the CTLA-4 inhibitor (ipilimumab), c) adding ipilimumab was also able to significantly increase the proportion of CD3+CD56+ cells in DC-CIK cells, d) anti-CD40 antibodies predominated over anti-CTLA-4 antibodies for cytotoxicity, apoptotic effect and IFN-g secretion of DC-CIK cells against RCC cells, e) after ipilimumab treatment, the population of Tregs in CIK cells remained unaffected, but ipilimumab combined with G28.5 significantly reduced the expression of CD28 in CIK cells. Taken together, we suggest that the agonistic anti-CD40 antibody rather than CTLA-4 inhibitor may improve the antitumor response of DC-CIK cells, particularly in RCC. In addition, we pointed towards the yet to be known contribution of CD28 in the crosstalk between anti-CTLA-4 and CIK cells.
Wie kann der wichtige Austausch von Hochschulen und Unfallversicherungsträgern im Bereich Forschung weiterentwickelt werden? Die Hochschule der DGUV (HGU) und die Hochschule Bonn-Rhein-Sieg (H-BRS) konzipierten ein interaktives Workshop-Format, das diesem strukturierten und kontinuierlichen Austausch von Wissenschaft und Praxis dienen soll.
The white ground crater by the Phiale Painter (450–440 BC) exhibited in the “Pietro Griffo” Archaeological Museum in Agrigento (Italy) depicts two scenes from Perseus myth. The vase is of utmost importance to archaeologists because the figures are drawn on a white background with remarkable daintiness and attention to detail. Notwithstanding the white ground ceramics being well documented from an archaeological and historical point of view, doubts concerning the compositions of pigments and binders and the production technique are still unsolved. This kind of vase is a valuable rarity, the use of which is documented in elitist funeral rituals. The study aims to investigate the constituent materials and the execution technique of this magnificent crater. The investigation was carried out using non-destructive and non-invasive techniques in situ. Portable X-ray fluorescence and Fourier-transform total reflection infrared spectroscopy complemented the use of visible and ultraviolet light photography to get an overview and specific information on the vase. The XRF data were used to produce false colour maps showing the location of the various elements detected, using the program SmART_scan. The use of gypsum as the material for the white ground is an important result that deserves to be further investigated in similar vases.
Research has identified nudging as a promising and effective tool to improve healthy eating behavior in a cafeteria setting. However, it remains unclear who is and who is not “nudgeable” (susceptible to nudges). An important influencing factor at the individual level is nudge acceptance. While some progress has been made in determining influences on the acceptance of healthy eating nudges, research on how personal characteristics (such as the perception of social norms) affect nudge acceptance remains scarce. We conducted a survey on 1032 university students to assess the acceptance of nine different types of healthy eating nudges in a cafeteria setting with four influential factors (social norms, health-promoting collaboration, responsibility to promote healthy eating, and procrastination). These factors are likely to play a role within a university and a cafeteria setting. The present study showed that key influential factors of nudge acceptance were the perceived responsibility to promote healthy eating and health-promoting collaboration. We also identified three different student clusters with respect to nudge acceptance, demonstrating that not all nudges were accepted equally. In particular, default, salience, and priming nudges were at least moderately accepted regardless of the degree of nudgeability. Our findings provide useful policy implications for nudge development by university, cafeteria, and public health officials. Recommendations are formulated for strengthening the theoretical background of nudge acceptance and the susceptibility to nudges.
This paper explores the role of artificial intelligence (AI) in elite sports. We approach the topic from two perspectives. Firstly, we provide a literature based overview of AI success stories in areas other than sports. We identified multiple approaches in the area of Machine Perception, Machine Learning and Modeling, Planning and Optimization as well as Interaction and Intervention, holding a potential for improving training and competition. Secondly, we discover the present status of AI use in elite sports. Therefore, in addition to another literature review, we interviewed leading sports scientist, which are closely connected to the main national service institute for elite sports in their countries. The analysis of this literature review and the interviews show that the most activity is carried out in the methodical categories of signal and image processing. However, projects in the field of modeling & planning have become increasingly popular within the last years. Based on these two perspectives, we extract deficits, issues and opportunities and summarize them in six key challenges faced by the sports analytics community. These challenges include data collection, controllability of an AI by the practitioners and explainability of AI results.
The implementation of the Sustainable Development Goals (SDGs) and the conservation and protection of nature are among the greatest challenges facing urban regions. There are few approaches so far that link the SDGs to natural diversity and related ecosystem services at the local level and track them in terms of increasing sustainable development at the local level. We want to close this gap by developing a set of indicators that capture ecosystem services in the sense of the SDGs and which are based on data that are freely available throughout Germany and Europe. Based on 10 SDGs and 35 SDG indicators, we are developing an ecosystem service and biodiversity-related indicator set for the evaluation of sustainable development in urban areas. We further show that it is possible to close many of the data gaps between SDGs and locally collected data mentioned in the literature and to translate the universal SDGs to the local level. Our example develops this set of indicators for the Bonn/Rhein-Sieg metropolitan area in North Rhine-Westphalia, Germany, which comprises both rural and densely populated settlements. This set of indicators can also help improve communication and plan sustainable development by increasing transparency in local sustainability, implementing a visible sustainability monitoring system, and strengthening the collaboration between local stakeholders.
Due to the COVID-19 pandemic, health education programs and workplace health promotion (WHP) could only be offered under difficult conditions, if at all. In Germany for example, mandatory lockdowns, working from home, and physical distancing have led to a sharp decline in expenditure on prevention and health promotion from 2019 to 2020. At the same time, the pandemic has negatively affected many people’s mental health. Therefore, our goal was to examine audiovisual stimulation as a possible measure in the context of WHP, because its usage is contact-free, time flexible, and offers, additionally, voice-guided health education programs. In an online survey following a cross-sectional single case study design with 393 study participants, we examined the associations between audiovisual stimulation and mental health, work engagement, and burnout. Using multiple regression analyses, we could identify positive associations between audiovisual stimulation and mental health, burnout, and work engagement. However, longitudinal data are needed to further investigate causal mechanisms between mental health and the use of audiovisual stimulation. Nevertheless, especially with regard to the pandemic, audiovisual stimulation may represent a promising measure for improving mental health at the workplace.
Ausrangierte Nachrichten
(2022)
Wichtige Nachrichten finden nicht ihre Bestimmung, nämlich das politisch interessierte und gesellschaftlich aufgeschlossene Publikum. Man kann diesen Vorgang als Agenda Cutting bezeichnen. Der Beitrag stellt die wichtigsten theoretischen Positionen zu diesem bislang noch wenig erforschten Phänomen dar, präsentiert wichtige Studienergebnisse und auch eigene empirische Ergebnisse zu innerredaktionellen Entscheidungsfindungsprozessen, bei denen Themen von der Agenda gestrichen werden. Zuletzt wird auch die Rolle des Publikums als Akteur beim Vorgang des Agenda Cuttings kritisch beleuchtet, die man als »news ignorance« beschreiben könnte.[1]
Die schleichende Abschaffung der Lernmittelfreiheit in den deutschen Bundesländern steht im Jahr 2022 auf Platz 1 der Top Ten der ›Vergessenen Nachrichten‹, die die Initiative Nachrichtenaufklärung (INA) e.V. jedes Jahr in die Öffentlichkeit lanciert. Der Chef des Fernsehnachrichtenmagazins Tagesthemen und stellvertretender Chefredakteur von ARD-Aktuell, Helge Fuhst, konzedierte in der Mediensendung eines öffentlich-rechtlichen Radiosenders, dass er dieses Thema für hochrelevant halte und es tatsächlich in seiner TV-Nachrichtensendung nicht behandelt worden sei. »Was das Schwierigste ist, ist tatsächlich Themen wegzulassen«, so Fuhst. »Es schmerzt uns jeden Tag, wenn wir Themen weglassen müssen. Es gibt wenige Tage im Laufe des Jahres, wo wir absolut keine Idee haben, was wir in die Sendung nehmen sollen« (WDR 2022).
Der Vorgang der Nachrichtenselektion ist redaktionelle Routine, und zu dieser Routine zählt auch, Themen wegzulassen, auszusortieren, fortzuschmeißen. Wenn dieser negative Prozess intentional erfolgt, kann man auch von Agenda Cutting sprechen. Dieser kommunikationswissenschaftliche Begriff beschreibt eine eigene Form redaktioneller Routine, die bislang nur wenig untersucht worden ist und deren Mechanismen mit ihrem erheblichen Einfluss auf die öffentliche Meinungsbildung dringend unter das Seziermesser der Medienforschung gehören.
Auswirkungen einer anhaltenden, inflationären Geldpolitik in der Eurozone auf den privaten Sparer
(2022)
Die vorliegende Bachelorarbeit setzt sich kritisch mit den Auswirkungen einer anhaltenden, inflationären Geldpolitik in der Eurozone auf den privaten Sparer auseinander. Im Rahmen dieser Arbeit wird aufgezeigt, wie die starke Erhöhung der Geldmenge Einfluss auf die Möglichkeiten und Entscheidungen des Sparers hat und wie weit eine solche Geldpolitik mit den Interessen des Sparers vereinbar ist.
Im Rahmen der Förderlinie „FDMScouts.nrw“ arbeiten zehn Hochschulen kooperativ an Strukturen und Prozessen für einen nachhaltigen Aufbau des Forschungsdatenmanagements an den betreffenden Hochschulen für angewandte Wissenschaften und Fachhochschulen.
Hierbei ist ausschlaggebend, das Forschungsdatenmanagement zielgerichtet und bedarfsorientiert zu konzipieren und sowohl strategisch als auch operativ zu verankern. Ausgangspunkt dieser Bemühungen bildet daher eine Bedarfserhebung, die bestehende Datenworkflows, Vorwissen und Bedarfe der Forschenden zum FDM erfassen soll. In Abstimmung innerhalb der Förderlinie „FDMScouts.nrw“ wurde der vorliegende Umfragebogen erstellt.
Der Erhebungsbogen basiert auf der Vorlage „Fragenkatalog zur Bedarfserhebung zur Archivierung und Bereitstellung von Forschungsdaten an den rheinland-pfälzischen Universitäten und Hochschulen für angewandte Wissenschaften“ (Lemaire et al. 2022). Darüber hinaus wurden Aspekte aus „UNEKE: Forschungsdatenspeicherung - Praxis und Bedarfe: Online-Survey 2019“ (Brenger et al. 2019) und aus „Anforderungserhebung bei den brandenburgischen Hochschulen“ (Radtke et al. 2020) entnommen. Als weitere Quelle diente der „Interviewleitfaden zur Bestands- und Bedarfserhebung im Forschungsdatenmanagement (FDM) - Projekt FDM-TUDO“ der TU Dortmund (Kletke et al. 2022).
Collaboration among multiple users on large screens leads to complicated behavior patterns and group dynamics. To gain a deeper understanding of collaboration on vertical, large, high-resolution screens, this dissertation builds on previous research and gains novel insights through new observational studies. Among other things, the collected results reveal new patterns of collaborative coupling, suggest that territorial behavior is less critical than shown in previous research, and demonstrate that workspace awareness can also negatively affect the effectiveness of individual users.
In March 2020, the world was hit by the coronavirus disease (COVID‐19) pandemic which led to all‐embracing measures to contain its spread. Most employees were forced to work from home and take care of their children because schools and daycares were closed. We present data from a research project in a large multinational organisation in the Netherlands with monthly quantitative measurements from January to May 2020 (N = 253–516), enriched with qualitative data from participants' comments before and after telework had started. Growth curve modelling showed major changes in employees' work‐related well‐being reflected in decreasing work engagement and increasing job satisfaction. For work‐non‐work balance, workload and autonomy, cubic trends over time were found, reflecting initial declines during crisis onset (March/April) and recovery in May. Participants' additional remarks exemplify that employees struggled with fulfilling different roles simultaneously, developing new routines and managing boundaries between life domains. Moderation analyses demonstrated that demographic variables shaped time trends. The diverging trends in well‐being indicators raise intriguing questions and show that close monitoring and fine‐grained analyses are needed to arrive at a better understanding of the impact of the crisis across time and among different groups of employees.
The backdated research dedicated to digital entrepreneurship education is immense, which makes it difficult to create an overview. Conversely, forward-thinking bibliometric visualization mapping and clustering can assist in visualizing and structuring difficult research literature. Hence, the goal of this mapping visualization study is to thoroughly discover and create clusters of EE to convey a taxonomic structure that can oblige as a basis for upcoming research. The analyzed data, which is drawn from Google Scholar through Publish or Perish tool, contain 1000 documents published between 2007 and 2022. This taxonomy should generate stronger bonds with digital entrepreneurial education research; on the other, it should stand in international research association to boost both interdisciplinary digital entrepreneurial education and its influence on a universal basis. This work strengthens student’s understanding of current digital entrepreneurial education research by classifying and decontaminating the most powerful knowledgeable relationship among its contributions and contributors. The bibliographic analysis includes ‘citation network’, ‘author’s research area’ and ‘paper content’ regarding the desired topic. In this paper, the above three mentioned terms are integrated which produces a bibliographic model of authors, titles of their papers, keywords and abstract by using Harzing’s Publish or Perish tool for extracting data from Google Scholar and further using VOSViewer to visualize networking map of co-authorship and term co-occurrence to administer the data for an instinctive and appropriate understanding of university students concerning ‘digital entrepreneurial intention’ research. This paper uses bibliometric analysis to analyze the keyword co-occurrence and co-authorship and VOSViewer is used for visualization.
There is an unmet need for the development and validation of biomarkers and surrogate endpoints for clinical trials in propionic acidemia (PA) and methylmalonic acidemia (MMA). This review examines the pathophysiology and clinical consequences of PA and MMA that could form the basis for potential biomarkers and surrogate endpoints. Changes in primary metabolites such as methylcitric acid (MCA), MCA:citric acid ratio, oxidation of 13C-propionate (exhaled 13CO2), and propionylcarnitine (C3) have demonstrated clinical relevance in patients with PA or MMA. Methylmalonic acid, another primary metabolite, is a potential biomarker, but only in patients with MMA. Other potential biomarkers in patients with either PA and MMA include secondary metabolites, such as ammonium, or the mitochondrial disease marker, fibroblast growth factor 21. Additional research is needed to validate these biomarkers as surrogate endpoints, and to determine whether other metabolites or markers of organ damage could also be useful biomarkers for clinical trials of investigational drug treatments in patients with PA or MMA. This review examines the evidence supporting a variety of possible biomarkers for drug development in propionic and methylmalonic acidemias.
Bionik
(2022)
Wie machen die das… kann angesichts der erstaunlichen Fähigkeiten mancher Lebewesen gefragt werden. Die Bionik fragt noch weiter …und wie kann man das nachmachen? Hier liegt ein Schwerpunkt dieses Lehrbuches, das die Bionik nicht nur an zahlreichen Beispielen erklärt, sondern auch eine Vorgehensweise für die Identifizierung biologischer Lösungen und deren Übertragung auf technische Anwendungen vermittelt. Basisinformationen der Biologie und Grundlagen der Konstruktionstechnik gewährleisten einen leichten Zugang zum Stoff. Mit dem 3D-Druck als Schlüsseltechnologie und der Thematisierung der Nachhaltigkeit geht das Buch zudem auf aktuelle Entwicklungen ein. Dieser ganzheitliche Blick auf die Bionik soll den Leser zur Durchführung bionischer Projekte befähigen und motivieren. Die vorliegende Auflage wurde überarbeitet und um aktuelle Forschungserkenntnisse und Entwicklungen ergänzt. (Verlagsangaben)
Buch-Aisthesis
(2022)
Literatur kann auch als Verbund von Medien betrachtet werden, die in Kooperations- und Konkurrenzverhältnissen auftreten. Dies wird umso deutlicher, wenn aus literatur- und designwissenschaftlicher Perspektive auf die Beobachtung der Differenz von typographischen und anderen, grundsätzlich nonverbalen visuellen Daten abgestellt wird. Die Beiträger*innen des Bandes leiten daraus ein Verhältnis von Literatur- und Kunstwissenschaft zu ihren Gegenständen ab, das nicht zuletzt zu einer neuen Aufmerksamkeit für die skripturale und typographische Materialität und Medialität der Literatur führt. Dabei geht es um die Theorie der Reflexion und die Praxis der Erzeugung einer je spezifischen Buch-Ästhetik.
Buch-Diskurse
(2022)
Buchbesprechung
(2022)
The purpose of the study is to provide empirical evidence about the under-researched area of university–government relations in building a culture of entrepreneurial initiatives inside the triple helix model in a rural region. The study deploys a qualitative case study research method based on the content analysis of project documentation and further internal documents both from universities and municipalities. The propositions in the research question are guided by the previous literature and were then analyzed through an “open coding” process to iteratively analyze, verify, and validate the results from the documents against the previous literature. Results presented in the case study are related both to the project of a municipality–university innovation partnership, as well as the historic development of the university in its three missions, and, related to the important third mission, themes relevant for the project. In addition, a “toolkit” of relevant project activities is presented against the major identified themes, major project stakeholders, as well as relevant Sustainable Development Goals (SDGs). Universities should look beyond a purely economic contribution and should augment all three missions (teaching, research, engagement) by considering social, environmental, and economic aspects of its activities. Instead of considering a government’s role solely as that of a regulator, a much more creative and purposeful cooperation between university and government is possible for creating a regional culture of entrepreneurial initiatives in a rural region.
AI (artificial intelligence) systems are increasingly being used in all aspects of our lives, from mundane routines to sensitive decision-making and even creative tasks. Therefore, an appropriate level of trust is required so that users know when to rely on the system and when to override it. While research has looked extensively at fostering trust in human-AI interactions, the lack of standardized procedures for human-AI trust makes it difficult to interpret results and compare across studies. As a result, the fundamental understanding of trust between humans and AI remains fragmented. This workshop invites researchers to revisit existing approaches and work toward a standardized framework for studying AI trust to answer the open questions: (1) What does trust mean between humans and AI in different contexts? (2) How can we create and convey the calibrated level of trust in interactions with AI? And (3) How can we develop a standardized framework to address new challenges?
We introduce canonical weight normalization for convolutional neural networks. Inspired by the canonical tensor decomposition, we express the weight tensors in so-called canonical networks as scaled sums of outer vector products. In particular, we train network weights in the decomposed form, where scale weights are optimized separately for each mode. Additionally, similarly to weight normalization, we include a global scaling parameter. We study the initialization of the canonical form by running the power method and by drawing randomly from Gaussian or uniform distributions. Our results indicate that we can replace the power method with cheaper initializations drawn from standard distributions. The canonical re-parametrization leads to competitive normalization performance on the MNIST, CIFAR10, and SVHN data sets. Moreover, the formulation simplifies network compression. Once training has converged, the canonical form allows convenient model-compression by truncating the parameter sums.
The aim of this paper is to assess the objectives of farmers’ challenges in enhancing biodiversity. The so-called “trilemma” (WBGU 2021) of land use stems from the multiple demands made on land for the benefit of mitigating climate change, securing food and maintaining biodiversity. The agricultural sector is accused of maladministration: it is blamed for causing soil contamination, animal cruelty, bee mortality and climate change. That is why farmers are seen as key actors at all levels. They are, however, also key players when it comes to overcoming the problems of the future. Their supportive role is urgently needed, but farmers find themselves caught between a “rock” and a ”hard place”. Consumers are calling for sustainable, environmentally friendly production and inexpensive food products that do not contain pesticide residues, demanding enough food for all. Farmers are restricted by the wants and needs of consumers who are influenced by interest groups and are exposed to direct and indirect influencing factors and their interdependencies. They are also tasked with balancing the scrutiny of the critical public on the one hand, and the control exercised by eager authorities on the other.
As part of the DINA (Diversity of Insects in Nature protected Areas) project, a trans- and interdisciplinary research study, we collected and surveyed the data of farmers who are farming within or close to the 21 selected nature protected areas included in the DINA project. Data was collected as part of a mixed method approach using a semi-structured questionnaire. The methodological and strategic approach and interdependencies of issues demonstrate the complexity of today’s problems. To investigate this, we first used the data collection method using questionnaires with closed and open questions. The conflicts and obstacles farmers face were evaluated, and the results show farmers’ willingness and the importance of appreciation shown to farmers for implementation of biodiversity measures. The paper proposes some follow-up activities (quantitative study) to verify the objectives. The results will later lead to recommendations for policymakers and farmers in all German nature protected areas.
Hydrogen is a versatile energy carrier. When produced with renewable energy by water splitting, it is a carbon neutral alternative to fossil fuels. The industrialization process of this technology is currently dominated by electrolyzers powered by solar or wind energy. For small scale applications, however, more integrated device designs for water splitting using solar energy might optimize hydrogen production due to lower balance of system costs and a smarter thermal management. Such devices offer the opportunity to thermally couple the solar cell and the electrochemical compartment. In this way, heat losses in the absorber can be turned into an efficiency boost for the device via simultaneously enhancing the catalytic performance of the water splitting reactions, cooling the absorber, and decreasing the ohmic losses.[1,2] However,integrated devices (sometimes also referred to as “artificial leaves”), currently suffer from a lower technology readiness level (TRL) than the completely decoupled approach.
Comparative study of 3D object detection frameworks based on LiDAR data and sensor fusion techniques
(2022)
Estimating and understanding the surroundings of the vehicle precisely forms the basic and crucial step for the autonomous vehicle. The perception system plays a significant role in providing an accurate interpretation of a vehicle's environment in real-time. Generally, the perception system involves various subsystems such as localization, obstacle (static and dynamic) detection, and avoidance, mapping systems, and others. For perceiving the environment, these vehicles will be equipped with various exteroceptive (both passive and active) sensors in particular cameras, Radars, LiDARs, and others. These systems are equipped with deep learning techniques that transform the huge amount of data from the sensors into semantic information on which the object detection and localization tasks are performed. For numerous driving tasks, to provide accurate results, the location and depth information of a particular object is necessary. 3D object detection methods, by utilizing the additional pose data from the sensors such as LiDARs, stereo cameras, provides information on the size and location of the object. Based on recent research, 3D object detection frameworks performing object detection and localization on LiDAR data and sensor fusion techniques show significant improvement in their performance. In this work, a comparative study of the effect of using LiDAR data for object detection frameworks and the performance improvement seen by using sensor fusion techniques are performed. Along with discussing various state-of-the-art methods in both the cases, performing experimental analysis, and providing future research directions.
Comparative study of 3D object detection frameworks based on LiDAR data and sensor fusion techniques
(2022)
Estimating and understanding the surroundings of the vehicle precisely forms the basic and crucial step for the autonomous vehicle. The perception system plays a significant role in providing an accurate interpretation of a vehicle's environment in real-time. Generally, the perception system involves various subsystems such as localization, obstacle (static and dynamic) detection, and avoidance, mapping systems, and others. For perceiving the environment, these vehicles will be equipped with various exteroceptive (both passive and active) sensors in particular cameras, Radars, LiDARs, and others. These systems are equipped with deep learning techniques that transform the huge amount of data from the sensors into semantic information on which the object detection and localization tasks are performed. For numerous driving tasks, to provide accurate results, the location and depth information of a particular object is necessary. 3D object detection methods, by utilizing the additional pose data from the sensors such as LiDARs, stereo cameras, provides information on the size and location of the object. Based on recent research, 3D object detection frameworks performing object detection and localization on LiDAR data and sensor fusion techniques show significant improvement in their performance. In this work, a comparative study of the effect of using LiDAR data for object detection frameworks and the performance improvement seen by using sensor fusion techniques are performed. Along with discussing various state-of-the-art methods in both the cases, performing experimental analysis, and providing future research directions.
Comparing Armature Windings for a 10 MW Fully Superconducting Synchronous Wind Turbine Generator
(2022)
Composite nanoparticles (NPs) consisting of lignin and different polysaccharide (PS) derivatives were prepared. In this synergistic approach, the PS derivative acts as biocompatible matrix that forms spherical NPs while lignin is a functional compound with therapeutic potential (e.g., antioxidative, antimicrobial, antiviral). Organosolv lignin and three different PS derivatives (cellulose acetate/CA, cellulose acetate phthalate/CAPh, xylan phenyl carbonate/XPC) were used in this study. Nanocomposites with particle sizes in the range of about 200–550 nm containing both types of biopolymers are accessible by dialysis of organic PS/lignin solutions against water. In particular, XPC and CAPh, which both contain aromatic substituents, were found to be suitable for incorporation of lignin within the PS nanomatrix. The present work paves the way for future studies in which the pharmaceutical potential and biocompatibility of composite NPs of lignin and PS derivatives with tailored properties are investigated.
Contextual information is widely considered for NLP and knowledge discovery in life sciences since it highly influences the exact meaning of natural language. The scientific challenge is not only to extract such context data, but also to store this data for further query and discovery approaches. Classical approaches use RDF triple stores, which have serious limitations. Here, we propose a multiple step knowledge graph approach using labeled property graphs based on polyglot persistence systems to utilize context data for context mining, graph queries, knowledge discovery and extraction. We introduce the graph-theoretic foundation for a general context concept within semantic networks and show a proof of concept based on biomedical literature and text mining. Our test system contains a knowledge graph derived from the entirety of PubMed and SCAIView data and is enriched with text mining data and domain-specific language data using Biological Expression Language. Here, context is a more general concept than annotations. This dense graph has more than 71M nodes and 850M relationships. We discuss the impact of this novel approach with 27 real-world use cases represented by graph queries. Storing and querying a giant knowledge graph as a labeled property graph is still a technological challenge. Here, we demonstrate how our data model is able to support the understanding and interpretation of biomedical data. We present several real-world use cases that utilize our massive, generated knowledge graph derived from PubMed data and enriched with additional contextual data. Finally, we show a working example in context of biologically relevant information using SCAIView.
Novel methods for contingency analysis of gas transport networks are presented. They are motivated by the transition of our energy system where hydrogen plays a growing role. The novel methods are based on a specific method for topological reduction and so-called supernodes. Stationary Euler equations with advanced compressor thermodynamics and a gas law allowing for gas compositions with up to 100% hydrogen are used. Several measures and plots support an intuitive comparison and analysis of the results. In particular, it is shown that the newly developed methods can estimate locations and magnitudes of additional capacities (injection, buffering, storage etc.) with a reasonable performance for networks of relevant composition and size.
Controlling
(2022)
Introduction: Recovery experiences have thus far been portrayed as experiences that simply “happen” to people. However, recovery can also be understood from a crafting perspective; that is, individuals may proactively shape their work and non-work activities to recover from stress, satisfy their psychological needs, and achieve optimal functioning.
Materials and Methods: In my talk, I will present the theoretical basis of needs-based crafting based on a conceptual review of the literature. Moreover, I will present empirical findings on the validation of a newly developed off-job crafting scale.
Results: In five sub studies, we found that off-job crafting was related to optimal functioning over time. Moreover, the newly developed off-job crafting scale had good convergent and discriminant validity, internal consistency, and test-retest reliability.
Conclusions: Theoretical and empirical evidence suggests that needs-based crafting can enhance optimal functioning in different life domains and support people in performing their work duties sustainably. Proactive attempts to achieve better recovery through needs satisfaction may be beneficial in an intensified and continually changing and challenging working life. Our line of research provides important avenues for organizational research and practices regarding recovery and needs satisfaction occurring at work and outside work.
This open access book brings together the latest developments from industry and research on automated driving and artificial intelligence.
Environment perception for highly automated driving heavily employs deep neural networks, facing many challenges. How much data do we need for training and testing? How to use synthetic data to save labeling costs for training? How do we increase robustness and decrease memory usage? For inevitably poor conditions: How do we know that the network is uncertain about its decisions? Can we understand a bit more about what actually happens inside neural networks? This leads to a very practical problem particularly for DNNs employed in automated driving: What are useful validation techniques and how about safety?
This book unites the views from both academia and industry, where computer vision and machine learning meet environment perception for highly automated driving. Naturally, aspects of data, robustness, uncertainty quantification, and, last but not least, safety are at the core of it. This book is unique: In its first part, an extended survey of all the relevant aspects is provided. The second part contains the detailed technical elaboration of the various questions mentioned above.
Der Faktor Vernunft
(2022)
We describe a systematic approach for rendering time-varying simulation data produced by exa-scale simulations, using GPU workstations. The data sets we focus on use adaptive mesh refinement (AMR) to overcome memory bandwidth limitations by representing interesting regions in space with high detail. Particularly, our focus is on data sets where the AMR hierarchy is fixed and does not change over time. Our study is motivated by the NASA Exajet, a large computational fluid dynamics simulation of a civilian cargo aircraft that consists of 423 simulation time steps, each storing 2.5 GB of data per scalar field, amounting to a total of 4 TB. We present strategies for rendering this time series data set with smooth animation and at interactive rates using current generation GPUs. We start with an unoptimized baseline and step by step extend that to support fast streaming updates. Our approach demonstrates how to push current visualization workstations and modern visualization APIs to their limits to achieve interactive visualization of exa-scale time series data sets.
Hydrophilic surface-enhanced Raman spectroscopy (SERS) substrates were prepared by a combination of TiO2-coatings of aluminium plates through a direct titanium tetraisopropoxide (TTIP) coating and drop coated by synthesised gold nanoparticles (AuNPs). Differences between the wettability of the untreated substrates, the slowly dried Ti(OH)4 substrates and calcinated as well as plasma treated TiO2 substrates were analysed by water contact angle (WCA) measurements. The hydrophilic behaviour of the developed substrates helped to improve the distribution of the AuNPs, which reflects in overall higher lateral SERS enhancement. Surface enhancement of the substrates was tested with target molecule rhodamine 6G (R6G) and a fibre-coupled 638 nm Raman spectrometer. Additionally, the morphology of the substrates was characterised using scanning electron microscopy (SEM) and Raman microscopy. The studies showed a reduced influence of the coffee ring effect on the particle distribution, resulting in a more broadly distributed edge region, which increased the spatial reproducibility of the measured SERS signal in the surface-enhanced Raman mapping measurements on mm scale.
State-of-the-art object detectors are treated as black boxes due to their highly non-linear internal computations. Even with unprecedented advancements in detector performance, the inability to explain how their outputs are generated limits their use in safety-critical applications. Previous work fails to produce explanations for both bounding box and classification decisions, and generally make individual explanations for various detectors. In this paper, we propose an open-source Detector Explanation Toolkit (DExT) which implements the proposed approach to generate a holistic explanation for all detector decisions using certain gradient-based explanation methods. We suggests various multi-object visualization methods to merge the explanations of multiple objects detected in an image as well as the corresponding detections in a single image. The quantitative evaluation show that the Single Shot MultiBox Detector (SSD) is more faithfully explained compared to other detectors regardless of the explanation methods. Both quantitative and human-centric evaluations identify that SmoothGrad with Guided Backpropagation (GBP) provides more trustworthy explanations among selected methods across all detectors. We expect that DExT will motivate practitioners to evaluate object detectors from the interpretability perspective by explaining both bounding box and classification decisions.
21 pages, with supplementary
As cameras are ubiquitous in autonomous systems, object detection is a crucial task. Object detectors are widely used in applications such as autonomous driving, healthcare, and robotics. Given an image, an object detector outputs both the bounding box coordinates as well as classification probabilities for each object detected. The state-of-the-art detectors are treated as black boxes due to their highly non-linear internal computations. Even with unprecedented advancements in detector performance, the inability to explain how their outputs are generated limits their use in safety-critical applications in particular. It is therefore crucial to explain the reason behind each detector decision in order to gain user trust, enhance detector performance, and analyze their failure.
Previous work fails to explain as well as evaluate both bounding box and classification decisions individually for various detectors. Moreover, no tools explain each detector decision, evaluate the explanations, and also identify the reasons for detector failures. This restricts the flexibility to analyze detectors. The main contribution presented here is an open-source Detector Explanation Toolkit (DExT). It is used to explain the detector decisions, evaluate the explanations, and analyze detector errors. The detector decisions are explained visually by highlighting the image pixels that most influence a particular decision. The toolkit implements the proposed approach to generate a holistic explanation for all detector decisions using certain gradient-based explanation methods. To the author’s knowledge, this is the first work to conduct extensive qualitative and novel quantitative evaluations of different explanation methods across various detectors. The qualitative evaluation incorporates a visual analysis of the explanations carried out by the author as well as a human-centric evaluation. The human-centric evaluation includes a user study to understand user trust in the explanations generated across various explanation methods for different detectors. Four multi-object visualization methods are provided to merge the explanations of multiple objects detected in an image as well as the corresponding detector outputs in a single image. Finally, DExT implements the procedure to analyze detector failures using the formulated approach.
The visual analysis illustrates that the ability to explain a model is more dependent on the model itself than the actual ability of the explanation method. In addition, the explanations are affected by the object explained, the decision explained, detector architecture, training data labels, and model parameters. The results of the quantitative evaluation show that the Single Shot MultiBox Detector (SSD) is more faithfully explained compared to other detectors regardless of the explanation methods. In addition, a single explanation method cannot generate more faithful explanations than other methods for both the bounding box and the classification decision across different detectors. Both the quantitative and human-centric evaluations identify that SmoothGrad with Guided Backpropagation (GBP) provides more trustworthy explanations among selected methods across all detectors. Finally, a convex polygon-based multi-object visualization method provides more human-understandable visualization than other methods.
The author expects that DExT will motivate practitioners to evaluate object detectors from the interpretability perspective by explaining both bounding box and classification decisions.
Im Zuge der Migrationsbewegung in den Jahren 2015 und 2016 hat die menschenwürdige Unterbringung von geflüchteten Menschen in Kommunen in Deutschland an Aufmerksamkeit gewonnen. Der Anstieg der Asylbewerber:innen in den Kommunen sowie die Bundesinitiative „Schutz von geflüchteten Menschen in Flüchtlingsunterkünften“ haben Veränderungen im Hinblick auf Schutzstandards in der kommunalen Unterbringung geflüchteter Menschen hervorgerufen. Der Artikel erklärt diese Veränderungen mittels einer akteurszentrierten organisationssoziologischen Herangehensweise. Grundlage sind empirische Forschungsergebnisse des Projektes „Organisational Perspectives on Human Security Standards for Refugees in Germany“ aus zwei deutschen Kommunen.
The aim of this master thesis was to probe the view of Bonn’s citizens on the smart city project of the German city. A literature review helped defining the smart city term and identifying the smart city concept that is mostly used in Germany. This can be summarized as an urban planning concept using information and communication technology to build citizen centric, sustainable cities. According to this, a smart city should include transparent communication and participation of its citizens. The websites and different publications of Bonn were researched to understand its smart city strategy and vision. This revealed inconsistencies. To resolve these inconsistencies, three representatives of the city were inter-viewed. Based on the knowledge gained up to this point, two groups of Bonn’s inhabitants discussed the Smart City Bonn and presented their perception of it. With the help of this methodology, the following results were obtained. Communication and participation of the city are in many cases in line with the current recommendations for a smart city. Bonn has apparently recognized the relevance of these aspects in theory but should also implement them more consistently in practice. Currently the city council publishes contradictory information and does not plan to incorporate the sight of Bonn’s citizens to develop the smart city strat-egy in the first place, as it is recommended in common literature.
Dienstleister
(2022)
Differential-Algebraic Equations and Beyond: From Smooth to Nonsmooth Constrained Dynamical Systems
(2022)
Digitalisierung in zentralen Feldern der Sozialpolitik: Entwicklungstendenzen, Chancen und Risiken
(2022)
Digitaltechnik
(2022)
Moderne Digitaltechnik, umfassend und kompakt: Dieses Lehr- und Übungsbuch spannt den Bogen von den Grundlagen der Digitaltechnik über den Entwurf mit VHDL und Komponenten digitaler Systeme bis zu modernen Mikrocontrollern der STM32-Serie.
Die 8. Auflage wurde aktualisiert und die Themenbereiche Mikroprozessoren und Mikrocontroller grundlegend überarbeitet.
Discarded news
(2022)
When important news fail to reach their recipients, namely, the politically interested, socially open-minded public, we sometimes refer to this process as agenda cutting. This article presents the key theoretical positions on this under-researched phenomenon, presenting important study results as well as our own empirical findings on internal editorial decision-making processes whereby topics are removed from the agenda. Last, we will critically examine the role of the audience as an actor in agenda cutting, which could be described as »news ignorance«.[1]
The top story showcased by Initiative Nachrichtenaufklärung (INA) e.V. in 2022 was the creeping abolition of free textbooks in German schools. In a public radio broadcast, the head of TV news magazine Tagesthemen and deputy editor-in-chief of ARD-Aktuell, Helge Fuhst, conceded that he considered this topic highly relevant, yet it had indeed not been covered in his TV news program. »Leaving out topics is, in fact, the most difficult challenge,« Fuhst said. »Having to drop topics hurts every day. There are only a few days a year when we have absolutely no idea what to put on the air.« (WDR 2022)
The process of news selection is editorial routine, which includes omitting, discarding, or abandoning topics. When this negative process is intentional, it can also be referred to as agenda cutting. This term from the field of communications science describes a distinct form of editorial routine that has been little studied to date and whose mechanisms, with their considerable influence on the formation of public opinion, are in urgent need of media research scrutiny.
Discrimination of Stressed and Non-Stressed Food-Related Bacteria Using Raman-Microspectroscopy
(2022)
As the identification of microorganisms becomes more significant in industry, so does the utilization of microspectroscopy and the development of effective chemometric models for data analysis and classification. Since only microorganisms cultivated under laboratory conditions can be identified, but they are exposed to a variety of stress factors, such as temperature differences, there is a demand for a method that can take these stress factors and the associated reactions of the bacteria into account. Therefore, bacterial stress reactions to lifetime conditions (regular treatment, 25 °C, HCl, 2-propanol, NaOH) and sampling conditions (cold sampling, desiccation, heat drying) were induced to explore the effects on Raman spectra in order to improve the chemometric models. As a result, in this study nine food-relevant bacteria were exposed to seven stress conditions in addition to routine cultivation as a control. Spectral alterations in lipids, polysaccharides, nucleic acids, and proteins were observed when compared to normal growth circumstances without stresses. Regardless of the involvement of several stress factors and storage times, a model for differentiating the analyzed microorganisms from genus down to strain level was developed. Classification of the independent training dataset at genus and species level for Escherichia coli and at strain level for the other food relevant microorganisms showed a classification rate of 97.6%.
For most people, using their body to authenticate their identity is an integral part of daily life. From our fingerprints to our facial features, our physical characteristics store the information that identifies us as "us." This biometric information is becoming increasingly vital to the way we access and use technology. As more and more platform operators struggle with traffic from malicious bots on their servers, the burden of proof is on users, only this time they have to prove their very humanity and there is no court or jury to judge, but an invisible algorithmic system. In this paper, we critique the invisibilization of artificial intelligence policing. We argue that this practice obfuscates the underlying process of biometric verification. As a result, the new "invisible" tests leave no room for the user to question whether the process of questioning is even fair or ethical. We challenge this thesis by offering a juxtaposition with the science fiction imagining of the Turing test in Blade Runner to reevaluate the ethical grounds for reverse Turing tests, and we urge the research community to pursue alternative routes of bot identification that are more transparent and responsive.
We analyze short-term effects of free hospitalization insurance for the poorest quintile of the population in the province of Khyber Pakhtunkhwa, Pakistan. First, we exploit that eligibility is based on an exogenous poverty score threshold and apply a regression discontinuity design. Second, we exploit imperfect rollout and compare insured and uninsured households using propensity score matching. With both methods we fail to detect significant effects on the incidence of hospitalization. Whereas the program did not meaningfully increase the quantity of health care consumed, insured households more often choose private hospitals, indicating a shift towards higher perceived quality of care.
E-Sport im Fernsehen - Eine Analyse der Chancen eines neuen Themenfelds bei deutschen Fernsehsendern
(2022)
In den letzten Jahren hat die mediale Präsenz des E-Sports in Deutschland zugenommen, was dazu führte, dass auch die Allgemeinheit sich mit dem Thema auseinandersetzt. Dadurch sind Fernsehunternehmen auf die ursprüngliche Nischensportart, welche im Internet beheimatet und dort stark verwurzelt ist, aufmerksam geworden und bauen ihr Engagement in dem Bereich aus, um an dem wachsenden Erfolg teilzuhaben, der dem E-Sport prognostiziert wird. Doch eine erfolgreiche und geeignete Thematisierung des Trendthemas scheint aufgrund der besonderen Rahmenbedingungen nicht so einfach zu sein, wie es bei anderen klassischen Sportarten der Fall ist. Daraus ergibt sich die Frage: Welche Chancen hat die Thematisierung des E-Sports bei deutschen Fernsehsendern, wenn man die besonderen Gegebenheiten zusammen betrachtet? Die TV-Sender haben hierbei die Aufgabe, ein Publikum zu gewinnen, welches eigentlich gewohnt ist, dieses Thema im Internet zu konsumieren – dabei verliert das Fernsehen seit dem digitalen Zeitalter sowieso schon immer mehr Zuschauende an ebendieses. Neben den Hindernissen, die überwunden werden müssen, bietet der E-Sport den Fernsehunternehmen aber auch Mehrwerte – beides wird in dieser Arbeit ergründet.
Diese explorative Forschungsarbeit bietet einen Ansatz für die weitere Erforschung des E-Sports in den deutschen Medien – vor allem, da existierende Arbeiten sich hauptsächlich auf Live-Streaming-Portale oder die Darstellung des E-Sports in den klassischen Medien beziehen und ein Bezug zu den Intentionen und Gedanken der Fernsehunternehmen nicht vorhanden ist. Um diese Lücke zu schließen, wurden sieben Handelnde bei deutschen TV-Sendern oder Senderfamilien interviewt, die den E-Sport schon in unterschiedlicher Intensivität behandelt oder Überlegungen dazu durchgeführt haben. Den Abschluss dieser Arbeit - und gleichzeitige Anknüpfungspunkte für eine weiterführende Forschung zu dem Thema - bilden die acht Hypothesen, die einen Aufschluss darüber geben, welche Faktoren einen Einfluss auf die Chancen einer Thematisierung haben und die durch die Methode der qualitativen Inhaltsanalyse erstellt wurden. Der Forschungsgegenstand wurde dabei unter einer Vielzahl besonderer Aspekte und deren Wechselwirkungen betrachtet, wie z. B. den unterschiedlichen Senderformen, den Umständen innerhalb der E-Sport-Branche oder den vorhandenen Unternehmensstrukturen.
Education for Sustainable Development (ESD, SDG 4) and human well-being (SDG 3) are among the central subjects of the Sustainable Development Goals (SDGs). In this article, based on the Questionnaire for Eudaimonic Well-Being (QEWB), we investigate to what extent (a) there is a connection between EWB and practical commitment to the SDGs and whether (b) there is a deficit in EWB among young people in general. We also want to use the article to draw attention to the need for further research on the links between human well-being and commitment for sustainable development. A total of 114 students between the ages of 18 and 34, who are either engaged in (extra)curricular activities of sustainable development (28 students) or not (86 students), completed the QEWB. The students were interviewed twice: once regarding their current and their aspired EWB. Our results show that students who are actively engaged in activities for sustainable development report a higher EWB than non-active students. Furthermore, we show that students generally report deficits in EWB and wish for an improvement in their well-being. This especially applies to aspects of EWB related to self-discovery and the sense of meaning in life. Our study suggests that a practice-oriented ESD in particular can have a positive effect on the quality of life of young students and can support them in working on deficits in EWB.
While the recent discussion on Art. 25 GDPR often considers the approach of data protection by design as an innovative idea, the notion of making data protection law more effective through requiring the data controller to implement the legal norms into the processing design is almost as old as the data protection debate. However, there is another, more recent shift in establishing the data protection by design approach through law, which is not yet understood to its fullest extent in the debate. Art. 25 GDPR requires the controller to not only implement the legal norms into the processing design but to do so in an effective manner. By explicitly declaring the effectiveness of the protection measures to be the legally required result, the legislator inevitably raises the question of which methods can be used to test and assure such efficacy. In our opinion, extending the legal compatibility assessment to the real effects of the required measures opens this approach to interdisciplinary methodologies. In this paper, we first summarise the current state of research on the methodology established in Art. 25 sect. 1 GDPR, and pinpoint some of the challenges of incorporating interdisciplinary research methodologies. On this premise, we present an empirical research methodology and first findings which offer one approach to answering the question on how to specify processing purposes effectively. Lastly, we discuss the implications of these findings for the legal interpretation of Art. 25 GDPR and related provisions, especially with respect to a more effective implementation of transparency and consent, and provide an outlook on possible next research steps.
Effective Neighborhood Feature Exploitation in Graph CNNs for Point Cloud Object-Part Segmentation
(2022)
Part segmentation is the task of semantic segmentation applied on objects and carries a wide range of applications from robotic manipulation to medical imaging. This work deals with the problem of part segmentation on raw, unordered point clouds of 3D objects. While pioneering works on deep learning for point clouds typically ignore taking advantage of local geometric structure around individual points, the subsequent methods proposed to extract features by exploiting local geometry have not yielded significant improvements either. In order to investigate further, a graph convolutional network (GCN) is used in this work in an attempt to increase the effectiveness of such neighborhood feature exploitation approaches. Most of the previous works also focus only on segmenting complete point cloud data. Considering the impracticality of such approaches, taking into consideration the real world scenarios where complete point clouds are scarcely available, this work proposes approaches to deal with partial point cloud segmentation.
In the attempt to better capture neighborhood features, this work proposes a novel method to learn regional part descriptors which guide and refine the segmentation predictions. The proposed approach helps the network achieve state-of-the-art performance of 86.4% mIoU on the ShapeNetPart dataset for methods which do not use any preprocessing techniques or voting strategies. In order to better deal with partial point clouds, this work also proposes new strategies to train and test on partial data. While achieving significant improvements compared to the baseline performance, the problem of partial point cloud segmentation is also viewed through an alternate lens of semantic shape completion.
Semantic shape completion networks not only help deal with partial point cloud segmentation but also enrich the information captured by the system by predicting complete point clouds with corresponding semantic labels for each point. To this end, a new network architecture for semantic shape completion is also proposed based on point completion network (PCN) which takes advantage of a graph convolution based hierarchical decoder for completion as well as segmentation. In addition to predicting complete point clouds, results indicate that the network is capable of reaching within a margin of 5% to the mIoU performance of dedicated segmentation networks for partial point cloud segmentation.