Refine
H-BRS Bibliography
- yes (99) (remove)
Departments, institutes and facilities
- Fachbereich Angewandte Naturwissenschaften (23)
- Fachbereich Wirtschaftswissenschaften (23)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (23)
- Fachbereich Informatik (18)
- Institut für funktionale Gen-Analytik (IFGA) (17)
- Fachbereich Ingenieurwissenschaften und Kommunikation (14)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (14)
- Fachbereich Sozialpolitik und Soziale Sicherung (5)
- Institut für Cyber Security & Privacy (ICSP) (5)
- Institut für Verbraucherinformatik (IVI) (5)
Document Type
- Article (66)
- Conference Object (12)
- Part of a Book (7)
- Working Paper (4)
- Bachelor Thesis (2)
- Master's Thesis (2)
- Preprint (2)
- Conference Proceedings (1)
- Part of Periodical (1)
- Report (1)
Year of publication
- 2023 (99) (remove)
Has Fulltext
- yes (99) (remove)
Keywords
- document similarity (3)
- COVID-19 (2)
- Conservation practice (2)
- ENaC (2)
- GDPR (2)
- Global horizontal irradiance (2)
- Insect decline (2)
- Liebe (2)
- Love (2)
- Named Entity Recognition (2)
Neuromorphic computing aims to mimic the computational principles of the brain in silico and has motivated research into event-based vision and spiking neural networks (SNNs). Event cameras (ECs) capture local, independent changes in brightness, and offer superior power consumption, response latencies, and dynamic ranges compared to frame-based cameras. SNNs replicate neuronal dynamics observed in biological neurons and propagate information in sparse sequences of ”spikes”. Apart from biological fidelity, SNNs have demonstrated potential as an alternative to conventional artificial neural networks (ANNs), such as in reducing energy expenditure and inference time in visual classification. Although potentially beneficial for robotics, the novel event-driven and spike-based paradigms remain scarcely explored outside the domain of aerial robots.
To investigate the utility of brain-inspired sensing and data processing in a robotics application, we developed a neuromorphic approach to real-time, online obstacle avoidance on a manipulator with an onboard camera. Our approach adapts high-level trajectory plans with reactive maneuvers by processing emulated event data in a convolutional SNN, decoding neural activations into avoidance motions, and adjusting plans in a dynamic motion primitive formulation. We conducted simulated and real experiments with a Kinova Gen3 arm performing simple reaching tasks involving static and dynamic obstacles. Our implementation was systematically tuned, validated, and tested in sets of distinct task scenarios, and compared to a non-adaptive baseline through formalized quantitative metrics and qualitative criteria.
The neuromorphic implementation facilitated reliable avoidance of imminent collisions in most scenarios, with 84% and 92% median success rates in simulated and real experiments, where the baseline consistently failed. Adapted trajectories were qualitatively similar to baseline trajectories, indicating low impacts on safety, predictability and smoothness criteria. Among notable properties of the SNN were the correlation of processing time with the magnitude of perceived motions (captured in events) and robustness to different event emulation methods. Preliminary tests with a DAVIS346 EC showed similar performance, validating our experimental event emulation method. These results motivate future efforts to incorporate SNN learning, utilize neuromorphic processors, and target other robot tasks to further explore this approach.
In the last two decades, studies that analyse the political economy of sustainable energy transitions have increasingly become available. Yet very few attempts have been made to synthesize the factors discussed in the growing literature. This paper reviews the extant empirical literature on the political economy of sustainable energy transitions. Using a well-defined search strategy, a total of 36 empirical contributions covering the period 2008 to 2022 are reviewed full text. Overall, the findings highlight the role of vested interest, advocacy coalitions and green constituencies, path dependency, external shocks, policy and institutional environment, political institutions and fossil fuel resource endowments as major political economy factors influencing sustainable energy transitions across both high income countries, and low and middle income countries. In addition, the paper highlights and discusses some critical knowledge gaps in the existing literature and provides suggestions for a future research agenda.
Airborne and spaceborne platforms are the primary data sources for large-scale forest mapping, but visual interpretation for individual species determination is labor-intensive. Hence, various studies focusing on forests have investigated the benefits of multiple sensors for automated tree species classification. However, transferable deep learning approaches for large-scale applications are still lacking. This gap motivated us to create a novel dataset for tree species classification in central Europe based on multi-sensor data from aerial, Sentinel-1 and Sentinel-2 imagery. In this paper, we introduce the TreeSatAI Benchmark Archive, which contains labels of 20 European tree species (i.e., 15 tree genera) derived from forest administration data of the federal state of Lower Saxony, Germany. We propose models and guidelines for the application of the latest machine learning techniques for the task of tree species classification with multi-label data. Finally, we provide various benchmark experiments showcasing the information which can be derived from the different sensors including artificial neural networks and tree-based machine learning methods. We found that residual neural networks (ResNet) perform sufficiently well with weighted precision scores up to 79 % only by using the RGB bands of aerial imagery. This result indicates that the spatial content present within the 0.2 m resolution data is very informative for tree species classification. With the incorporation of Sentinel-1 and Sentinel-2 imagery, performance improved marginally. However, the sole use of Sentinel-2 still allows for weighted precision scores of up to 74 % using either multi-layer perceptron (MLP) or Light Gradient Boosting Machine (LightGBM) models. Since the dataset is derived from real-world reference data, it contains high class imbalances. We found that this dataset attribute negatively affects the models' performances for many of the underrepresented classes (i.e., scarce tree species). However, the class-wise precision of the best-performing late fusion model still reached values ranging from 54 % (Acer) to 88 % (Pinus). Based on our results, we conclude that deep learning techniques using aerial imagery could considerably support forestry administration in the provision of large-scale tree species maps at a very high resolution to plan for challenges driven by global environmental change. The original dataset used in this paper is shared via Zenodo (https://doi.org/10.5281/zenodo.6598390, Schulz et al., 2022). For citation of the dataset, we refer to this article.
A company's financial documents use tables along with text to organize the data containing key performance indicators (KPIs) (such as profit and loss) and a financial quantity linked to them. The KPI’s linked quantity in a table might not be equal to the similarly described KPI's quantity in a text. Auditors take substantial time to manually audit these financial mistakes and this process is called consistency checking. As compared to existing work, this paper attempts to automate this task with the help of transformer-based models. Furthermore, for consistency checking it is essential for the table's KPIs embeddings to encode the semantic knowledge of the KPIs and the structural knowledge of the table. Therefore, this paper proposes a pipeline that uses a tabular model to get the table's KPIs embeddings. The pipeline takes input table and text KPIs, generates their embeddings, and then checks whether these KPIs are identical. The pipeline is evaluated on the financial documents in the German language and a comparative analysis of the cell embeddings' quality from the three tabular models is also presented. From the evaluation results, the experiment that used the English-translated text and table KPIs and Tabbie model to generate table KPIs’ embeddings achieved an accuracy of 72.81% on the consistency checking task, outperforming the benchmark, and other tabular models.
This paper presents the preliminary results of the Socialist Republic of Vietnam country case study conducted as part of the research project Sustainable Labour Migration implemented by the University of Applied Science Bonn-Rhein-Sieg. The project focuses on stakeholder perspectives on countries of origin benefits and the sustainability of different transnational skill partnership schemes. Existing and ongoing small-scale initiatives indicate that opportunities exist for all three types of labour mobility pathways, from recruiting youth for apprenticeships and subsequent skilled work to recruitment and recognition of skilled 'professionals' certificates for direct work contracts to initial vocational education and training programs in a dual-track approach. While the latter has the highest potential to be more beneficial than other approaches, pursuing and supporting the scaling up of all three pathways in parallel will have additional, mutually reinforcing and supporting effects. The potential for benefits over and above those already realised by existing skill partnerships appears high, especially considering the favourable framework conditions specific to the long-standing German-Vietnamese relationship. If the potential of well-managed skill partnerships was realised, such sustainable models of skilled labour migration could serve as a unique selling point in the international competition for skilled labour.
The transport of carbon dioxide through pipelines is one of the important components of Carbon dioxide Capture and Storage (CCS) systems that are currently being developed. If high flow rates are desired a transportation in the liquid or supercritical phase is to be preferred. For technical reasons, the transport must stay in that phase, without transitioning to the gaseous state. In this paper, a numerical simulation of the stationary process of carbon dioxide transport with impurities and phase transitions is considered. We use the Homogeneous Equilibrium Model (HEM) and the GERG-2008 thermodynamic equation of state to describe the transport parameters. The algorithms used allow to solve scenarios of carbon dioxide transport in the liquid or supercritical phase, with the detection of approaching the phase transition region. Convergence of the solution algorithms is analyzed in connection with fast and abrupt changes of the equation of state and the enthalpy function in the region of phase transitions.
A biodegradable blend of PBAT—poly(butylene adipate-co-terephthalate)—and PLA—poly(lactic acid)—for blown film extrusion was modified with four multi-functional chain extending cross-linkers (CECL). The anisotropic morphology introduced during film blowing affects the degradation processes. Given that two CECL increased the melt flow rate (MFR) of tris(2,4-di-tert-butylphenyl)phosphite (V1) and 1,3-phenylenebisoxazoline (V2) and the other two reduced it (aromatic polycarbodiimide (V3) and poly(4,4-dicyclohexylmethanecarbodiimide) (V4)), their compost (bio-)disintegration behavior was investigated. It was significantly altered with respect to the unmodified reference blend (REF). The disintegration behavior at 30 and 60 °C was investigated by determining changes in mass, Young’s moduli, tensile strengths, elongations at break and thermal properties. In order to quantify the disintegration behavior, the hole areas of blown films were evaluated after compost storage at 60 °C to calculate the kinetics of the time dependent degrees of disintegration. The kinetic model of disintegration provides two parameters: initiation time and disintegration time. They quantify the effects of the CECL on the disintegration behavior of the PBAT/PLA compound. Differential scanning calorimetry (DSC) revealed a pronounced annealing effect during storage in compost at 30 °C, as well as the occurrence of an additional step-like increase in the heat flow at 75 °C after storage at 60 °C. The disintegration consists of processes which affect amorphous and crystalline phase of PBAT in different manner that cannot be understood by a hydrolytic chain degradation only. Furthermore, gel permeation chromatography (GPC) revealed molecular degradation only at 60 °C for the REF and V1 after 7 days of compost storage. The observed losses of mass and cross-sectional area seem to be attributed more to mechanical decay than to molecular degradation for the given compost storage times.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. The method is tested on data from two measurement campaigns that took place in the Allgäu region in Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 min resolution along with a non-linear photovoltaic module temperature model, global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 5.79 W m−2 (7.35 W m−2) under clear (cloudy) skies, averaged over the two campaigns, whereas for the retrieval using coarser 15 min power data with a linear temperature model the mean bias error is 5.88 and 41.87 W m−2 under clear and cloudy skies, respectively.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a 1D radiative transfer simulation, and the results are compared to both satellite retrievals and data from the Consortium for Small-scale Modelling (COSMO) weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken-cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. Specifically, the aerosol (cloud) optical depth is inferred during clear sky (completely overcast) conditions. The method is tested on data from two measurement campaigns that took place in Allgäu, Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 minute resolution, the hourly global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 11.45 W m−2, averaged over the two campaigns, whereas for the retrieval using coarser 15 minute power data the mean bias error is 16.39 W m−2.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a one-dimensional radiative transfer simulation, and the results are compared to both satellite retrievals as well as data from the COSMO weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and are properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
When optimizing the process parameters of the acidic ethanolic organosolv process, the aim is usually to maximize the delignification and/or lignin purity. However, process parameters such as temperature, time, ethanol and catalyst concentration, respectively, can also be used to vary the structural properties of the obtained organosolv lignin, including the molecular weight and the ratio of aliphatic versus phenolic hydroxyl groups, among others. This review particularly focuses on these influencing factors and establishes a trend analysis between the variation of the process parameters and the effect on lignin structure. Especially when larger data sets are available, as for process temperature and time, correlations between the distribution of depolymerization and condensation reactions are found, which allow direct conclusions on the proportion of lignin's structural features, independent of the diversity of the biomass used. The newfound insights gained from this review can be used to tailor organosolv lignins isolated for a specific application.
Konsument:innen scheint die Lust vergangen zu sein, individuellen Kleidungsstil auszudrücken, da der Onlinehandel zur Steigerung von Auswahlmöglichkeiten geführt hat. Dies mündet unter anderem in der Nutzung virtueller Stilberatungen. Diese Dienste dienen dazu, Kund:innen möglichst effizient, individuell und authentisch „zu machen“, und sind somit als paradoxaler Demokratisierungsprozess zu verstehen. Eine Erklärung für den Erfolg dieser Dienstleistungen soll mit Reckwitz’ Singularisierungsthese gestützt werden.
Dried serum spots that are well prepared can be attractive alternatives to frozen serum samples for shelving specimens in a medical or research center's biobank and mailing freshly prepared serum to specialized laboratories. During the pre-analytical phase, complications can arise which are often challenging to identify or are entirely overlooked. These complications can lead to reproducibility issues, which can be avoided in serum protein analysis by implementing optimized storage and transfer procedures. With a method that ensures accurate loading of filter paper discs with donor or patient serum, a gap in dried serum spot preparation and subsequent serum analysis shall be filled. Pre-punched filter paper discs with a 3 mm diameter are loaded within seconds in a highly reproducible fashion (approximately 10% standard deviation) when fully submerged in 10 μl of serum, named the "Submerge and Dry" protocol. Such prepared dried serum spots can store several hundred micrograms of proteins and other serum components. Serum-borne antigens and antibodies are reproducibly released in 20 μl elution buffer in high yields (approximately 90%). Dried serum spot-stored and eluted antigens kept their epitopes and antibodies their antigen binding abilities as was assessed by SDS-PAGE, 2D gel electrophoresis-based proteomics, and Western blot analysis, suggesting pre-punched filter paper discs as handy solution for serological tests.
Several species of (poly)saccharides and organic acids can be found often simultaneously in various biological matrices, e.g., fruits, plant materials, and biological fluids. The analysis of such matrices sometimes represents a challenging task. Using Aloe vera (A. vera) plant materials as an example, the performance of several spectroscopic methods (80 MHz benchtop NMR, NIR, ATR-FTIR and UV-Vis) for the simultaneous analysis of quality parameters of this plant material was compared. The determined parameters include (poly)saccharides such as aloverose, fructose and glucose as well as organic acids (malic, lactic, citric, isocitric, acetic, fumaric, benzoic and sorbic acids). 500 MHz NMR and high-performance liquid chromatography (HPLC) were used as the reference methods.
UV-VIS data can be used only for identification of added preservatives (benzoic and sorbic acids) and drying agent (maltodextrin) and semiquantitative analysis of malic acid. NIR and MIR spectroscopies combined with multivariate regression can deliver more informative overview of A. vera extracts being able to additionally quantify glucose, aloverose, citric, isocitric, malic, lactic acids and fructose. Low-field NMR measurements can be used for the quantification of aloverose, glucose, malic, lactic, acetic, and benzoic acids. The benchtop NMR method was successfully validated in terms of robustness, stability, precision, reproducibility and limit of detection (LOD) and quantification (LOQ), respectively.
All spectroscopic techniques are useful for the screening of (poly)saccharides and organic acids in plant extracts and should be applied according to its availability as well as information and confidence required for the specific analytical goal. Benchtop NMR spectroscopy seems to be the most feasible solution for quality control of A. vera products.
This work presents an open source database with suitable retention parameters for prediction and simulation of GC separations and gives a short introduction to three common retention models. Useful computer simulations play an important role to save resources and time in method development in GC. Thermodynamic retention parameters for the ABC model and the K-centric model are determined by isothermal measurements. This standardized procedure of measurements and calculations, presented in this work, have a useful benefit for all chromatographers, analytical chemists, and method developers because it can be used in their own laboratories to simplify the method development. The main benefits as simulations of temperature-programed GC separations are demonstrated and compared to measurements. The observed deviations of predicted retention times are in most cases less than 1%. The database includes more than 900 entries with a large range of compounds such as VOCs, PAHs, FAMEs, PCBs, or allergenic fragrances over 20 different GC columns.
Neutral buoyancy has been used as an analog for microgravity from the earliest days of human spaceflight. Compared to other options on Earth, neutral buoyancy is relatively inexpensive and presents little danger to astronauts while simulating some aspects of microgravity. Neutral buoyancy removes somatosensory cues to the direction of gravity but leaves vestibular cues intact. Removal of both somatosensory and direction of gravity cues while floating in microgravity or using virtual reality to establish conflicts between them has been shown to affect the perception of distance traveled in response to visual motion (vection) and the perception of distance. Does removal of somatosensory cues alone by neutral buoyancy similarly impact these perceptions? During neutral buoyancy we found no significant difference in either perceived distance traveled nor perceived size relative to Earth-normal conditions. This contrasts with differences in linear vection reported between short- and long-duration microgravity and Earth-normal conditions. These results indicate that neutral buoyancy is not an effective analog for microgravity for these perceptual effects.
PURPOSE
Cervical cancer (CC) is caused by a persistent high-risk human papillomavirus (hrHPV) infection. The cervico-vaginal microbiome may influence the development of (pre)cancer lesions. Aim of the study was (i) to evaluate the new CC screening program in Germany for the detection of high-grade CC precursor lesions, and (ii) to elucidate the role of the cervico-vaginal microbiome and its potential impact on cervical dysplasia.
METHODS
The microbiome of 310 patients referred to colposcopy was determined by amplicon sequencing and correlated with clinicopathological parameters.
RESULTS
Most patients were referred for colposcopy due to a positive hrHPV result in two consecutive years combined with a normal PAP smear. In 2.1% of these cases, a CIN III lesion was detected. There was a significant positive association between the PAP stage and Lactobacillus vaginalis colonization and between the severity of CC precursor lesions and Ureaplasma parvum.
CONCLUSION
In our cohort, the new cervical cancer screening program resulted in a low rate of additional CIN III detected. It is questionable whether these cases were only identified earlier with additional HPV testing before the appearance of cytological abnormalities, or the new screening program will truly increase the detection rate of CIN III in the long run. Colonization with U. parvum was associated with histological dysplastic lesions. Whether targeted therapy of this pathogen or optimization of the microbiome prevents dysplasia remains speculative.
Climate change is transforming the risks individuals and households face, with potentially profound socioeconomic consequences such as increased poverty, inequality, and social instability. Social protection is a policy tool that governments use to help individuals and households manage risks linked to income and livelihoods, and to achieve societal outcomes such as reducing poverty and inequality. Despite its potential as a policy response to climate change, the integration of social protection within the climate policy agenda is currently limited. While the concept of risk is key to both sectors, different understandings of the nature and scope of climate change impacts and their implications, as well as of the adequacy of social protection instruments to address them, contribute to the lack of policy and practice integration.
Our goal is to bridge this cognitive gap by highlighting the potential of social protection as a policy response to climate change. Using a comprehensive climate risk lens, we first explore how climate change drives risks that are within the realm of social protection, and their implications, including likely future trends in demand for social protection. Based on this analysis, we critically review existing arguments for what social protection can do and evidence of what it currently does to manage risks arising from climate change. From the analysis, a set of reconceptualised roles emerge for social protection to strategically contribute to climate-resilient development.
A Fourier scatterometry setup is evaluated to recover the key parameters of optical phase gratings. Based on these parameters, systematic errors in the printing process of two-photon polymerization (TPP) gray-scale lithography three-dimensional printers can be compensated, namely tilt and curvature deviations. The proposed setup is significantly cheaper than a confocal microscope, which is usually used to determine calibration parameters for compensation of the TPP printing process. The grating parameters recovered this way are compared to those obtained with a confocal microscope. A clear correlation between confocal and scatterometric measurements is first shown for structures containing either tilt or curvature. The correlation is also shown for structures containing a mixture of tilt and curvature errors (squared Pearson coefficient r2 = 0.92). This compensation method is demonstrated on a TPP printer: a diffractive optical element printed with correction parameters obtained from Fourier scatterometry shows a significant reduction in noise as compared to the uncompensated system. This verifies the successful reduction of tilt and curvature errors. Further improvements of the method are proposed, which may enable the measurements to become more precise than confocal measurements in the future, since scatterometry is not affected by the diffraction limit.
Trust your guts: fostering embodied knowledge and sustainable practices through voice interaction
(2023)
Despite various attempts to prevent food waste and motivate conscious food handling, household members find it difficult to correctly assess the edibility of food. With the rise of ambient voice assistants, we did a design case study to support households’ in situ decision-making process in collaboration with our voice agent prototype, Fischer Fritz. Therefore, we conducted 15 contextual inquiries to understand food practices at home. Furthermore, we interviewed six fish experts to inform the design of our voice agent on how to guide consumers and teach food literacy. Finally, we created a prototype and discussed with 15 consumers its impact and capability to convey embodied knowledge to the human that is engaged as sensor. Our design research goes beyond current Human-Food Interaction automation approaches by emphasizing the human-food relationship in technology design and demonstrating future complementary human-agent collaboration with the aim to increase humans’ competence to sense, think, and act.
Cyanobacteria are gaining considerable interest as a method of supporting the long-term presence of humans on the Moon and settlements on Mars due to their ability to produce oxygen and their potential as bio-factories for space biotechnology/synthetic biology and other applications. Since many unknowns remain in our knowledge to bridge the gap and move cyanobacterial bioprocesses from Earth to space, we investigated cell division resumption on the rehydration of dried Chroococcidiopsis sp. CCMEE 029 accumulated DNA damage while exposed to space vacuum, Mars-like conditions, and Fe-ion radiation. Upon rehydration, the monitoring of the ftsZ gene showed that cell division was arrested until DNA damage was repaired, which took 48 h under laboratory conditions. During the recovery, a progressive DNA repair lasting 48 h of rehydration was revealed by PCR-stop assay. This was followed by overexpression of the ftsZ gene, ranging from 7.5- to 9-fold compared to the non-hydrated samples. Knowing the time required for DNA repair and cell division resumption is mandatory for deep-space experiments that are designed to unravel the effects of reduced/microgravity on this process. It is also necessary to meet mission requirements for dried-sample implementation and real-time monitoring upon recovery. Future experiments as part of the lunar exploration mission Artemis and the lunar gateway station will undoubtedly help to move cyanobacterial bioprocesses beyond low Earth orbit. From an astrobiological perspective, these experiments will further our understanding of microbial responses to deep-space conditions.