Refine
H-BRS Bibliography
- yes (19) (remove)
Departments, institutes and facilities
- Fachbereich Ingenieurwissenschaften und Kommunikation (8)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (7)
- Fachbereich Informatik (5)
- Fachbereich Angewandte Naturwissenschaften (4)
- Fachbereich Wirtschaftswissenschaften (2)
- Institut für Cyber Security & Privacy (ICSP) (2)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (2)
- Institut für Medienentwicklung und -analyse (IMEA) (1)
Document Type
- Preprint (19) (remove)
Has Fulltext
- yes (19) (remove)
Keywords
- lignin (2)
- work engagement (2)
- ATR-FTIR (1)
- Authentication features (1)
- Ball Tracking (1)
- Big Data Analysis (1)
- Drosophila (1)
- FOS: Computer and information sciences (1)
- Folin-Ciocalteu assay (1)
- HEB mixer (1)
Renewable resources gain increasing interest as source for environmentally benign biomaterials, such as drug encapsulation/release compounds, and scaffolds for tissue engineering in regenerative medicine. Being the second largest naturally abundant polymer, the interest in lignin valorization for biomedical utilization is rapidly growing. Depending on resource and isolation procedure, lignin shows specific antioxidant and antimicrobial activity. Today, efforts in research and industry are directed toward lignin utilization as renewable macromolecular building block for the preparation of polymeric drug encapsulation and scaffold materials. Within the last five years, remarkable progress has been made in isolation, functionalization and modification of lignin and lignin-derived compounds. However, literature so far mainly focuses lignin-derived fuels, lubricants and resins. The purpose of this review is to summarize the current state of the art and to highlight the most important results in the field of lignin-based materials for potential use in biomedicine (reported in 2014–2018). Special focus is drawn on lignin-derived nanomaterials for drug encapsulation and release as well as lignin hybrid materials used as scaffolds for guided bone regeneration in stem cell-based therapies.
Antioxidant activity is an essential feature required for oxygen-sensitive merchandise and goods, such as food and corresponding packaging as well as materials used in cosmetics and biomedicine. For example, vanillin, one of the most prominent antioxidants, is fabricated from lignin, the second most abundant natural polymer in the world. Antioxidant potential is primarily related to the termination of oxidation propagation reactions through hydrogen transfer. The application of technical lignin as a natural antioxidant has not yet been implemented in the industrial sector, mainly due to the complex heterogeneous structure and polydispersity of lignin. Thus, current research focuses on various isolation and purification strategies to improve the compatibility of lignin material with substrates and enhancing its stabilizing effect.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. Specifically, the aerosol (cloud) optical depth is inferred during clear sky (completely overcast) conditions. The method is tested on data from two measurement campaigns that took place in Allgäu, Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 minute resolution, the hourly global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 11.45 W m−2, averaged over the two campaigns, whereas for the retrieval using coarser 15 minute power data the mean bias error is 16.39 W m−2.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a one-dimensional radiative transfer simulation, and the results are compared to both satellite retrievals as well as data from the COSMO weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and are properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
In this contribution, we perform computer simulations to expedite the development of hydrogen storages based on metal hydride. These simulations enable in-depth analysis of the processes within the systems which otherwise could not be achieved. That is, because the determination of crucial process properties require measurement instruments in the setup which are currently not available. Therefore, we investigate the reliability of reaction values that are determined by a design of experiments.
Specifically, we first explain our model setup in detail. We define the mathematical terms to obtain insights into the thermal processes and reaction kinetics. We then compare the simulated results to measurements of a 5-gram sample consisting of iron-titanium-manganese (FeTiMn) to obtain the values with the highest agreement with the experimental data. In addition, we improve the model by replacing the commonly used Van’t-Hoff equation by a mathematical expression of the pressure-composition-isotherms (PCI) to calculate the equilibrium pressure.
Finally, the parameters’ accuracy is checked in yet another with an existing metal hydride system. The simulated results demonstrate high concordance with experimental data, which advocate the usage of approximated kinetic reaction properties by a design of experiments for further design studies. Furthermore, we are able to determine process parameters like the entropy and enthalpy.
Today, more than 70 million tons of lignin are produced by the pulp and paper industry every year. However, the utilization of lignin as a source for chemical synthesis is still limited due to the complex and heterogeneous lignin structure. The purpose of this study was a selective photodegradation of industrially available kraft lignin in order to obtain appropriate fragments and building block chemicals for further utilization, e.g. polymerization. Thus, kraft lignin obtained from soft wood black liquor by acidification was dissolved in sodium hydroxide and irradiated at a wavelength of 254 nm with and without the presence of titanium dioxide in various concentrations. Analyses of the irradiated products via SEC showed decreasing molar masses and decreasing polydispersity indices over time. At the end of the irradiation period the lignin was depolymerised to form fragments as small as the lignin monomers. TOC analyses showed minimal mineralisation due to the depolymerisation process.
4GREAT is an extension of the German Receiver for Astronomy at Terahertz frequencies (GREAT) operated aboard the Stratospheric Observatory for Infrared Astronomy (SOFIA). The spectrometer comprises four different detector bands and their associated subsystems for simultaneous and fully independent science operation. All detector beams are co-aligned on the sky. The frequency bands of 4GREAT cover 491-635, 890-1090, 1240-1525 and 2490-2590 GHz, respectively. This paper presents the design and characterization of the instrument, and its in-flight performance. 4GREAT saw first light in June 2018, and has been offered to the interested SOFIA communities starting with observing cycle 6.
In an effort to assist researchers in choosing basis sets for quantum mechanical modeling of molecules (i.e. balancing calculation cost versus desired accuracy), we present a systematic study on the accuracy of computed conformational relative energies and their geometries in comparison to MP2/CBS and MP2/AV5Z data, respectively. In order to do so, we introduce a new nomenclature to unambiguously indicate how a CBS extrapolation was computed. Nineteen minima and transition states of buta-1,3-diene, propan-2-ol and the water dimer were optimized using forty-five different basis sets. Specifically, this includes one Pople (i.e. 6-31G(d)), eight Dunning (i.e. VXZ and AVXZ, X=2-5), twenty-five Jensen (i.e. pc-n, pcseg-n, aug-pcseg-n, pcSseg-n and aug-pcSseg-n, n=0-4) and nine Karlsruhe (e.g. def2-SV(P), def2-QZVPPD) basis sets. The molecules were chosen to represent both common and electronically diverse molecular systems. In comparison to MP2/CBS relative energies computed using the largest Jensen basis sets (i.e. n=2,3,4), the use of smaller sizes (n=0,1,2 and n=1,2,3) provides results that are within 0.11--0.24 and 0.09-0.16 kcal/mol. To practically guide researchers in their basis set choice, an equation is introduced that ranks basis sets based on a user-defined balance between their accuracy and calculation cost. Furthermore, we explain why the aug-pcseg-2, def2-TZVPPD and def2-TZVP basis sets are very suitable choices to balance speed and accuracy.
The promotion of sustainable packaging is part of the European Green Deal and plays a key role in the EU’s social and political strategy. One option is the use of renewable resources and biomass waste as raw materials for polymer production. Lignocellulose biomass from annual and perennial industrial crops and agricultural residues are a major source of polysaccharides, proteins, and lignin, and can also be used to obtain plant-based extracts and essential oils. Therefore, these biomasses are considered as potential substitute for fossil-based resources. Here, the status quo of bio-based polymers is discussed and evaluated in terms of properties related to packaging applications such as gas and water vapor permeability as well as mechanical properties. So far, their practical use is still restricted due to lower performance in fundamental packaging functions that directly influence food quality and safety, the length of shelf life and thus the amount of food waste. Besides bio-based polymers, this review focuses on plant extracts as active packaging agents. Incorporating extracts of herbs, flowers, trees, and their fruits is inevitable to achieve desired material properties that are capable to prolong the food shelf life. Finally, the adoption potential of packaging based on polymers from renewable resources is discussed from a bioeconomy perspective.
Start-ups stehen im Wettbewerb um qualifizierte Mitarbeiter in starker Konkurrenz zu etablierten Unternehmen und Konzernen. Der Bedarf an Fachkräften (etwa Software-Entwicklern) ist größer als je zuvor [1]. Wie stellen sich Start-ups als Arbeitgeber dar, um Personal für sich zu gewinnen? Dieser Frage wurde im Rahmen der Studie „Start-ups als Arbeitgeber“ nachgegangen.
Die Bilder aus dem Silicon Valley sind bekannt: Das Großraumbüro mit den Sitzecken zum Zurückziehen. Schaukeln, Kickern und Videospiele zum Relaxen in den Arbeitspausen. Überall gibt es etwas zu essen und zu trinken, und das natürlich gratis. – Diese Vorstellungen haben viele im Kopf. Finden sich diese Bilder in der Selbstdarstellung deutscher Start-ups als Arbeitgeber wieder?
Die hier vorgestellte Studie will keine allgemeingültigen Ergebnisse liefern, sondern ist explorativ angelegt und soll zur weiteren Beschäftigung mit diesem Forschungsfeld in Wissenschaft und Praxis anregen.
Long-term variability of solar irradiance and its implications for photovoltaic power in West Africa
(2020)
This paper addresses long-term changes in solar irradiance for West Africa (3° N to 20° N and 20° W to 16° E) and its implications for photovoltaic power systems. Here we use satellite irradiance (Surface Solar Radiation Data Set-Heliosat, Edition 2.1, SARAH-2.1) to derive photovoltaic yields. Based on 35 years of data (1983–2017) the temporal and regional variability as well as long-term trends of global and direct horizontal irradiance are analyzed. Furthermore, at four locations a detailed time series analysis is undertaken. The dry and the wet season are considered separately.
This paper presents the b-it-bots RoboCup@Work team and its current hardware and functional architecture for the KUKA youBot robot. We describe the underlying software framework and the developed capabilities required for operating in industrial environments including features such as reliable and precise navigation, flexible manipulation, robust object recognition and task planning. New developments include an approach to grasp vertical objects, placement of objects by considering the empty space on a workstation, and the process of porting our code to ROS2.
TinyECC 2.0 is an open source library for Elliptic Curve Cryptography (ECC) in wireless sensor networks. This paper analyzes the side channel susceptibility of TinyECC 2.0 on a LOTUS sensor node platform. In our work we measured the electromagnetic (EM) emanation during computation of the scalar multiplication using 56 different configurations of TinyECC 2.0. All of them were found to be vulnerable, but to a different degree. The different degrees of leakage include adversary success using (i) Simple EM Analysis (SEMA) with a single measurement, (ii) SEMA using averaging, and (iii) Multiple-Exponent Single-Data (MESD) with a single measurement of the secret scalar. It is extremely critical that in 30 TinyECC 2.0 configurations a single EM measurement of an ECC private key operation is sufficient to simply read out the secret scalar. MESD requires additional adversary capabilities and it affects all TinyECC 2.0 configurations, again with only a single measurement of the ECC private key operation. These findings give evidence that in security applications a configuration of TinyECC 2.0 should be chosen that withstands SEMA with a single measurement and, beyond that, an addition of appropriate randomizing countermeasures is necessary.
Machine learning and neural networks are now ubiquitous in sonar perception, but it lags behind the computer vision field due to the lack of data and pre-trained models specifically for sonar images. In this paper we present the Marine Debris Turntable dataset and produce pre-trained neural networks trained on this dataset, meant to fill the gap of missing pre-trained models for sonar images. We train Resnet 20, MobileNets, DenseNet121, SqueezeNet, MiniXception, and an Autoencoder, over several input image sizes, from 32 x 32 to 96 x 96, on the Marine Debris turntable dataset. We evaluate these models using transfer learning for low-shot classification in the Marine Debris Watertank and another dataset captured using a Gemini 720i sonar. Our results show that in both datasets the pre-trained models produce good features that allow good classification accuracy with low samples (10-30 samples per class). The Gemini dataset validates that the features transfer to other kinds of sonar sensors. We expect that the community benefits from the public release of our pre-trained models and the turntable dataset.
Background: Virtual reality combined with spherical treadmills is used across species for studying neural circuits underlying navigation.
New Method: We developed an optical flow-based method for tracking treadmil ball motion in real-time using a single high-resolution camera.
Results: Tracking accuracy and timing were determined using calibration data. Ball tracking was performed at 500 Hz and integrated with an open source game engine for virtual reality projection. The projection was updated at 120 Hz with a latency with respect to ball motion of 30 ± 8 ms.
Comparison: with Existing Method(s) Optical flow based tracking of treadmill motion is typically achieved using optical mice. The camera-based optical flow tracking system developed here is based on off-the-shelf components and offers control over the image acquisition and processing parameters. This results in flexibility with respect to tracking conditions – such as ball surface texture, lighting conditions, or ball size – as well as camera alignment and calibration.
Conclusions: A fast system for rotational ball motion tracking suitable for virtual reality animal behavior across different scales was developed and characterized.
In the literature on occupational stress and recovery from work several facets of thinking about work in off-job time have been conceptualized. However, research on the focal concepts is currently rather disintegrated. In this study we take a closer look at the five most established concepts, namely (1) psychological detachment, (2) affective rumination, (3) problem-solving pondering, (4) positive work reflection, and (5) negative work reflection. More specifically, we scrutinized (1) whether the five facets of work-related rumination are empirically distinct, (2) whether they yield differential associations with different facets of employee well-being (burnout, work engagement, thriving, satisfaction with life, and flourishing), and (3) to what extent the five facets can be distinguished from and relate to conceptually similar constructs, such as irritation, worry, and neuroticism. We applied structural equation modeling techniques to cross-sectional survey data from 474 employees. Our results provide evidence that (1) the five facets of work-related rumination are highly related, yet empirically distinct, (2) that each facet contributes uniquely to explain variance in certain aspects of employee well-being, and (3) that they are distinct from related concepts, albeit there is a high overlap between (lower levels of) psychological detachment and cognitive irritation. Our study contributes to clarify the structure of work-related rumination and extends the nomological network around different types of thinking about work in off-job time and employee well-being.
Although work events can be regarded as pivotal elements of organizational life, only a few studies have examined how positive and negative events relate to and combine to affect work engagement over time. Theory suggests that to better understand how current events affect work engagement (WE), we have to account for recent events that have preceded these current events. We present competing theoretical views on how recent and current work events may affect employees (e.g., getting used to a high frequency of negative events or becoming more sensitive to negative events). Although the occurrence of events implies discrete changes in the experience of work, prior research has not considered whether work events actually accumulate to sustained mid-term changes in WE. To address these gaps in the literature, we conducted a week-level longitudinal study across a period of 15 consecutive weeks among 135 employees, which yielded 849 weekly observations. While positive events were associated with higher levels of WE within the same week, negative events were not. Our results support neither satiation nor sensitization processes. However, high frequencies of negative events in the preceding week amplified the beneficial effects of positive events on WE in the current week. Growth curve analyses show that the benefits of positive events accumulate to sustain high levels of WE. WE dissipates in the absence of continuous experience of positive events. Our study adds a temporal component and informs research that has taken a feature-oriented perspective on the dynamic interplay of job demands and resources.
Risk-based authentication (RBA) aims to strengthen password-based authentication rather than replacing it. RBA does this by monitoring and recording additional features during the login process. If feature values at login time differ significantly from those observed before, RBA requests an additional proof of identification. Although RBA is recommended in the NIST digital identity guidelines, it has so far been used almost exclusively by major online services. This is partly due to a lack of open knowledge and implementations that would allow any service provider to roll out RBA protection to its users.
To close this gap, we provide a first in-depth analysis of RBA characteristics in a practical deployment. We observed N=780 users with 247 unique features on a real-world online service for over 1.8 years. Based on our collected data set, we provide (i) a behavior analysis of two RBA implementations that were apparently used by major online services in the wild, (ii) a benchmark of the features to extract a subset that is most suitable for RBA use, (iii) a new feature that has not been used in RBA before, and (iv) factors which have a significant effect on RBA performance. Our results show that RBA needs to be carefully tailored to each online service, as even small configuration adjustments can greatly impact RBA's security and usability properties. We provide insights on the selection of features, their weightings, and the risk classification in order to benefit from RBA after a minimum number of login attempts.
Turbulent compressible flows are traditionally simulated using explicit Eulerian time integration applied to the Navier-Stokes equations. However, the associated Courant-Friedrichs-Lewy condition severely restricts the maximum time step size. Exploiting the Lagrangian nature of the Boltzmann equation's material derivative, we now introduce a feasible three-dimensional semi-Lagrangian lattice Boltzmann method (SLLBM), which elegantly circumvents this restriction. Previous lattice Boltzmann methods for compressible flows were mostly restricted to two dimensions due to the enormous number of discrete velocities needed in three dimensions. In contrast, this Rapid Communication demonstrates how cubature rules enhance the SLLBM to yield a three-dimensional velocity set with only 45 discrete velocities. Based on simulations of a compressible Taylor-Green vortex we show that the new method accurately captures shocks or shocklets as well as turbulence in 3D without utilizing additional filtering or stabilizing techniques, even when the time step sizes are up to two orders of magnitude larger compared to simulations in the literature. Our new method therefore enables researchers for the first time to study compressible turbulent flows by a fully explicit scheme, whose range of admissible time step sizes is only dictated by physics, while being decoupled from the spatial discretization.
The clear-sky radiative effect of aerosol-radiation interactions is of relevance for our understanding of the climate system. The influence of aerosol on the surface energy budget is of high interest for the renewable energy sector. In this study, the radiative effect is investigated in particular with respect to seasonal and regional variations for the region of Germany and the year 2015 at the surface and top of atmosphere using two complementary approaches.
First, an ensemble of clear-sky models which explicitly consider aerosols is utilized to retrieve the aerosol optical depth and the surface direct radiative effect of aerosols by means of a clear sky fitting technique. For this, short-wave broadband irradiance measurements in the absence of clouds are used as a basis. A clear sky detection algorithm is used to identify cloud free observations. Considered are measurements of the shortwave broadband global and diffuse horizontal irradiance with shaded and unshaded pyranometers at 25 stations across Germany within the observational network of the German Weather Service (DWD). Clear sky models used are MMAC, MRMv6.1, METSTAT, ESRA, Heliosat-1, CEM and the simplified Solis model. The definition of aerosol and atmospheric characteristics of the models are examined in detail for their suitability for this approach.
Second, the radiative effect is estimated using explicit radiative transfer simulations with inputs on the meteorological state of the atmosphere, trace-gases and aerosol from CAMS reanalysis. The aerosol optical properties (aerosol optical depth, Ångström exponent, single scattering albedo and assymetrie parameter) are first evaluated with AERONET direct sun and inversion products. The largest inconsistency is found for the aerosol absorption, which is overestimated by about 0.03 or about 30 % by the CAMS reanalysis. Compared to the DWD observational network, the simulated global, direct and diffuse irradiances show reasonable agreement within the measurement uncertainty. The radiative kernel method is used to estimate the resulting uncertainty and bias of the simulated direct radiative effect. The uncertainty is estimated to −1.5 ± 7.7 and 0.6 ± 3.5 W m−2 at the surface and top of atmosphere, respectively, while the annual-mean biases at the surface, top of atmosphere and total atmosphere are −10.6, −6.5 and 4.1 W m−2, respectively.
The retrieval of the aerosol radiative effect with the clear sky models shows a high level of agreement with the radiative transfer simulations, with an RMSE of 5.8 W m−2 and a correlation of 0.75. The annual mean of the REari at the surface for the 25 DWD stations shows a value of −12.8 ± 5 W m−2 as average over the clear sky models, compared to −11 W m−2 from the radiative transfer simulations. Since all models assume a fixed aerosol characterisation, the annual cycle of the aerosol radiation effect cannot be reproduced. Out of this set of clear sky models, the largest level of agreement is shown by the ESRA and MRMv6.1 models.