Refine
H-BRS Bibliography
- yes (98)
Departments, institutes and facilities
- Fachbereich Angewandte Naturwissenschaften (33)
- Fachbereich Wirtschaftswissenschaften (25)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (22)
- Fachbereich Informatik (21)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (14)
- Fachbereich Ingenieurwissenschaften und Kommunikation (13)
- Institut für funktionale Gen-Analytik (IFGA) (10)
- Institut für Sicherheitsforschung (ISF) (7)
- Institute of Visual Computing (IVC) (5)
- Centrum für Entrepreneurship, Innovation und Mittelstand (CENTIM) (3)
Document Type
- Article (78)
- Conference Object (12)
- Report (4)
- Part of a Book (3)
- Master's Thesis (1)
Year of publication
- 2022 (98) (remove)
Language
- English (98) (remove)
Has Fulltext
- yes (98) (remove)
Keywords
- Knowledge Graphs (3)
- Machine Learning (3)
- Bioinformatics (2)
- Natural Language Processing (2)
- SERS (2)
- Transformers (2)
- Well-being (2)
- biodiversity (2)
- creep (2)
- cytokine-induced killer cells (2)
- haptics (2)
- immunotherapy (2)
- modeling (2)
- presenteeism (2)
- relaxation (2)
- sustainable development goals (2)
- virtual reality (2)
- 3D Segmentation (1)
- 3D user interface (1)
- AI usage in sports (1)
- AOP (1)
- APC superfamily (1)
- ATB0,+ (1)
- Abstract Syntax Tree (1)
- Agri-environment schemes (1)
- Anthropocene (1)
- Artificial Intelligence (1)
- AuNPs (1)
- Autonomy (1)
- BCFA (1)
- Backorder prediction (1)
- Bayesian CFA (1)
- Bibliographic Analysis (1)
- Biopolymers (1)
- Bounding box explanations (1)
- Brand (1)
- Brand identity (1)
- Brand image (1)
- CD40, CTLA-4 (1)
- CNN (1)
- COVID‐19 (1)
- Capital structure (1)
- Cathepsin K (1)
- Cislunar (1)
- Classification explanations (1)
- Codes (1)
- Collective action (1)
- Complexity (1)
- Composites (1)
- Computer Vision (1)
- Contingency analysis (1)
- Current research information systems (1)
- DNA damage (1)
- DNA typing (1)
- Data protection by design (1)
- Data structures (1)
- Deep Learning (1)
- Digital Entrepreneurship Education (1)
- Effective purpose specification (1)
- Enhanced weathering (1)
- Entrepreneurial Intention (1)
- Entrepreneurial family (1)
- Explainable Artificial Intelligence (XAI) (1)
- Explosives (1)
- Extrusionsblasformen (1)
- Family business (1)
- Feedback (1)
- Fertilizer (1)
- Food security (1)
- GDPR (1)
- Gas transport simulation (1)
- Gasturbinenschaufel (1)
- Geometry (1)
- Global explanati (1)
- Gradient-based explanation methods (1)
- Graph Convolutional Neural Networks (1)
- Graph embeddings (1)
- Graph theory (1)
- Graphene (1)
- HCI (1)
- HDBR (1)
- HSP90 (1)
- HTS (1)
- Hardware (1)
- Health Promotion (1)
- Health care consumption (1)
- Health insurance (1)
- Higher Education (1)
- Human orientation perception (1)
- Hydrogen (1)
- Hydrogen storage (1)
- IR microspectroscopy (1)
- Institutional Analysis and Development (IAD) framework (1)
- Institutions of Sustainability (IoS) framework (1)
- Interaction effects (1)
- Interdisciplinarity (1)
- Interests (1)
- International sustainable development (1)
- Job satisfaction (1)
- Knowledge co-production (1)
- Kriechen (1)
- LSTM (1)
- Laboratory (1)
- Lebensdauervorhersage (1)
- LeuT (1)
- Lignin (1)
- Linear viscoelasticity (1)
- Lineare Viskoelastizität (1)
- Local explanation (1)
- MAXQDA (1)
- MOX gas sensors (1)
- Malware (1)
- Markov Cluster Algorithm (1)
- Markov chain Monte Carlo (1)
- Mass transport (1)
- Mechanische Prüfung (1)
- Membrane Transport (1)
- Miscanthus (1)
- Molecular dynamics (1)
- Multi-object visualization (1)
- Multidisciplinary (1)
- NLP (1)
- NSS family (1)
- Nafion™ (1)
- Nano-Systems (1)
- Nanoparticles (1)
- Nickel-based superalloy (1)
- Nickelbasis-Superlegierung (1)
- O3/UV (1)
- OCT (1)
- Object Segmentation (1)
- Object detectors (1)
- One Health doctoral training (1)
- One Health implementation (1)
- Orion (1)
- PAD (1)
- PEM electrolysis (1)
- PLASM (1)
- Pakistan (1)
- Part Segmentation (1)
- Payment for Ecosystem Services (PES) (1)
- Perception (1)
- Perceptual Upright (1)
- Permeation (1)
- Point Cloud Segmentation (1)
- Policy instruments (1)
- Polysaccharide derivatives (1)
- Positive emotions (1)
- Poverty (1)
- Power (1)
- Private equity (1)
- Program evaluation (1)
- Proximity (1)
- Quantitative analysis of explanations (1)
- R-ratio (1)
- Raman spectroscopy (1)
- Raman-microspectroscopy (1)
- Ray tracing (1)
- Research-practice-collaborations (1)
- RheoTack analysis (1)
- Rock dust (1)
- SDG 3 (1)
- SDG 4 (1)
- SLC (1)
- SLC6 (1)
- SLC6A14 (1)
- SMPA loop (1)
- STARLIFE project (1)
- Saliency maps (1)
- Sanity checks for explaining detectors (1)
- Schwindung (1)
- Science Management (1)
- Self-assembling (1)
- Semantic Segmentation (1)
- Semantic search (1)
- Silicon Carbides (1)
- Simulator sickness (1)
- Single family office (1)
- Social protection (1)
- Software Supply Chain (1)
- Soil health (1)
- Space radiation (1)
- Spill-over (1)
- Sustainability (1)
- Sustainable development (1)
- TOC (1)
- Tap water (1)
- Three-dimensional displays (1)
- TiO2-coatings (1)
- Topological reduction (1)
- Topology (1)
- Transdisciplinary research (1)
- Transformation Management (1)
- Treatment (1)
- UXD (1)
- Universal health care (1)
- VOSviewer (1)
- Verzug (1)
- Visually induced motion sickness (1)
- Voltage measurement (1)
- Web scrapping (1)
- Winery (1)
- Workload (1)
- Work‐life balance (1)
- Yeast (1)
- ability to study (1)
- acceptance (1)
- adaptive trigger (1)
- adhesion factor (1)
- aluminum bonding wire (1)
- ambiguity (1)
- amino acid transporter (1)
- anabolic (1)
- anaplastic lymphoma kinase (1)
- annotation (1)
- antibiotic prophylaxis (1)
- attitude-behavior gap for sustainability (1)
- augmented reality (1)
- authentication (1)
- authoring (1)
- autologous bone graft (1)
- automatic measurement validation (1)
- automatic music generation (1)
- autonomy (1)
- azadipeptide nitrile (1)
- bacteria (1)
- biaxial stretching (1)
- biochemical fingerprinting (1)
- biomarker (1)
- biomaterial (1)
- biometrics (1)
- blown film extrusion (1)
- bone mineral density (1)
- bone remodeling (1)
- cafeteria (1)
- catabolic (1)
- cell viability (1)
- chain-extending cross-linker (1)
- chemosensing (1)
- cholesteric phase (1)
- circular economy (1)
- classification (1)
- clinical trials (1)
- coffee ring effect (1)
- collision (1)
- combination of treatments (1)
- consumer behavior for sustainability (1)
- controller design (1)
- creep compliance (1)
- crystallization (1)
- cube in cube model (1)
- cyanohydrazide warhead (1)
- deformation behavior (1)
- dental implant (1)
- detaching (1)
- digital learning (1)
- discriminant analysis (1)
- double pulse test (1)
- drivers (1)
- drug delivery (1)
- eXplainable artifcial intelligence (XAI) (1)
- eco-certification (1)
- eco-products (1)
- ecosystem services (1)
- education for sustainable development (1)
- elementary volume (1)
- elite sports (1)
- emotion recognition (1)
- employee privacy (1)
- engaged university (1)
- environmental certification (1)
- error analysis (1)
- eudaimonic well-being (1)
- explainable AI (1)
- extremophile (1)
- extrusion blow molding (1)
- facial emotion recognition (1)
- factor analysis (1)
- farmers (1)
- feature (1)
- fiber composites (1)
- fingerprint (1)
- flexibility (1)
- forensic genetics (1)
- freedom (1)
- fuel cell (1)
- fully superconducting generator (1)
- gas turbine blade (1)
- gas-to-power (1)
- generation Z (1)
- generational cohort (1)
- geopolymer (1)
- halogen bonding (1)
- head down bed rest (1)
- health (1)
- health intervention (1)
- health management (1)
- health-promoting collaboration (1)
- healthy eating (1)
- holiday (1)
- hybrid system (1)
- hydrides (1)
- hydrogen (1)
- hydrogen bonding (1)
- immune checkpoint inhibition programmed cell death-1 (1)
- impact monitoring (1)
- institutional analysis (1)
- integrative Simulation (1)
- integrative simulation (1)
- interaction design (1)
- interface design (1)
- knowledge transfer (1)
- land use (1)
- language (1)
- latent class analysis (1)
- leave (1)
- leishmaniasis (1)
- lifetime prediction (1)
- local chain orientation (1)
- measurement errors (1)
- mechanical testing (1)
- mesenchymal stem cells (1)
- mesoscale coarse-graining (1)
- methylmalonic acidemia (1)
- microbial contamination (1)
- mixed methods (1)
- mixed methods study (1)
- mixed reality (1)
- molecular docking (1)
- multivariate statistics (1)
- music analysis (1)
- nanomedicine (1)
- nature (1)
- nature-protected areas (1)
- non-small cell lung cancer (1)
- nudge (1)
- nudgeability (1)
- off-job crafting (1)
- optical coherence tomography (1)
- oral history (1)
- organizational policy (1)
- orthotropes prozessabhängiges Materialverhalten (1)
- orthotropic process-dependent material behavior (1)
- osteoblast (1)
- osteoclast (1)
- osteogenic potential (1)
- osteoporosis (1)
- ozonation (1)
- ozone (1)
- particulate composite (1)
- pathophysiology (1)
- photonic sensing (1)
- physical exercise (1)
- poly(butylene adipate terephthalate) (1)
- poly(lactic acid) (1)
- polyethylene (1)
- power converter (1)
- power electronics (1)
- power semiconductors (1)
- presentation attack detection (1)
- pressure sensitive adhesives (1)
- prioritizable ranking (1)
- process-induced morphology (1)
- propionic acidemia (1)
- protease inhibitor (1)
- protein microarray (1)
- prototyping (1)
- psychophysics (1)
- qualitative research (1)
- questionnaire (1)
- recovery (1)
- renal cell carcinoma (1)
- representation learning (1)
- retraction speed dependency (1)
- scaffolds (1)
- self-determination theory (1)
- sensor array (1)
- sensor phenomena and characterization (1)
- sensory characterisation (1)
- sentiment analysis (1)
- short tandem repeat (STR) (1)
- shrinkage (1)
- slope based signature (1)
- small-scale fatigue testing (1)
- social Innovation (1)
- social activities (1)
- social exchange theory (1)
- social media (1)
- socially engaged university (1)
- solute carrier (1)
- space flight analog (1)
- space radiation environment (1)
- speech emotion recognition (1)
- stakeholder analysis (1)
- stress response (1)
- structural equation modeling (1)
- structure (1)
- students (1)
- subjective visual vertical (1)
- supramolecular liquid crystals (1)
- surrogate endpoint (1)
- susceptibility (1)
- sustainability-oriented behavior (1)
- synchronous generator (1)
- systemic regional innovation (1)
- technological innovation (1)
- thermal insulation materials (1)
- thermophoresis (1)
- thermosensing (1)
- third mission (1)
- time series analysis (1)
- tissue engineering (1)
- transfer office (1)
- triiodothyronine (1)
- triple helix (1)
- ultrapure water (1)
- university–government relations (1)
- urban development (1)
- vibration (1)
- virtual reality, XR (1)
- warpage (1)
- weight perception (1)
- whole genome amplification (WGA) (1)
- wide band gap (1)
- wind energy (1)
- work ability (1)
- DRAMMA model (1)
- integrative needs model of crafting (1)
- leisure crafting (1)
- needs satisfaction (1)
- needs-based (1)
- optimal functioning (1)
- validation (1)
Trojanized software packages used in software supply chain attacks constitute an emerging threat. Unfortunately, there is still a lack of scalable approaches that allow automated and timely detection of malicious software packages and thus most detections are based on manual labor and expertise. However, it has been observed that most attack campaigns comprise multiple packages that share the same or similar malicious code. We leverage that fact to automatically reproduce manually identified clusters of known malicious packages that have been used in real world attacks, thus, reducing the need for expert knowledge and manual inspection. Our approach, AST Clustering using MCL to mimic Expertise (ACME), yields promising results with a 𝐹1 score of 0.99. Signatures are automatically generated based on characteristic code fragments from clusters and are subsequently used to scan the whole npm registry for unreported malicious packages. We are able to identify and report six malicious packages that have been removed from npm consequentially. Therefore, our approach can support the detection by reducing manual labor and hence may be employed by maintainers of package repositories to detect possible software supply chain attacks through trojanized software packages.
Guzzo et al. (Reference Guzzo, Schneider and Nalbantian2022) argue that open science practices may marginalize inductive and abductive research and preclude leveraging big data for scientific research. We share their assessment that the hypothetico-deductive paradigm has limitations (see also Staw, Reference Staw2016) and that big data provide grand opportunities (see also Oswald et al., Reference Oswald, Behrend, Putka and Sinar2020). However, we arrive at very different conclusions. Rather than opposing open science practices that build on a hypothetico-deductive paradigm, we should take initiative to do open science in a way compatible with the very nature of our discipline, namely by incorporating ambiguity and inductive decision-making. In this commentary, we (a) argue that inductive elements are necessary for research in naturalistic field settings across different stages of the research process, (b) discuss some misconceptions of open science practices that hide or discourage inductive elements, and (c) propose that field researchers can take ownership of open science in a way that embraces ambiguity and induction. We use an example research study to illustrate our points.
Bonding wires made of aluminum are the most used materials for the transmission of electrical signals in power electronic devices. During operation, different cyclic mechanical and thermal stresses can lead to fatigue loads and a failure of the bonding wires. A prediction or prevention of the wire failure is not yet possible by design for all cases. The following work presents meaningful fatigue tests in small wire dimensions and investigates the influence of the R-ratio on the lifetime of two different aluminum wires with a diameter of 300 μm each. The experiments show very reproducible fatigue results with ductile failure behavior. The endurable stress amplitude decreases linearly with an increasing stress ratio, which can be displayed by a Smith diagram, even though the applied maximum stresses exceed the initial yield stresses determined by tensile tests. A scaling of the fatigue results by the tensile strength indicates that the fatigue level is significantly influenced by the strength of the material. Due to the very consistent findings, the development of a generalized fatigue model for predicting the lifetime of bonding wires with an arbitrary loading situation seems to be possible and will be further investigated.
In young adulthood, important foundations are laid for health later in life. Hence, more attention should be paid to the health measures concerning students. A research field that is relevant to health but hitherto somewhat neglected in the student context is the phenomenon of presenteeism. Presenteeism refers to working despite illness and is associated with negative health and work-related effects. The study attempts to bridge the research gap regarding students and examines the effects of and reasons for this behavior. The consequences of digital learning on presenteeism behavior are moreover considered. A student survey (N = 1036) and qualitative interviews (N = 11) were conducted. The results of the quantitative study show significant negative relationships between presenteeism and health status, well-being, and ability to study. An increased experience of stress and a low level of detachment as characteristics of digital learning also show significant relationships with presenteeism. The qualitative interviews highlighted the aspect of not wanting to miss anything as the most important reason for presenteeism. The results provide useful insights for developing countermeasures to be easily integrated into university life, such as establishing fixed learning partners or the use of additional digital learning material.
Deployment of modern data-driven machine learning methods, most often realized by deep neural networks (DNNs), in safety-critical applications such as health care, industrial plant control, or autonomous driving is highly challenging due to numerous model-inherent shortcomings. These shortcomings are diverse and range from a lack of generalization over insufficient interpretability and implausible predictions to directed attacks by means of malicious inputs. Cyber-physical systems employing DNNs are therefore likely to suffer from so-called safety concerns, properties that preclude their deployment as no argument or experimental setup can help to assess the remaining risk. In recent years, an abundance of state-of-the-art techniques aiming to address these safety concerns has emerged. This chapter provides a structured and broad overview of them. We first identify categories of insufficiencies to then describe research activities aiming at their detection, quantification, or mitigation. Our work addresses machine learning experts and safety engineers alike: The former ones might profit from the broad range of machine learning topics covered and discussions on limitations of recent methods. The latter ones might gain insights into the specifics of modern machine learning methods. We hope that this contribution fuels discussions on desiderata for machine learning systems and strategies on how to help to advance existing approaches accordingly.
In the field of automatic music generation, one of the greatest challenges is the consistent generation of pieces continuously perceived positively by the majority of the audience since there is no objective method to determine the quality of a musical composition. However, composing principles, which have been refined for millennia, have shaped the core characteristics of today's music. A hybrid music generation system, mlmusic, that incorporates various static, music-theory-based methods, as well as data-driven, subsystems, is implemented to automatically generate pieces considered acceptable by the average listener. Initially, a MIDI dataset, consisting of over 100 hand-picked pieces of various styles and complexities, is analysed using basic music theory principles, and the abstracted information is fed into explicitly constrained LSTM networks. For chord progressions, each individual network is specifically trained on a given sequence length, while phrases are created by consecutively predicting the notes' offset, pitch and duration. Using these outputs as a composition's foundation, additional musical elements, along with constrained recurrent rhythmic and tonal patterns, are statically generated. Although no survey regarding the pieces' reception could be carried out, the successful generation of numerous compositions of varying complexities suggests that the integration of these fundamentally distinctive approaches might lead to success in other branches.
This study investigates the initial stage of the thermo-mechanical crystallization behavior for uni- and biaxially stretched polyethylene. The models are based on a mesoscale molecular dynamics approach. We take constraints that occur in real-life polymer processing into account, especially with respect to the blowing stage of the extrusion blow-molding process. For this purpose, we deform our systems using a wide range of stretching levels before they are quenched. We discuss the effects of the stretching procedures on the micro-mechanical state of the systems, characterized by entanglement behavior and nematic ordering of chain segments. For the cooling stage, we use two different approaches which allow for free or hindered shrinkage, respectively. During cooling, crystallization kinetics are monitored: We precisely evaluate how the interplay of chain length, temperature, local entanglements and orientation of chain segments influence crystallization behavior. Our models reveal that the main stretching direction dominates microscopic states of the different systems. We are able to show that crystallization mainly depends on the (dis-)entanglement behavior. Nematic ordering plays a secondary role.
Modeling of Creep Behavior of Particulate Composites with Focus on Interfacial Adhesion Effect
(2022)
Evaluation of creep compliance of particulate composites using empirical models always provides parameters depending on initial stress and material composition. The effort spent to connect model parameters with physical properties has not resulted in success yet. Further, during the creep, delamination between matrix and filler may occur depending on time and initial stress, reducing an interface adhesion and load transfer to filler particles. In this paper, the creep compliance curves of glass beads reinforced poly(butylene terephthalate) composites were fitted with Burgers and Findley models providing different sets of time-dependent model parameters for each initial stress. Despite the finding that the Findley model performs well in a primary creep, the Burgers model is more suitable if secondary creep comes into play; they allow only for a qualitative prediction of creep behavior because the interface adhesion and its time dependency is an implicit, hidden parameter. As Young’s modulus is a parameter of these models (and the majority of other creep models), it was selected to be introduced as a filler content-dependent parameter with the help of the cube in cube elementary volume approach of Paul. The analysis led to the time-dependent creep compliance that depends only on the time-dependent creep of the matrix and the normalized particle distance (or the filler volume content), and it allowed accounting for the adhesion effect. Comparison with the experimental data confirmed that the elementary volume-based creep compliance function can be used to predict the realistic creep behavior of particulate composites.
Silicon carbide and graphene possess extraordinary chemical and physical properties. Here, these different systems are linked and the changes in structural and dynamic properties are investigated. For the simulations performed a classical molecular dynamic (MD) approach was used. In this approach, a graphene layer (N = 240 atoms) was grafted at different distances on top of a 6H-SiC structure (N = 2400 atoms) and onto a 3C-SiC structure (N = 1728 atoms). The distances between the graphene and the 6H are 1.0, 1.3 and 1.5 Å and the distances between the graphene layer and the 3C-SiC are 2.0, 2.3, and 2.5 Å. Each system has been equilibrated at room temperature until no further relaxation was observed. The 6H-SiC structure in combination with graphene proves to be more stable compared to the combination with 3C-SiC. This can be seen well in the determined energies. Pair distribution functions were influenced slightly by the graphene layer due to steric and energetic changes. This becomes clear from the small shifts of the C-C distances. Interactions as well as bonds between graphene and SiC lead to the fact that small shoulders of the high-frequency SiC-peaks are visible in the spectra and at the same time the high-frequency peaks of graphene are completely absent.
Due to expected positive impacts on business, the application of artificial intelligence has been widely increased. The decision-making procedures of those models are often complex and not easily understandable to the company’s stakeholders, i.e. the people having to follow up on recommendations or try to understand automated decisions of a system. This opaqueness and black-box nature might hinder adoption, as users struggle to make sense and trust the predictions of AI models. Recent research on eXplainable Artificial Intelligence (XAI) focused mainly on explaining the models to AI experts with the purpose of debugging and improving the performance of the models. In this article, we explore how such systems could be made explainable to the stakeholders. For doing so, we propose a new convolutional neural network (CNN)-based explainable predictive model for product backorder prediction in inventory management. Backorders are orders that customers place for products that are currently not in stock. The company now takes the risk to produce or acquire the backordered products while in the meantime, customers can cancel their orders if that takes too long, leaving the company with unsold items in their inventory. Hence, for their strategic inventory management, companies need to make decisions based on assumptions. Our argument is that these tasks can be improved by offering explanations for AI recommendations. Hence, our research investigates how such explanations could be provided, employing Shapley additive explanations to explain the overall models’ priority in decision-making. Besides that, we introduce locally interpretable surrogate models that can explain any individual prediction of a model. The experimental results demonstrate effectiveness in predicting backorders in terms of standard evaluation metrics and outperform known related works with AUC 0.9489. Our approach demonstrates how current limitations of predictive technologies can be addressed in the business domain.
Robust Identification and Segmentation of the Outer Skin Layers in Volumetric Fingerprint Data
(2022)
Despite the long history of fingerprint biometrics and its use to authenticate individuals, there are still some unsolved challenges with fingerprint acquisition and presentation attack detection (PAD). Currently available commercial fingerprint capture devices struggle with non-ideal skin conditions, including soft skin in infants. They are also susceptible to presentation attacks, which limits their applicability in unsupervised scenarios such as border control. Optical coherence tomography (OCT) could be a promising solution to these problems. In this work, we propose a digital signal processing chain for segmenting two complementary fingerprints from the same OCT fingertip scan: One fingerprint is captured as usual from the epidermis (“outer fingerprint”), whereas the other is taken from inside the skin, at the junction between the epidermis and the underlying dermis (“inner fingerprint”). The resulting 3D fingerprints are then converted to a conventional 2D grayscale representation from which minutiae points can be extracted using existing methods. Our approach is device-independent and has been proven to work with two different time domain OCT scanners. Using efficient GPGPU computing, it took less than a second to process an entire gigabyte of OCT data. To validate the results, we captured OCT fingerprints of 130 individual fingers and compared them with conventional 2D fingerprints of the same fingers. We found that both the outer and inner OCT fingerprints were backward compatible with conventional 2D fingerprints, with the inner fingerprint generally being less damaged and, therefore, more reliable.
The implementation of the Sustainable Development Goals (SDGs) and the conservation and protection of nature are among the greatest challenges facing urban regions. There are few approaches so far that link the SDGs to natural diversity and related ecosystem services at the local level and track them in terms of increasing sustainable development at the local level. We want to close this gap by developing a set of indicators that capture ecosystem services in the sense of the SDGs and which are based on data that are freely available throughout Germany and Europe. Based on 10 SDGs and 35 SDG indicators, we are developing an ecosystem service and biodiversity-related indicator set for the evaluation of sustainable development in urban areas. We further show that it is possible to close many of the data gaps between SDGs and locally collected data mentioned in the literature and to translate the universal SDGs to the local level. Our example develops this set of indicators for the Bonn/Rhein-Sieg metropolitan area in North Rhine-Westphalia, Germany, which comprises both rural and densely populated settlements. This set of indicators can also help improve communication and plan sustainable development by increasing transparency in local sustainability, implementing a visible sustainability monitoring system, and strengthening the collaboration between local stakeholders.
The visual and auditory quality of computer-mediated stimuli for virtual and extended reality (VR/XR) is rapidly improving. Still, it remains challenging to provide a fully embodied sensation and awareness of objects surrounding, approaching, or touching us in a 3D environment, though it can greatly aid task performance in a 3D user interface. For example, feedback can provide warning signals for potential collisions (e.g., bumping into an obstacle while navigating) or pinpointing areas where one’s attention should be directed to (e.g., points of interest or danger). These events inform our motor behaviour and are often associated with perception mechanisms associated with our so-called peripersonal and extrapersonal space models that relate our body to object distance, direction, and contact point/impact. We will discuss these references spaces to explain the role of different cues in our motor action responses that underlie 3D interaction tasks. However, providing proximity and collision cues can be challenging. Various full-body vibration systems have been developed that stimulate body parts other than the hands, but can have limitations in their applicability and feasibility due to their cost and effort to operate, as well as hygienic considerations associated with e.g., Covid-19. Informed by results of a prior study using low-frequencies for collision feedback, in this paper we look at an unobtrusive way to provide spatial, proximal and collision cues. Specifically, we assess the potential of foot sole stimulation to provide cues about object direction and relative distance, as well as collision direction and force of impact. Results indicate that in particular vibration-based stimuli could be useful within the frame of peripersonal and extrapersonal space perception that support 3DUI tasks. Current results favor the feedback combination of continuous vibrotactor cues for proximity, and bass-shaker cues for body collision. Results show that users could rather easily judge the different cues at a reasonably high granularity. This granularity may be sufficient to support common navigation tasks in a 3DUI.
SLC6A14 (ATB0,+) is unique among SLC proteins in its ability to transport 18 of the 20 proteinogenic (dipolar and cationic) amino acids and naturally occurring and synthetic analogues (including anti-viral prodrugs and nitric oxide synthase (NOS) inhibitors). SLC6A14 mediates amino acid uptake in multiple cell types where increased expression is associated with pathophysiological conditions including some cancers. Here, we investigated how a key position within the core LeuT-fold structure of SLC6A14 influences substrate specificity. Homology modelling and sequence analysis identified the transmembrane domain 3 residue V128 as equivalent to a position known to influence substrate specificity in distantly related SLC36 and SLC38 amino acid transporters. SLC6A14, with and without V128 mutations, was heterologously expressed and function determined by radiotracer solute uptake and electrophysiological measurement of transporter-associated current. Substituting the amino acid residue occupying the SLC6A14 128 position modified the binding pocket environment and selectively disrupted transport of cationic (but not dipolar) amino acids and related NOS inhibitors. By understanding the molecular basis of amino acid transporter substrate specificity we can improve knowledge of how this multi-functional transporter can be targeted and how the LeuT-fold facilitates such diversity in function among the SLC6 family and other SLC amino acid transporters.
The following work presents algorithms for semi-automatic validation, feature extraction and ranking of time series measurements acquired from MOX gas sensors. Semi-automatic measurement validation is accomplished by extending established curve similarity algorithms with a slope-based signature calculation. Furthermore, a feature-based ranking metric is introduced. It allows for individual prioritization of each feature and can be used to find the best performing sensors regarding multiple research questions. Finally, the functionality of the algorithms, as well as the developed software suite, are demonstrated with an exemplary scenario, illustrating how to find the most power-efficient MOX gas sensor in a data set collected during an extensive screening consisting of 16,320 measurements, all taken with different sensors at various temperatures and analytes.
The cube in cube approach was used by Paul and Ishai-Cohen to model and derive formulas for filler content dependent Young's moduli of particle filled composites assuming perfect filler matrix adhesion. Their formulas were chosen because of their simplicity, and recalculated using an elementary volume approach which transforms spherical inclusions to cubic inclusions. The EV approach led to expression of the composites moduli that allows introducing an adhesion factor kadh ranging from 0 and 1 to take into account reduced filler matrix adhesion. This adhesion factor scales the edge length of the cubic inclusions, thus reducing the stress transfer area between matrix and filler. Fitting the experimental data with the modified Paul model provides reasonable kadh for PA66, PBT, PP, PE-LD and BR which are in line with their surface energies. Further analysis showed that stiffening only occurs if kadh exceeds [Formula: see text] and depends on the ratio of matrix modulus and filler modulus. The modified model allows for a quick calculation of any particle filled composites for known matrix modulus EM, filler modulus EF, filler volume content vF and adhesion factor kadh. Thus, finite element analysis (FEA) simulations of any particle filled polymer parts as well as materials selection are significantly eased. FEA of cubic and hexagonal EV arrangements show that stress distributions within the EV exhibit more shear stresses if one deviates from the cubic arrangement. At high filler contents the assumption that the property of the EV is representative for the whole composite, holds only for filler volume contents up to 15 or 20% (corresponding to 30 to 40 weight %). Thus, for vast majority of commercially available particulate composites, the modified model can be applied. Furthermore, this indicates that the cube in cube approach reaches two limits: (i) the occurrence of increasing shear stresses at filler contents above 20% due to deviations of EV arrangements or spatial filler distribution from cubic arrangements (singular), and (ii) increasing interaction between particles with the formation of particle network within the matrix violating the EV assumption of their homogeneous dispersion.
The purpose of the study is to provide empirical evidence about the under-researched area of university–government relations in building a culture of entrepreneurial initiatives inside the triple helix model in a rural region. The study deploys a qualitative case study research method based on the content analysis of project documentation and further internal documents both from universities and municipalities. The propositions in the research question are guided by the previous literature and were then analyzed through an “open coding” process to iteratively analyze, verify, and validate the results from the documents against the previous literature. Results presented in the case study are related both to the project of a municipality–university innovation partnership, as well as the historic development of the university in its three missions, and, related to the important third mission, themes relevant for the project. In addition, a “toolkit” of relevant project activities is presented against the major identified themes, major project stakeholders, as well as relevant Sustainable Development Goals (SDGs). Universities should look beyond a purely economic contribution and should augment all three missions (teaching, research, engagement) by considering social, environmental, and economic aspects of its activities. Instead of considering a government’s role solely as that of a regulator, a much more creative and purposeful cooperation between university and government is possible for creating a regional culture of entrepreneurial initiatives in a rural region.
Shaping off-job life is becoming increasingly important for workers to increase and maintain their optimal functioning (i.e., feeling and performing well). Proactively shaping the job domain (referred to as job crafting) has been extensively studied, but crafting in the off-job domain has received markedly less research attention. Based on the Integrative Needs Model of Crafting, needs-based off-job crafting is defined as workers’ proactive and self-initiated changes in their off-job lives, which target psychological needs satisfaction. Off-job crafting is posited as a possible means for workers to fulfill their needs and enhance well-being and performance over time. We developed a new scale to measure off-job crafting and examined its relationships to optimal functioning in different work contexts in different regions around the world (the United States, Germany, Austria, Switzerland, Finland, Japan, and the United Kingdom). Furthermore, we examined the criterion, convergent, incremental, discriminant, and structural validity evidence of the Needs-based Off-job Crafting Scale using multiple methods (longitudinal and cross-sectional survey studies, an “example generation”-task). The results showed that off-job crafting was related to optimal functioning over time, especially in the off-job domain but also in the job domain. Moreover, the novel off-job crafting scale had good convergent and discriminant validity, internal consistency, and test–retest reliability. To conclude, our series of studies in various countries show that off-job crafting can enhance optimal functioning in different life domains and support people in performing their duties sustainably. Therefore, shaping off-job life may be beneficial in an intensified and continually changing and challenging working life.
From Conclusion to Coda
(2022)
Emotions are associated with the genesis of visually induced motion sickness in virtual reality
(2022)
Visually induced motion sickness (VIMS) is a well-known side effect of virtual reality (VR) immersion, with symptoms including nausea, disorientation, and oculomotor discomfort. Previous studies have shown that pleasant music, odor, and taste can mitigate VIMS symptomatology, but the mechanism by which this occurs remains unclear. We predicted that positive emotions influence the VIMS-reducing effects. To investigate this, we conducted an experimental study with 68 subjects divided into two groups. The groups were exposed to either positive or neutral emotions before and during the VIMS-provoking stimulus. Otherwise, they performed exactly the same task of estimating the time-to-contact while confronted with a VIMS-provoking moving starfield stimulation. Emotions were induced by means of pre-tested videos and with International Affective Picture System (IAPS) images embedded in the starfield simulation. We monitored emotion induction before, during, and after the simulation, using the Self-Assessment Manikin (SAM) valence and arousal scales. VIMS was assessed before and after exposure using the Simulator Sickness Questionnaire (SSQ) and during simulation using the Fast Motion Sickness Scale (FMS) and FMS-D for dizziness symptoms. VIMS symptomatology did not differ between groups, but valence and arousal were correlated with perceived VIMS symptoms. For instance, reported positive valence prior to VR exposure was found to be related to milder VIMS symptoms and, conversely, experienced symptoms during simulation were negatively related to subjects’ valence. This study sheds light on the complex and potentially bidirectional relationship of VIMS and emotions and provides starting points for further research on the use of positive emotions to prevent VIMS.
A precise characterization of substances is essential for the safe handling of explosives. One parameter regularly characterized is the impact sensitivity. This is typically determined using a drop hammer. However, the results can vary depending on the test method and even the operator, and it is not possible to distinguish the type of decomposition such as detonation and deflagration. This study monitors the reaction progress by constructing a drop hammer to measure the decomposition reaction of four different primary explosives (tetrazene, silver azide, lead azide, lead styphnate) in order to determine the reproducibility of this method. Additionally, further possible evaluation methods are explored to improve on the current binary statistical analysis. To determine whether classification was possible based on extracted features, the responses of equipped sensor arrays, which measure and monitor the reactions, were studied and evaluated. Features were extracted from this data and were evaluated using multivariate methods such as principal component analysis (PCA) and linear discriminant analysis (LDA). The results indicate that although the measurements show substance specific trends, they also show a large scatter for each substance. By reducing the dimensions of the extracted features, different sample clusters can be represented and the calculated loadings allow significant parameters to be determined for classification. The results also suggest that differentiation of different reaction mechanisms is feasible. Testing of the regressor function shows reliable results considering the comparatively small amount of data.
This paper investigates the effect of voltage sensors on the measurement of transient voltages for power semiconductors in a Double Pulse Test (DPT) environment.We adapt previously published models that were developed for current sensors and apply them to voltage sensors to evaluate their suitability for DPT applications. Similarities and differences between transient current and voltage sensors are investigated and the resulting methodology is applied to commercially available and experimental voltage sensors. Finally, a selection aid for given measurement tasks is derived that focuses on the measurement of fast-switching power semiconductors.