Refine
H-BRS Bibliography
- yes (1589) (remove)
Departments, institutes and facilities
- Fachbereich Angewandte Naturwissenschaften (528)
- Fachbereich Wirtschaftswissenschaften (408)
- Fachbereich Informatik (263)
- Fachbereich Ingenieurwissenschaften und Kommunikation (197)
- Institut für funktionale Gen-Analytik (IFGA) (183)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (173)
- Fachbereich Sozialpolitik und Soziale Sicherung (172)
- Institute of Visual Computing (IVC) (65)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (63)
- Institut für Sicherheitsforschung (ISF) (41)
Document Type
- Article (1589) (remove)
Year of publication
Keywords
- cytokine-induced killer cells (9)
- GC/MS (8)
- immunotherapy (8)
- Deutschland (6)
- ISM: molecules (6)
- Qualitätsmanagement (6)
- virtual reality (6)
- Africa (5)
- Corporate Social Responsibility (5)
- Gene expression (5)
In this paper, the performance evaluation of Frequency Modulated Chaotic On-Off Keying (FM-COOK) in AWGN, Rayleigh and Rician fading channels is given. The simulation results show that an improvement in BER can be gained by incorporating the FM modulation with COOK for SNR values less than 10dB in AWGN case and less than 6dB for Rayleigh and Rician fading channels.
For many different applications, current information about the bandwidth-related metrics of the utilized connection is very useful as they directly impact the performance of throughput sensitive applications such as streaming servers, IPTV and VoIP applications. In literature, several tools have been proposed to estimate major bandwidth-related metrics such as capacity, available bandwidth and achievable throughput. The vast majority of these tools fall into one of Packet Pair (PP), Variable Packet Size (VPS), Self-Loading of Periodic Streams (SLoPS) or Throughput approaches. In this study, seven popular bandwidth estimation tools including nettimer, pathrate, pathchar, pchar, clink, pathload and iperf belonging to these four well-known estimation techniques are presented and experimentally evaluated in a controlled testbed environment. Differently from the rest of studies in literature, all tools have been uniformly classified and evaluated according to an objective and sophisticated classification and evaluation scheme. The performance comparison of the tools incorporates not only the estimation accuracy but also the probing time and overhead caused.
YAWL (Yet Another Workflow Language) is an open source Business Process Management System, first released in 2003. YAWL grew out of a university research environment to become a unique system that has been deployed worldwide as a laboratory environment for research in Business Process Management and as a productive system in other scientific domains.
When users in virtual reality cannot physically walk and self-motions are instead only visually simulated, spatial updating is often impaired. In this paper, we report on a study that investigated if HeadJoystick, an embodied leaning-based flying interface, could improve performance in a 3D navigational search task that relies on maintaining situational awareness and spatial updating in VR. We compared it to Gamepad, a standard flying interface. For both interfaces, participants were seated on a swivel chair and controlled simulated rotations by physically rotating. They either leaned (forward/backward, right/left, up/down) or used the Gamepad thumbsticks for simulated translation. In a gamified 3D navigational search task, participants had to find eight balls within 5 min. Those balls were hidden amongst 16 randomly positioned boxes in a dark environment devoid of any landmarks. Compared to the Gamepad, participants collected more balls using the HeadJoystick. It also minimized the distance travelled, motion sickness, and mental task demand. Moreover, the HeadJoystick was rated better in terms of ease of use, controllability, learnability, overall usability, and self-motion perception. However, participants rated HeadJoystick could be more physically fatiguing after a long use. Overall, participants felt more engaged with HeadJoystick, enjoyed it more, and preferred it. Together, this provides evidence that leaning-based interfaces like HeadJoystick can provide an affordable and effective alternative for flying in VR and potentially telepresence drones.
As competition for tourists becomes more global, understanding and accommodating the needs of international tourists, with their different cultural backgrounds, has become increasingly important. This study highlights the variations in tourist industry service--particularly as they relate to different cultures. Specifically, service failures experienced by Japanese and German tourists in the U.S. were categorized using the Critical Incident Technique (CIT). The results were compared with earlier studies of service failures experienced by American consumers in the tourist industry. The sample consists of 128 Japanese and 94 “Germanic” (German, Austrian, Swiss-German) respondents. The Japanese and German sample rated “Inappropriate employee behavior” most significant category of service failure. More than half of these respondents said that, because of the failure, they would avoid the offending U.S. business. This is a much stronger response than an American sample had reported in an earlier study. The implications for managers and researchers are discussed.
Airborne and spaceborne platforms are the primary data sources for large-scale forest mapping, but visual interpretation for individual species determination is labor-intensive. Hence, various studies focusing on forests have investigated the benefits of multiple sensors for automated tree species classification. However, transferable deep learning approaches for large-scale applications are still lacking. This gap motivated us to create a novel dataset for tree species classification in central Europe based on multi-sensor data from aerial, Sentinel-1 and Sentinel-2 imagery. In this paper, we introduce the TreeSatAI Benchmark Archive, which contains labels of 20 European tree species (i.e., 15 tree genera) derived from forest administration data of the federal state of Lower Saxony, Germany. We propose models and guidelines for the application of the latest machine learning techniques for the task of tree species classification with multi-label data. Finally, we provide various benchmark experiments showcasing the information which can be derived from the different sensors including artificial neural networks and tree-based machine learning methods. We found that residual neural networks (ResNet) perform sufficiently well with weighted precision scores up to 79 % only by using the RGB bands of aerial imagery. This result indicates that the spatial content present within the 0.2 m resolution data is very informative for tree species classification. With the incorporation of Sentinel-1 and Sentinel-2 imagery, performance improved marginally. However, the sole use of Sentinel-2 still allows for weighted precision scores of up to 74 % using either multi-layer perceptron (MLP) or Light Gradient Boosting Machine (LightGBM) models. Since the dataset is derived from real-world reference data, it contains high class imbalances. We found that this dataset attribute negatively affects the models' performances for many of the underrepresented classes (i.e., scarce tree species). However, the class-wise precision of the best-performing late fusion model still reached values ranging from 54 % (Acer) to 88 % (Pinus). Based on our results, we conclude that deep learning techniques using aerial imagery could considerably support forestry administration in the provision of large-scale tree species maps at a very high resolution to plan for challenges driven by global environmental change. The original dataset used in this paper is shared via Zenodo (https://doi.org/10.5281/zenodo.6598390, Schulz et al., 2022). For citation of the dataset, we refer to this article.
Multi-Merger-Szenarien als Herausforderung für das IT-Controlling - Checklisten zur IT-Integration
(2006)
Agiles IT-Controlling
(2022)
Während im IT-Projektmanagement agile Methoden seit vielen Jahren in der Praxis Zuspruch finden, werden im IT-Controlling überwiegend noch klassische Methoden eingesetzt. Der Beitrag untersucht die Fragestellung, ob und wie die im IT-Controlling eingesetzten Methoden auch agilen Paradigmen folgen und Methoden des agilen IT-Projektmanagements adaptiert werden können.
Trueness and precision of milled and 3D printed root-analogue implants: A comparative in vitro study
(2023)
A company's financial documents use tables along with text to organize the data containing key performance indicators (KPIs) (such as profit and loss) and a financial quantity linked to them. The KPI’s linked quantity in a table might not be equal to the similarly described KPI's quantity in a text. Auditors take substantial time to manually audit these financial mistakes and this process is called consistency checking. As compared to existing work, this paper attempts to automate this task with the help of transformer-based models. Furthermore, for consistency checking it is essential for the table's KPIs embeddings to encode the semantic knowledge of the KPIs and the structural knowledge of the table. Therefore, this paper proposes a pipeline that uses a tabular model to get the table's KPIs embeddings. The pipeline takes input table and text KPIs, generates their embeddings, and then checks whether these KPIs are identical. The pipeline is evaluated on the financial documents in the German language and a comparative analysis of the cell embeddings' quality from the three tabular models is also presented. From the evaluation results, the experiment that used the English-translated text and table KPIs and Tabbie model to generate table KPIs’ embeddings achieved an accuracy of 72.81% on the consistency checking task, outperforming the benchmark, and other tabular models.
AI (artificial intelligence) systems are increasingly being used in all aspects of our lives, from mundane routines to sensitive decision-making and even creative tasks. Therefore, an appropriate level of trust is required so that users know when to rely on the system and when to override it. While research has looked extensively at fostering trust in human-AI interactions, the lack of standardized procedures for human-AI trust makes it difficult to interpret results and compare across studies. As a result, the fundamental understanding of trust between humans and AI remains fragmented. This workshop invites researchers to revisit existing approaches and work toward a standardized framework for studying AI trust to answer the open questions: (1) What does trust mean between humans and AI in different contexts? (2) How can we create and convey the calibrated level of trust in interactions with AI? And (3) How can we develop a standardized framework to address new challenges?
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
The epithelial sodium channel (ENaC) is a heterotrimeric ion channel that plays a key role in sodium and water homeostasis in tetrapod vertebrates. In the aldosterone-sensitive distal nephron, hormonally controlled ENaC expression matches dietary sodium intake to its excretion. Furthermore, ENaC mediates sodium absorption across the epithelia of the colon, sweat ducts, reproductive tract, and lung. ENaC is a constitutively active ion channel and its expression, membrane abundance, and open probability (PO) are controlled by multiple intracellular and extracellular mediators and mechanisms [9]. Aberrant ENaC regulation is associated with severe human diseases, including hypertension, cystic fibrosis, pulmonary edema, pseudohypoaldosteronism type 1, and nephrotic syndrome [9].
Lignocellulose feedstock (LCF) provides a sustainable source of components to produce bioenergy, biofuel, and novel biomaterials. Besides hard and soft wood, so-called low-input plants such as Miscanthus are interesting crops to be investigated as potential feedstock for the second generation biorefinery. The status quo regarding the availability and composition of different plants, including grasses and fast-growing trees (i.e., Miscanthus, Paulownia), is reviewed here. The second focus of this review is the potential of multivariate data processing to be used for biomass analysis and quality control. Experimental data obtained by spectroscopic methods, such as nuclear magnetic resonance (NMR) and Fourier-transform infrared spectroscopy (FTIR), can be processed using computational techniques to characterize the 3D structure and energetic properties of the feedstock building blocks, including complex linkages. Here, we provide a brief summary of recently reported experimental data for structural analysis of LCF biomasses, and give our perspectives on the role of chemometrics in understanding and elucidating on LCF composition and lignin 3D structure.
Antioxidant activity is an essential aspect of oxygen-sensitive merchandise and goods, such as food and corresponding packaging, cosmetics, and biomedicine. Technical lignin has not yet been applied as a natural antioxidant, mainly due to the complex heterogeneous structure and polydispersity of lignin. This report presents antioxidant capacity studies completed using the 2,2-diphenyl-1-picrylhydrazyl (DPPH) assay. The influence of purification on lignin structure and activity was investigated. The purification procedure showed that double-fold selective extraction is the most efficient (confirmed by ultraviolet-visible (UV/Vis), Fourier transform infrared (FTIR), heteronuclear single quantum coherence (HSQC) and 31P nuclear magnetic resonance spectroscopy, size exclusion chromatography, and X-ray diffraction), resulting in fractions of very narrow polydispersity (3.2⁻1.6), up to four distinct absorption bands in UV/Vis spectroscopy. Due to differential scanning calorimetry measurements, the glass transition temperature increased from 123 to 185 °C for the purest fraction. Antioxidant capacity is discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: antioxidant activity (DPPH inhibition) of kraft lignin fractions were 62⁻68%, whereas beech and spruce/pine-mixed lignin showed values of 42% and 64%, respectively. Total phenol content (TPC) of the isolated kraft lignin fractions varied between 26 and 35%, whereas beech and spruce/pine lignin were 33% and 34%, respectively. Storage decreased the TPC values but increased the DPPH inhibition.
The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. The polyphenolic structure of lignin in addition to the presence of O-containing functional groups is potentially responsible for these activities. This study used DPPH assays to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. The scavenging activity (SA) of both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems was affected by the percentage of the added lignin: the 5% addition showed the highest activity and the 30% addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source showing the following trend: organosolv of softwood > kraft of softwood > organosolv of grass. Testing the antimicrobial activities of lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin release in the produced films affected the activity positively and the chitosan addition enhances the activity even more for both Gram-positive and Gram-negative bacteria. Testing the films against spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both B. thermosphacta and P. fluorescens.
Once aberrantly activated, the Wnt/βcatenin pathway may result in uncontrolled proliferation and eventually cancer. Efforts to counter and inhibit this pathway are mainly directed against βcatenin, as it serves a role on the cytoplasm and the nucleus. In addition, speciallygenerated lymphocytes are recruited for the purpose of treating liver cancer. Peripheral blood mononuclear lymphocytes are expanded by the timely addition of interferon γ, interleukin (IL)1β, IL2 and anticluster of differentiation 3 antibody. The resulting cells are called cytokineinduced killer (CIK) cells. The present study utilised these cells and combine them with drugs inhibiting the Wnt pathway in order to examine whether this resulted in an improvement in the killing ability of CIK cells against liver cancer cells. Drugs including ethacrynic acid (EA) and ciclopirox olamine (CPX) were determined to be suitable candidates, as determined by previous studies. Drugs were administered on their own and combined with CIK cells and then a cell viability assay was performed. These results suggest that EAtreated cells demonstrated apoptosis and were significantly affected compared with untreated cells. Unlike EA, CPX killed normal and cancerous cells even at low concentrations. Subsequent to combining EA with CIK cells, the potency of killing was increased and a greater number of cells died, which proves a synergistic action. In summary, EA may be used as an antihepatocellular carcinoma drug, while CPX possesses a high toxicity to cancerous as well as to normal cells. It was proposed that EA should be integrated into present therapeutic methods for cancer.
Competitions for Benchmarking: Task and Functionality Scoring Complete Performance Assessment
(2015)
This paper presents the preliminary results of the Socialist Republic of Vietnam country case study conducted as part of the research project Sustainable Labour Migration implemented by the University of Applied Science Bonn-Rhein-Sieg. The project focuses on stakeholder perspectives on countries of origin benefits and the sustainability of different transnational skill partnership schemes. Existing and ongoing small-scale initiatives indicate that opportunities exist for all three types of labour mobility pathways, from recruiting youth for apprenticeships and subsequent skilled work to recruitment and recognition of skilled 'professionals' certificates for direct work contracts to initial vocational education and training programs in a dual-track approach. While the latter has the highest potential to be more beneficial than other approaches, pursuing and supporting the scaling up of all three pathways in parallel will have additional, mutually reinforcing and supporting effects. The potential for benefits over and above those already realised by existing skill partnerships appears high, especially considering the favourable framework conditions specific to the long-standing German-Vietnamese relationship. If the potential of well-managed skill partnerships was realised, such sustainable models of skilled labour migration could serve as a unique selling point in the international competition for skilled labour.
SLC6A14 (ATB0,+) is unique among SLC proteins in its ability to transport 18 of the 20 proteinogenic (dipolar and cationic) amino acids and naturally occurring and synthetic analogues (including anti-viral prodrugs and nitric oxide synthase (NOS) inhibitors). SLC6A14 mediates amino acid uptake in multiple cell types where increased expression is associated with pathophysiological conditions including some cancers. Here, we investigated how a key position within the core LeuT-fold structure of SLC6A14 influences substrate specificity. Homology modelling and sequence analysis identified the transmembrane domain 3 residue V128 as equivalent to a position known to influence substrate specificity in distantly related SLC36 and SLC38 amino acid transporters. SLC6A14, with and without V128 mutations, was heterologously expressed and function determined by radiotracer solute uptake and electrophysiological measurement of transporter-associated current. Substituting the amino acid residue occupying the SLC6A14 128 position modified the binding pocket environment and selectively disrupted transport of cationic (but not dipolar) amino acids and related NOS inhibitors. By understanding the molecular basis of amino acid transporter substrate specificity we can improve knowledge of how this multi-functional transporter can be targeted and how the LeuT-fold facilitates such diversity in function among the SLC6 family and other SLC amino acid transporters.
Pipeline transport is an efficient method for transporting fluids in energy supply and other technical applications. While natural gas is the classical example, the transport of hydrogen is becoming more and more important; both are transmitted under high pressure in a gaseous state. Also relevant is the transport of carbon dioxide, captured in the places of formation, transferred under high pressure in a liquid or supercritical state and pumped into underground reservoirs for storage. The transport of other fluids is also required in technical applications. Meanwhile, the transport equations for different fluids are essentially the same, and the simulation can be performed using the same methods. In this paper, the effect of control elements such as compressors, regulators and flaptraps on the stability of fluid transport simulations is studied. It is shown that modeling of these elements can lead to instabilities, both in stationary and dynamic simulations. Special regularization methods were developed to overcome these problems. Their functionality also for dynamic simulations is demonstrated for a number of numerical experiments.
Farming communities confronted with climate change adopt formal and informal adaptation strategies to mitigate the effects of climate change. While the environmental and social effects of climate change are well documented, there is still a dearth of literature on girl-child marriage (formal marriage or informal union between a child under the age of 18 and an adult or another child) as a response to the effects of climate change. In this research, we ask if girl-child marriage is promoted as a social protection mechanism first, rather than as simply a response to climate-induced poverty. We use qualitative semi-structured interviews and focus group discussions to explore this question in a rural farming community in Northern Ghana. Our findings reveal that climate change shocks result in poverty and compel farmers to marry off their young daughters. The unmarried girl-child is perceived as an ‘extra mouth to feed’, a liability whose marriage becomes a strategy for protecting the family, the family’s reputation, and the girl child. The emphasis in girl-child marriage is not on the girl-child as an individual but on the family as a group. Hence, what is good for the family is assumed to be in the best interest of the girl-child. We place our analysis at the intersection of climate change, social protection, and the incidence of girl-child marriages. We argue that understanding this link is crucial and can contribute significantly to our knowledge of girl-child marriage as well as our ability to address this in Sub-Saharan Africa.
Die Komplexität der Entscheidungen im Fuhrparkmanagement hat in der jüngeren Vergangenheit deutlich zugenommen. Damit steigen die Anforderungen an den Fuhrparkcontroller, den Fuhrparkleiter mit entscheidungsrelevanten Informationen im Sinne eines internen Dienstleisters zu unterstützen. Das Dynamic Carbon Accounting bietet die Möglichkeit, strategische, strukturelle und kulturelle Anforderungen an das Fuhrparkcontrolling durch die Kombination von Prozesskostenrechnung, Target Costing, Life Cycle Costing und den Ideen des Carbon Accountings instrumentell zu berücksichtigen. Je nach Bedeutung der Nachhaltigkeit für den Unternehmenserfolg können die damit verbundenen Auszahlungen noch differenzierter aufgenommen werden. So ist es denkbar, externe Auszahlungen der Emissionen von NO(ind x), Nichtmethan-Kohlenwasserstoffen, Partikeln, Lärm und Unfällen in die Rechnung zu integrieren. Damit wird je Fahrzeug der Beitrag zur Erreichung von Emissionszielen transparent gemacht und ist durch eine zielgerichtete Integration in den Controllingprozess des Unternehmens plan- und steuerbar. Da von einer zukünftig zunehmenden Komplexität des wirtschaftlichen Handelns auszugehen ist, wird sich der praktische Bedarf an dynamischen, marktorientierten Instrumenten im Controlling generell und speziell im Fuhrparkcontrolling weiter erhöhen.
Among the celestial bodies in the Solar System, Mars currently represents the main target for the search for life beyond Earth. However, its surface is constantly exposed to high doses of cosmic rays (CRs) that may pose a threat to any biological system. For this reason, investigations into the limits of resistance of life to space relevant radiation is fundamental to speculate on the chance of finding extraterrestrial organisms on Mars. In the present work, as part of the STARLIFE project, the responses of dried colonies of the black fungus Cryomyces antarcticus Culture Collection of Fungi from Extreme Environments (CCFEE) 515 to the exposure to accelerated iron (LET: 200 keV/μm) ions, which mimic part of CRs spectrum, were investigated. Samples were exposed to the iron ions up to 1000 Gy in the presence of Martian regolith analogues. Our results showed an extraordinary resistance of the fungus in terms of survival, recovery of metabolic activity and DNA integrity. These experiments give new insights into the survival probability of possible terrestrial-like life forms on the present or past Martian surface and shallow subsurface environments.
A biodegradable blend of PBAT—poly(butylene adipate-co-terephthalate)—and PLA—poly(lactic acid)—for blown film extrusion was modified with four multi-functional chain extending cross-linkers (CECL). The anisotropic morphology introduced during film blowing affects the degradation processes. Given that two CECL increased the melt flow rate (MFR) of tris(2,4-di-tert-butylphenyl)phosphite (V1) and 1,3-phenylenebisoxazoline (V2) and the other two reduced it (aromatic polycarbodiimide (V3) and poly(4,4-dicyclohexylmethanecarbodiimide) (V4)), their compost (bio-)disintegration behavior was investigated. It was significantly altered with respect to the unmodified reference blend (REF). The disintegration behavior at 30 and 60 °C was investigated by determining changes in mass, Young’s moduli, tensile strengths, elongations at break and thermal properties. In order to quantify the disintegration behavior, the hole areas of blown films were evaluated after compost storage at 60 °C to calculate the kinetics of the time dependent degrees of disintegration. The kinetic model of disintegration provides two parameters: initiation time and disintegration time. They quantify the effects of the CECL on the disintegration behavior of the PBAT/PLA compound. Differential scanning calorimetry (DSC) revealed a pronounced annealing effect during storage in compost at 30 °C, as well as the occurrence of an additional step-like increase in the heat flow at 75 °C after storage at 60 °C. The disintegration consists of processes which affect amorphous and crystalline phase of PBAT in different manner that cannot be understood by a hydrolytic chain degradation only. Furthermore, gel permeation chromatography (GPC) revealed molecular degradation only at 60 °C for the REF and V1 after 7 days of compost storage. The observed losses of mass and cross-sectional area seem to be attributed more to mechanical decay than to molecular degradation for the given compost storage times.
Process-induced changes in the morphology of biodegradable polybutylene adipate terephthalate (PBAT) and polylactic acid (PLA) blends modified with various multifunctional chainextending cross-linkers (CECLs) are presented. The morphology of unmodified and modified films produced with blown film extrusion is examined in an extrusion direction (ED) and a transverse direction (TD). While FTIR analysis showed only small peak shifts indicating that the CECLs modify the molecular weight of the PBAT/PLA blend, SEM investigations of the fracture surfaces of blown extrusion films revealed their significant effect on the morphology formed during the processing. Due to the combined shear and elongation deformation during blown film extrusion, rather spherical PLA islands were partly transformed into long fibrils, which tended to decay to chains of elliptical islands if cooled slowly. The CECL introduction into the blend changed the thickness of the PLA fibrils, modified the interface adhesion, and altered the deformation behavior of the PBAT matrix from brittle to ductile. The results proved that CECLs react selectively with PBAT, PLA, and their interface. Furthermore, the reactions of CECLs with PBAT/PLA induced by the processing depended on the deformation directions (ED and TD), thus resulting in further non-uniformities of blown extrusion films.
This study investigates the effects of four multifunctional chain-extending cross-linkers (CECL) on the processability, mechanical performance, and structure of polybutylene adipate terephthalate (PBAT) and polylactic acid (PLA) blends produced using film blowing technology. The newly developed reference compound (M·VERA® B5029) and the CECL modified blends are characterized with respect to the initial properties and the corresponding properties after aging at 50 °C for 1 and 2 months. The tensile strength, seal strength, and melt volume rate (MVR) are markedly changed after thermal aging, whereas the storage modulus, elongation at the break, and tear resistance remain constant. The degradation of the polymer chains and crosslinking with increased and decreased MVR, respectively, is examined thoroughly with differential scanning calorimetry (DSC), with the results indicating that the CECL-modified blends do not generally endure thermo-oxidation over time. Further, DSC measurements of 25 µm and 100 µm films reveal that film blowing pronouncedly changes the structures of the compounds. These findings are also confirmed by dynamic mechanical analysis, with the conclusion that tris(2,4-di-tert-butylphenyl)phosphite barely affects the glass transition temperature, while with the other changes in CECL are seen. Cross-linking is found for aromatic polycarbodiimide and poly(4,4-dicyclohexylmethanecarbodiimide) CECL after melting of granules and films, although overall the most synergetic effect of the CECL is shown by 1,3-phenylenebisoxazoline.
This review is divided into two interconnected parts, namely a biological and a chemical one. The focus of the first part is on the biological background for constructing tissue-engineered vascular grafts to promote vascular healing. Various cell types, such as embryonic, mesenchymal and induced pluripotent stem cells, progenitor cells and endothelial- and smooth muscle cells will be discussed with respect to their specific markers. The in vitro and in vivo models and their potential to treat vascular diseases are also introduced. The chemical part focuses on strategies using either artificial or natural polymers for scaffold fabrication, including decellularized cardiovascular tissue. An overview will be given on scaffold fabrication including conventional methods and nanotechnologies. Special attention is given to 3D network formation via different chemical and physical cross-linking methods. In particular, electron beam treatment is introduced as a method to combine 3D network formation and surface modification. The review includes recently published scientific data and patents which have been registered within the last decade.
(1) Background: Autologous bone is supposed to contain vital cells that might improve the osseointegration of dental implants. The aim of this study was to investigate particulate and filtered bone chips collected during oral surgery intervention with respect to their osteogenic potential and the extent of microbial contamination to evaluate its usefulness for jawbone reconstruction prior to implant placement. (2) Methods: Cortical and cortical-cancellous bone chip samples of 84 patients were collected. The stem cell character of outgrowing cells was characterized by expression of CD73, CD90 and CD105, followed by osteogenic differentiation. The degree of bacterial contamination was determined by Gram staining, catalase and oxidase tests and tests to evaluate the genera of the found bacteria (3) Results: Pre-surgical antibiotic treatment of the patients significantly increased viability of the collected bone chip cells. No significant difference in plasticity was observed between cells isolated from the cortical and cortical-cancellous bone chip samples. Thus, both types of bone tissue can be used for jawbone reconstruction. The osteogenic differentiation was independent of the quantity and quality of the detected microorganisms, which comprise the most common bacteria in the oral cavity. (4) Discussion: This study shows that the quality of bone chip-derived stem cells is independent of the donor site and the extent of present common microorganisms, highlighting autologous bone tissue, assessable without additional surgical intervention for the patient, as a useful material for dental implantology.
MOTIVATION
The majority of biomedical knowledge is stored in structured databases or as unstructured text in scientific publications. This vast amount of information has led to numerous machine learning-based biological applications using either text through natural language processing (NLP) or structured data through knowledge graph embedding models (KGEMs). However, representations based on a single modality are inherently limited.
RESULTS
To generate better representations of biological knowledge, we propose STonKGs, a Sophisticated Transformer trained on biomedical text and Knowledge Graphs (KGs). This multimodal Transformer uses combined input sequences of structured information from KGs and unstructured text data from biomedical literature to learn joint representations in a shared embedding space. First, we pre-trained STonKGs on a knowledge base assembled by the Integrated Network and Dynamical Reasoning Assembler (INDRA) consisting of millions of text-triple pairs extracted from biomedical literature by multiple NLP systems. Then, we benchmarked STonKGs against three baseline models trained on either one of the modalities (i.e., text or KG) across eight different classification tasks, each corresponding to a different biological application. Our results demonstrate that STonKGs outperforms both baselines, especially on the more challenging tasks with respect to the number of classes, improving upon the F1-score of the best baseline by up to 0.084 (i.e., from 0.881 to 0.965). Finally, our pre-trained model as well as the model architecture can be adapted to various other transfer learning applications.
AVAILABILITY
We make the source code and the Python package of STonKGs available at GitHub (https://github.com/stonkgs/stonkgs) and PyPI (https://pypi.org/project/stonkgs/). The pre-trained STonKGs models and the task-specific classification models are respectively available at https://huggingface.co/stonkgs/stonkgs-150k and https://zenodo.org/communities/stonkgs.
SUPPLEMENTARY INFORMATION
Supplementary data are available at Bioinformatics online.
The general method of topological reduction for the network problems is presented on example of gas transport networks. The method is based on a contraction of series, parallel and tree-like subgraphs for the element equations of quadratic, power law and general monotone dependencies. The method allows to reduce significantly the complexity of the graph and to accelerate the solution procedure for stationary network problems. The method has been tested on a large set of realistic network scenarios. Possible extensions of the method have been described, including triangulated element equations, continuation of the equations at infinity, providing uniqueness of solution, a choice of Newtonian stabilizer for nearly degenerated systems. The method is applicable for various sectors in the field of energetics, including gas networks, water networks, electric networks, as well as for coupling of different sectors.
With increasing life expectancy, demands for dental tissue and whole-tooth regeneration are becoming more significant. Despite great progress in medicine, including regenerative therapies, the complex structure of dental tissues introduces several challenges to the field of regenerative dentistry. Interdisciplinary efforts from cellular biologists, material scientists, and clinical odontologists are being made to establish strategies and find the solutions for dental tissue regeneration and/or whole-tooth regeneration. In recent years, many significant discoveries were done regarding signaling pathways and factors shaping calcified tissue genesis, including those of tooth. Novel biocompatible scaffolds and polymer-based drug release systems are under development and may soon result in clinically applicable biomaterials with the potential to modulate signaling cascades involved in dental tissue genesis and regeneration. Approaches for whole-tooth regeneration utilizing adult stem cells, induced pluripotent stem cells, or tooth germ cells transplantation are emerging as promising alternatives to overcome existing in vitro tissue generation hurdles. In this interdisciplinary review, most recent advances in cellular signaling guiding dental tissue genesis, novel functionalized scaffolds and drug release material, various odontogenic cell sources, and methods for tooth regeneration are discussed thus providing a multi-faceted, up-to-date, and illustrative overview on the tooth regeneration matter, alongside hints for future directions in the challenging field of regenerative dentistry.
The temperature of photovoltaic modules is modelled as a dynamic function of ambient temperature, shortwave and longwave irradiance and wind speed, in order to allow for a more accurate characterisation of their efficiency. A simple dynamic thermal model is developed by extending an existing parametric steady-state model using an exponential smoothing kernel to include the effect of the heat capacity of the system. The four parameters of the model are fitted to measured data from three photovoltaic systems in the Allgäu region in Germany using non-linear optimisation. The dynamic model reduces the root-mean-square error between measured and modelled module temperature to 1.58 K on average, compared to 3.03 K for the steady-state model, whereas the maximum instantaneous error is reduced from 20.02 to 6.58 K.
Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. The method is tested on data from two measurement campaigns that took place in the Allgäu region in Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 min resolution along with a non-linear photovoltaic module temperature model, global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 5.79 W m−2 (7.35 W m−2) under clear (cloudy) skies, averaged over the two campaigns, whereas for the retrieval using coarser 15 min power data with a linear temperature model the mean bias error is 5.88 and 41.87 W m−2 under clear and cloudy skies, respectively.
During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a 1D radiative transfer simulation, and the results are compared to both satellite retrievals and data from the Consortium for Small-scale Modelling (COSMO) weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken-cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
Thermo-chemical conversion of cucumber peel waste for biobased energy and chemical production
(2022)
Object-Based Trace Model for Automatic Indicator Computation in the Human Learning Environments
(2021)
This paper proposes a traces model in the form of an object or class model (in the UML sense) which allows the automatic calculation of indicators of various kinds and independently of the computer environment for human learning (CEHL). The model is based on the establishment of a trace-based system that encompasses all the logic of traces collecting and indicators calculation. It is im-plemented in the form of a trace database. It is an important contribution in the field of the exploitation of the traces of apprenticeship in a CEHL because it pro-vides a general formalism for modeling the traces and allowing the calculation of several indicators at the same time. Also, with the inclusion of calculated indica-tors as potential learning traces, our model provides a formalism for classifying the various indicators in the form of inheritance relationships, which promotes the reuse of indicators already calculated. Economically, the model can allow organi-zations with different learning platforms to invest only in one traces Management System. At the social level, it can allow a better sharing of trace databases be-tween the various research institutions in the field of CEHL.
Telepresence robots allow users to be spatially and socially present in remote environments. Yet, it can be challenging to remotely operate telepresence robots, especially in dense environments such as academic conferences or workplaces. In this paper, we primarily focus on the effect that a speed control method, which automatically slows the telepresence robot down when getting closer to obstacles, has on user behaviors. In our first user study, participants drove the robot through a static obstacle course with narrow sections. Results indicate that the automatic speed control method significantly decreases the number of collisions. For the second study we designed a more naturalistic, conference-like experimental environment with tasks that require social interaction, and collected subjective responses from the participants when they were asked to navigate through the environment. While about half of the participants preferred automatic speed control because it allowed for smoother and safer navigation, others did not want to be influenced by an automatic mechanism. Overall, the results suggest that automatic speed control simplifies the user interface for telepresence robots in static dense environments, but should be considered as optionally available, especially in situations involving social interactions.
Salts and proteins comprise two of the basic molecular components of biological materials. Kosmotropic/chaotropic co-solvation and matching ion water affinities explain basic ionic effects on protein aggregation observed in simple solutions. However, it is unclear how these theories apply to proteins in complex biological environments and what the underlying ionic binding patterns are. Using the positive ion Ca2+ and the negatively charged membrane protein SNAP25, we studied ion effects on protein oligomerization in solution, in native membranes and in molecular dynamics (MD) simulations. We find that concentration-dependent ion-induced protein oligomerization is a fundamental chemico-physical principle applying not only to soluble but also to membrane-anchored proteins in their native environment. Oligomerization is driven by the interaction of Ca2+ ions with the carboxylate groups of aspartate and glutamate. From low up to middle concentrations, salt bridges between Ca2+ ions and two or more protein residues lead to increasingly larger oligomers, while at high concentrations oligomers disperse due to overcharging effects. The insights provide a conceptual framework at the interface of physics, chemistry and biology to explain binding of ions to charged protein surfaces on an atomistic scale, as occurring during protein solubilisation, aggregation and oligomerization both in simple solutions and membrane systems.
Defect evolution in thermal barrier coating systems under multi-axial thermomechanical loading
(2005)
Advanced thermal gradient mechanical fatigue testing of CMSX-4 with an oxidation protection coating
(2008)
Die Norm EN ISO 13849-1 stellt explizite Anforderungen an sicherheitsgerichtete SPS-Software. Wie lassen sich diese im Maschinenbau praxisgerecht umsetzen? Mit dieser Frage hat sich ein von der DGUV gefördertes und an der Hochschule Bonn-Rhein-Sieg durchgeführtes Projekt beschäftigt. Der Beitrag skizziert die Vorgehensweise zur möglichen Umsetzung der normativen Anforderungen. Diese Vorgehensweise ist unabhängig von der verwendeten Sicherheits-SPS und daher allgemein anwendbar. Es wird auf insgesamt 10 dokumentierte Beispiele und einen ausführlichen Forschungsbericht verwiesen, die downloadbar sind.
DeltaV Neural ist eine Softwareapplikation innerhalb des Prozessautomatisierungssystems DeltaV, die es dem Anwender ermöglicht, auf einfache Art und Weise Softsensoren zu konfigurieren. Softsensoren besitzen die Aufgabe, schwer messbare oder nur in großen Zeitabständen ermittelbare Prozessausgangsgrößen mittels einfacher und schneller messbarer Ersatzmessgrößen zu schätzen bzw. vorherzusagen.
Dem RTPM (Real-Time Performance Monitoring) wurde in den letzten Jahren in der Automatisierungstechnik immer mehr Beachtung geschenkt. Drei ausgewählte Aspekte des RPTM werden behandelt: Alarmanalyse, Reglerperformance und Stelleinrichtungen. Die Reduktion von Alarmmeldungen mit Hilfe einer Alarmanalyse wird mit Hilfe von Beispielen aus der Industrie veranschaulicht. Ziel einer Analyse ist die Identifikation von (1) falschen Alarmgrenzen, (2) Reglern, bei denen Störungen im Handbetrieb ausgeregelt werden, (3) Reglern, bei denen Betriebspunktänderungen im Handbetrieb ausgeführt werden, (4) Reglern mit Stellgrößen bei 0% oder 100%, (5) falschen Reglerparametern sowie (6) Fehlern in der Messtechnik, Antrieben, Klappen oder Ventilen. Die industrielle Anwendung der Überwachung der Reglerperformance wird anhand des in das Prozessautomatisierungssystem DeltaV von Emerson Process Management integrierten Softwareproduktes DeltaV Inspect erläutert. DeltaV überwacht und bewertet (1) die Bereichsüberschreitungen der Regelgrößen und der Stellsignale, (2) die Betriebsarten (Hand oder Automatik) und (3) die Regelungsgüte. Die Regelungsgüte wird bei einem konstanten Sollwert und stochastischen Störungen aus dem Unterschied zwischen der tatsächlichen und der theoretisch erreichbaren Varianz des Regelfehlers berechnet. Anstelle einer Korrelations- bzw. Regressionsanalyse wird die theoretisch erreichbare minimale Varianz aus der aktuellen Varianz des Regelfehlers und der Varianz der Abweichung der aufeinander folgenden Regelfehlerabtastwerte berechnet.
Der I. Senat des BFH hat dem Großen Senat mit Beschluss vom 7.4.2010 die Frage vorgelegt, ob der subjektive Fehlerbegriff zwar in Bezug auf nach Bilanzaufstellung neu bekanntgewordene Tatsachen beizubehalten, in Bezug auf bessere Rechtserkenntnisse nach Bilanzaufstellung aber aufzugeben ist. Der Große Senat hat insoweit eine schwierige Entscheidung zu treffen, da sich materielles Bilanzsteuerrecht, Verfahrensrecht und Handelsrecht bei dieser Frage überlagern. hat schon 1991 von einem heillosen und verworrenen Labyrinth gesprochen, in dem es schwer falle, die Prinzipien zu erkennen. Zusätzlich hat der Große Senat unter dem Gesichtspunkt der Kontinuität der Rechtsprechung und der Rechtssicherheit abzuwägen, ob es gerechtfertigt ist, eine seit 50 Jahren bestehende Rechtsprechung aufzugeben. Die Entscheidung des Großen Senats hat sich durch den Präsidentenwechsel beim BFH verzögert, ist aber nunmehr in nächster Zeit zu erwarten.
Im Zuge der Migrationsbewegung in den Jahren 2015 und 2016 hat die menschenwürdige Unterbringung von geflüchteten Menschen in Kommunen in Deutschland an Aufmerksamkeit gewonnen. Der Anstieg der Asylbewerber:innen in den Kommunen sowie die Bundesinitiative „Schutz von geflüchteten Menschen in Flüchtlingsunterkünften“ haben Veränderungen im Hinblick auf Schutzstandards in der kommunalen Unterbringung geflüchteter Menschen hervorgerufen. Der Artikel erklärt diese Veränderungen mittels einer akteurszentrierten organisationssoziologischen Herangehensweise. Grundlage sind empirische Forschungsergebnisse des Projektes „Organisational Perspectives on Human Security Standards for Refugees in Germany“ aus zwei deutschen Kommunen.
I. Einleitung II. Soziale Sicherung als Bestandteil entwicklungspolitischer Agenden – Eine internationale Perspektive III. Internationale Politikdiffusion und nationaler Politikwandel – Konzeptionelle Grundlagen IV. Die Rolle internationaler Politikdiffusion für den Wandel sozialer Sicherungssysteme – Empirische Evidenz V. Schlussfolgerungen
The cooperation between researchers and practitioners during the different stages of the research process is promoted as it can be of benefit to both society and research supporting processes of ‘transformation’. While acknowledging the important potential of research–practice–collaborations (RPCs), this paper reflects on RPCs from a political-economic perspective to also address potential unintended adverse effects on knowledge generation due to divergent interests, incomplete information or the unequal distribution of resources. Asymmetries between actors may induce distorted and biased knowledge and even help produce or exacerbate existing inequalities. Potential merits and limitations of RPCs, therefore, need to be gauged. Taking RPCs seriously requires paying attention to these possible tensions—both in general and with respect to international development research, in particular: On the one hand, there are attempts to contribute to societal change and ethical concerns of equity at the heart of international development research, and on the other hand, there is the relative risk of encountering asymmetries more likely.
Purpose – To describe the development of a novel polyether(meth)acrylate-based resin material class for stereolithography with alterable material characteristics.
Design/methodology/approach – A complete overview of details to composition parameters, the optimization and bandwidth of mechanical and processing parameters is given. Initial biological characterization experiments and future application felds are depicted. Process parameters are studied in a commercial 3D systems Viper stereolithography system, and a new method to determine these parameters is described herein.
Findings – Initial biological characterizations show the non-toxic behavior in a biological environment, caused mainly by the (meth)acrylate-based core components. These photolithographic resins combine an adjustable low Young’s modulus with the advantages of a non-toxic (meth)acrylate-based process material. In contrast to the mostly rigid process materials used today in the rapid prototyping industry, these polymeric formulations are able to fulfll the extended need for a soft engineering material. A short overview of sample applications is given.
Practical implications – These polymeric formulations are able to meet the growing demand for a resin class for rapid manufacturing that covers a bandwidth from softer to stiffer materials.
Originality/value – This paper gives an overview about the novel developed material class for stereolithography and should be therefore of high interest to people with interest in novel rapid manufacturing materials and technology.
Cathepsin K (CatK) is a target for the treatment of osteoporosis, arthritis, and bone metastasis. Peptidomimetics with a cyanohydrazide warhead represent a new class of highly potent CatK inhibitors; however, their binding mechanism is unknown. We investigated two model cyanohydrazide inhibitors with differently positioned warheads: an azadipeptide nitrile Gü1303 and a 3-cyano-3-aza-β-amino acid Gü2602. Crystal structures of their covalent complexes were determined with mature CatK as well as a zymogen-like activation intermediate of CatK. Binding mode analysis, together with quantum chemical calculations, revealed that the extraordinary picomolar potency of Gü2602 is entropically favoured by its conformational flexibility at the nonprimed-primed subsites boundary. Furthermore, we demonstrated by live cell imaging that cyanohydrazides effectively target mature CatK in osteosarcoma cells. Cyanohydrazides also suppressed the maturation of CatK by inhibiting the autoactivation of the CatK zymogen. Our results provide structural insights for the rational design of cyanohydrazide inhibitors of CatK as potential drugs.
When optimizing the process parameters of the acidic ethanolic organosolv process, the aim is usually to maximize the delignification and/or lignin purity. However, process parameters such as temperature, time, ethanol and catalyst concentration, respectively, can also be used to vary the structural properties of the obtained organosolv lignin, including the molecular weight and the ratio of aliphatic versus phenolic hydroxyl groups, among others. This review particularly focuses on these influencing factors and establishes a trend analysis between the variation of the process parameters and the effect on lignin structure. Especially when larger data sets are available, as for process temperature and time, correlations between the distribution of depolymerization and condensation reactions are found, which allow direct conclusions on the proportion of lignin's structural features, independent of the diversity of the biomass used. The newfound insights gained from this review can be used to tailor organosolv lignins isolated for a specific application.
Miscanthus crops possess very attractive properties such as high photosynthesis yield and carbon fixation rate. Because of these properties, it is currently considered for use in second-generation biorefineries. Here we analyze the differences in chemical composition between M. x giganteus, a commonly studied Miscanthus genotype, and M. nagara, which is relatively understudied but has useful properties such as increased frost resistance and higher stem stability. Samples of M. x giganteus (Gig35) and M. nagara (NagG10) have been separated by plant portion (leaves and stems) in order to isolate the corresponding lignins. The organosolv process was used for biomass pulping (80% ethanol solution, 170 °C, 15 bar). Biomass composition and lignin structure analysis were performed using composition analysis, Fourier-transform infrared (FTIR), ultraviolet-visible (UV-Vis) and nuclear magnetic resonance (NMR) spectroscopy, thermogravimetric analysis (TGA), size exclusion chromatography (SEC) and pyrolysis gas-chromatography/mass spectrometry (Py-GC/MS) to determine the 3D structure of the isolated lignins, monolignol ratio and most abundant linkages depending on genotype and harvesting season. SEC data showed significant differences in the molecular weight and polydispersity indices for stem versus leaf-derived lignins. Py-GC/MS and hetero-nuclear single quantum correlation (HSQC) NMR revealed different monolignol compositions for the two genotypes (Gig35, NagG10). The monolignol ratio is slightly influenced by the time of harvest: stem-derived lignins of M. nagara showed increasing H and decreasing G unit content over the studied harvesting period (December–April).
As a low-input crop, Miscanthus offers numerous advantages that, in addition to agricultural applications, permits its exploitation for energy, fuel, and material production. Depending on the Miscanthus genotype, season, and harvest time as well as plant component (leaf versus stem), correlations between structure and properties of the corresponding isolated lignins differ. Here, a comparative study is presented between lignins isolated from M. x giganteus, M. sinensis, M. robustus and M. nagara using a catalyst-free organosolv pulping process. The lignins from different plant constituents are also compared regarding their similarities and differences regarding monolignol ratio and important linkages. Results showed that the plant genotype has the weakest influence on monolignol content and interunit linkages. In contrast, structural differences are more significant among lignins of different harvest time and/or season. Analyses were performed using fast and simple methods such as nuclear magnetic resonance (NMR) spectroscopy. Data was assigned to four different linkages (A: β-O-4 linkage, B: phenylcoumaran, C: resinol, D: β-unsaturated ester). In conclusion, A content is particularly high in leaf-derived lignins at just under 70% and significantly lower in stem and mixture lignins at around 60% and almost 65%. The second most common linkage pattern is D in all isolated lignins, the proportion of which is also strongly dependent on the crop portion. Both stem and mixture lignins, have a relatively high share of approximately 20% or more (maximum is M. sinensis Sin2 with over 30%). In the leaf-derived lignins, the proportions are significantly lower on average. Stem samples should be chosen if the highest possible lignin content is desired, specifically from the M. x giganteus genotype, which revealed lignin contents up to 27%. Due to the better frost resistance and higher stem stability, M. nagara offers some advantages compared to M. x giganteus. Miscanthus crops are shown to be very attractive lignocellulose feedstock (LCF) for second generation biorefineries and lignin generation in Europe.
Miscanthus x giganteus Stem Versus Leaf-Derived Lignins Differing in Monolignol Ratio and Linkage
(2019)
As a renewable, Miscanthus offers numerous advantages such as high photosynthesis activity (as a C4 plant) and an exceptional CO2 fixation rate. These properties make Miscanthus very attractive for industrial exploitation, such as lignin generation. In this paper, we present a systematic study analyzing the correlation of the lignin structure with the Miscanthus genotype and plant portion (stem versus leaf). Specifically, the ratio of the three monolignols and corresponding building blocks as well as the linkages formed between the units have been studied. The lignin amount has been determined for M. x giganteus (Gig17, Gig34, Gig35), M. nagara (NagG10), M. sinensis (Sin2), and M. robustus (Rob4) harvested at different time points (September, December, and April). The influence of the Miscanthus genotype and plant component (leaf vs. stem) has been studied to develop corresponding structure-property relationships (i.e., correlations in molecular weight, polydispersity, and decomposition temperature). Lignin isolation was performed using non-catalyzed organosolv pulping and the structure analysis includes compositional analysis, Fourier transform infradred (FTIR), ultraviolet/visible (UV-Vis), hetero-nuclear single quantum correlation nuclear magnetic resonsnce (HSQC-NMR), thermogravimetric analysis (TGA), and pyrolysis gaschromatography/mass spectrometry (GC/MS). Structural differences were found for stem and leaf-derived lignins. Compared to beech wood lignins, Miscanthus lignins possess lower molecular weight and narrow polydispersities (<1.5 Miscanthus vs. >2.5 beech) corresponding to improved homogeneity. In addition to conventional univariate analysis of FTIR spectra, multivariate chemometrics revealed distinct differences for aromatic in-plane deformations of stem versus leaf-derived lignins. These results emphasize the potential of Miscanthus as a low-input resource and a Miscanthus-derived lignin as promising agricultural feedstock.
There has been a growing interest in taste research in the HCI and CSCW communities. However, the focus is more on stimulating the senses, while the socio-cultural aspects have received less attention. However, individual taste perception is mediated through social interaction and collective negotiation and is not only dependent on physical stimulation. Therefore, we study the digital mediation of taste by drawing on ethnographic research of four online wine tastings and one self-organized event. Hence, we investigated the materials, associated meanings, competences, procedures, and engagements that shaped the performative character of tasting practices. We illustrate how the tastings are built around the taste-making process and how online contexts differ in providing a more diverse and distributed environment. We then explore the implications of our findings for the further mediation of taste as a social and democratized phenomenon through online interaction.
Herein we report an update to ACPYPE, a Python3 tool that now properly converts AMBER to GROMACS topologies for force fields that utilize nondefault and nonuniform 1–4 electrostatic and nonbonded scaling factors or negative dihedral force constants. Prior to this work, ACPYPE only converted AMBER topologies that used uniform, default 1–4 scaling factors and positive dihedral force constants. We demonstrate that the updated ACPYPE accurately transfers the GLYCAM06 force field from AMBER to GROMACS topology files, which employs non-uniform 1–4 scaling factors as well as negative dihedral force constants. Validation was performed using β-d-GlcNAc through gas-phase analysis of dihedral energy curves and probability density functions. The updated ACPYPE retains all of its original functionality, but now allows the simulation of complex glycomolecular systems in GROMACS using AMBER-originated force fields. ACPYPE is available for download at https://github.com/alanwilter/acpype.
Human butyrylcholinesterase (BChE) is a glycoprotein capable of bioscavenging toxic compounds such as organophosphorus (OP) nerve agents. For commercial production of BChE, it is practical to synthesize BChE in non-human expression systems, such as plants or animals. However, the glycosylation profile in these systems is significantly different from the human glycosylation profile, which could result in changes in BChE's structure and function. From our investigation, we found that the glycan attached to ASN241 is both structurally and functionally important due to its close proximity to the BChE tetramerization domain and the active site gorge. To investigate the effects of populating glycosylation site ASN241, monomeric human BChE glycoforms were simulated with and without site ASN241 glycosylated. Our simulations indicate that the structure and function of human BChE are significantly affected by the absence of glycan 241.
Updating a shared data structure in a parallel program is usually done with some sort of high-level synchronization operation to ensure correctness and consistency. The realization of such high-level synchronization operations is done with appropriate low-level atomic synchronization instructions that the target processor architecture provides. These instructions are costly and often limited in their scalability on larger multi-core / multi-processor systems. In this paper, a technique is discussed that replaces atomic updates of a shared data structure with ordinary and cheaper read/write operations. The necessary conditions are specified that must be fulfilled to ensure overall correctness of the program despite missing synchronization. The advantage of this technique is the reduction of access costs as well as more scalability due to elided atomic operations. But on the other side, possibly more work has to be done caused by missing synchronization. Therefore, additional work is traded against costly atomic operations. A practical application is shown with level-synchronous parallel Breadth-First Search on an undirected graph where two vertex frontiers are accessed in parallel. This application scenario is also used for an evaluation of the technique. Tests were done on four different large parallel systems with up to 64-way parallelism. It will be shown that for the graph application examined the amount of additional work caused by missing synchronization is neglectible and the performance is almost always better than the approach with atomic operations.
SpMV Runtime Improvements with Program Optimization Techniques on Different Abstraction Levels
(2016)
The multiplication of a sparse matrix with a dense vector is a performance critical computational kernel in many applications, especially in natural and engineering sciences. To speed up this operation, many optimization techniques have been developed in the past, mainly focusing on the data layout for the sparse matrix. Strongly related to the data layout is the program code for the multiplication. But even for a fixed data layout with an accommodated kernel, there are several alternatives for program optimizations. This paper discusses a spectrum of program optimization techniques on different abstraction layers for six different sparse matrix data format and kernels. At the one end of the spectrum, compiler options can be used that hide from the programmer all optimizations done by the compiler internally. On the other end of the spectrum, a multiplication kernel can be programmed that use highly sophisticated intrinsics on an assembler level that ask for a programmer with a deep understanding of processor architectures. These special instructions can be used to efficiently utilize hardware features in processors like vector units that have the potential to speed up sparse matrix computations. The paper compares the programming effort and required knowledge level for certain program optimizations in relation to the gained runtime improvements.
Industrie 4.0: Digitale Wirtschaft – Herausforderung und Chance für Unternehmen und Arbeitswelt
(2015)
Seit Mitte der 1990er Jahre werden neue Informations- und Kommunikationstechnologien in der Arbeitswelt genutzt, in wachsendem Ausmaß und mit wachsender Bedeutung. Die verstärkte Digitalisierung verändert sowohl die Wirtschaft als auch die Gesellschaft. Es wird sogar von der "vierten industriellen Revolution" gesprochen, denn traditionelle Geschäftsmodelle geraten unter Druck.
Hochschulen beschäftigen sich mit der Frage, wie Lehrveranstaltungen und Forschungsprojekte durch den Einsatz digitaler Werkzeuge und Lernplattformen begleitet und ergänzt werden können. Bibliotheken spielen hierbei häufig eine tragende Rolle, auf ganz unterschiedliche Art und Weise: Als Content-Entwickler und -Provider, als Supportstelle oder als E-Learning-Beratungsstelle für Lehrende und Studierende.