Refine
H-BRS Bibliography
- yes (325) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (86)
- Fachbereich Wirtschaftswissenschaften (69)
- Fachbereich Angewandte Naturwissenschaften (57)
- Fachbereich Ingenieurwissenschaften und Kommunikation (55)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (46)
- Fachbereich Sozialpolitik und Soziale Sicherung (40)
- Institute of Visual Computing (IVC) (18)
- Institut für Verbraucherinformatik (IVI) (16)
- Institut für funktionale Gen-Analytik (IFGA) (16)
- Graduierteninstitut (15)
Document Type
- Article (122)
- Conference Object (65)
- Part of a Book (43)
- Book (monograph, edited volume) (23)
- Preprint (18)
- Doctoral Thesis (15)
- Report (10)
- Contribution to a Periodical (6)
- Master's Thesis (6)
- Working Paper (5)
Year of publication
- 2020 (325) (remove)
Keywords
- Digitalisierung (5)
- Inborn error of metabolism (3)
- Lehrbuch (3)
- Organic aciduria (3)
- Quality diversity (3)
- Usable Security (3)
- post-buckling (3)
- ARIMA (2)
- Artificial Intelligence (2)
- Autoencoder (2)
In der vorliegenden Arbeit wurde Kraft-Lignin als Makromonomer für die Synthese von thermoplastischen Polyurethanen mit hoher molarer Masse durch acide Präzipitation aus Schwarzlauge isoliert. Die Charakterisierung des Rohstoffes bezüglich seiner Ausgangsmolmasse erfolgte mittels Gel-Permeations-Chromatographie mit Polystyren-Polymerstandard, welche sich als sehr hilfreiche Analysemethode erwies. Da das Kraft-Lignin die klassische Polyolkomponente bei der Synthese von Polyurethanen ersetzen sollte, war es notwendig, den Hydroxylgehalt des Kraft-Lignins zu bestimmen. Für diesen Zweck wurde eine bereits etablierte Prozedur zur nasschemischen Bestimmung des Hydroxylgehaltes von Polyolen für die Synthese von Polyurethanen einer Adaption unterzogen. Es wurde die Reaktionsdauer bei der Acetylierung des Kraft-Lignins variiert. Das Ergebnis war, dass die Messgenauigkeit durch eine Erhöhung der Reaktionsdauer von 1 h auf 3 h drastisch von 25,5 % auf 3,6 % reduziert werden konnte. Um abschätzen zu können, ob die erzielte Messgenauigkeit im Rahmen einer nasschemischen Prozedur mit manueller Titration liegt, wurden zusätzlich die Hydroxylgehalte von Ethandiol und Saccharose bestimmt. Diese dienten als Referenzsubstanz mit definierten und bekannten Hydroxylgehalten. Die Ermittlung der Hydroxylgehalte mit diesen Substanzen ergab für Ethandiol eine Messgenauigkeit von 2,2 % und für Saccharose eine Messgenauigkeit von 1,4 %. Eine Messgenauigkeit von 3,6 % ist in Anbetracht des Zeitaufwandes akzeptabel.
Für die Synthese von thermoplastischen Polyurethanen wurde Kraft-Lignin mit Methylendiphenyldiisocyanat in Dimethylacetamid mit Zinnoktoat als Katalysator zur Reaktion gebracht. Es wurde das NCO/OH-Verhältnis und die Reaktionsdauer variiert. Die Analyse der synthetisierten Polyurethane erfolgte mittels Ubbelohde-Kapillarviskosimetrie, Fourier-Transformations-Infrarotspektroskopie und Schmelzpunktbestimmung. Die FTIR-Spektren bestätigte eine erfolgreiche Synthese von Polyurethanen aus Kraft-Lignin und Methylendiphenyldiisocyanat und zeigte, dass die Variation des NCO/OH-Verhältnisses und der Reaktionsdauer keinerlei Einflüsse auf die chemische Grundstruktur des Polyurethans hat. Die Ubbelohde-Kapillarviskosimetrie belegte die thermoplastischen Eigenschaften des synthetisierten Polyurethans, die sich in einem thermoplastischen Nassprozess verarbeiten lassen. Sie zeigte auch die Abhängigkeit der Molmasse der synthetisierten Polyurethane von der Reaktionsdauer und vom NCO/OH-Verhältnis. So steigt die Molmasse des Polyurethans mit steigender Reaktionsdauer und sinkendem NCO/OH-Verhältnis. Letztere Beobachtung ist sogar praktisch hinsichtlich der gesundheitsgefährdenden Eigenschaft von Isocyanaten, da so der Einsatz von Isocyanaten reduziert werden kann. Um die schmelzflüssige Verarbeitbarkeit des synthetisierten Polyurethans zu untersuchen, wurden die Schmelzpunkte der Polymere bestimmt. Es konnte in einem Temperaturbereich von 25 °C-410 °C keine Aggregatzustandsänderung, sondern lediglich eine Zersetzungsreaktion beobachtet werden.
4GREAT is an extension of the German Receiver for Astronomy at Terahertz frequencies (GREAT) operated aboard the Stratospheric Observatory for Infrared Astronomy (SOFIA). The spectrometer comprises four different detector bands and their associated subsystems for simultaneous and fully independent science operation. All detector beams are co-aligned on the sky. The frequency bands of 4GREAT cover 491-635, 890-1090, 1240-1525 and 2490-2590 GHz, respectively. This paper presents the design and characterization of the instrument, and its in-flight performance. 4GREAT saw first light in June 2018, and has been offered to the interested SOFIA communities starting with observing cycle 6.
Purpose To investigate how completing vocational re-training influenced income and employment days of working-age people with disabilities in the first 8 years after program admission. The investigation also included the influence of vocational re-training on the likelihood of receiving an earnings incapacity pension and on social security benefit receipt. Methods This retrospective cohort study with 8 years follow up was based on data from 2399 individuals who had completed either a 1-year vocational re-training program (n = 278), or a 2-year vocational re-training program (n = 1754) or who were admitted into re-training but never completed the program (n = 367). A propensity score-based method was used to account for observed differences and establish comparability between program graduates and program dropouts. Changes in outcomes were examined using the inverse probability-weighted regression adjustment method. Results After controlling for other factors, over the 8 years after program admission, graduates of 1-year re-training, on average, were employed for an additional 405 days, 95% CI [249 days, 561 days], and had earned €24,260 more than without completed re-training, 95% CI [€12,805, €35,715]. Two-year program completers, on average, were employed for 441 additional days, 95% CI [349 days, 534 days], and had earned €35,972 more than without completed re-training, 95% CI [€27,743, €44,202]. The programs also significantly reduced the number of days on social-security and unemployment benefits and lowered the likelihood of an earnings incapacity pension. Conclusion Policies to promote the labor market re-integration of persons with disabilities should consider that vocational re-training may be an effective tool for sustainably improving work participation outcomes.
The motor protein myosin drives a wide range of cellular and muscular functions by generating directed movement and force, fueled through adenosine triphosphate (ATP) hydrolysis. Release of the hydrolysis product adenosine diphosphate (ADP) is a fundamental and regulatory process during force production. However, details about the molecular mechanism accompanying ADP release are scarce due to the lack of representative structures. Here we solved a novel blebbistatin-bound myosin conformation with critical structural elements in positions between the myosin pre-power stroke and rigor states. ADP in this structure is repositioned towards the surface by the phosphate-sensing P-loop, and stabilized in a partially unbound conformation via a salt-bridge between Arg131 and Glu187. A 5 Å rotation separates the mechanical converter in this conformation from the rigor position. The crystallized myosin structure thus resembles a conformation towards the end of the two-step power stroke, associated with ADP release. Computationally reconstructing ADP release from myosin by means of molecular dynamics simulations further supported the existence of an equivalent conformation along the power stroke that shows the same major characteristics in the myosin motor domain as the resolved blebbistatin-bound myosin-II·ADP crystal structure, and identified a communication hub centered on Arg232 that mediates chemomechanical energy transduction.
Green infrastructure improves environmental health in cities, benefits human health, and provides habitat for wildlife. Increasing urbanization has demanded the expansion of urban areas and transformation of existing cities. The adoption of compact design in urban planning is a recommended strategy to minimize environmental impacts; however, it may undermine green infrastructure networks within cities as it sets a battleground for urban space. Under this scenario, multifunctionality of green spaces is highly desirable but reconciling human needs and biodiversity conservation in a limited space is still a challenge. Through a systematic review, we first compiled urban green space's characteristics that affect mental health and urban wildlife support, and then identified potential synergies and trade-offs between these dimensions. A framework based on the One Health approach is proposed, synthesizing the interlinkages between green space quality, mental health, and wildlife support; providing a new holistic perspective on the topic. Looking at the human-wildlife-environment relationships simultaneously may contribute to practical guidance on more effective green space design and management that benefit all dimensions.
Das Buch schlägt die Brücke zwischen den betriebswirtschaftlich-organisatorischen Methoden und deren digitaler Umsetzung, denn Prozessmanagement heißt zunehmend Gestaltung betrieblicher Aufgaben. Neben methodischen Grundlagen bietet das Werk viele Praxisbeispiele und Übungen. Das Buch von Prof. Gadatsch gilt mittlerweile als der "aktuelle Klassiker", DAS maßgebliche Standardwerk zur IT-gestützten Gestaltung von Geschäftsprozessen.
Comparative Evaluation of Pretrained Transfer Learning Models on Automatic Short Answer Grading
(2020)
Automatic Short Answer Grading (ASAG) is the process of grading the student answers by computational approaches given a question and the desired answer. Previous works implemented the methods of concept mapping, facet mapping, and some used the conventional word embeddings for extracting semantic features. They extracted multiple features manually to train on the corresponding datasets. We use pretrained embeddings of the transfer learning models, ELMo, BERT, GPT, and GPT-2 to assess their efficiency on this task. We train with a single feature, cosine similarity, extracted from the embeddings of these models. We compare the RMSE scores and correlation measurements of the four models with previous works on Mohler dataset. Our work demonstrates that ELMo outperformed the other three models. We also, briefly describe the four transfer learning models and conclude with the possible causes of poor results of transfer learning models.
Optimization plays an essential role in industrial design, but is not limited to minimization of a simple function, such as cost or strength. These tools are also used in conceptual phases, to better understand what is possible. To support this exploration we focus on Quality Diversity (QD) algorithms, which produce sets of varied, high performing solutions. These techniques often require the evaluation of millions of solutions -- making them impractical in design cases. In this thesis we propose methods to radically improve the data-efficiency of QD with machine learning, enabling its application to design. In our first contribution, we develop a method of modeling the performance of evolved neural networks used for control and design. The structures of these networks grow and change, making them difficult to model -- but with a new method we are able to estimate their performance based on their heredity, improving data-efficiency by several times. In our second contribution we combine model-based optimization with MAP-Elites, a QD algorithm. A model of performance is created from known designs, and MAP-Elites creates a new set of designs using this approximation. A subset of these designs are the evaluated to improve the model, and the process repeats. We show that this approach improves the efficiency of MAP-Elites by orders of magnitude. Our third contribution integrates generative models into MAP-Elites to learn domain specific encodings. A variational autoencoder is trained on the solutions produced by MAP-Elites, capturing the common “recipe” for high performance. This learned encoding can then be reused by other algorithms for rapid optimization, including MAP-Elites. Throughout this thesis, though the focus of our vision is design, we examine applications in other fields, such as robotics. These advances are not exclusive to design, but serve as foundational work on the integration of QD and machine learning.
The encoding of solutions in black-box optimization is a delicate, handcrafted balance between expressiveness and domain knowledge between exploring a wide variety of solutions, and ensuring that those solutions are useful. Our main insight is that this process can be automated by generating a dataset of high-performing solutions with a quality diversity algorithm (here, MAP-Elites), then learning a representation with a generative model (here, a Varia-tional Autoencoder) from that dataset. Our second insight is that this representation can be used to scale quality diversity optimization to higher dimensions-but only if we carefully mix solutions generated with the learned representation and those generated with traditional variation operators. We demonstrate these capabilities by learning an low-dimensional encoding for the inverse kinemat-ics of a thousand joint planar arm. The results show that learned representations make it possible to solve high-dimensional problems with orders of magnitude fewer evaluations than the standard MAP-Elites, and that, once solved, the produced encoding can be used for rapid optimization of novel, but similar, tasks. The presented techniques not only scale up quality diversity algorithms to high dimensions, but show that black-box optimization encodings can be automatically learned, rather than hand designed.
The way solutions are represented, or encoded, is usually the result of domain knowledge and experience. In this work, we combine MAP-Elites with Variational Autoencoders to learn a Data-Driven Encoding (DDE) that captures the essence of the highest-performing solutions while still able to encode a wide array of solutions. Our approach learns this data-driven encoding during optimization by balancing between exploiting the DDE to generalize the knowledge contained in the current archive of elites and exploring new representations that are not yet captured by the DDE. Learning representation during optimization allows the algorithm to solve high-dimensional problems, and provides a low-dimensional representation which can be then be re-used. We evaluate the DDE approach by evolving solutions for inverse kinematics of a planar arm (200 joint angles) and for gaits of a 6-legged robot in action space (a sequence of 60 positions for each of the 12 joints). We show that the DDE approach not only accelerates and improves optimization, but produces a powerful encoding that captures a bias for high performance while expressing a variety of solutions.
Kommunikation gilt nicht ohne Grund als die Königsdisziplin im BGM. Hier gilt es, Mitarbeiter in einem ersten Schritt für das Thema Gesundheit zu sensibilisieren und mit relevanten Materialien zu informieren, um sie letztendlich zur Teilnahme an Gesundheitsangeboten zu motivieren. Diese drei Schritte empfehlen sich ebenfalls für die Kommunikation in digitalen Zeiten. Gesundheitsplattformen und/oder Gesundheits-Apps können die Kommunikation unterstützen. Das richtige Maß an Kommunikation stellt eine weitere Herausforderung in digitalen Zeiten dar, da Informationen in der Flut an E-Mails durchaus untergehen können. Eine Kombination aus Push- und Pull-Kommunikation hat sich hierbei bewährt, um bei Mitarbeitern das nötige Interesse für Gesundheit anzustoßen, damit diese dann eigenständig aus bestehenden Angeboten (Informationen, Kurse usw.) wählen.
Background: 3-hydroxy-3-methylglutaryl-coenzyme A lyase deficiency (HMGCLD) is an autosomal recessive disorder of ketogenesis and leucine degradation due to mutations in HMGCL.
Method: We performed a systematic literature search to identify all published cases. Two hundred eleven patients of whom relevant clinical data were available were included in this analysis. Clinical course, biochemical findings and mutation data are highlighted and discussed. An overview on all published HMGCL variants is provided.
Results: More than 95% of patients presented with acute metabolic decompensation. Most patients manifested within the first year of life, 42.4% already neonatally. Very few individuals remained asymptomatic. The neurologic long-term outcome was favorable with 62.6% of patients showing normal development.
Conclusion: This comprehensive data analysis provides a systematic overview on all published cases with HMGCLD including a list of all known HMGCL mutations.
2-methylacetoacetyl-coenzyme A thiolase (beta-ketothiolase) deficiency: one disease - two pathways
(2020)
Background: 2-methylacetoacetyl-coenzyme A thiolase deficiency (MATD; deficiency of mitochondrial acetoacetyl-coenzyme A thiolase T2/ “beta-ketothiolase”) is an autosomal recessive disorder of ketone body utilization and isoleucine degradation due to mutations in ACAT1.
Methods: We performed a systematic literature search for all available clinical descriptions of patients with MATD. Two hundred forty-four patients were identified and included in this analysis. Clinical course and biochemical data are presented and discussed.
Results: For 89.6% of patients at least one acute metabolic decompensation was reported. Age at first symptoms ranged from 2 days to 8 years (median 12 months). More than 82% of patients presented in the first 2 years of life, while manifestation in the neonatal period was the exception (3.4%). 77.0% (157 of 204 patients) of patients showed normal psychomotor development without neurologic abnormalities. Conclusion: This comprehensive data analysis provides a systematic overview on all cases with MATD identified in the literature. It demonstrates that MATD is a rather benign disorder with often favourable outcome, when compared with many other organic acidurias.
The development of metals tailored to the metallurgical conditions of laser-based additive manufacturing is crucial to advance the maturity of these materials for their use in structural applications. While efforts in this regard are being carried out around the globe, the use of high strength eutectic alloys have, so far, received minor attention, although previous works showed that rapid solidification techniques can result in ultrafine microstructures with excellent mechanical performance, albeit for small sample sizes. In the present work, a eutectic Ti-32.5Fe alloy has been produced by laser powder bed fusion aiming at exploiting rapid solidification and the capability to produce bulk ultrafine microstructures provided by this processing technique.
Process energy densities between 160 J/mm³ and 180 J/mm³ resulted in a dense and crack-free material with an oxygen content of ~ 0.45 wt.% in which a hierarchical microstructure is formed by µm-sized η-Ti4Fe2Ox dendrites embedded in an ultrafine eutectic β-Ti/TiFe matrix. The microstructure was studied three-dimensionally using near-field synchrotron ptychographic X-ray computed tomography with an actual spatial resolution down to 39 nm to analyse the morphology of the eutectic and dendritic structures as well as to quantify their mass density, size and distribution. Inter-lamellar spacings down to ~ 30–50 nm were achieved, revealing the potential of laser-based additive manufacturing to generate microstructures smaller than those obtained by classical rapid solidification techniques for bulk materials. The alloy was deformed at 600 °C under compressive loading up to a strain of ~ 30% without damage formation, resulting in a compressive yield stress of ~ 800 MPa.
This study provides a first demonstration of the feasibility to produce eutectic Ti-Fe alloys with ultrafine microstructures by laser powder bed fusion that are suitable for structural applications at elevated temperature.
Describing the elephant: a foundational model of human needs, motivation, behaviour, and wellbeing
(2020)
Models of basic psychological needs have been present and popular in the academic and lay literature for more than a century yet reviews of needs models show an astonishing lack of consensus. This raises the question of what basic human psychological needs are and if this can be consolidated into a model or framework that can align previous research and empirical study. The authors argue that the lack of consensus arises from researchers describing parts of the proverbial elephant correctly but failing to describe the full elephant. Through redefining what human needs are and matching this to an evolutionary framework we can see broad consensus across needs models and neatly slot constructs and psychological and behavioural theories into this framework. This enables a descriptive model of drives, motives, and well-being that can be simply outlined but refined enough to do justice to the complexities of human behaviour. This also raises some issues of how subjective well-being is and should be measured. Further avenues of research and how to continue building this model and framework are proposed.
Are There Extended Cognitive Improvements from Different Kinds of Acute Bouts of Physical Activity?
(2020)
Acute bouts of physical activity of at least moderate intensity have shown to enhance cognition in young as well as older adults. This effect has been observed for different kinds of activities such as aerobic or strength and coordination training. However, only few studies have directly compared these activities regarding their effectiveness. Further, most previous studies have mainly focused on inhibition and have not examined other important core executive functions (i.e., updating, switching) which are essential for our behavior in daily life (e.g., staying focused, resisting temptations, thinking before acting), as well. Therefore, this study aimed to directly compare two kinds of activities, aerobic and coordinative, and examine how they might affect executive functions (i.e., inhibition, updating, and switching) in a test-retest protocol. It is interesting for practical implications, as coordinative exercises, for example, require little space and would be preferable in settings such as an office or a classroom. Furthermore, we designed our experiment in such a way that learning effects were controlled. Then, we tested the influence of acute bouts of physical activity on the executive functioning in both young and older adults (young 16–22 years, old 65–80 years). Overall, we found no differences between aerobic and coordinative activities and, in fact, benefits from physical activities occurred only in the updating tasks in young adults. Additionally, we also showed some learning effects that might influence the results. Thus, it is important to control cognitive tests for learning effects in test-retest studies as well as to analyze effects from physical activity on a construct level of executive functions.
Computers can help us to trigger our intuition about how to solve a problem. But how does a computer take into account what a user wants and update these triggers? User preferences are hard to model as they are by nature vague, depend on the user’s background and are not always deterministic, changing depending on the context and process under which they were established. We pose that the process of preference discovery should be the object of interest in computer aided design or ideation. The process should be transparent, informative, interactive and intuitive. We formulate Hyper-Pref, a cyclic co-creative process between human and computer, which triggers the user’s intuition about what is possible and is updated according to what the user wants based on their decisions. We combine quality diversity algorithms, a divergent optimization method that can produce many, diverse solutions, with variational autoencoders to both model that diversity as well as the user’s preferences, discovering the preference hypervolume within large search spaces.
In optimization methods that return diverse solution sets, three interpretations of diversity can be distinguished: multi-objective optimization which searches diversity in objective space, multimodal optimization which tries spreading out the solutions in genetic space, and quality diversity which performs diversity maintenance in phenotypic space. We introduce niching methods that provide more flexibility to the analysis of diversity and a simple domain to compare and provide insights about the paradigms. We show that multiobjective optimization does not always produce much diversity, quality diversity is not sensitive to genetic neutrality and creates the most diverse set of solutions, and multimodal optimization produces higher fitness solutions. An autoencoder is used to discover phenotypic features automatically, producing an even more diverse solution set. Finally, we make recommendations about when to use which approach.
In complex, expensive optimization domains we often narrowly focus on finding high performing solutions, instead of expanding our understanding of the domain itself. But what if we could quickly understand the complex behaviors that can emerge in said domains instead? We introduce surrogate-assisted phenotypic niching, a quality diversity algorithm which allows to discover a large, diverse set of behaviors by using computationally expensive phenotypic features. In this work we discover the types of air flow in a 2D fluid dynamics optimization problem. A fast GPU-based fluid dynamics solver is used in conjunction with surrogate models to accurately predict fluid characteristics from the shapes that produce the air flow. We show that these features can be modeled in a data-driven way while sampling to improve performance, rather than explicitly sampling to improve feature models. Our method can reduce the need to run an infeasibly large set of simulations while still being able to design a large diversity of air flows and the shapes that cause them. Discovering diversity of behaviors helps engineers to better understand expensive domains and their solutions.
Die Bundesrepublik Deutschland erlebt in jüngster Vergangenheit verstärkt Dieselfahrverbote in Großstädten. Gleichzeitig erfahren Großstädte als Lebensmittelpunkt eine steigende Beliebtheit. Für Verkehrsunternehmen gilt es, der Bevölkerung nachhaltige Mobilitätslösungen zu bieten, die ein Höchstmaß an Flexibilität ermöglichen. Moderne Mobility-as-a-Service-Konzepte und Innovationen in der Mobilität stellen den klassischen, planorientierten, öffentlichen Personennahverkehr und damit auch die Existenz von Bushaltestellen infrage. Mittels qualitativer Experten-Interviews lässt sich feststellen, dass sich die Bushaltestelle in den Innenstädten vor dem Hintergrund zunehmender digitaler Vernetzung von Mobilitätsanbietern und daraus resultierender modernen Mobility-as-a-service-Konzepte verändern wird. Die Ergebnisse deuten darauf hin, dass die Bushaltestelle in den Innenstädten auch in Zukunft bestehen bleibt und um „on demand“-Verkehre ergänzt wird. Ein radikaler Wandel, wie eine flächendeckende Einführung von autonom fahrenden Bussen, könnte langfristig eine Runderneuerung der Haltestelle zur Folge haben.
Kollaborative Industrieroboter werden für produzierende Unternehmen immer kosteneffizienter. Während diese Systeme für den menschlichen Mitarbeiter eine große Hilfe sein können, stellen sie gleichzeitig ein ernstes Gesundheitsrisiko dar, wenn die zwingend notwendigen Sicherheitsmaßnahmen nur unzureichend umgesetzt werden. Herkömmliche Sicherheitseinrichtungen wie Zäune oder Lichtvorhänge bieten einen guten Schutz, aber solch statische Schutzvorrichtungen sind in neuen, hochdynamischen Arbeitsszenarien problematisch.
Im Forschungsprojekt BeyondSPAI wurde ein Funktionsmuster eines Multisensorsystems zur Absicherung solcher dynamischer Arbeitsszenarien entworfen, implementiert und im Feld getestet. Kern des Systems ist eine robuste optische Materialklassifikation, die mit Hilfe eines intelligenten InGaAs-Kamerasystems Haut von anderen typischen Werkstückoberflächen (z.B. Holz, Metalle od. Kunststoffe) unterscheiden kann. Diese einzigartige Eigenschaft wird genutzt, um menschliche Mitarbeiter zuverlässig zu erkennen, so dass ein konventioneller Roboter in Folge als personenbewusster Cobot arbeiten kann.
Das System ist modular und kann leicht mit weiteren Sensoren verschiedenster Art erweitert werden. Es kann an verschiedene Marken von Industrierobotern angepasst werden und lässt sich schnell an bestehenden Robotersystemen integrieren. Die vier vom System bereitgestellten Sicherheitsausgänge können dazu verwendet werden - abhängig von der durchdrungenen Überwachungszone - entweder eine Warnung auszugeben, die Bewegung des Roboters auf eine sichere Geschwindigkeit zu verlangsamen, oder den Roboter sicher anzuhalten. Sobald alle Zonen wieder als „eindeutig frei von Personen“ identifiziert sind, kann der Roboter wieder beschleunigen, seine ursprüngliche Bewegung wiederaufnehmen und die Arbeit fortsetzen.
The simultaneous operation of multiple different semiconducting metal oxide (MOX) gas sensors is demanding for the readout circuitry. The challenge results from the strongly varying signal intensities of the various sensor types to the target gas. While some sensors change their resistance only slightly, other types can react with a resistive change over a range of several decades. Therefore, a suitable readout circuit has to be able to capture all these resistive variations, requiring it to have a very large dynamic range. This work presents a compact embedded system that provides a full, high range input interface (readout and heater management) for MOX sensor operation. The system is modular and consists of a central mainboard that holds up to eight sensor-modules, each capable of supporting up to two MOX sensors, therefore supporting a total maximum of 16 different sensors. Its wide input range is archived using the resistance-to-time measurement method. The system is solely built with commercial off-the-shelf components and tested over a range spanning from 100Ω to 5 GΩ (9.7 decades) with an average measurement error of 0.27% and a maximum error of 2.11%. The heater management uses a well-tested power-circuit and supports multiple modes of operation, hence enabling the system to be used in highly automated measurement applications. The experimental part of this work presents the results of an exemplary screening of 16 sensors, which was performed to evaluate the system’s performance.
Demand forecast
(2020)