Refine
H-BRS Bibliography
- yes (4915) (remove)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (1241)
- Fachbereich Informatik (1148)
- Fachbereich Angewandte Naturwissenschaften (766)
- Fachbereich Ingenieurwissenschaften und Kommunikation (636)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (480)
- Präsidium (404)
- Fachbereich Sozialpolitik und Soziale Sicherung (402)
- Institute of Visual Computing (IVC) (313)
- Institut für funktionale Gen-Analytik (IFGA) (241)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (195)
Document Type
- Article (1603)
- Conference Object (1119)
- Part of a Book (689)
- Part of Periodical (410)
- Book (monograph, edited volume) (370)
- Report (145)
- Preprint (88)
- Working Paper (87)
- Contribution to a Periodical (83)
- Doctoral Thesis (70)
Year of publication
Keywords
- Lehrbuch (85)
- Deutschland (27)
- Nachhaltigkeit (27)
- Controlling (23)
- Unternehmen (23)
- Digitalisierung (17)
- Management (17)
- Betriebswirtschaftslehre (16)
- Machine Learning (16)
- Corporate Social Responsibility (15)
Reducing energy consumption is one of the most pursued economic and ecologic challenges concerning societies as a whole, individuals and organizations alike. While politics start taking measures for energy turnaround and smart home energy monitors are becoming popular, few studies have touched on sustainability in office environments so far, though they account for almost every second workplace in modern economics. In this paper, we present findings of two parallel studies in an organizational context using behavioral change oriented strategies to raise energy awareness. Next to demonstrating potentials, it shows that energy feedback needs must fit to the local organizational context to succeed and should consider typical work patterns to foster accountability of consumption.
Kleinere, günstigere und effizientere Sensoren und Aktoren sowie Funkprotokolle haben dazu geführt, dass Smart Home Produkte in zunehmend auch für den privaten Massenmarkt erschwinglich werden. Damit stehen Hersteller und Anbieter vor der Herausforderung, komplexe cyber-physische Systeme für Jedermann handhabbar zu gestalten. Es fehlen allerdings empirische Erkenntnisse über die Rolle von Smart Home im Alltag. Wir präsentieren Ergebnisse aus einer Living Lab Studie, in der 14 Haushalte mit einer am Markt erhältlichen Smart Home Nachrüstlösung ausgestattet und über neun Monate empirisch begleitet wurden. Anhand der Analyse von Interviews, Beobachtungen und Co-Design Workshops in den Phasen der Produktauswahl, Installation, Konfiguration und längerfristigen Nutzung zeigen wir Herausforderungen und Potentiale von Smart Home Systemen auf. Unsere Erkenntnisse deuten darauf hin, dass das Smart Home immer noch von technischen Details dominiert wird. Zugleich fehlen Nutzern angemessene Steuerungs- und Kontrollmöglichkeiten, um weiterhin die Entscheidungshoheit im eigenen Zuhause zu behalten.
Smart home systems are becoming an integral feature of the emerging home IT market. Under this general term, products mainly address issues of security, energy savings and comfort. Comprehensive systems that cover several use cases are typically operated and managed via a unified dashboard. Unfortunately, research targeting user experience (UX) design for smart home interaction that spans several use cases or covering the entire system is scarce. Furthermore, existing comprehensive and user-centered longterm studies on challenges and needs throughout phases of information collection, installation and operation of smart home systems are technologically outdated. Our 18-month Living Lab study covering 14 households equipped with smart home technology provides insights on how to design for improving smart home appropriation. This includes a stronger sensibility for household practices during setup and configuration, flexible visualizations for evolving demands and an extension of smart home beyond the location.
Die Entwicklung intelligenter Technologien zur Unterstützung im Alltag und in den eigenen vier Wänden begleitet unsere Gesellschaft schon seit dem Zeitalter des Personal Computers. Mit dem Aufkommen des Internet der Dinge und begünstigt durch immer kleiner und günstiger werdende Hardware ergeben sich neue Potenziale, die das Thema Smart Home attraktiver als je zuvor werden lassen. Eine Vielzahl der aktuell im Markt verfügbaren Lösungen adressiert die Bedürfnisse Komfort, Sicherheit und effiziente Energienutzung. Die versprochene Intelligenz – smartness, wie sie der Begriff selbst suggeriert – wird vor allem bei Lösungen im privaten Nachrüstbereich überwiegend durch die Interaktion der Nutzer selbst und entsprechende regelbasierte Konfigurationen erzeugt. Diese notwendige Art der Interaktion und die damit verbundenen Aufwände sind jedoch von starker Bedeutung für das gesamte Nutzungserlebnis Smart Home und führen nicht selten zu Frustration oder gar Resignation in der Nutzung.
New cars are increasingly "connected" by default. Since not having a car is not an option for many people, understanding the privacy implications of driving connected cars and using their data-based services is an even more pressing issue than for expendable consumer products. While risk-based approaches to privacy are well established in law, they have only begun to gain traction in HCI. These approaches are understood not only to increase acceptance but also to help consumers make choices that meet their needs. To the best of our knowledge, perceived risks in the context of connected cars have not been studied before. To address this gap, our study reports on the analysis of a survey with 18 open-ended questions distributed to 1,000 households in a medium-sized German city. Our findings provide qualitative insights into existing attitudes and use cases of connected car features and, most importantly, a list of perceived risks themselves. Taking the perspective of consumers, we argue that these can help inform consumers about data use in connected cars in a user-friendly way. Finally, we show how these risks fit into and extend existing risk taxonomies from other contexts with a stronger social perspective on risks of data use.
Due to the popularity of the Internet and the networked services that it facilitates, networked devices have become increasingly common in both the workplace and everyday life in recent years—following the trail blazed by smartphones. The data provided by these devices allow for the creation of rich user profiles. As a result, the collection, processing and exchange of such personal data have become drivers of economic growth. History shows that the adoption of new technologies is likely to influence both individual and societal concepts of privacy. Research into privacy has therefore been confronted with continuously changing concepts due to technological progress. From a legal perspective, privacy laws that reflect social values are sought. Privacy enhancing technologies are developed or adapted to take account of technological development. Organizations must also identify protective measures that are effective in terms of scalability and automation. Similarly, research is being conducted from the perspective of Human-Computer Interaction (HCI) to explore design spaces that empower individuals to manage their protection needs with regard to novel data, which they may perceive as sensitive. Taking such an HCI perspective with regard to understanding privacy management on the Internet of Things (IoT), this research mainly focuses on three interrelated goals across the fields of application: 1. Exploring and analyzing how people make sense of data, especially when managing privacy and data disclosure; 2. Identifying, framing and evaluating potential resources for designing sense-making processes; and 3. Exploring the fitness of the identified concepts for inclusion in legal and technical perspectives on supporting decisions regarding privacy on the IoT. Although this work's point of departure is the HCI perspective, it emphasizes the importance of the interrelationships among seemingly independent perspectives. Their interdependence is therefore also emphasized and taken into account by subscribing to a user-centered design process throughout this study. More specifically, this thesis adopts a design case study approach. This approach makes it possible to conduct full user-centered design lifecycles in a concrete application case with participants in the context of everyday life. Based on this approach, it was possible to investigate several domains of the IoT that are currently relevant, namely smart metering, smartphones, smart homes and connected cars. The results show that the participants were less concerned about (raw) data than about the information that could potentially be derived from it. Against the background of the constant collection of highly technical and abstract data, the content of which only becomes visible through the application of complex algorithms, this study indicates that people should learn to explore and understand these data flexibly, and provides insights in how to design for supporting this aim. From the point of view of design for usable privacy protection measures, the information that is provided to users about data disclosure should be focused on the consequences thereof for users' environments and life. A related concept from law is “informed consent,” which I propose should be further developed in order to implement usable mechanisms for individual privacy protection in the era of the IoT. Finally, this thesis demonstrates how research on HCI can be methodologically embedded in a regulative process that will inform both the development of technology and the drafting of legislation.
In recent years, a plethora of observations with high spectral resolution of sub-millimetre and far-infrared transitions of methylidene (CH), conducted with Herschel and SOFIA, have demonstrated this radical to be a valuable proxy for molecular hydrogen that can be used for characterising molecular gas within the interstellar medium on a Galactic scale, including the CO-dark component. We report the discovery of the 13CH isotopologue in the interstellar medium using the upGREAT receiver on board SOFIA. We have detected the three hyperfine structure components of the ≈2 THz frequency transition from its X2Π1∕2 ground-state towards the high-mass star-forming regions Sgr B2(M), G34.26+0.15, W49(N), and W51E and determined 13CH column densities. The ubiquity of molecules containing carbon in the interstellar medium has turned the determination of the ratio between the abundances of the two stable isotopes of carbon, 12C/13C, into a cornerstone for Galactic chemical evolution studies. Whilst displaying a rising gradient with galactocentric distance, this ratio, when measured using observations of different molecules (CO, H2CO, and others), shows systematic variations depending on the tracer used. These observed inconsistencies may arise from optical depth effects, chemical fractionation, or isotope-selective photo-dissociation. Formed from C+ either through UV-driven or turbulence-driven chemistry, CH reflects the fractionation of C+, and does not show any significant fractionation effects, unlike other molecules that were previously used to determine the 12C/13C isotopic ratio. This makes it an ideal tracer for the 12C/13C ratio throughout the Galaxy. By comparing the derived column densities of 13CH with previously obtained SOFIA data of the corresponding transitions of the main isotopologue 12CH, we therefore derive 12C/13C isotopic ratios toward Sgr B2(M), G34.26+0.15, W49(N) and W51E. Adding our values derived from 12∕13CH to previous calculations of the Galactic isotopic gradient, we derive a revised value of 12C/13C = 5.87(0.45)RGC + 13.25(2.94).
Discrimination and classification of eight strains related to meat spoilage microorganisms commonly found in poultry meat were successfully carried out using two dispersive Raman spectrometers (Microscope and Portable Fiber-Optic systems) in combination with chemometric methods. Principal Components Analysis (PCA) and Multi-Class Support Vector Machines (MC-SVM) were applied to develop discrimination and classification models. These models were certified using validation data sets which were successfully assigned to the correct bacterial genera and even to the right strain. The discrimination of bacteria down to the strain level was performed for the pre-processed spectral data using a 3-stage model based on PCA. The spectral features and differences among the species on which the discrimination was based were clarified through PCA loadings. In MC-SVM the pre-processed spectral data was subjected to PCA and utilized to build a classification model. When using the first two components, the accuracy of the MC-SVM model was 97.64% and 93.23% for the validation data collected by the Raman Microscope and the Portable Fiber-Optic Raman system, respectively. The accuracy reached 100% for the validation data by using the first eight and ten PC’s from the data collected by Raman Microscope and by Portable Fiber-Optic Raman system, respectively. The results reflect the strong discriminative power and the high performance of the developed models, the suitability of the pre-processing method used in this study and that the low accuracy of the Portable Fiber-Optic Raman system does not adversely affect the discriminative power of the developed models.
A New Approach of Using Two Wireless Tracking Systems in Mobile Augmented Reality Applications
(2003)
The device (10) has a handrail (18) provided with an optical contactless monitoring device formed as an active sensor system, where the monitoring device is arranged in a region of a guide (14) of the handrail at a front base (16) of an escalator (12) or a moving pavement. The monitoring device has two transmission paths (28, 30) with wavelength bands that are different from each other, where one of the paths includes the handrail. Ratio or difference between signals of the paths is used for recognizing foreign bodies e.g. hands of adults and children.
Service robots performing complex tasks involving people in houses or public environments are becoming more and more common, and there is a huge interest from both the research and the industrial point of view. The RoCKIn@Home challenge has been designed to compare and evaluate different approaches and solutions to tasks related to the development of domestic and service robots. RoCKIn@Home competitions have been designed and executed according to the benchmarking methodology developed during the project and received very positive feedbacks from the participating teams. Tasks and functionality benchmarks are explained in detail.
Swedish wheeled mobile robots have remarkable mobility properties allowing them to rotate and translate at the same time. Being holonomic systems, their kinematics model results in the possibility of designing separate and independent position and heading trajectory tracking control laws. Nevertheless, if these control laws should be implemented in the presence of unaccounted actuator saturation, the resulting saturated linear and angular velocity commands could interfere with each other thus dramatically affecting the overall expected performance. Based on Lyapunov’s direct method, a position and heading trajectory tracking control law for Swedish wheeled robots is developed. It explicitly accounts for actuator saturation by using ideas from a prioritized task based control framework.
Introduction: Some Basic Remarks on Sustainable Forest Management, Environment and Global Ethics
(2010)
Die Hochschulen der "Hochschulallianz für den Mittelstand" haben sich ganz bewusst für diese Namensgebung entschieden. Wir wollen uns gemeinsam für den Mittelstand in Deutschland engagieren. Hochschulen für angewandte Wissenschaften/Fachhochschulen sind in vielen Regionen der wichtigste Ausbildungs-, Forschungs- und Entwicklungspartner für mittelständische Unternehmen. Und dennoch konstatiert der aktuelle Innovationsindikator des BDI: Es gibt noch immer zu viele Berührungsängste zwischen Wissenschaftlern und KMU-Managern. Daran hat sich seit Jahrzehnten leider nicht viel geändert.
Die Fachhochschulen haben sich als Hochschulen für angewandte Wissenschaften seit ihrer Gründung Anfang der 70er Jahre deutlich gewandelt. Das Fächerportfolio vieler Fachhochschulen ist inzwischen mit jenem der Universitäten vergleichbar. In einigen Fächern bilden die Fachhochschulen sogar den überwiegenden Anteil von Absolventen aus. Die anwendungsorientierte Spitzenforschung gehört zum Selbstverständnis vieler Fachhochschulen. Vor diesem Hintergrund ist es unverständlich und für die wirtschaftliche Zukunftsfähigkeit schädlich, dass Fachhochschulen immer noch deutliche Wettbewerbsnachteile in der Weiterqualifizierung des wissenschaftlichen Nachwuchses haben. Dies gilt umso mehr, wenn mit Fachhochschulen vergleichbaren privaten Hochschulen das Promotionsrecht zugestanden wird.
Der Weltklimavertrag versucht, sowohl auf das Umsteuern bei klimaschädlichem Wirtschaften in den Industrieländern (und einigen Schwellenländern) als auch auf das Vermeiden von klimaschädlicher Ausgestaltung des Wirtschaftens in den Schwellen- und Entwicklungsländern eine Antwort zu geben. Doch noch ist diese Antwort zu abstrakt.
Sie leiten ein heterogenes oder agiles Team und merken, dass Sie bisherige Führungsgrundsätze überdenken müssen? In diesem Spotlight erfahren Sie, wie Sie mit dem Wandel zur Agilität und mit Diversity in Ihrem Team umgehen können. Sie erhalten Unterstützung beim Thema Delegieren oder wenn Sie sich mit widersprüchlichen Führungszielen konfrontiert sehen. Außerdem zeigen Ihnen unsere Autoren mit direkt einsetzbaren Tipps, wie Sie wertschätzend führen sowie für Verbindlichkeit und ein gemeinsames Verständnis im Projektteam sorgen.
Ein neues Projekt beginnt, alle legen enthusiastisch los. Doch nach wenigen Wochen beschleicht den Projektleiter ein mulmiges Gefühl: Der Schwerpunkt des Projekts ist offenbar falsch gesetzt und wichtige Interessen des Auftraggebers wurden nicht berücksichtigt. Es drohen zahlreiche Überstunden oder sogar der Projektstopp. Olaf Ihlow erspart sich solche Schwierigkeiten, indem er frühzeitig eine einfache Methode anwendet, die aussagekräftige Ergebnisse liefert: die systemische Auftragsklärung.
Trotz einer sauber aufgesetzten Projektplanung und der konsequenten Einhaltung der vorgeschriebenen PM-Methodik verfehlen Projekte oft ihre Zielvorgaben. Einer der wichtigsten Gründe hierfür ist die stark gestiegene Komplexität. Diese erfordert eine neue Sichtweise auf das Projektmanagement, wie Olaf Ihlow in seinem Trend-Beitrag meint.
Die moderne Arbeitswelt erfordert digitale Kompetenz, doch Hochschulen mangelt es an Angeboten zum digitalen Kompetenzaufbau Studierender. Peer-Angebote können ein sinnvoller Ansatz zur Förderung digitaler Kompetenz sein, allerdings fehlen empirische Belege für deren Wirksamkeit. Die Studie setzt hier an und evaluiert den digitalen Kompetenzerwerb von Teilnehmenden fachübergreifender Peer-Trainings auf Grundlage des DigComp Rahmenmodells. Die Ergebnisse zeigen, dass Trainings-Teilnehmende ihre digitale Kompetenz im Vergleich zur Kontrollgruppe signifikant stärker steigern konnten. Die Ausbildung zur bzw. zum Peer-Trainer:in sowie die Peer-Trainings wurden von allen Beteiligten sehr positiv bewertet.
Die Internationalisierung von Hochschulen muss mit passenden Angeboten für internationale Studierende einhergehen. Ein systemisches und kultursensitives Peer-Coaching kann internationale Studierende unterstützen, sich zu orientieren, zu integrieren und berufliche Ziele zu verwirklichen. Gleichzeitig entwickeln die zu Peer-Coaches ausgebildeten Studierenden Sensibilität für Diversität. Insgesamt wird ein kultursensitives Miteinander an der Hochschule gefördert.
Quereinsteiger und Neulinge der Vermittlung von Informationskompetenz werden grundlegend auf der Basis der vielfältigen praktischen Erfahrungen des Multiplikatorennetzwerks der AG Informationskompetenz NRW über strategische Konzepte, unterschiedliche Schulungsangebote für verschiedene Zielgruppen, Methodik und Didaktik von Schulungsveranstaltungen, Organisation und Infrastruktur sowie Evaluation und Qualitätsmanagement informiert.
Langfristige Finanzierungen tragen wesentlich zur Beruhigung der Märkte und damit zur Stabilität von Volkswirtschaften bei. Dies zeigt eindrucksvoll das Beispiel des deutschen Marktes für Wohnungsfinanzierungen. Bedingt durch die neuen Finanzmarktregulierungen wie Basel III und Solvency II wird sich der Markt für langfristige Finanzierungen jedoch verkleinern, da Banken Anreize erhalten, eher kurzfristige Darlehen zu vergeben. Andere Finanzintermediäre, etwa Versicherungen oder Fonds, werden diese Lücke kaum schließen können. Zwar werden alternative Finanzintermediäre aufgrund von Vorteilen in der Regulierung ihre Kreditvergabe erhöhen, aber wegen fehlender Erfahrungen und Anreize werden sie den Mangel an Langfristfinanzierungen nicht beheben können. Geboten ist daher eine Anpassung des Regelwerks, damit der Bankensektor robuster wird, gleichzeitig aber seine originären volkswirtschaftlichen Funktionen weiter erfüllen kann.
Die Blockchain-Technologie ist einer der großen Innovationstreiber der letzten Jahre. Mit einer zugrundeliegenden Blockchain-Technologie ist auch der Betrieb von verteilten Anwendungen, sogenannter Decentralized Applications (DApps), bereits technisch umsetzbar. Dieser Beitrag verfolgt das Ziel, Gestaltungsmöglichkeiten der digitalen Verbraucherteilhabe an Blockchain-Anwendungen zu untersuchen. Hierzu enthält der Beitrag eine Einführung in die digitale Verbraucherteilhabe und die technischen Grundlagen und Eigenschaften der Blockchain-Technologie, einschließlich darauf basierender DApps. Abschließend werden technische, ethisch-organisatorische, rechtliche und sonstige Anforderungsbereiche für die Umsetzung von digitaler Verbraucherteilhabe in Blockchain-Anwendungen adressiert.
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
Mit der vorliegenden Diplomarbeit entsteht der Prototyp eines Fahrtrichtungsanzeigers für autonom navigierende Roboter. Es werden zunächst wichtige Aspekte der Bewegungsabsichtenerkennung zwischen Robotern und Menschen untersucht. Anschließend steht das Design einer Mensch-Maschine-Schnittstelle für diesen Zweck im Fokus. Den Schwerpunkt der Arbeit bilden die Auswahl einer geeigneten Technik zur prototypischen Umsetzung der Anzeige, der Aufbau der Hardware mit einem Embedded System und die Programmierung dieser Hardware. Der nach Abschluss der Arbeit vorliegende Prototyp weist eine USB-Schnittstelle zur Übertragung von Navigationsdaten auf. Den Kern des Systems bildet ein Embedded System auf Basis des ARM-7 Mikrocontrollers. Die Ansteuerung des aus 64 LEDs bestehenden Displays erfolgt über den LED-Treiberbaustein TLC5920.
Atomic oxygen is a key species in the mesosphere and thermosphere of Venus. It peaks in the transition region between the two dominant atmospheric circulation patterns, the retrograde super-rotating zonal flow below 70 km and the subsolar to antisolar flow above 120 km altitude. However, past and current detection methods are indirect and based on measurements of other molecules in combination with photochemical models. Here, we show direct detection of atomic oxygen on the dayside as well as on the nightside of Venus by measuring its ground-state transition at 4.74 THz (63.2 µm). The atomic oxygen is concentrated at altitudes around 100 km with a maximum column density on the dayside where it is generated by photolysis of carbon dioxide and carbon monoxide. This method enables detailed investigations of the Venusian atmosphere in the region between the two atmospheric circulation patterns in support of future space missions to Venus.
Atomic oxygen in the mesosphere and lower thermosphere measured by terahertz heterodyne spectroscopy
(2021)
Atomic oxygen is a main component of the mesosphere and lower thermosphere (MLT). The photochemistry and the energy balance of the MLT are governed by atomic oxygen. In addition, it is a tracer for dynamical motions in the MLT. It is difficult to measure with remote sensing techniques. Concentrations can be inferred indirectly from the oxygen air glow or from observations of OH, which is involved in photochemical processes related to atomic oxygen. Such measurements have been performed with several satellite instruments such as SCIAMACHY, SABER, WINDII and OSIRIS. However, the methods are indirect and rely on photochemical models and assumptions such as quenching rates, radiative lifetimes, and reaction coefficients. The results are not always in agreement, particularly when obtained with different instruments.
XML Signature Wrapping (XSW) has been a relevant threat to web services for 15 years until today. Using the Personal Health Record (PHR), which is currently under development in Germany, we investigate a current SOAP-based web services system as a case study. In doing so, we highlight several deficiencies in defending against XSW. Using this real-world contemporary example as motivation, we introduce a guideline for more secure XML signature processing that provides practitioners with easier access to the effective countermeasures identified in the current state of research.
This work extends the affordance-inspired robot control architecture introduced in the MACS project [35] and especially its approach to integrate symbolic planning systems given in [24] by providing methods to automated abstraction of affordances to high-level operators. It discusses how symbolic planning instances can be generated automatically based on these operators and introduces an instantiation method to execute the resulting plans. Preconditions and effects of agent behaviour are learned and represented in Gärdenfors conceptual spaces framework. Its notion of similarity is used to group behaviours to abstract operators based on the affordance-inspired, function-centred view on the environment. Ways on how the capabilities of conceptual spaces to map subsymbolic to symbolic representations to generate PDDL planning domains including affordance-based operators are discussed. During plan execution, affordance-based operators are instantiated by agent behaviour based on the situation directly before its execution. The current situation is compared to past ones and the behaviour that has been most successful in the past is applied. Execution failures can be repaired by action substitution. The concept of using contexts to dynamically change dimension salience as introduced by Gärdenfors is realized by using techniques from the field of feature selection. The approach is evaluated using a 3D simulation environment and implementations of several object manipulation behaviours.
This report describes the design, the implementation and the usage of a system for managing different systems for automated theorem proving and automatically generated proofs. In particular, we focus on a user-friendly web-based interface and a structure for collecting and cataloguing proofs in a uniform way. The second point hopefully helps to understand the structure of automatically generated proofs and builds a starting point for new insights for strategies for proof planning.
Die Medikalisierungs- und die Kompressionsthese sind zwei „konkurrierende“ Ansätze in Bezug auf die Frage, in welchem Gesundheitszustand ein längeres Leben, insbesondere die Lebensjahre in höherem Alter verbracht werden. Neben der individuellen Bedeutung von Quantität und Qualität der Lebensjahre ist die Relevanz dieser Frage für das Gesundheitswesen hoch, denn nicht nur in der Vergangenheit ist die Zahl bzw. auch der Anteil der älteren Menschen gestiegen, es wird im Kontext des demografischen Wandels ein weiterer Anstieg, auch der Lebenserwartung, prognostiziert – und die Auswirkungen auf die Versorgungsbedarfe bzw. Ausgaben im Gesundheitswesen können beträchtlich sein.
Controlling 2020
(2012)
Aufgrund der wachsenden Dynamik der Unternehmensumwelt, nimmt die allgemeine Unsicherheit über die zukünftigen Entwicklungsrichtungen der Unternehmen in Deutschland stetig zu. Vor diesem Hintergrund und für einen proaktiven Umgang mit zukünftigen Herausforderungen, sollten Controller ein besonderes Interesse daran haben, frühzeitig zu erfahren, welche Entwicklungen die Controllingpraxis in Zukunft tangieren werden. Trotz der steigenden Nachfrage kommt aktuell weder die akademische, noch die praxisorientierte Controllingforschung ihrer Prognosefunktion in ausreichendem Maße nach. Als wissenschaftlicher Beitrag für eine stärkere Zukunftsorientierung in der Controllingforschung zielt die vorliegende Arbeit darauf ab, mit Hilfe einer qualitativen Metaanalyse, aus Zukunftsbildern der Unternehmensumwelt Thesen über die zukünftige Entwicklung des Controllings in Deutschland bis 2020 aufzustellen und Implikationen für die Controllingpraxis abzuleiten.
Unternehmen im Maschinenbau realisieren Sicherheitsfunktionen immer mehr durch die Anwendungsprogrammierung von sicherheitsgerichteten Steuerungen. Die aktuellen Normen DIN EN ISO 13849 und DIN EN 62061 definieren erstmals auch Anforderungen an die Softwareentwicklung von Sicherheitsfunktionen. Dadurch sollen gefährliche systematische Fehler in der sicherheitsbezogenen Anwendungssoftware für eine Maschine vermieden werden. Wesentliche Anforderung dieser Normen ist, einen strukturierten Entwicklungsprozess einzuhalten: das V-Modell. Auch die weiteren Anforderungen zu fehlervermeidenden und -beherrschenden Maßnahmen bei der Entwicklung sind in den Normen wie üblich sehr allgemein gehalten. Zudem gibt es bislang wenige publizierte Beispiele und Vorschläge für die Umsetzung dieser Anforderungen. Daher ist die Interpretation der Normen bei der Softwareentwicklung im Maschinenbau oft unklar und bereitet Schwierigkeiten in der Umsetzung. Dies war der Anlass für ein von der DGUV gefördertes und an der Hochschule Bonn-Rhein-Sieg durchgeführtes Projekt (FF-FP0319, Laufzeit 2011 bis 2013). In dem Projekt wurde gemeinsam mit regionalen Maschinenbauunternehmen eine praktisch anwendbare Entwicklungsmethode – die Matrixmethode des IFA – hergeleitet und in einem Forschungsbericht mit vielen Beispielen dokumentiert. Dieser Forschungsbericht bildet den Kern des vorliegenden IFA Reports. Mit der hier dargestellten Matrixmethode des IFA kann Anwendungssoftware von Sicherheitsfunktionen normgerecht spezifiziert, validiert und dokumentiert werden. Darüber hinaus vermittelt der Report weitere Informationen rund um Anwendungsprogrammierung für sicherheitsbezogene Maschinensteuerungen. Der Aufwand für die Anwendungsprogrammierung ist bei Standardsteuerungen typischerweise höher als für zertifizierte Sicherheitssteuerungen. Daher beziehen sich mehrere Kapitel des Reports auf die Anwendung von Standardsteuerungen. Zur effizienten Anwendung der Matrixmethode entwickelt das IFA ein Softwaretool namens SOFTEMA. Die Beispiele des Reports sind zum Download verfügbar und können mit SOFTEMA betrachtet werden.
Manufacturers of machinery are increasingly using application programming of safety controls in order to implement safety functions. The EN ISO 13849-1 and EN 62061 standards define requirements concerning the development of software employed for safety functions. The IFA began addressing the subject of safety-related application software many years ago. Between 2011 and 2013, Project FF-FP0319 concerning standardscompliant development and documentation of safetyrelated user software in machine construction was successfully completed at the Bonn-Rhein-Sieg University of Applied Sciences in conjunction with numerous partner bodies from the machine construction sector and with funding from the DGUV. For this purpose, a procedure – the IFA matrix method – was developed, and evaluated and documented with reference to examples from industry, for implementation of the requirements concerning the development of software for machine safety functions. This paper provides insights into both the IFA matrix method and the new IFA report on the subject, and with information on what further tools are planned.
This work discusses how to use OSM for robotic applications and aims at starting a discussion between the OSM and the robotics community. OSM contains much topological and semantic information that can be directly used in robotics and offers various advantages: 1) Standardized format with existing tooling. 2) The graph structure allows to compose the OSM models with domain-specific semantics by adding custom nodes, relations, and key-value pairs. 3) Information about many places is already available and can be used by robots since it is driven by a community effort.
Deployment of modern data-driven machine learning methods, most often realized by deep neural networks (DNNs), in safety-critical applications such as health care, industrial plant control, or autonomous driving is highly challenging due to numerous model-inherent shortcomings. These shortcomings are diverse and range from a lack of generalization over insufficient interpretability and implausible predictions to directed attacks by means of malicious inputs. Cyber-physical systems employing DNNs are therefore likely to suffer from so-called safety concerns, properties that preclude their deployment as no argument or experimental setup can help to assess the remaining risk. In recent years, an abundance of state-of-the-art techniques aiming to address these safety concerns has emerged. This chapter provides a structured and broad overview of them. We first identify categories of insufficiencies to then describe research activities aiming at their detection, quantification, or mitigation. Our work addresses machine learning experts and safety engineers alike: The former ones might profit from the broad range of machine learning topics covered and discussions on limitations of recent methods. The latter ones might gain insights into the specifics of modern machine learning methods. We hope that this contribution fuels discussions on desiderata for machine learning systems and strategies on how to help to advance existing approaches accordingly.
Computers will soon be powerful enough to simulate consciousness. The artificial life community should start to try to understand how consciousness could be simulated. The proposal is to build an artificial life system in which consciousness might be able to evolve. The idea is to develop internet-wide artificial universe in which the agents can evolve. Users play games by defining agents that form communities. The communities have to perform tasks, or compete, or whatever the specific game demands. The demands should be such that agents that are more aware of their universe are more likely to succeed. The agents reproduce and evolve within their user’s machine, but can also sometimes transfer to other machine across the internet. Users will be able to choose the capabilities of their agents from a fixed list, but may also write their own powers for their agents.
The lattice Boltzmann method (LBM) stands apart from conventional macroscopic approaches due to its low numerical dissipation and reduced computational cost, attributed to a simple streaming and local collision step. While this property makes the method particularly attractive for applications such as direct noise computation, it also renders the method highly susceptible to instabilities. A vast body of literature exists on stability-enhancing techniques, which can be categorized into selective filtering, regularized LBM, and multi-relaxation time (MRT) models. Although each technique bolsters stability by adding numerical dissipation, they act on different modes. Consequently, there is not a universal scheme optimally suited for a wide range of different flows. The reason for this lies in the static nature of these methods; they cannot adapt to local or global flow features. Still, adaptive filtering using a shear sensor constitutes an exception to this. For this reason, we developed a novel collision operator that uses space- and time-variant collision rates associated with the bulk viscosity. These rates are optimized by a physically informed neural net. In this study, the training data consists of a time series of different instances of a 2D barotropic vortex solution, obtained from a high-order Navier–Stokes solver that embodies desirable numerical features. For this specific text case our results demonstrate that the relaxation times adapt to the local flow and show a dependence on the velocity field. Furthermore, the novel collision operator demonstrates a better stability-to-precision ratio and outperforms conventional techniques that use an empirical constant for the bulk viscosity.
Mit der Aufnahme des Lehrbetriebes im Wintersemester 1995/96 haben sich die Fachbereiche Wirtschaft in Sankt Augustin und Rheinbach die laufende Qualitätssicherung und Qualitätsverbesserung der Ausbildung zum Ziel gesetzt. Die Evaluierung der Lehre und des Studiums wurde frühzeitig implementiert. Der Fachbereich versteht den Lehr- und Evaluationsbericht als Instrument der selbst gesteuerten Qualitätssicherung.
Investition
(2009)
Investition und Finanzierung
(2007)
Auch eine hervorragende Geschäftsidee braucht Zeit, bis sie von potenziellen Kunden bemerkt und honoriert wird. Trotz guter Planung und engagierter Kundenakquisition kann das Auftragsvolumen, auch aufgrund konjunktureller Schwächen, hinter den Erwartungen zurückbleiben. Zudem können sich Schwierigkeiten in der Produktentwicklung, in der Leistungserstellung und in der Auftragsabwicklung einstellen.
„Auf uns hört ja keiner“
(2019)
Ethik im Medizintourismus
(2022)
Autonomous mobile robots need internal environment representations or models of their environment in order to act in a goal-directed manner, plan actions and navigate effectively. Especially in those situations where a robot can not be provided with a manually constructed model or in environments that change over time, the robot needs to possess the ability of autonomously constructing models and maintaining these models on its own. To construct a model of an environment multiple sensor readings have to be acquired and integrated into a single representation. Where the robot has to take these sensor readings is determined by an exploration strategy. The strategy allows the robot to sense all environmental structures and to construct a complete model of its workspace. Given a complete environment model, the task of inspection is to guide the robot to all modeled environmental structures in order to detect changes and to update the model if necessary. Informally stated, exploration and inspection provide the means for acquiring as much information as possible by the robot itself. Both exploration and inspection are highly integrated problems. In addition to the according strategies, they require for several abilities of a robotic system and comprise various problems from the field of mobile robotics including Simultaneous localization and Mapping (SLAM), motion planning and control as well as reliable collision avoidance. The goal of this thesis is to develop and implement a complete system and a set of algorithms for robotic exploration and inspection. That is, instead of focussing on specific strategies, robotic exploration and inspection are addressed as the integrated problems that they are. Given the set of algorithms a real mobile service robot has to be able to autonomously explore its workspace, construct a model of its workspace and use this model in subsequent tasks e.g. for navigating in the workspace or inspecting the workspace itself. The algorithms need to be reliable, robust against environment dynamics and internal failures and applicable online in real-time on a real mobile robot. The resulting system should allow a mobile service robot to navigate effectively and reliably in a domestic environment and avoid all kinds of collisions. In the context of mobile robotics, domestic environments combine the characteristics of being cluttered, dynamic and populated by humans and domestic animals. SLAM is going to be addressed in terms of incremental range image registration which provides efficient means to construct internal environment representations online while moving through the environment. Two registration algorithms are presented that can be applied on two-dimensional and three-dimensional data together with several extensions and an incremental registration procedure. The algorithms are used to construct two different types of environment representations, memory-efficient sparse points and probabilistic reflection maps. For effective navigation in the robot’s workspace, different path planning algorithms are going to be presented for the two types of environment representations. Furthermore, two motion controllers will be described that allow a mobile robot to follow planned paths and to approach a target position and orientation. Finally this thesis will present different exploration and inspection strategies that use the aforementioned algorithms to move the robot to previously unexplored or uninspected terrain and update the internal environment representations accordingly. These strategies are augmented with algorithms for detecting changes in the environment and for segmenting internal models into individual rooms. The resulting system performed very successfully in the 2008 and 2009 RoboCup@Home competitions.
Mobiles Laser-Schneidsystem zur Unterstützung der USBV-Entschärfung und Beweissicherung (mobiLaS)
(2022)
Als rohstoffarme und exportorientierte Wirtschaftsnation ist die Bundesrepublik in ho- hem Maß auf die Sicherung und Sicherheit der Logistikketten im grenzüberschreiten- den Verkehr angewiesen. Angesichts der komplexen Transportstrukturen bei grenz- überschreitenden Transporten kommt den eingesetzten Kontroll- und Prüfverfahren besondere Bedeutung zu: Einerseits müssen Kostenbelastungen, Unterbrechungen und Verzögerungen in der Transportkette minimiert, andererseits besonders illegale Einfuhren, Transporte und Substanzen unterbunden werden. Von besonderer Bedeu- tung für Verdachts- bzw. Stichprobenkontrollen ist der Einsatz speziell trainierter Spür- hunde. Als besonders leistungsfähige ‚lebende Sensoren‘ sind sie in der Lage, eine Vielzahl von Stoffen zu detektieren. Der Einsatz von Spürhunden unterliegt allerdings engen Grenzen: Hoher Trainingsaufwand, eng begrenzte Einsatzdauer, begrenzte Verfügbarkeit. Die Entwicklung neuer, optimierter Einsatzverfahren für Spürhunde z. B. mit höheren Durchsatzraten und überprüfbarer Verlässlichkeit durch Einbindung technischer Systeme ist daher ein wichtiger Beitrag für die Sicherung und Sicherheit der Logistikketten.
In der Regel werden für die Hundeausbildung lose Explosivstoffe im Grammbereich eingesetzt. Deren Umgang unterliegt jedoch aufgrund des Gefährdungspotenzials und aus rechtlichen Gründen sehr strikten Regeln. Diese können nur mit Schwierigkeiten mit den Erfordernissen der Hundeausbildung in Einklang gebracht werden. Der Umgang mit hoch-brisanten Zündstoffen und Selbstlaboraten (z.B. TATP und HMTD) stellt aufgrund der nochmals erhöhten Gefährlichkeit und zusätzlicher gesetzlicher Regelungen eine spezielle Herausforderung dar. Das Poster beschreibt die EMPK® (Echtstoff-Mikromengen-Prüfkörper), die eine sichere Alternative als Trainingshilfsmittel für Sprengstoffspürhunde darstellen.
A reference model is always developed in order to support a specific purpose. The development environment is setting the broader context. Limitations are not only set by size and experience of the modeler team or by budget and time constraints. The intended usage scenario also defines the fundamental contour of a reference model. During the practical work with reference models, a range of key issues has come up to increase the suitability of reference models for daily use. As the result of many projects, the authors have summarized the key issues and formulated critical success factors for reference modeling projects.
The increasing ubiquity of Artificial Intelligence (AI) poses significant political consequences. The rapid proliferation of AI over the past decade has prompted legislators and regulators to attempt to contain AI’s technological consequences. For Germany, relevant design requirements have been expressed by the European Commission’s High-Level Expert Group on Artificial Intelligence (HLEG AI), and, at the national level, by the German government’s Data Ethics Commission (DEK) as well as the German Bundestag’s Commission of Inquiry on Artificial Intelligence (EKKI).
Wie KI Innere Führung lernt
(2022)
Dass sich künstliche Intelligenz (KI) weltweit ausgebreitet hat, ist eine Binsenwahrheit. Die rasche und unaufhaltsame Proliferation von KI der letzten zehn Jahre spricht für sich, und längst ziehen auch Gesetzgeber und Regulierungsbehörden nach, um KI und ihre Technikfolgen einzuhegen. Für Deutschland relevante Gestaltungsanforderungen haben die High-Level Expert Group on Artificial Intelligence der Europäischen Kommission (HLEG AI) und auf nationaler Ebene die Datenethikkommission der Bundesregierung (DEK) und die Enquetekommission Künstliche Intelligenz des Deutschen Bundestags (EKKI) geäußert.
Dieses Einführungspapier ist als Orientierungshilfe zum Thema Künstliche Intelligenz (KI) (engl. Artifical Intelligence, AI) im DaF/DaZ-Kontext gedacht. Ausgehend von häufig gestellten Fragen enthält es grundsätzliche Informationen zu technischen und historischen Hintergründen, didaktisch-methodische Reflexionsanregungen sowie praktische Ideen zum Einsatz von KI im DaF/DaZ-Kontext.
Survival of patients with pediatric acute lymphoblastic leukemia (ALL) after allogeneic hematopoietic stem cell transplantation (allo-SCT) is mainly compromised by leukemia relapse, carrying dismal prognosis. As novel individualized therapeutic approaches are urgently needed, we performed whole-exome sequencing of leukemic blasts of 10 children with post–allo-SCT relapses with the aim of thoroughly characterizing the mutational landscape and identifying druggable mutations. We found that post–allo-SCT ALL relapses display highly diverse and mostly patient-individual genetic lesions. Moreover, mutational cluster analysis showed substantial clonal dynamics during leukemia progression from initial diagnosis to relapse after allo-SCT. Only very few alterations stayed constant over time. This dynamic clonality was exemplified by the detection of thiopurine resistance-mediating mutations in the nucleotidase NT5C2 in 3 patients’ first relapses, which disappeared in the post–allo-SCT relapses on relief of selective pressure of maintenance chemotherapy. Moreover, we identified TP53 mutations in 4 of 10 patients after allo-SCT, reflecting acquired chemoresistance associated with selective pressure of prior antineoplastic treatment. Finally, in 9 of 10 children’s post–allo-SCT relapse, we found alterations in genes for which targeted therapies with novel agents are readily available. We could show efficient targeting of leukemic blasts by APR-246 in 2 patients carrying TP53 mutations. Our findings shed light on the genetic basis of post–allo-SCT relapse and may pave the way for unraveling novel therapeutic strategies in this challenging situation.