Refine
H-BRS Bibliography
- yes (204) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (58)
- Fachbereich Wirtschaftswissenschaften (51)
- Fachbereich Ingenieurwissenschaften und Kommunikation (31)
- Fachbereich Angewandte Naturwissenschaften (23)
- Institute of Visual Computing (IVC) (20)
- Präsidium (17)
- Institut für funktionale Gen-Analytik (IFGA) (9)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (9)
- Fachbereich Sozialpolitik und Soziale Sicherung (8)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (7)
Document Type
- Article (76)
- Conference Object (52)
- Part of Periodical (19)
- Book (monograph, edited volume) (16)
- Part of a Book (16)
- Report (8)
- Contribution to a Periodical (4)
- Master's Thesis (3)
- Preprint (2)
- Working Paper (2)
Year of publication
- 2013 (204) (remove)
Keywords
- Lehrbuch (6)
- Corporate Social Responsibility (3)
- Education (2)
- Three-dimensional displays (2)
- Virtuelle Realität (2)
- ionic liquids (2)
- paper-derived ceramic (2)
- preceramic paper (2)
- 3D real-time echocardiography (1)
- 3D user interface (1)
Information reliability and automatic computation are two important aspects that are continuously pushing the Web to be more semantic. Information uploaded to the Web should be reusable and extractable automatically to other applications, platforms, etc. Several tools exist to explicitly markup Web content. The Web services may also have a positive role on the automatic processing of Web contents, especially when they act as flexible and agile agents. However, Web services themselves should be developed with semantics in mind. They should include and provide structured information to facilitate their use, reuse, composition, query, etc. In this chapter, the authors focus on evaluating state-of-the-art semantic aspects and approaches in Web services. Ultimately, this contributes to the goal of Web knowledge management, execution, and transfer.
BWL für Dummies
(2013)
Updating a shared data structure in a parallel program is usually done with some sort of high-level synchronization operation to ensure correctness and consistency. However, underlying synchronization instructions in a processor architecture are costly and rather limited in their scalability on larger multi-core/multi-processors systems. In this paper, we examine work queue operations where such costly atomic update operations are replaced with non-atomic modifiers (simple read+write). In this approach, we trade the exact amount of work with atomic operations against doing more and redundant work but without atomic operations and without violating the correctness of the algorithm. We show results for the application of this idea to the concrete scenario of parallel Breadth First Search (BFS) algorithms for undirected graphs on two large NUMA shared memory system with up to 64 cores.
Hochschulen beschäftigen sich mit der Frage, wie Lehrveranstaltungen und Forschungsprojekte durch den Einsatz digitaler Werkzeuge und Lernplattformen begleitet und ergänzt werden können. Bibliotheken spielen hierbei häufig eine tragende Rolle, auf ganz unterschiedliche Art und Weise: Als Content-Entwickler und -Provider, als Supportstelle oder als E-Learning-Beratungsstelle für Lehrende und Studierende.
Although most individuals who gamble do so without any adverse consequences, some individuals develop a recurrent, maladaptive pattern of gambling behaviour, often called pathological gambling or gambling disorder, that is associated with financial losses, disruption of family and interpersonal relationships, and co-occurring psychiatric disorders. Identifying whether different types of gambling modalities vary in their ability to lead to maladaptive patterns of gambling behaviour is essential to develop public policies that seek to balance access to gambling opportunities with minimizing risk for the potential adverse consequences of gambling behaviour. Until recently, assessing the risk potential of different types of gambling products was nearly impossible. ASTERIG, initially developed in Germany in 2006-2010, is an assessment tool to measure and to evaluate the risk potential of any gambling product based on scores on ten dimensions. In doing so, it also allows a comparison to be drawn between the addictive potential of different gambling products. Furthermore, the tool highlights where the specific risk potential of each specific gambling product lies. This makes it a valuable tool at the legislative, case law, and administrative levels as it allows the risk potential of individual gambling products to be identified and to be compared globally and across 10 different dimensions of risk potential. We note that specific gambling products should always be evaluated rather than product groups (lotteries, slot machines) or providers, as there may be variations among those product groups that impact their risk potential. For example, slot machines may vary on the amount of jackpot, which may influence their risk potential.
Switched power electronic subsystems are widely used in various applications. A fault in one of their components may have a significant effect on the system’s load or may even cause a damage. Therefore, it is important to detect and isolate faults and to report true faults to a supervisory system in order to avoid malfunction of or damage to a load. If, in a model-based approach to fault detection and isolation of hybrid systems, switching devices are considered as ideal switches then some equations must be reformulated whenever some devices have switched. In this paper, a fixed causality bond graph representation of hybrid system models is used, i.e., computational causalities assigned according to the Standard Causality Assignment Procedure (SCAP) are independent of system modes of operation. The latter are taken into account by transformer moduli mi(t) ∈ {0, 1} ∀t ≥ 0 in a unique set of equations of motion. In a case study, this approach is used for fault diagnosis in a three-phase full-wave rectifier. Residuals of Analytical Redundancy Relations (ARRs) holding for all modes of operations and serving as fault indicators are computed in an offline simulation as part of a DAE system by using a bond graph model of the faulty system instead of the real one and by coupling it to a bond graph of the healthy system by means of residual sinks.
Increased endothelin-1 decreases PKC alpha (PKCα), resulting in high miRNA 15a levels in kidney tumors. Breast cancer cells treated with ET-1, β-estrogen, Tamoxifen, Tamoxifen + β-estrogen and Tamoxifen + ET-1 were analysed regarding miRNA 15a expression. Significantly increased miRNA 15a levels were found after ET-1, becoming further increased in Tamoxifen + ET-1 treated cells. Our group already showed that miRNA 15a induces MAPK p38 splicing resulting in a truncated product called Mxi-2, whose function has yet to be defined in tumors. We described for the first time in ET-1 induced tumor cells that Mxi-2 builds a complex with Ago2, a miRNA binding protein, which is important for the localization of miRNAs to the 3′UTR of target genes. Furthermore, we show that Mxi-2/Ago2 is important for the interaction with the miRNA 1285 which binds to the 3′end of the tumor suppressor gene p53, being responsible for the downregulation of p53. Tissue arrays from breast cancer patients were performed, analysing Mxi-2, p53 and PKCα. Since the Mxi-2 levels increase in Tamoxifen + ET-1 treated cells, we claim that increasing ET-1 levels in Tamoxifen treated breast cancer patients are responsible for decreasing p53 levels. In summary, ET-1 decreases nuclear PKCα levels, while increasing the amount of miRNA 15a. This causes high levels of Mxi-2, necessary for complex formation with Ago2. The newly identified Mxi-2/Ago2 complex interacting with miRNA 1285 leads to increased 3′UTR p53 interaction, resulting in decreased p53 levels and subsequent tumor progression. This newly identified mechanism is a possible explanation for the development of ET-1 induced tumors.
This paper examines how students learn to collaborate in English by participating in an intercultural project that focuses on teaching students to work together on a digital writing project using various online tools, and participated in this digital collaboration project. Mixed groups of students, two French and two German, used several synchronous and asynchronous tools to communicate with their counterparts (Facebook, WordPress blog, WIMS e-learning platform, email, videoconferencing). Students had to produce an article together, comparing French and German attitudes about a topic they negotiated freely in their groups. Before publishing their post, students were expected to peer-review the article written by their group. Once published, the stage consisted of voting for the best posts on the e-learning platform, WIMS. A videoconference was also organized to create cohesion between the participants. The result of the student evaluations, together with the administrative, technical vastly differing university setups is presented.
The BRICS component model: a model-based development paradigm for complex robotics software systems
(2013)
Internet-Ökonomie
(2013)
Angewandte Makroökonomie
(2013)
Makroökonomische Ereignisse wie die Schuldenkrise, Rezession, Arbeitslosigkeit und Inflation haben nicht nur gesamtwirtschaftliche Konsequenzen, sondern auch vielfältige Berührungspunkte zum täglichen Leben. Diese Ereignisse sind häufig komplex und für den Einzelnen nicht immer leicht zu durchschauen. Um Studierende auf die globalen Herausforderungen von Wirtschaft, Gesellschaft und Umwelt vorzubereiten ist in diesem Lehrbuch explizit auch das Thema der nachhaltigen Entwicklung integriert. Außerdem werden die großen Themen der Makroökonomie teilweise gebündelt behandelt, um die vielfältigen Zusammenhänge zwischen den einzelnen Gebieten transparenter zu gestalten. Dies hat für Studierende und Lehrende u.a. den Vorteil, dass eine modulare Verwendung möglich ist.
Controlling
(2013)
Eine zielorientierte Steuerung und die Erhöhung der Transparenz gehören zu den zentralen Aufgaben des Management in Unternehmen und in nichtkommerziellen Organisationen. Vor dem Hintergrund der zunehmenden Umweltdynamik und einer hohen Komplexität, in denen Organisationen heute agieren, gewinnt ein modernes und zielorientiertes Controlling zur Erfüllung der anspruchsvollen Steuerungsaufgaben zunehmend an Bedeutung.
Earth’s nearest candidate supermassive black hole lies at the centre of the Milky Way1. Its electromagnetic emission is thought to be powered by radiatively inefficient accretion of gas from its environment2, which is a standard mode of energy supply for most galactic nuclei. X-ray measurements have already resolved a tenuous hot gas component from which the black hole can be fed3. The magnetization of the gas, however, which is a crucial parameter determining the structure of the accretion flow, remains unknown. Strong magnetic fields can influence the dynamics of accretion, remove angular momentum from the infalling gas4, expel matter through relativistic jets5 and lead to synchrotron emission such as that previously observed6, 7, 8. Here we report multi-frequency radio measurements of a newly discovered pulsar close to the Galactic Centre9, 10, 11, 12 and show that the pulsar’s unusually large Faraday rotation (the rotation of the plane of polarization of the emission in the presence of an external magnetic field) indicates that there is a dynamically important magnetic field near the black hole. If this field is accreted down to the event horizon it provides enough magnetic flux to explain the observed emission—from radio to X-ray wavelengths—from the black hole.
Radio pulsars in relativistic binary systems are unique tools to study the curved space-time around massive compact objects. The discovery of a pulsar closely orbiting the super-massive black hole at the centre of our Galaxy, Sgr A⋆, would provide a superb test-bed for gravitational physics. To date, the absence of any radio pulsar discoveries within a few arc minutes of Sgr A⋆ has been explained by one principal factor: extreme scattering of radio waves caused by inhomogeneities in the ionized component of the interstellar medium in the central 100 pc around Sgr A⋆. Scattering, which causes temporal broadening of pulses, can only be mitigated by observing at higher frequencies. Here we describe recent searches of the Galactic centre region performed at a frequency of 18.95 GHz with the Effelsberg radio telescope.
Produktionswirtschaft
(2013)
We derive rates of convergence for limit theorems that reveal the intricate structure of the phase transitions in a mean-field version of the Blume-Emery-Griffith model. The theorems consist of scaling limits for the total spin. The model depends on the inverse temperature β and the interaction strength K. The rates of convergence results are obtained as (β,K) converges along appropriate sequences (βn,Kn) to points belonging to various subsets of the phase diagram which include a curve of second-order points and a tricritical point. We apply Stein's method for normal and non-normal approximation avoiding the use of transforms and supplying bounds, such as those of Berry-Esseen quality, on approximation error. We observe an additional phase transition phenomenon in the sense that depending on how fast Kn and βn are converging to points in various subsets of the phase diagram, different rates of convergences to one and the same limiting distribution occur.
Benchmarking
(2013)
Steigender Kostendruck und der Zwang zur Verbesserung der Wirtschaftlichkeit sind für Mitarbeiter im Gesundheitswesen Realität geworden. Viele Einrichtungen müssen ihre Arbeitsabläufe optimieren und gleichzeitig die Qualität ihrer Leistungen verbessern. Das Buch erläutert Grundlagen und ausgewählte Methoden des Prozessmanagements, die in Einrichtungen des Gesundheitswesens (Krankenhäuser, Krankenkassen und Krankenversicherungen) eingesetzt werden können.
Unter Big Data wird die Echtzeitverarbeitung sehr großer Datenmengen für analytische Aufgaben verstanden. Neue Technologien und Methoden machen es möglich, bislang nicht genutzte Analyseaufgaben in kürzester Zeit durchzuführen. Der Beitrag zeigt, wie Big Data in den betriebswirtschaftlichen Kontext einzuordnen ist, welche Chancen den Unternehmen offen stehen und welche Auswirkungen dies für den Controller haben wird.
Die Klausur
(2013)
Die Aufgaben sind Teil einer Klausur, die im Sommersemester 2013 von Prof. Dr.
Andreas Gadatsch (Hochschule Bonn-Rhein-Sieg) im Prüfungsfach IT-Controlling
für Wirtschaftswissenschaftler des Master-Studiengangs Informations- und
Innovationsmanagement gestellt wurde. Bearbeitungszeit der gesamten Klausur:
90 Minuten.
Cloud-Services gelten als zukunftsträchtiges Konzept zur Bereitstellung von IT-Leistungen und verdrängen zunehmend klassische IT-Outsourcing-Modelle. Allerdings weichen noch viele Unternehmen auf traditionelle Bereitstellungskonzepte (z. B. Betrieb eines eigenen Rechenzentrums, Klassisches Outsourcing) aus. Ein Grund hierfür ist das fehlende Vertrauen in die „Cloud“. Der Kunde kann in der Regel die möglichen Risiken und das Sicherheitsniveau des Anbieters nicht beurteilen, da er meist auch keine direkte Einsicht in die Abläufe und Strukturen des Dienstleisters hat (Winkelmann, 2010).
Die weitere Verbreitung von Cloud-Diensten hängt stark davon ab, ob die Leistungsabnehmer ausreichend Vertrauen in die IT-Dienstleister, die technische Infrastruktur und das gesamte organisatorisch-rechtliche Umfeld (z.B. auch den jeweiligen Rechtsschutz, Anspruchs- und Klagemöglichkeiten) haben. Eine Möglichkeit, diese Situation im Sinne einer Entscheidungshilfe zu verbessern ist es, ein neutrales „Zertifikat für ITSicherheit von Cloud-Services“ zu nutzen. Im Rahmen der weiteren Vertrauensbildung in die „Cloud“ ist der unabhängigen Zertifizierung von Cloud-Services eine erhebliche Bedeutung beizumessen.
Dieses White-Paper beschreibt typische Kategorien von Cloud-Services, thematisiert die hohe Bedeutung des Vertrauens in die Cloud und geht auf Anforderungen an eine „vertrauenswürdige Zertifizierung von Cloud-Dienstleistungen“ (Trusted Cloud-Services) ein.
Abschließend wird ein praxisreifes Lösungsangebot zur Zertifizierung von Cloud-Diensten vorgestellt, welches die zuvor beschriebenen Anforderungen erfüllen kann.
IT-gestütztes IT-Controlling
(2013)
Die Greenpocket GmbH hat in Kooperation mit der Hochschule Bonn-Rhein-Sieg eine Studie zum Thema Smart Home durchgeführt. Wie die Onlinebefragung zeigt, ist die Zahlungsbereitschaft für Smart-Home-Lösungen bei jungen, technikaffinen Verbrauchern, den Digital Natives, höher ls bisher angenommen. Insgesamt umfasste der Online-Fragebogen 32 Fragen. Teile der Befragung wurden nach dem Kano-Modell konzipiert. Zu den Ergebnissen der Befragung: Der Steuerung über das Smartphone kommt eine wichtige Rolle zu. Insgesamt ist die Qualität wichtiger als ein niedriger Preis. Langfristig sollte ein Smart Home die Gewohnheiten des Nutzers automatisch erkennen und berücksichtigen können. Die Anbindung an soziale Netzwerke wird hingegen kritisch beurteilt. Als wünschenswert wird die Steuerung von Haushalts- und Elektrogeräten in Abhängigkeit vom Strompreis empfunden.
Power train models are required to simulate hence predict energy consumption of vehicles. Efficiencies for different components in power train are required. Common procedures use digitalised shell models (or maps) to model the efficiency of Internal Combustion Engines (ICE) and manual gearboxes (MG). Errors are connected with these models and affect the accuracy of the calculation. The accuracy depends on the configuration of the simulation, the digitalisation of the data and the data used. This paper evaluates these sources of error. The understanding of the source of error can improve the results of the modelling by more than eight percent.
Die Vorstandsperspektive
(2013)
Als Basis für Simulationen innerhalb virtueller Umgebungen werden meist unterliegende Semantiken benötigt. Im Fall von Verkehrssimulationen werden in der Regel definierte Verkehrsnetzwerke genutzt. Die Erstellung dieser Netzwerke wird meist per Hand durchgeführt, wodurch sie fehleranfällig ist und viel Zeit erfordert. Dieses Projekt wurde im Rahmen des AVeSi Projektes durchgeführt, in dem an der Entwicklung einer realistischen Verkehrssimulation für virtuelle Umgebung geforscht wird. Der im Projekt angestrebte Simulationsansatz basiert auf der Nutzung von zwei Komplexitätsebenen – einer mikroskopischen und einer mesoskopischen. Um einen Übergang zwischen den Simulationsebenen zu realisieren ist eine Verknüpfung der Verkehrsnetzwerke notwendig, was ebenfalls mit einem hohen Zeitaufwand verbunden ist. In diesem Bericht werden Modelle für Verkehrsnetzwerke beider Ebenen vorgestellt. Anschließend wird ein Ansatz beschrieben, der eine automatische Generierung und Verknüpfung von Verkehrsnetzwerken beider Modelle ermöglicht. Als Grundlage für die Generierung der Netzwerke dienen Daten im OpenDRIVE®-Format. Zur Evaluierung wurden wirklichkeitsgetreue OpenStreetMap-Daten, durch Verwendung einer Drittanbietersoftware, in OpenDRIVE®-Datensätze überführt. Es konnte nachgewiesen werden, dass es durch den Ansatz möglich ist, innerhalb weniger Minuten, große Verkehrsnetzwerke zu erzeugen, auf denen unmittelbar Simulationen ausgeführt werden können. Die Qualität der zur Evaluierung generierten Netzwerke reicht jedoch für Umgebungen, in denen ein hoher Realitätsgrad gefordert wird, nicht aus, was einen zusätzlichen Bearbeitungsschritt notwendig macht. Die Qualitätsprobleme konnten darauf zurückgeführt werden, dass der Detailgrad, der den Evaluierungsdaten zu Grunde liegenden OpenStreetMap-Daten, nicht hoch genug und der Überführungsprozess nicht ausreichend transparent ist.
Traffic simulations are generally used to forecast traffic behavior or to simulate non-player characters in computer games and virual environments. These systems are usually modeled in such a way that traffic rules are strictly followed. However, rule violations are a common part of real-life traffic and thus should be integrated into such models.
Real-Time Simulation of Camera Errors and Their Effect on Some Basic Robotic Vision Algorithms
(2013)
Computers will soon be powerful enough to simulate consciousness. The artificial life community should start to try to understand how consciousness could be simulated. The proposal is to build an artificial life system in which consciousness might be able to evolve. The idea is to develop internet-wide artificial universe in which the agents can evolve. Users play games by defining agents that form communities. The communities have to perform tasks, or compete, or whatever the specific game demands. The demands should be such that agents that are more aware of their universe are more likely to succeed. The agents reproduce and evolve within their user’s machine, but can also sometimes transfer to other machine across the internet. Users will be able to choose the capabilities of their agents from a fixed list, but may also write their own powers for their agents.
This work extends the affordance-inspired robot control architecture introduced in the MACS project [35] and especially its approach to integrate symbolic planning systems given in [24] by providing methods to automated abstraction of affordances to high-level operators. It discusses how symbolic planning instances can be generated automatically based on these operators and introduces an instantiation method to execute the resulting plans. Preconditions and effects of agent behaviour are learned and represented in Gärdenfors conceptual spaces framework. Its notion of similarity is used to group behaviours to abstract operators based on the affordance-inspired, function-centred view on the environment. Ways on how the capabilities of conceptual spaces to map subsymbolic to symbolic representations to generate PDDL planning domains including affordance-based operators are discussed. During plan execution, affordance-based operators are instantiated by agent behaviour based on the situation directly before its execution. The current situation is compared to past ones and the behaviour that has been most successful in the past is applied. Execution failures can be repaired by action substitution. The concept of using contexts to dynamically change dimension salience as introduced by Gärdenfors is realized by using techniques from the field of feature selection. The approach is evaluated using a 3D simulation environment and implementations of several object manipulation behaviours.
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
The device (10) has a handrail (18) provided with an optical contactless monitoring device formed as an active sensor system, where the monitoring device is arranged in a region of a guide (14) of the handrail at a front base (16) of an escalator (12) or a moving pavement. The monitoring device has two transmission paths (28, 30) with wavelength bands that are different from each other, where one of the paths includes the handrail. Ratio or difference between signals of the paths is used for recognizing foreign bodies e.g. hands of adults and children.
The simulation of fluid flows is of importance to many fields of application, especially in industry and infrastructure. The modelling equations applied describe a coupled system of non-linear, hyperbolic partial differential equations given by one-dimensional shallow water equations that enable the consistent implementation of free surface flows in open channels as well as pressurised flows in closed pipes. The numerical realisation of these equations is complicated and challenging to date due to their characteristic properties that are able to cause discontinuous solutions.
Application performance improvements through VM parameter modification after runtime analysis
(2013)
Im gemeinsamen Verbundprojekt analysierte das IZNE die Wahrnehmung gesundheitlicher und finanzieller Wertschöpfungsaspekte des betrieblichen Mobilitätsmanagements (BMM). Hierzu wurden 178 Betriebe schriftlich und 22 Betriebsleiter in persönlichen Interviews zu Maßnahmen der betrieblichen Gesundheitsförderung (BGF) sowie 1.341 Arbeitnehmer aus 14 Unternehmen im Raum Bonn zu ihrem Mobilitätsverhalten befragt. Die Einschätzung der tatsächlichen Existenz und des gesundheitlichen und wirtschaftlichen Nutzens des BMM sollte Bedarf und Optimierungspotentiale erkennbar machen.
Grailog embodies a systematics to visualize knowledge sources by graphical elements. Its main benefit is that the resulting visual presentations are easier to read for humans than the original symbolic source code. In this paper we introduce a methodology to handle the mapping from Datalog RuleML, serialized in XML, to an SVG representation of Grailog, also serialized in XML, via eXtensible Stylesheet Language Transformations (XSLT) 2.0/XML; the SVG is then rendered visually by modern Web browsers. This initial mapping is realized to target Grailog's "fully node copied" normal form. Elements can thus be translated one at a time, separating the fundamental Datalog-to-SVG translation concern from the concern of merging node copies for optimal (hyper)graph layout and avoiding its high computational complexity in this online tool. The resulting open source Grailog Knowledge-Source Visualizer (Grailog KS Viz) supports Datalog RuleML with positional relations of arity n>1. The on-the-fly transformation was shown to run on all recent major Web browsers and should be easy to understand, use, and extend.
Demografischer Wandel und einhergehende Probleme wie Fachkräftemangel, alternde Belegschaft und ein kontinuierlicher Know-How-Verlust, sind keine Fremdworte mehr. Anders verhält es sich mit möglichen Lösungswegen. Frau Kramer stellt in ihrem Werk ein mögliches Lösungskonzept, die lebensphasenorientierten Personalpolitik vor. Sie zeigt anhand praxisnaher Beispiele zwei unterschiedliche Konzepte auf.Die Idee dahinter: Die traditionelle begrenzte Sichtweise der Personalpolitik mit dem Fokus auf die ersten 20 Jahre des Berufslebens wird bei der lebensphasenorientierten Personalpolitik um die gesamte Lebensarbeitszeit erweitert.
Distributed systems comprise distributed computing systems, distributed information systems, and distributed pervasive systems. They are often very complex and their implementation is challenging. Intensive and continuous testing is indispensable to ensure reliability and high quality of a distributed system. The testing process should have a high degree of automation, not only on lower levels (i.e. unit and module testing), but also on higher testing levels (e.g. system, integration, and acceptance tests). To achieve automation on higher testing levels virtual infrastructure components (e.g. virtual machines, virtual networks) that are offered as a Service (IaaS) can be employed. The elasticity of on-demand computation resources fits well together with the varying resource demands of automated test execution.
A methodology for automated acceptance testing of distributed systems that uses virtual infrastructure is presented. It is founded on a task-oriented model that is used to abstract concurrency and asynchronous, remote communication in distributed systems. The model is used as groundwork for a domain-specific language that allows expressing tests for distributed systems in the form of scenarios. On the one hand, test scenarios are executable and, therefore, fully automated. On the other hand, test scenarios represent requirements to the system under test making an automated, example-based verification possible.
A prototypical implementation is used to apply the developed methodology in the context of two different case studies. The first case study uses RCE as an example of a distributed, workflow-driven integration environment for scientific computing. The second one uses MongoDB as an example of a document-oriented database system that offers distributed data storage through master-slave replication. The results of the experimental evaluation indicate that the developed acceptance testing methodology is a useful approach to design, build, and execute tests for distributed systems with high quality and a high degree of automation.
Realism and plausibility of computer controlled entities in entertainment software have been enhanced by adding both static personalities and dynamic emotions. Here a generic model is introduced which allows the transfer of findings from real-life personality studies to a computational model. This information is used for decision making. The introduction of dynamic event-based emotions enables adaptive behavior patterns. The advantages of this new model have been validated with a four-way crossroad in a traffic simulation. Driving agents using the introduced model enhanced by dynamics were compared to agents based on static personality profiles and simple rule-based behavior. It has been shown that adding an adaptive dynamic factor to agents improves perceivable plausibility and realism. It also supports coping with extreme situations in a fair and understandable way.
Unternehmen sind in verstärktem Ausmaß Veränderungen der Umwelt ausgesetzt. Mit zunehmender Globalisierung und der wachsenden Komplexität der Wertschöpfungsnetze steigt die Anzahl der Veränderungen, ihre Geschwindigkeit und Stärke und zwingt die Unternehmen schneller und öfter zu zeitnahen (Re-)Aktionen. Die Planung erlaubt es mögliche Entwicklungen gedanklich vorwegzunehmen und frühzeitig rationale richtungweisende Entscheidungen zu treffen, Potenziale zu erschließen, zu sichern und zu nutzen sowie Risiken zu erkennen, zu analysieren und zu minimieren. Die Kontrolle der Entscheidungen erlaubt eine Ursachenanalyse der Abweichungen zwischen Planung und Realität. Kontrolle bildet damit die Basis für Lernprozesse in Unternehmen.
The criteria for assessing the quality of rubber materials are the polymer or copolymer composition and the additives. These additives include plasticizers, extender oils, carbon black, inorganic fillers, antioxidants, heat and light stabilizers, processing aids, cross-linking agents, accelerators, retarders, adhesives, pigments, smoke and flame retardants, and others. Determination of additives in polymers or copolymers generally requires the extraction of these substances from the matrix as a first step, which can be challenging, and the subsequent analysis of the extracted additives by gas chromatography (GC), GC-mass spectrometry (MS), high performance liquid chromatography (HPLC), HPLC-MS, capillary electrophoresis, thin-layer chromatography, and other analytical techniques. In the present work, nitrile rubber materials were studied using direct analytical flash pyrolysis hyphenated to GC and electrospray ionization MS in both scan and selected ion monitoring modes to demonstrate that this technique is a good tool to identify the organic additives in nitrile rubber.
Improving Robustness of Task Execution Against External Faults Using Simulation Based Approach
(2013)
Robots interacting in complex and cluttered environments may face unexpected situations referred to as external faults which prohibit the successful completion of their tasks. In order to function in a more robust manner, robots need to recognise these faults and learn how to deal with them in the future. We present a simulation-based technique to avoid external faults occurring during execusion releasing actions of a robot. Our technique utilizes simulation to generate a set of labeled examples which are used by a histogram algorithm to compute a safe region. A safe region consists of a set of releasing states of an object that correspond to successful performances of the action. This technique also suggests a general solution to avoid the occurrence of external faults for not only the current, observable object but also for any other object of the same shape but different size.