Refine
Departments, institutes and facilities
- Fachbereich Informatik (58)
- Fachbereich Wirtschaftswissenschaften (51)
- Fachbereich Ingenieurwissenschaften und Kommunikation (31)
- Fachbereich Angewandte Naturwissenschaften (23)
- Institut für funktionale Gen-Analytik (IFGA) (20)
- Institute of Visual Computing (IVC) (20)
- Präsidium (17)
- Institut für Cyber Security & Privacy (ICSP) (15)
- Institut für Verbraucherinformatik (IVI) (9)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (9)
Document Type
- Article (103)
- Conference Object (71)
- Part of a Book (24)
- Book (monograph, edited volume) (21)
- Part of Periodical (19)
- Report (8)
- Lecture (6)
- Contribution to a Periodical (4)
- Master's Thesis (4)
- Conference Proceedings (2)
Year of publication
- 2013 (272) (remove)
Keywords
- Lehrbuch (6)
- Corporate Social Responsibility (3)
- Amiloride (2)
- Education (2)
- Internet (2)
- Mal d 1 (2)
- Molecular dynamics (2)
- Qualitätsmanagement (2)
- Three-dimensional displays (2)
- Virtuelle Realität (2)
Benchmarking
(2013)
Die Klausur
(2013)
Die Aufgaben sind Teil einer Klausur, die im Sommersemester 2013 von Prof. Dr.
Andreas Gadatsch (Hochschule Bonn-Rhein-Sieg) im Prüfungsfach IT-Controlling
für Wirtschaftswissenschaftler des Master-Studiengangs Informations- und
Innovationsmanagement gestellt wurde. Bearbeitungszeit der gesamten Klausur:
90 Minuten.
Steigender Kostendruck und der Zwang zur Verbesserung der Wirtschaftlichkeit sind für Mitarbeiter im Gesundheitswesen Realität geworden. Viele Einrichtungen müssen ihre Arbeitsabläufe optimieren und gleichzeitig die Qualität ihrer Leistungen verbessern. Das Buch erläutert Grundlagen und ausgewählte Methoden des Prozessmanagements, die in Einrichtungen des Gesundheitswesens (Krankenhäuser, Krankenkassen und Krankenversicherungen) eingesetzt werden können.
Unter Big Data wird die Echtzeitverarbeitung sehr großer Datenmengen für analytische Aufgaben verstanden. Neue Technologien und Methoden machen es möglich, bislang nicht genutzte Analyseaufgaben in kürzester Zeit durchzuführen. Der Beitrag zeigt, wie Big Data in den betriebswirtschaftlichen Kontext einzuordnen ist, welche Chancen den Unternehmen offen stehen und welche Auswirkungen dies für den Controller haben wird.
Cloud-Services gelten als zukunftsträchtiges Konzept zur Bereitstellung von IT-Leistungen und verdrängen zunehmend klassische IT-Outsourcing-Modelle. Allerdings weichen noch viele Unternehmen auf traditionelle Bereitstellungskonzepte (z. B. Betrieb eines eigenen Rechenzentrums, Klassisches Outsourcing) aus. Ein Grund hierfür ist das fehlende Vertrauen in die „Cloud“. Der Kunde kann in der Regel die möglichen Risiken und das Sicherheitsniveau des Anbieters nicht beurteilen, da er meist auch keine direkte Einsicht in die Abläufe und Strukturen des Dienstleisters hat (Winkelmann, 2010).
Die weitere Verbreitung von Cloud-Diensten hängt stark davon ab, ob die Leistungsabnehmer ausreichend Vertrauen in die IT-Dienstleister, die technische Infrastruktur und das gesamte organisatorisch-rechtliche Umfeld (z.B. auch den jeweiligen Rechtsschutz, Anspruchs- und Klagemöglichkeiten) haben. Eine Möglichkeit, diese Situation im Sinne einer Entscheidungshilfe zu verbessern ist es, ein neutrales „Zertifikat für ITSicherheit von Cloud-Services“ zu nutzen. Im Rahmen der weiteren Vertrauensbildung in die „Cloud“ ist der unabhängigen Zertifizierung von Cloud-Services eine erhebliche Bedeutung beizumessen.
Dieses White-Paper beschreibt typische Kategorien von Cloud-Services, thematisiert die hohe Bedeutung des Vertrauens in die Cloud und geht auf Anforderungen an eine „vertrauenswürdige Zertifizierung von Cloud-Dienstleistungen“ (Trusted Cloud-Services) ein.
Abschließend wird ein praxisreifes Lösungsangebot zur Zertifizierung von Cloud-Diensten vorgestellt, welches die zuvor beschriebenen Anforderungen erfüllen kann.
IT-gestütztes IT-Controlling
(2013)
Die Greenpocket GmbH hat in Kooperation mit der Hochschule Bonn-Rhein-Sieg eine Studie zum Thema Smart Home durchgeführt. Wie die Onlinebefragung zeigt, ist die Zahlungsbereitschaft für Smart-Home-Lösungen bei jungen, technikaffinen Verbrauchern, den Digital Natives, höher ls bisher angenommen. Insgesamt umfasste der Online-Fragebogen 32 Fragen. Teile der Befragung wurden nach dem Kano-Modell konzipiert. Zu den Ergebnissen der Befragung: Der Steuerung über das Smartphone kommt eine wichtige Rolle zu. Insgesamt ist die Qualität wichtiger als ein niedriger Preis. Langfristig sollte ein Smart Home die Gewohnheiten des Nutzers automatisch erkennen und berücksichtigen können. Die Anbindung an soziale Netzwerke wird hingegen kritisch beurteilt. Als wünschenswert wird die Steuerung von Haushalts- und Elektrogeräten in Abhängigkeit vom Strompreis empfunden.
Power train models are required to simulate hence predict energy consumption of vehicles. Efficiencies for different components in power train are required. Common procedures use digitalised shell models (or maps) to model the efficiency of Internal Combustion Engines (ICE) and manual gearboxes (MG). Errors are connected with these models and affect the accuracy of the calculation. The accuracy depends on the configuration of the simulation, the digitalisation of the data and the data used. This paper evaluates these sources of error. The understanding of the source of error can improve the results of the modelling by more than eight percent.
BACKGROUND
Metabolic control and dietary management of patients with phenylketonuria (PKU) are based on single blood samples obtained at variable intervals. Sampling conditions are often not well-specified and intermittent variation of phenylalanine concentrations between two measurements remains unknown. We determined phenylalanine and tyrosine concentrations in blood over 24 hours. Additionally, the impact of food intake and physical exercise on phenylalanine and tyrosine concentrations was examined. Subcutaneous microdialysis was evaluated as a tool for monitoring phenylalanine and tyrosine concentrations in PKU patients.
METHODS
Phenylalanine and tyrosine concentrations of eight adult patients with PKU were determined at 60 minute intervals in serum, dried blood and subcutaneous microdialysate and additionally every 30 minutes postprandially in subcutaneous microdialysate. During the study period of 24 hours individually tailored meals with defined phenylalanine and tyrosine contents were served at fixed times and 20 min bicycle-ergometry was performed.
RESULTS
Serum phenylalanine concentrations showed only minor variations while tyrosine concentrations varied significantly more over the 24-hour period. Food intake within the patients' individual diet had no consistent effect on the mean phenylalanine concentration but the tyrosine concentration increased up to 300% individually. Mean phenylalanine concentration remained stable after short-term bicycle-exercise whereas mean tyrosine concentration declined significantly. Phenylalanine and tyrosine concentrations in dried blood were significantly lower than serum concentrations. No close correlation has been found between serum and microdialysis fluid for phenylalanine and tyrosine concentrations.
CONCLUSIONS
Slight diurnal variation of phenylalanine concentrations in serum implicates that a single blood sample does reliably reflect the metabolic control in this group of adult patients. Phenylalanine concentrations determined by subcutaneous microdialysis do not correlate with the patients' phenylalanine concentrations in serum/blood.
BACKGROUND
Propionic acidemia is an inherited disorder caused by deficiency of propionyl-CoA carboxylase. Although it is one of the most frequent organic acidurias, information on the outcome of affected individuals is still limited.
STUDY DESIGN/METHODS
Clinical and outcome data of 55 patients with propionic acidemia from 16 European metabolic centers were evaluated retrospectively. 35 patients were diagnosed by selective metabolic screening while 20 patients were identified by newborn screening. Endocrine parameters and bone age were evaluated. In addition, IQ testing was performed and the patients' and their families' quality of life was assessed.
RESULTS
The vast majority of patients (>85%) presented with metabolic decompensation in the neonatal period. Asymptomatic individuals were the exception. About three quarters of the study population was mentally retarded, median IQ was 55. Apart from neurologic symptoms, complications comprised hematologic abnormalities, cardiac diseases, feeding problems and impaired growth. Most patients considered their quality of life high. However, according to the parents' point of view psychic problems were four times more common in propionic acidemia patients than in healthy controls.
CONCLUSION
Our data show that the outcome of propionic acidemia is still unfavourable, in spite of improved clinical management. Many patients develop long-term complications affecting different organ systems. Impairment of neurocognitive development is of special concern. Nevertheless, self-assessment of quality of life of the patients and their parents yielded rather positive results.
Ornithine transcarbamylase (OTC) deficiency is the most common urea cycle defect. The clinical presentation in female manifesting carriers varies both in onset and severity. We report on a female with insulin dependent diabetes mellitus and recurrent episodes of hyperammonemia. Since OTC activity measured in a liver biopsy sample was within normal limits, OTC deficiency was initially excluded from the differential diagnoses of hyperammonemia. Due to moderately elevated homocitrulline excretion, hyperornithinemia-hyperammonemia-homocitrullinuria-syndrome was suggested, but further assays in fibroblasts showed normal ornithine utilization. Later, when mutation analysis of the OTC gene became available, a known pathogenic missense mutation (c.533C>T) in exon 5 leading to an exchange of threonine-178 by methionine (p.Thr178Met) was detected. Skewed X-inactivation was demonstrated in leukocyte DNA. In the further clinical course the girl developed marked obesity. By initiating physical activities twice a week, therapeutic control of both diabetes and OTC deficiency improved, but obesity persisted. In conclusion, our case confirms that normal hepatic OTC enzyme activity measured in a single liver biopsy sample does not exclude a clinical relevant mosaic of OTC deficiency because of skewed X-inactivation. Mutation analysis of the OTC gene in whole blood may be a simple way to establish the diagnosis of OTC deficiency. The joint occurrence of OTC deficiency and diabetes in a patient has not been reported before.
Die Vorstandsperspektive
(2013)
Als Basis für Simulationen innerhalb virtueller Umgebungen werden meist unterliegende Semantiken benötigt. Im Fall von Verkehrssimulationen werden in der Regel definierte Verkehrsnetzwerke genutzt. Die Erstellung dieser Netzwerke wird meist per Hand durchgeführt, wodurch sie fehleranfällig ist und viel Zeit erfordert. Dieses Projekt wurde im Rahmen des AVeSi Projektes durchgeführt, in dem an der Entwicklung einer realistischen Verkehrssimulation für virtuelle Umgebung geforscht wird. Der im Projekt angestrebte Simulationsansatz basiert auf der Nutzung von zwei Komplexitätsebenen – einer mikroskopischen und einer mesoskopischen. Um einen Übergang zwischen den Simulationsebenen zu realisieren ist eine Verknüpfung der Verkehrsnetzwerke notwendig, was ebenfalls mit einem hohen Zeitaufwand verbunden ist. In diesem Bericht werden Modelle für Verkehrsnetzwerke beider Ebenen vorgestellt. Anschließend wird ein Ansatz beschrieben, der eine automatische Generierung und Verknüpfung von Verkehrsnetzwerken beider Modelle ermöglicht. Als Grundlage für die Generierung der Netzwerke dienen Daten im OpenDRIVE®-Format. Zur Evaluierung wurden wirklichkeitsgetreue OpenStreetMap-Daten, durch Verwendung einer Drittanbietersoftware, in OpenDRIVE®-Datensätze überführt. Es konnte nachgewiesen werden, dass es durch den Ansatz möglich ist, innerhalb weniger Minuten, große Verkehrsnetzwerke zu erzeugen, auf denen unmittelbar Simulationen ausgeführt werden können. Die Qualität der zur Evaluierung generierten Netzwerke reicht jedoch für Umgebungen, in denen ein hoher Realitätsgrad gefordert wird, nicht aus, was einen zusätzlichen Bearbeitungsschritt notwendig macht. Die Qualitätsprobleme konnten darauf zurückgeführt werden, dass der Detailgrad, der den Evaluierungsdaten zu Grunde liegenden OpenStreetMap-Daten, nicht hoch genug und der Überführungsprozess nicht ausreichend transparent ist.
Traffic simulations are generally used to forecast traffic behavior or to simulate non-player characters in computer games and virual environments. These systems are usually modeled in such a way that traffic rules are strictly followed. However, rule violations are a common part of real-life traffic and thus should be integrated into such models.
Real-Time Simulation of Camera Errors and Their Effect on Some Basic Robotic Vision Algorithms
(2013)
Computers will soon be powerful enough to simulate consciousness. The artificial life community should start to try to understand how consciousness could be simulated. The proposal is to build an artificial life system in which consciousness might be able to evolve. The idea is to develop internet-wide artificial universe in which the agents can evolve. Users play games by defining agents that form communities. The communities have to perform tasks, or compete, or whatever the specific game demands. The demands should be such that agents that are more aware of their universe are more likely to succeed. The agents reproduce and evolve within their user’s machine, but can also sometimes transfer to other machine across the internet. Users will be able to choose the capabilities of their agents from a fixed list, but may also write their own powers for their agents.
BACKGROUND
Hyperlysinemia is an autosomal recessive inborn error of L-lysine degradation. To date only one causal mutation in the AASS gene encoding α-aminoadipic semialdehyde synthase has been reported. We aimed to better define the genetic basis of hyperlysinemia.
METHODS
We collected the clinical, biochemical and molecular data in a cohort of 8 hyperlysinemia patients with distinct neurological features.
RESULTS
We found novel causal mutations in AASS in all affected individuals, including 4 missense mutations, 2 deletions and 1 duplication. In two patients originating from one family, the hyperlysinemia was caused by a contiguous gene deletion syndrome affecting AASS and PTPRZ1.
CONCLUSIONS
Hyperlysinemia is caused by mutations in AASS. As hyperlysinemia is generally considered a benign metabolic variant, the more severe neurological disease course in two patients with a contiguous deletion syndrome may be explained by the additional loss of PTPRZ1. Our findings illustrate the importance of detailed biochemical and genetic studies in any hyperlysinemia patient.
This work extends the affordance-inspired robot control architecture introduced in the MACS project [35] and especially its approach to integrate symbolic planning systems given in [24] by providing methods to automated abstraction of affordances to high-level operators. It discusses how symbolic planning instances can be generated automatically based on these operators and introduces an instantiation method to execute the resulting plans. Preconditions and effects of agent behaviour are learned and represented in Gärdenfors conceptual spaces framework. Its notion of similarity is used to group behaviours to abstract operators based on the affordance-inspired, function-centred view on the environment. Ways on how the capabilities of conceptual spaces to map subsymbolic to symbolic representations to generate PDDL planning domains including affordance-based operators are discussed. During plan execution, affordance-based operators are instantiated by agent behaviour based on the situation directly before its execution. The current situation is compared to past ones and the behaviour that has been most successful in the past is applied. Execution failures can be repaired by action substitution. The concept of using contexts to dynamically change dimension salience as introduced by Gärdenfors is realized by using techniques from the field of feature selection. The approach is evaluated using a 3D simulation environment and implementations of several object manipulation behaviours.
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
The device (10) has a handrail (18) provided with an optical contactless monitoring device formed as an active sensor system, where the monitoring device is arranged in a region of a guide (14) of the handrail at a front base (16) of an escalator (12) or a moving pavement. The monitoring device has two transmission paths (28, 30) with wavelength bands that are different from each other, where one of the paths includes the handrail. Ratio or difference between signals of the paths is used for recognizing foreign bodies e.g. hands of adults and children.
In Software development, the always beta principle is used to successfully develop innovation based on early and continuous user feedback. In this paper we discuss how this principle could be adapted to the special needs of designing for the Smart Home, where we do not just take care of the software, but also release hardware components. In particular, because of the 'materiality' of the Smart Home one could not just make a beta version available on the web, but an essential part of the development process is also to visit the 'beta' users in their home, to build trust, to face the real world issues and provide assistance to make the Smart Home work for them. After presenting our case study, we will then discuss the challenges we faced and how we dealt with them.
The simulation of fluid flows is of importance to many fields of application, especially in industry and infrastructure. The modelling equations applied describe a coupled system of non-linear, hyperbolic partial differential equations given by one-dimensional shallow water equations that enable the consistent implementation of free surface flows in open channels as well as pressurised flows in closed pipes. The numerical realisation of these equations is complicated and challenging to date due to their characteristic properties that are able to cause discontinuous solutions.
The Java Virtual Machine (JVM) executes the compiled bytecode version of a Java program and acts as a layer between the program and the operating system. The JVM provides additional features such as Process, Thread, and Memory Management to manage the execution of these programs. The Garbage Collection (GC) is part of the memory management and has an impact on the overall runtime performance because it is responsible for removing dead objects from the heap. Currently, the execution of a program needs to be halted during every GC run. The problem of this stop-the-world approach is that all threads in the JVM need to be suspended. It would be desirable to have a thread-local GC that only blocks the current thread and does not affect any other threads. In particular, this would improve the execution of multi-threaded Java programs. An object that is accessible by more than one thread is called escaped. It is not possible to thread-locally determine if escaped objects are still alive so that they cannot be handled in a thread-local GC. To gain significant performance improvements with a thread-local GC, it is therefore necessary to determine if it is possible to reliably predict if a given object will escape. Experimental results show that the escaping of objects can be predicted with high accuracy based on the line of code the object was allocated from. A thread-local GC was developed to minimize the number of stop-the-world GCs. The prototype implementation delivers a proof-of-concept that shows that this goal can be achieved in certain scenarios.