Refine
Departments, institutes and facilities
- Fachbereich Informatik (58)
- Fachbereich Wirtschaftswissenschaften (51)
- Fachbereich Ingenieurwissenschaften und Kommunikation (31)
- Fachbereich Angewandte Naturwissenschaften (23)
- Institut für funktionale Gen-Analytik (IFGA) (20)
- Institute of Visual Computing (IVC) (20)
- Präsidium (17)
- Institut für Cyber Security & Privacy (ICSP) (15)
- Institut für Verbraucherinformatik (IVI) (9)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (9)
Document Type
- Article (103)
- Conference Object (71)
- Part of a Book (24)
- Book (monograph, edited volume) (21)
- Part of Periodical (19)
- Report (8)
- Lecture (6)
- Contribution to a Periodical (4)
- Master's Thesis (4)
- Conference Proceedings (2)
Year of publication
- 2013 (272) (remove)
Keywords
- Lehrbuch (6)
- Corporate Social Responsibility (3)
- Amiloride (2)
- Education (2)
- Internet (2)
- Mal d 1 (2)
- Molecular dynamics (2)
- Qualitätsmanagement (2)
- Three-dimensional displays (2)
- Virtuelle Realität (2)
Information reliability and automatic computation are two important aspects that are continuously pushing the Web to be more semantic. Information uploaded to the Web should be reusable and extractable automatically to other applications, platforms, etc. Several tools exist to explicitly markup Web content. The Web services may also have a positive role on the automatic processing of Web contents, especially when they act as flexible and agile agents. However, Web services themselves should be developed with semantics in mind. They should include and provide structured information to facilitate their use, reuse, composition, query, etc. In this chapter, the authors focus on evaluating state-of-the-art semantic aspects and approaches in Web services. Ultimately, this contributes to the goal of Web knowledge management, execution, and transfer.
More than 25 years ago, it was a big surprise for physiologists that nitric oxide (NO) was identified as the endothelium derived relaxing factor which is responsible for endothelium-induced smooth muscle relaxation (Ignarro et al., 1987). Until then, small gaseous molecules were simply regarded as byproducts of cellular metabolism which were unlikely to be of any physiological relevance. The discovery that NO was synthesized by specific enzymes (NO-synthases), upon stimulation by specific, physiologically relevant stimuli (e.g., acetylcholine stimulation of endothelial cells), as well as the fact that it acted on specific cellular targets (e.g., soluble guanylate cyclase), set the course for numerous studies which investigated the physiological roles of gaseous signaling molecules—in other words, gasotransmitters (Wang, 2002).
BWL für Dummies
(2013)
Updating a shared data structure in a parallel program is usually done with some sort of high-level synchronization operation to ensure correctness and consistency. However, underlying synchronization instructions in a processor architecture are costly and rather limited in their scalability on larger multi-core/multi-processors systems. In this paper, we examine work queue operations where such costly atomic update operations are replaced with non-atomic modifiers (simple read+write). In this approach, we trade the exact amount of work with atomic operations against doing more and redundant work but without atomic operations and without violating the correctness of the algorithm. We show results for the application of this idea to the concrete scenario of parallel Breadth First Search (BFS) algorithms for undirected graphs on two large NUMA shared memory system with up to 64 cores.
Hochschulen beschäftigen sich mit der Frage, wie Lehrveranstaltungen und Forschungsprojekte durch den Einsatz digitaler Werkzeuge und Lernplattformen begleitet und ergänzt werden können. Bibliotheken spielen hierbei häufig eine tragende Rolle, auf ganz unterschiedliche Art und Weise: Als Content-Entwickler und -Provider, als Supportstelle oder als E-Learning-Beratungsstelle für Lehrende und Studierende.
Although most individuals who gamble do so without any adverse consequences, some individuals develop a recurrent, maladaptive pattern of gambling behaviour, often called pathological gambling or gambling disorder, that is associated with financial losses, disruption of family and interpersonal relationships, and co-occurring psychiatric disorders. Identifying whether different types of gambling modalities vary in their ability to lead to maladaptive patterns of gambling behaviour is essential to develop public policies that seek to balance access to gambling opportunities with minimizing risk for the potential adverse consequences of gambling behaviour. Until recently, assessing the risk potential of different types of gambling products was nearly impossible. ASTERIG, initially developed in Germany in 2006-2010, is an assessment tool to measure and to evaluate the risk potential of any gambling product based on scores on ten dimensions. In doing so, it also allows a comparison to be drawn between the addictive potential of different gambling products. Furthermore, the tool highlights where the specific risk potential of each specific gambling product lies. This makes it a valuable tool at the legislative, case law, and administrative levels as it allows the risk potential of individual gambling products to be identified and to be compared globally and across 10 different dimensions of risk potential. We note that specific gambling products should always be evaluated rather than product groups (lotteries, slot machines) or providers, as there may be variations among those product groups that impact their risk potential. For example, slot machines may vary on the amount of jackpot, which may influence their risk potential.
Switched power electronic subsystems are widely used in various applications. A fault in one of their components may have a significant effect on the system’s load or may even cause a damage. Therefore, it is important to detect and isolate faults and to report true faults to a supervisory system in order to avoid malfunction of or damage to a load. If, in a model-based approach to fault detection and isolation of hybrid systems, switching devices are considered as ideal switches then some equations must be reformulated whenever some devices have switched. In this paper, a fixed causality bond graph representation of hybrid system models is used, i.e., computational causalities assigned according to the Standard Causality Assignment Procedure (SCAP) are independent of system modes of operation. The latter are taken into account by transformer moduli mi(t) ∈ {0, 1} ∀t ≥ 0 in a unique set of equations of motion. In a case study, this approach is used for fault diagnosis in a three-phase full-wave rectifier. Residuals of Analytical Redundancy Relations (ARRs) holding for all modes of operations and serving as fault indicators are computed in an offline simulation as part of a DAE system by using a bond graph model of the faulty system instead of the real one and by coupling it to a bond graph of the healthy system by means of residual sinks.
Increased endothelin-1 decreases PKC alpha (PKCα), resulting in high miRNA 15a levels in kidney tumors. Breast cancer cells treated with ET-1, β-estrogen, Tamoxifen, Tamoxifen + β-estrogen and Tamoxifen + ET-1 were analysed regarding miRNA 15a expression. Significantly increased miRNA 15a levels were found after ET-1, becoming further increased in Tamoxifen + ET-1 treated cells. Our group already showed that miRNA 15a induces MAPK p38 splicing resulting in a truncated product called Mxi-2, whose function has yet to be defined in tumors. We described for the first time in ET-1 induced tumor cells that Mxi-2 builds a complex with Ago2, a miRNA binding protein, which is important for the localization of miRNAs to the 3′UTR of target genes. Furthermore, we show that Mxi-2/Ago2 is important for the interaction with the miRNA 1285 which binds to the 3′end of the tumor suppressor gene p53, being responsible for the downregulation of p53. Tissue arrays from breast cancer patients were performed, analysing Mxi-2, p53 and PKCα. Since the Mxi-2 levels increase in Tamoxifen + ET-1 treated cells, we claim that increasing ET-1 levels in Tamoxifen treated breast cancer patients are responsible for decreasing p53 levels. In summary, ET-1 decreases nuclear PKCα levels, while increasing the amount of miRNA 15a. This causes high levels of Mxi-2, necessary for complex formation with Ago2. The newly identified Mxi-2/Ago2 complex interacting with miRNA 1285 leads to increased 3′UTR p53 interaction, resulting in decreased p53 levels and subsequent tumor progression. This newly identified mechanism is a possible explanation for the development of ET-1 induced tumors.
This paper examines how students learn to collaborate in English by participating in an intercultural project that focuses on teaching students to work together on a digital writing project using various online tools, and participated in this digital collaboration project. Mixed groups of students, two French and two German, used several synchronous and asynchronous tools to communicate with their counterparts (Facebook, WordPress blog, WIMS e-learning platform, email, videoconferencing). Students had to produce an article together, comparing French and German attitudes about a topic they negotiated freely in their groups. Before publishing their post, students were expected to peer-review the article written by their group. Once published, the stage consisted of voting for the best posts on the e-learning platform, WIMS. A videoconference was also organized to create cohesion between the participants. The result of the student evaluations, together with the administrative, technical vastly differing university setups is presented.
The BRICS component model: a model-based development paradigm for complex robotics software systems
(2013)
Internet-Ökonomie
(2013)
Angewandte Makroökonomie
(2013)
Makroökonomische Ereignisse wie die Schuldenkrise, Rezession, Arbeitslosigkeit und Inflation haben nicht nur gesamtwirtschaftliche Konsequenzen, sondern auch vielfältige Berührungspunkte zum täglichen Leben. Diese Ereignisse sind häufig komplex und für den Einzelnen nicht immer leicht zu durchschauen. Um Studierende auf die globalen Herausforderungen von Wirtschaft, Gesellschaft und Umwelt vorzubereiten ist in diesem Lehrbuch explizit auch das Thema der nachhaltigen Entwicklung integriert. Außerdem werden die großen Themen der Makroökonomie teilweise gebündelt behandelt, um die vielfältigen Zusammenhänge zwischen den einzelnen Gebieten transparenter zu gestalten. Dies hat für Studierende und Lehrende u.a. den Vorteil, dass eine modulare Verwendung möglich ist.
Reactive oxygen species and the bacteriostatic and bactericidal effects of isoconazole nitrate
(2013)
Controlling
(2013)
Eine zielorientierte Steuerung und die Erhöhung der Transparenz gehören zu den zentralen Aufgaben des Management in Unternehmen und in nichtkommerziellen Organisationen. Vor dem Hintergrund der zunehmenden Umweltdynamik und einer hohen Komplexität, in denen Organisationen heute agieren, gewinnt ein modernes und zielorientiertes Controlling zur Erfüllung der anspruchsvollen Steuerungsaufgaben zunehmend an Bedeutung.
Earth’s nearest candidate supermassive black hole lies at the centre of the Milky Way1. Its electromagnetic emission is thought to be powered by radiatively inefficient accretion of gas from its environment2, which is a standard mode of energy supply for most galactic nuclei. X-ray measurements have already resolved a tenuous hot gas component from which the black hole can be fed3. The magnetization of the gas, however, which is a crucial parameter determining the structure of the accretion flow, remains unknown. Strong magnetic fields can influence the dynamics of accretion, remove angular momentum from the infalling gas4, expel matter through relativistic jets5 and lead to synchrotron emission such as that previously observed6, 7, 8. Here we report multi-frequency radio measurements of a newly discovered pulsar close to the Galactic Centre9, 10, 11, 12 and show that the pulsar’s unusually large Faraday rotation (the rotation of the plane of polarization of the emission in the presence of an external magnetic field) indicates that there is a dynamically important magnetic field near the black hole. If this field is accreted down to the event horizon it provides enough magnetic flux to explain the observed emission—from radio to X-ray wavelengths—from the black hole.
Radio pulsars in relativistic binary systems are unique tools to study the curved space-time around massive compact objects. The discovery of a pulsar closely orbiting the super-massive black hole at the centre of our Galaxy, Sgr A⋆, would provide a superb test-bed for gravitational physics. To date, the absence of any radio pulsar discoveries within a few arc minutes of Sgr A⋆ has been explained by one principal factor: extreme scattering of radio waves caused by inhomogeneities in the ionized component of the interstellar medium in the central 100 pc around Sgr A⋆. Scattering, which causes temporal broadening of pulses, can only be mitigated by observing at higher frequencies. Here we describe recent searches of the Galactic centre region performed at a frequency of 18.95 GHz with the Effelsberg radio telescope.
Produktionswirtschaft
(2013)
We derive rates of convergence for limit theorems that reveal the intricate structure of the phase transitions in a mean-field version of the Blume-Emery-Griffith model. The theorems consist of scaling limits for the total spin. The model depends on the inverse temperature β and the interaction strength K. The rates of convergence results are obtained as (β,K) converges along appropriate sequences (βn,Kn) to points belonging to various subsets of the phase diagram which include a curve of second-order points and a tricritical point. We apply Stein's method for normal and non-normal approximation avoiding the use of transforms and supplying bounds, such as those of Berry-Esseen quality, on approximation error. We observe an additional phase transition phenomenon in the sense that depending on how fast Kn and βn are converging to points in various subsets of the phase diagram, different rates of convergences to one and the same limiting distribution occur.
Benchmarking
(2013)
Die Klausur
(2013)
Die Aufgaben sind Teil einer Klausur, die im Sommersemester 2013 von Prof. Dr.
Andreas Gadatsch (Hochschule Bonn-Rhein-Sieg) im Prüfungsfach IT-Controlling
für Wirtschaftswissenschaftler des Master-Studiengangs Informations- und
Innovationsmanagement gestellt wurde. Bearbeitungszeit der gesamten Klausur:
90 Minuten.
Steigender Kostendruck und der Zwang zur Verbesserung der Wirtschaftlichkeit sind für Mitarbeiter im Gesundheitswesen Realität geworden. Viele Einrichtungen müssen ihre Arbeitsabläufe optimieren und gleichzeitig die Qualität ihrer Leistungen verbessern. Das Buch erläutert Grundlagen und ausgewählte Methoden des Prozessmanagements, die in Einrichtungen des Gesundheitswesens (Krankenhäuser, Krankenkassen und Krankenversicherungen) eingesetzt werden können.
Unter Big Data wird die Echtzeitverarbeitung sehr großer Datenmengen für analytische Aufgaben verstanden. Neue Technologien und Methoden machen es möglich, bislang nicht genutzte Analyseaufgaben in kürzester Zeit durchzuführen. Der Beitrag zeigt, wie Big Data in den betriebswirtschaftlichen Kontext einzuordnen ist, welche Chancen den Unternehmen offen stehen und welche Auswirkungen dies für den Controller haben wird.
Cloud-Services gelten als zukunftsträchtiges Konzept zur Bereitstellung von IT-Leistungen und verdrängen zunehmend klassische IT-Outsourcing-Modelle. Allerdings weichen noch viele Unternehmen auf traditionelle Bereitstellungskonzepte (z. B. Betrieb eines eigenen Rechenzentrums, Klassisches Outsourcing) aus. Ein Grund hierfür ist das fehlende Vertrauen in die „Cloud“. Der Kunde kann in der Regel die möglichen Risiken und das Sicherheitsniveau des Anbieters nicht beurteilen, da er meist auch keine direkte Einsicht in die Abläufe und Strukturen des Dienstleisters hat (Winkelmann, 2010).
Die weitere Verbreitung von Cloud-Diensten hängt stark davon ab, ob die Leistungsabnehmer ausreichend Vertrauen in die IT-Dienstleister, die technische Infrastruktur und das gesamte organisatorisch-rechtliche Umfeld (z.B. auch den jeweiligen Rechtsschutz, Anspruchs- und Klagemöglichkeiten) haben. Eine Möglichkeit, diese Situation im Sinne einer Entscheidungshilfe zu verbessern ist es, ein neutrales „Zertifikat für ITSicherheit von Cloud-Services“ zu nutzen. Im Rahmen der weiteren Vertrauensbildung in die „Cloud“ ist der unabhängigen Zertifizierung von Cloud-Services eine erhebliche Bedeutung beizumessen.
Dieses White-Paper beschreibt typische Kategorien von Cloud-Services, thematisiert die hohe Bedeutung des Vertrauens in die Cloud und geht auf Anforderungen an eine „vertrauenswürdige Zertifizierung von Cloud-Dienstleistungen“ (Trusted Cloud-Services) ein.
Abschließend wird ein praxisreifes Lösungsangebot zur Zertifizierung von Cloud-Diensten vorgestellt, welches die zuvor beschriebenen Anforderungen erfüllen kann.
IT-gestütztes IT-Controlling
(2013)
Die Greenpocket GmbH hat in Kooperation mit der Hochschule Bonn-Rhein-Sieg eine Studie zum Thema Smart Home durchgeführt. Wie die Onlinebefragung zeigt, ist die Zahlungsbereitschaft für Smart-Home-Lösungen bei jungen, technikaffinen Verbrauchern, den Digital Natives, höher ls bisher angenommen. Insgesamt umfasste der Online-Fragebogen 32 Fragen. Teile der Befragung wurden nach dem Kano-Modell konzipiert. Zu den Ergebnissen der Befragung: Der Steuerung über das Smartphone kommt eine wichtige Rolle zu. Insgesamt ist die Qualität wichtiger als ein niedriger Preis. Langfristig sollte ein Smart Home die Gewohnheiten des Nutzers automatisch erkennen und berücksichtigen können. Die Anbindung an soziale Netzwerke wird hingegen kritisch beurteilt. Als wünschenswert wird die Steuerung von Haushalts- und Elektrogeräten in Abhängigkeit vom Strompreis empfunden.
Power train models are required to simulate hence predict energy consumption of vehicles. Efficiencies for different components in power train are required. Common procedures use digitalised shell models (or maps) to model the efficiency of Internal Combustion Engines (ICE) and manual gearboxes (MG). Errors are connected with these models and affect the accuracy of the calculation. The accuracy depends on the configuration of the simulation, the digitalisation of the data and the data used. This paper evaluates these sources of error. The understanding of the source of error can improve the results of the modelling by more than eight percent.
BACKGROUND
Metabolic control and dietary management of patients with phenylketonuria (PKU) are based on single blood samples obtained at variable intervals. Sampling conditions are often not well-specified and intermittent variation of phenylalanine concentrations between two measurements remains unknown. We determined phenylalanine and tyrosine concentrations in blood over 24 hours. Additionally, the impact of food intake and physical exercise on phenylalanine and tyrosine concentrations was examined. Subcutaneous microdialysis was evaluated as a tool for monitoring phenylalanine and tyrosine concentrations in PKU patients.
METHODS
Phenylalanine and tyrosine concentrations of eight adult patients with PKU were determined at 60 minute intervals in serum, dried blood and subcutaneous microdialysate and additionally every 30 minutes postprandially in subcutaneous microdialysate. During the study period of 24 hours individually tailored meals with defined phenylalanine and tyrosine contents were served at fixed times and 20 min bicycle-ergometry was performed.
RESULTS
Serum phenylalanine concentrations showed only minor variations while tyrosine concentrations varied significantly more over the 24-hour period. Food intake within the patients' individual diet had no consistent effect on the mean phenylalanine concentration but the tyrosine concentration increased up to 300% individually. Mean phenylalanine concentration remained stable after short-term bicycle-exercise whereas mean tyrosine concentration declined significantly. Phenylalanine and tyrosine concentrations in dried blood were significantly lower than serum concentrations. No close correlation has been found between serum and microdialysis fluid for phenylalanine and tyrosine concentrations.
CONCLUSIONS
Slight diurnal variation of phenylalanine concentrations in serum implicates that a single blood sample does reliably reflect the metabolic control in this group of adult patients. Phenylalanine concentrations determined by subcutaneous microdialysis do not correlate with the patients' phenylalanine concentrations in serum/blood.
BACKGROUND
Propionic acidemia is an inherited disorder caused by deficiency of propionyl-CoA carboxylase. Although it is one of the most frequent organic acidurias, information on the outcome of affected individuals is still limited.
STUDY DESIGN/METHODS
Clinical and outcome data of 55 patients with propionic acidemia from 16 European metabolic centers were evaluated retrospectively. 35 patients were diagnosed by selective metabolic screening while 20 patients were identified by newborn screening. Endocrine parameters and bone age were evaluated. In addition, IQ testing was performed and the patients' and their families' quality of life was assessed.
RESULTS
The vast majority of patients (>85%) presented with metabolic decompensation in the neonatal period. Asymptomatic individuals were the exception. About three quarters of the study population was mentally retarded, median IQ was 55. Apart from neurologic symptoms, complications comprised hematologic abnormalities, cardiac diseases, feeding problems and impaired growth. Most patients considered their quality of life high. However, according to the parents' point of view psychic problems were four times more common in propionic acidemia patients than in healthy controls.
CONCLUSION
Our data show that the outcome of propionic acidemia is still unfavourable, in spite of improved clinical management. Many patients develop long-term complications affecting different organ systems. Impairment of neurocognitive development is of special concern. Nevertheless, self-assessment of quality of life of the patients and their parents yielded rather positive results.
Ornithine transcarbamylase (OTC) deficiency is the most common urea cycle defect. The clinical presentation in female manifesting carriers varies both in onset and severity. We report on a female with insulin dependent diabetes mellitus and recurrent episodes of hyperammonemia. Since OTC activity measured in a liver biopsy sample was within normal limits, OTC deficiency was initially excluded from the differential diagnoses of hyperammonemia. Due to moderately elevated homocitrulline excretion, hyperornithinemia-hyperammonemia-homocitrullinuria-syndrome was suggested, but further assays in fibroblasts showed normal ornithine utilization. Later, when mutation analysis of the OTC gene became available, a known pathogenic missense mutation (c.533C>T) in exon 5 leading to an exchange of threonine-178 by methionine (p.Thr178Met) was detected. Skewed X-inactivation was demonstrated in leukocyte DNA. In the further clinical course the girl developed marked obesity. By initiating physical activities twice a week, therapeutic control of both diabetes and OTC deficiency improved, but obesity persisted. In conclusion, our case confirms that normal hepatic OTC enzyme activity measured in a single liver biopsy sample does not exclude a clinical relevant mosaic of OTC deficiency because of skewed X-inactivation. Mutation analysis of the OTC gene in whole blood may be a simple way to establish the diagnosis of OTC deficiency. The joint occurrence of OTC deficiency and diabetes in a patient has not been reported before.
Die Vorstandsperspektive
(2013)
Als Basis für Simulationen innerhalb virtueller Umgebungen werden meist unterliegende Semantiken benötigt. Im Fall von Verkehrssimulationen werden in der Regel definierte Verkehrsnetzwerke genutzt. Die Erstellung dieser Netzwerke wird meist per Hand durchgeführt, wodurch sie fehleranfällig ist und viel Zeit erfordert. Dieses Projekt wurde im Rahmen des AVeSi Projektes durchgeführt, in dem an der Entwicklung einer realistischen Verkehrssimulation für virtuelle Umgebung geforscht wird. Der im Projekt angestrebte Simulationsansatz basiert auf der Nutzung von zwei Komplexitätsebenen – einer mikroskopischen und einer mesoskopischen. Um einen Übergang zwischen den Simulationsebenen zu realisieren ist eine Verknüpfung der Verkehrsnetzwerke notwendig, was ebenfalls mit einem hohen Zeitaufwand verbunden ist. In diesem Bericht werden Modelle für Verkehrsnetzwerke beider Ebenen vorgestellt. Anschließend wird ein Ansatz beschrieben, der eine automatische Generierung und Verknüpfung von Verkehrsnetzwerken beider Modelle ermöglicht. Als Grundlage für die Generierung der Netzwerke dienen Daten im OpenDRIVE®-Format. Zur Evaluierung wurden wirklichkeitsgetreue OpenStreetMap-Daten, durch Verwendung einer Drittanbietersoftware, in OpenDRIVE®-Datensätze überführt. Es konnte nachgewiesen werden, dass es durch den Ansatz möglich ist, innerhalb weniger Minuten, große Verkehrsnetzwerke zu erzeugen, auf denen unmittelbar Simulationen ausgeführt werden können. Die Qualität der zur Evaluierung generierten Netzwerke reicht jedoch für Umgebungen, in denen ein hoher Realitätsgrad gefordert wird, nicht aus, was einen zusätzlichen Bearbeitungsschritt notwendig macht. Die Qualitätsprobleme konnten darauf zurückgeführt werden, dass der Detailgrad, der den Evaluierungsdaten zu Grunde liegenden OpenStreetMap-Daten, nicht hoch genug und der Überführungsprozess nicht ausreichend transparent ist.
Traffic simulations are generally used to forecast traffic behavior or to simulate non-player characters in computer games and virual environments. These systems are usually modeled in such a way that traffic rules are strictly followed. However, rule violations are a common part of real-life traffic and thus should be integrated into such models.
Real-Time Simulation of Camera Errors and Their Effect on Some Basic Robotic Vision Algorithms
(2013)
Computers will soon be powerful enough to simulate consciousness. The artificial life community should start to try to understand how consciousness could be simulated. The proposal is to build an artificial life system in which consciousness might be able to evolve. The idea is to develop internet-wide artificial universe in which the agents can evolve. Users play games by defining agents that form communities. The communities have to perform tasks, or compete, or whatever the specific game demands. The demands should be such that agents that are more aware of their universe are more likely to succeed. The agents reproduce and evolve within their user’s machine, but can also sometimes transfer to other machine across the internet. Users will be able to choose the capabilities of their agents from a fixed list, but may also write their own powers for their agents.
BACKGROUND
Hyperlysinemia is an autosomal recessive inborn error of L-lysine degradation. To date only one causal mutation in the AASS gene encoding α-aminoadipic semialdehyde synthase has been reported. We aimed to better define the genetic basis of hyperlysinemia.
METHODS
We collected the clinical, biochemical and molecular data in a cohort of 8 hyperlysinemia patients with distinct neurological features.
RESULTS
We found novel causal mutations in AASS in all affected individuals, including 4 missense mutations, 2 deletions and 1 duplication. In two patients originating from one family, the hyperlysinemia was caused by a contiguous gene deletion syndrome affecting AASS and PTPRZ1.
CONCLUSIONS
Hyperlysinemia is caused by mutations in AASS. As hyperlysinemia is generally considered a benign metabolic variant, the more severe neurological disease course in two patients with a contiguous deletion syndrome may be explained by the additional loss of PTPRZ1. Our findings illustrate the importance of detailed biochemical and genetic studies in any hyperlysinemia patient.
This work extends the affordance-inspired robot control architecture introduced in the MACS project [35] and especially its approach to integrate symbolic planning systems given in [24] by providing methods to automated abstraction of affordances to high-level operators. It discusses how symbolic planning instances can be generated automatically based on these operators and introduces an instantiation method to execute the resulting plans. Preconditions and effects of agent behaviour are learned and represented in Gärdenfors conceptual spaces framework. Its notion of similarity is used to group behaviours to abstract operators based on the affordance-inspired, function-centred view on the environment. Ways on how the capabilities of conceptual spaces to map subsymbolic to symbolic representations to generate PDDL planning domains including affordance-based operators are discussed. During plan execution, affordance-based operators are instantiated by agent behaviour based on the situation directly before its execution. The current situation is compared to past ones and the behaviour that has been most successful in the past is applied. Execution failures can be repaired by action substitution. The concept of using contexts to dynamically change dimension salience as introduced by Gärdenfors is realized by using techniques from the field of feature selection. The approach is evaluated using a 3D simulation environment and implementations of several object manipulation behaviours.
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
The device (10) has a handrail (18) provided with an optical contactless monitoring device formed as an active sensor system, where the monitoring device is arranged in a region of a guide (14) of the handrail at a front base (16) of an escalator (12) or a moving pavement. The monitoring device has two transmission paths (28, 30) with wavelength bands that are different from each other, where one of the paths includes the handrail. Ratio or difference between signals of the paths is used for recognizing foreign bodies e.g. hands of adults and children.
In Software development, the always beta principle is used to successfully develop innovation based on early and continuous user feedback. In this paper we discuss how this principle could be adapted to the special needs of designing for the Smart Home, where we do not just take care of the software, but also release hardware components. In particular, because of the 'materiality' of the Smart Home one could not just make a beta version available on the web, but an essential part of the development process is also to visit the 'beta' users in their home, to build trust, to face the real world issues and provide assistance to make the Smart Home work for them. After presenting our case study, we will then discuss the challenges we faced and how we dealt with them.
The simulation of fluid flows is of importance to many fields of application, especially in industry and infrastructure. The modelling equations applied describe a coupled system of non-linear, hyperbolic partial differential equations given by one-dimensional shallow water equations that enable the consistent implementation of free surface flows in open channels as well as pressurised flows in closed pipes. The numerical realisation of these equations is complicated and challenging to date due to their characteristic properties that are able to cause discontinuous solutions.
The Java Virtual Machine (JVM) executes the compiled bytecode version of a Java program and acts as a layer between the program and the operating system. The JVM provides additional features such as Process, Thread, and Memory Management to manage the execution of these programs. The Garbage Collection (GC) is part of the memory management and has an impact on the overall runtime performance because it is responsible for removing dead objects from the heap. Currently, the execution of a program needs to be halted during every GC run. The problem of this stop-the-world approach is that all threads in the JVM need to be suspended. It would be desirable to have a thread-local GC that only blocks the current thread and does not affect any other threads. In particular, this would improve the execution of multi-threaded Java programs. An object that is accessible by more than one thread is called escaped. It is not possible to thread-locally determine if escaped objects are still alive so that they cannot be handled in a thread-local GC. To gain significant performance improvements with a thread-local GC, it is therefore necessary to determine if it is possible to reliably predict if a given object will escape. Experimental results show that the escaping of objects can be predicted with high accuracy based on the line of code the object was allocated from. A thread-local GC was developed to minimize the number of stop-the-world GCs. The prototype implementation delivers a proof-of-concept that shows that this goal can be achieved in certain scenarios.