Refine
Departments, institutes and facilities
- Fachbereich Informatik (46)
- Fachbereich Ingenieurwissenschaften und Kommunikation (23)
- Fachbereich Angewandte Naturwissenschaften (21)
- Institut für funktionale Gen-Analytik (IFGA) (20)
- Institute of Visual Computing (IVC) (20)
- Institut für Cyber Security & Privacy (ICSP) (13)
- Fachbereich Wirtschaftswissenschaften (9)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (6)
- Institut für Verbraucherinformatik (IVI) (6)
- Institut für Sicherheitsforschung (ISF) (4)
Document Type
- Conference Object (68)
- Article (65)
- Part of a Book (8)
- Lecture (5)
- Book (monograph, edited volume) (3)
- Master's Thesis (3)
- Conference Proceedings (2)
- Part of Periodical (2)
- Preprint (2)
- Doctoral Thesis (1)
Year of publication
- 2013 (161) (remove)
Language
- English (161) (remove)
Keywords
- Amiloride (2)
- Education (2)
- Internet (2)
- Mal d 1 (2)
- Molecular dynamics (2)
- Three-dimensional displays (2)
- apple allergy (2)
- cystic fibrosis (2)
- end user development (2)
- ionic liquids (2)
The Federal Ministry of Labour and Social Affairs (Bundesministerium für Arbeit und Soziales, BMA) is supporting 73 projects in Germany using European Union (EU) funds in the amount of € 26 million. By providing the subsidies, the European Commission and the German Federal Government are hoping to implement Corporate Social Responsibility (CSR) among German small and medium-sized businesses (SMBs). The project run by Bonn-Rhein-Sieg University is one of these CSR projects. It is aimed at providing comprehensive information on CSR to the businesses in question and at emphasizing their responsibility along the supply chain.
Web-based Editor for YAWL
(2013)
This paper presents a web-based editor that offers YAWL editing capabilities and comprehensive support for the XML format of YAWL. The open-source project Signavio Core Components is extended with a graphical user interface (GUI) for parts of the YAWL Language, and an import-/export component that converts between YAWL and the internal format of Signavio Core Components. This conversion, between the web-based editor and the offcial YAWL Editor, is lossless so both tools may be used together. Compared to the offcial YAWL Editor, the web-based editor is missing some features, but could still facilitate the usage of the YAWL system in use cases that are not supported by a desktop application.
Computers will soon be powerful enough to simulate consciousness. The artificial life community should start to try to understand how consciousness could be simulated. The proposal is to build an artificial life system in which consciousness might be able to evolve. The idea is to develop internet-wide artificial universe in which the agents can evolve. Users play games by defining agents that form communities. The communities have to perform tasks, or compete, or whatever the specific game demands. The demands should be such that agents that are more aware of their universe are more likely to succeed. The agents reproduce and evolve within their user’s machine, but can also sometimes transfer to other machine across the internet. Users will be able to choose the capabilities of their agents from a fixed list, but may also write their own powers for their agents.
Botnets
(2013)
Malware poses one of the major threats to all currently operated computer systems. The scale of the problem becomes obvious by looking at the global economic loss caused by different kinds of malware, which is estimated to be more than US$ 10 billion every year. Botnets, a special kind of malware, are used to reap economic gains by criminals as well as for politically motivated activities. In contrast to other kinds of malware, botnets utilize a hidden communication channel to receive commands from their operator and communicate their current status. The ability to execute almost arbitrary commands on the infected machines makes botnets a general-purpose tool to perform malicious cyber-activities. (Verlagsangaben)
The Report starts with an interview between Eric Bettermann, Director of the German radio station Deutsche Welle, and University President Hartmut Ihne, which deals with responsibility in education and our University’s activities in the area of development cooperation. The chapters “Studies & Research”, “Research”, “Campus” , “The Region and International Issues” cover a wide spectrum of topics that are not rigidly defined because many topics might just as readily be assigned to other chapters.
In the latest edition, some special pages have been dedicated to the topic of “Taking a break”, i.e. to research semesters and sabbaticals, to breaks as a scientific focal point or to absolutely normal coffee breaks. Breaks are an essential part of our lives.
The criteria for assessing the quality of rubber materials are the polymer or copolymer composition and the additives. These additives include plasticizers, extender oils, carbon black, inorganic fillers, antioxidants, heat and light stabilizers, processing aids, cross-linking agents, accelerators, retarders, adhesives, pigments, smoke and flame retardants, and others. Determination of additives in polymers or copolymers generally requires the extraction of these substances from the matrix as a first step, which can be challenging, and the subsequent analysis of the extracted additives by gas chromatography (GC), GC-mass spectrometry (MS), high performance liquid chromatography (HPLC), HPLC-MS, capillary electrophoresis, thin-layer chromatography, and other analytical techniques. In the present work, nitrile rubber materials were studied using direct analytical flash pyrolysis hyphenated to GC and electrospray ionization MS in both scan and selected ion monitoring modes to demonstrate that this technique is a good tool to identify the organic additives in nitrile rubber.
YAWL Symposium 2013. Proceedings of the First YAWL Symposium, Sankt Augustin, Germany, June 7, 2013
(2013)
Annual Report 2011 - 2012
(2013)
Real-Time Simulation of Camera Errors and Their Effect on Some Basic Robotic Vision Algorithms
(2013)
The BRICS component model: a model-based development paradigm for complex robotics software systems
(2013)
Power train models are required to simulate hence predict energy consumption of vehicles. Efficiencies for different components in power train are required. Common procedures use digitalised shell models (or maps) to model the efficiency of Internal Combustion Engines (ICE) and manual gearboxes (MG). Errors are connected with these models and affect the accuracy of the calculation. The accuracy depends on the configuration of the simulation, the digitalisation of the data and the data used. This paper evaluates these sources of error. The understanding of the source of error can improve the results of the modelling by more than eight percent.
This work extends the affordance-inspired robot control architecture introduced in the MACS project [35] and especially its approach to integrate symbolic planning systems given in [24] by providing methods to automated abstraction of affordances to high-level operators. It discusses how symbolic planning instances can be generated automatically based on these operators and introduces an instantiation method to execute the resulting plans. Preconditions and effects of agent behaviour are learned and represented in Gärdenfors conceptual spaces framework. Its notion of similarity is used to group behaviours to abstract operators based on the affordance-inspired, function-centred view on the environment. Ways on how the capabilities of conceptual spaces to map subsymbolic to symbolic representations to generate PDDL planning domains including affordance-based operators are discussed. During plan execution, affordance-based operators are instantiated by agent behaviour based on the situation directly before its execution. The current situation is compared to past ones and the behaviour that has been most successful in the past is applied. Execution failures can be repaired by action substitution. The concept of using contexts to dynamically change dimension salience as introduced by Gärdenfors is realized by using techniques from the field of feature selection. The approach is evaluated using a 3D simulation environment and implementations of several object manipulation behaviours.
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
Realism and plausibility of computer controlled entities in entertainment software have been enhanced by adding both static personalities and dynamic emotions. Here a generic model is introduced which allows the transfer of findings from real-life personality studies to a computational model. This information is used for decision making. The introduction of dynamic event-based emotions enables adaptive behavior patterns. The advantages of this new model have been validated with a four-way crossroad in a traffic simulation. Driving agents using the introduced model enhanced by dynamics were compared to agents based on static personality profiles and simple rule-based behavior. It has been shown that adding an adaptive dynamic factor to agents improves perceivable plausibility and realism. It also supports coping with extreme situations in a fair and understandable way.
This paper examines how students learn to collaborate in English by participating in an intercultural project that focuses on teaching students to work together on a digital writing project using various online tools, and participated in this digital collaboration project. Mixed groups of students, two French and two German, used several synchronous and asynchronous tools to communicate with their counterparts (Facebook, WordPress blog, WIMS e-learning platform, email, videoconferencing). Students had to produce an article together, comparing French and German attitudes about a topic they negotiated freely in their groups. Before publishing their post, students were expected to peer-review the article written by their group. Once published, the stage consisted of voting for the best posts on the e-learning platform, WIMS. A videoconference was also organized to create cohesion between the participants. The result of the student evaluations, together with the administrative, technical vastly differing university setups is presented.
Earth’s nearest candidate supermassive black hole lies at the centre of the Milky Way1. Its electromagnetic emission is thought to be powered by radiatively inefficient accretion of gas from its environment2, which is a standard mode of energy supply for most galactic nuclei. X-ray measurements have already resolved a tenuous hot gas component from which the black hole can be fed3. The magnetization of the gas, however, which is a crucial parameter determining the structure of the accretion flow, remains unknown. Strong magnetic fields can influence the dynamics of accretion, remove angular momentum from the infalling gas4, expel matter through relativistic jets5 and lead to synchrotron emission such as that previously observed6, 7, 8. Here we report multi-frequency radio measurements of a newly discovered pulsar close to the Galactic Centre9, 10, 11, 12 and show that the pulsar’s unusually large Faraday rotation (the rotation of the plane of polarization of the emission in the presence of an external magnetic field) indicates that there is a dynamically important magnetic field near the black hole. If this field is accreted down to the event horizon it provides enough magnetic flux to explain the observed emission—from radio to X-ray wavelengths—from the black hole.
Although most individuals who gamble do so without any adverse consequences, some individuals develop a recurrent, maladaptive pattern of gambling behaviour, often called pathological gambling or gambling disorder, that is associated with financial losses, disruption of family and interpersonal relationships, and co-occurring psychiatric disorders. Identifying whether different types of gambling modalities vary in their ability to lead to maladaptive patterns of gambling behaviour is essential to develop public policies that seek to balance access to gambling opportunities with minimizing risk for the potential adverse consequences of gambling behaviour. Until recently, assessing the risk potential of different types of gambling products was nearly impossible. ASTERIG, initially developed in Germany in 2006-2010, is an assessment tool to measure and to evaluate the risk potential of any gambling product based on scores on ten dimensions. In doing so, it also allows a comparison to be drawn between the addictive potential of different gambling products. Furthermore, the tool highlights where the specific risk potential of each specific gambling product lies. This makes it a valuable tool at the legislative, case law, and administrative levels as it allows the risk potential of individual gambling products to be identified and to be compared globally and across 10 different dimensions of risk potential. We note that specific gambling products should always be evaluated rather than product groups (lotteries, slot machines) or providers, as there may be variations among those product groups that impact their risk potential. For example, slot machines may vary on the amount of jackpot, which may influence their risk potential.
Distributed systems comprise distributed computing systems, distributed information systems, and distributed pervasive systems. They are often very complex and their implementation is challenging. Intensive and continuous testing is indispensable to ensure reliability and high quality of a distributed system. The testing process should have a high degree of automation, not only on lower levels (i.e. unit and module testing), but also on higher testing levels (e.g. system, integration, and acceptance tests). To achieve automation on higher testing levels virtual infrastructure components (e.g. virtual machines, virtual networks) that are offered as a Service (IaaS) can be employed. The elasticity of on-demand computation resources fits well together with the varying resource demands of automated test execution.
A methodology for automated acceptance testing of distributed systems that uses virtual infrastructure is presented. It is founded on a task-oriented model that is used to abstract concurrency and asynchronous, remote communication in distributed systems. The model is used as groundwork for a domain-specific language that allows expressing tests for distributed systems in the form of scenarios. On the one hand, test scenarios are executable and, therefore, fully automated. On the other hand, test scenarios represent requirements to the system under test making an automated, example-based verification possible.
A prototypical implementation is used to apply the developed methodology in the context of two different case studies. The first case study uses RCE as an example of a distributed, workflow-driven integration environment for scientific computing. The second one uses MongoDB as an example of a document-oriented database system that offers distributed data storage through master-slave replication. The results of the experimental evaluation indicate that the developed acceptance testing methodology is a useful approach to design, build, and execute tests for distributed systems with high quality and a high degree of automation.
Updating a shared data structure in a parallel program is usually done with some sort of high-level synchronization operation to ensure correctness and consistency. However, underlying synchronization instructions in a processor architecture are costly and rather limited in their scalability on larger multi-core/multi-processors systems. In this paper, we examine work queue operations where such costly atomic update operations are replaced with non-atomic modifiers (simple read+write). In this approach, we trade the exact amount of work with atomic operations against doing more and redundant work but without atomic operations and without violating the correctness of the algorithm. We show results for the application of this idea to the concrete scenario of parallel Breadth First Search (BFS) algorithms for undirected graphs on two large NUMA shared memory system with up to 64 cores.
Information reliability and automatic computation are two important aspects that are continuously pushing the Web to be more semantic. Information uploaded to the Web should be reusable and extractable automatically to other applications, platforms, etc. Several tools exist to explicitly markup Web content. The Web services may also have a positive role on the automatic processing of Web contents, especially when they act as flexible and agile agents. However, Web services themselves should be developed with semantics in mind. They should include and provide structured information to facilitate their use, reuse, composition, query, etc. In this chapter, the authors focus on evaluating state-of-the-art semantic aspects and approaches in Web services. Ultimately, this contributes to the goal of Web knowledge management, execution, and transfer.
Tamoxifen therapy of invasive breast cancer has been associated with increased levels of endothelin-1 (ET-1) so that an endothelin-1 receptor (ETR) blockade has been suggested as a new therapeutic approach. This study analyzed the relationship between Tamoxifen and ET-1 signalling in invasive breast cancer. Using paraffinized tissue from 50 randomly chosen cases of invasive and in-situ ductal carcinoma from our archive, the expression of ETRs was analyzed by immune histology. ETRs were regularly detectable in normal breast tissue, but rarely in adjacent tumor areas (3/50 cases). By immunoprecipitation, a complex was found consisting of ET-1, estrogen receptors and Tamoxifen. Consequently, transcription of several target genes of ET-1 and estrogen receptors was detectable (interleukin-6, wnt-11 and a vimentin spliceform). In particular, the combination of Tamoxifen, ET-1, and estrogen receptors induced further increasing levels of these target genes. Some of these genes have been found upregulated in metastatically spreading breast cancer cells. We conclude: i) ETRs do not play a role in invasive or in-situ ductal breast cancer; ii) estrogen receptors and Tamoxifen build a complex with ET-1; and iii) upregulated transcription of target genes by ET-1–estrogen receptor–Tamoxifen complex may negatively influence breast cancer prognosis. These results indicate a role for ET-1 in Tamoxifen treated breast cancer patients leading to a potentially worsening prognosis.
Increased endothelin-1 decreases PKC alpha (PKCα), resulting in high miRNA 15a levels in kidney tumors. Breast cancer cells treated with ET-1, β-estrogen, Tamoxifen, Tamoxifen + β-estrogen and Tamoxifen + ET-1 were analysed regarding miRNA 15a expression. Significantly increased miRNA 15a levels were found after ET-1, becoming further increased in Tamoxifen + ET-1 treated cells. Our group already showed that miRNA 15a induces MAPK p38 splicing resulting in a truncated product called Mxi-2, whose function has yet to be defined in tumors. We described for the first time in ET-1 induced tumor cells that Mxi-2 builds a complex with Ago2, a miRNA binding protein, which is important for the localization of miRNAs to the 3′UTR of target genes. Furthermore, we show that Mxi-2/Ago2 is important for the interaction with the miRNA 1285 which binds to the 3′end of the tumor suppressor gene p53, being responsible for the downregulation of p53. Tissue arrays from breast cancer patients were performed, analysing Mxi-2, p53 and PKCα. Since the Mxi-2 levels increase in Tamoxifen + ET-1 treated cells, we claim that increasing ET-1 levels in Tamoxifen treated breast cancer patients are responsible for decreasing p53 levels. In summary, ET-1 decreases nuclear PKCα levels, while increasing the amount of miRNA 15a. This causes high levels of Mxi-2, necessary for complex formation with Ago2. The newly identified Mxi-2/Ago2 complex interacting with miRNA 1285 leads to increased 3′UTR p53 interaction, resulting in decreased p53 levels and subsequent tumor progression. This newly identified mechanism is a possible explanation for the development of ET-1 induced tumors.
More than 25 years ago, it was a big surprise for physiologists that nitric oxide (NO) was identified as the endothelium derived relaxing factor which is responsible for endothelium-induced smooth muscle relaxation (Ignarro et al., 1987). Until then, small gaseous molecules were simply regarded as byproducts of cellular metabolism which were unlikely to be of any physiological relevance. The discovery that NO was synthesized by specific enzymes (NO-synthases), upon stimulation by specific, physiologically relevant stimuli (e.g., acetylcholine stimulation of endothelial cells), as well as the fact that it acted on specific cellular targets (e.g., soluble guanylate cyclase), set the course for numerous studies which investigated the physiological roles of gaseous signaling molecules—in other words, gasotransmitters (Wang, 2002).
Ornithine transcarbamylase (OTC) deficiency is the most common urea cycle defect. The clinical presentation in female manifesting carriers varies both in onset and severity. We report on a female with insulin dependent diabetes mellitus and recurrent episodes of hyperammonemia. Since OTC activity measured in a liver biopsy sample was within normal limits, OTC deficiency was initially excluded from the differential diagnoses of hyperammonemia. Due to moderately elevated homocitrulline excretion, hyperornithinemia-hyperammonemia-homocitrullinuria-syndrome was suggested, but further assays in fibroblasts showed normal ornithine utilization. Later, when mutation analysis of the OTC gene became available, a known pathogenic missense mutation (c.533C>T) in exon 5 leading to an exchange of threonine-178 by methionine (p.Thr178Met) was detected. Skewed X-inactivation was demonstrated in leukocyte DNA. In the further clinical course the girl developed marked obesity. By initiating physical activities twice a week, therapeutic control of both diabetes and OTC deficiency improved, but obesity persisted. In conclusion, our case confirms that normal hepatic OTC enzyme activity measured in a single liver biopsy sample does not exclude a clinical relevant mosaic of OTC deficiency because of skewed X-inactivation. Mutation analysis of the OTC gene in whole blood may be a simple way to establish the diagnosis of OTC deficiency. The joint occurrence of OTC deficiency and diabetes in a patient has not been reported before.
The reciprocal translocation t(12;21)(p13;q22), the most common structural genomic alteration in B-cell precursor acute lymphoblastic leukaemia in children, results in a chimeric transcription factor TEL-AML1 (ETV6-RUNX1). We identified directly and indirectly regulated target genes utilizing an inducible TEL-AML1 system derived from the murine pro B-cell line BA/F3 and a monoclonal antibody directed against TEL-AML1. By integration of promoter binding identified with chromatin immunoprecipitation (ChIP)-on-chip, gene expression and protein output through microarray technology and stable labelling of amino acids in cell culture, we identified 217 directly and 118 indirectly regulated targets of the TEL-AML1 fusion protein. Directly, but not indirectly, regulated promoters were enriched in AML1-binding sites. The majority of promoter regions were specific for the fusion protein and not bound by native AML1 or TEL. Comparison with gene expression profiles from TEL-AML1-positive patients identified 56 concordantly misregulated genes with negative effects on proliferation and cellular transport mechanisms and positive effects on cellular migration, and stress responses including immunological responses. In summary, this work for the first time gives a comprehensive insight into how TEL-AML1 expression may directly and indirectly contribute to alter cells to become prone for leukemic transformation.
BACKGROUND
Hyperlysinemia is an autosomal recessive inborn error of L-lysine degradation. To date only one causal mutation in the AASS gene encoding α-aminoadipic semialdehyde synthase has been reported. We aimed to better define the genetic basis of hyperlysinemia.
METHODS
We collected the clinical, biochemical and molecular data in a cohort of 8 hyperlysinemia patients with distinct neurological features.
RESULTS
We found novel causal mutations in AASS in all affected individuals, including 4 missense mutations, 2 deletions and 1 duplication. In two patients originating from one family, the hyperlysinemia was caused by a contiguous gene deletion syndrome affecting AASS and PTPRZ1.
CONCLUSIONS
Hyperlysinemia is caused by mutations in AASS. As hyperlysinemia is generally considered a benign metabolic variant, the more severe neurological disease course in two patients with a contiguous deletion syndrome may be explained by the additional loss of PTPRZ1. Our findings illustrate the importance of detailed biochemical and genetic studies in any hyperlysinemia patient.
BACKGROUND
Metabolic control and dietary management of patients with phenylketonuria (PKU) are based on single blood samples obtained at variable intervals. Sampling conditions are often not well-specified and intermittent variation of phenylalanine concentrations between two measurements remains unknown. We determined phenylalanine and tyrosine concentrations in blood over 24 hours. Additionally, the impact of food intake and physical exercise on phenylalanine and tyrosine concentrations was examined. Subcutaneous microdialysis was evaluated as a tool for monitoring phenylalanine and tyrosine concentrations in PKU patients.
METHODS
Phenylalanine and tyrosine concentrations of eight adult patients with PKU were determined at 60 minute intervals in serum, dried blood and subcutaneous microdialysate and additionally every 30 minutes postprandially in subcutaneous microdialysate. During the study period of 24 hours individually tailored meals with defined phenylalanine and tyrosine contents were served at fixed times and 20 min bicycle-ergometry was performed.
RESULTS
Serum phenylalanine concentrations showed only minor variations while tyrosine concentrations varied significantly more over the 24-hour period. Food intake within the patients' individual diet had no consistent effect on the mean phenylalanine concentration but the tyrosine concentration increased up to 300% individually. Mean phenylalanine concentration remained stable after short-term bicycle-exercise whereas mean tyrosine concentration declined significantly. Phenylalanine and tyrosine concentrations in dried blood were significantly lower than serum concentrations. No close correlation has been found between serum and microdialysis fluid for phenylalanine and tyrosine concentrations.
CONCLUSIONS
Slight diurnal variation of phenylalanine concentrations in serum implicates that a single blood sample does reliably reflect the metabolic control in this group of adult patients. Phenylalanine concentrations determined by subcutaneous microdialysis do not correlate with the patients' phenylalanine concentrations in serum/blood.
Embodied artificial agents operating in dynamic, real-world environments need architectures that support the special requirements that exist for them. Architectures are not always designed from scratch and the system then implemented all at once, but rather, a step-wise integration of components is often made to increase functionality. Our work aims to increase flexibility and robustness by integrating a task planner into an existing architecture and coupling the planning process with the preexisting execution and the basic monitoring processes. This involved the conversion of monolithic SMACH scenario scripts (state-machine execution scripts) into modular states that can be called dynamically based on the plan that was generated by the planning process. The procedural knowledge encoded in such state machines was used to model the planning domain for two RoboCup@Home scenarios on a Care-O-Bot 3 robot [GRH+08]. This was done for the JSHOP2 [IN03] hierarchical task network (HTN) planner. A component which iterates through a generated plan and calls the appropriate SMACH states [Fie11] was implemented, thus enabling the scenarios. Crucially, individual monitoring actions which enable the robot to monitor the execution of the actions were designed and included, thus providing additional robustness.
In Software development, the always beta principle is used to successfully develop innovation based on early and continuous user feedback. In this paper we discuss how this principle could be adapted to the special needs of designing for the Smart Home, where we do not just take care of the software, but also release hardware components. In particular, because of the 'materiality' of the Smart Home one could not just make a beta version available on the web, but an essential part of the development process is also to visit the 'beta' users in their home, to build trust, to face the real world issues and provide assistance to make the Smart Home work for them. After presenting our case study, we will then discuss the challenges we faced and how we dealt with them.
Radio pulsars in relativistic binary systems are unique tools to study the curved space-time around massive compact objects. The discovery of a pulsar closely orbiting the super-massive black hole at the centre of our Galaxy, Sgr A⋆, would provide a superb test-bed for gravitational physics. To date, the absence of any radio pulsar discoveries within a few arc minutes of Sgr A⋆ has been explained by one principal factor: extreme scattering of radio waves caused by inhomogeneities in the ionized component of the interstellar medium in the central 100 pc around Sgr A⋆. Scattering, which causes temporal broadening of pulses, can only be mitigated by observing at higher frequencies. Here we describe recent searches of the Galactic centre region performed at a frequency of 18.95 GHz with the Effelsberg radio telescope.
In discussions of gambling addiction to specific games, the market size and the proceeds generated by the game are usually disregarded. Inclusion of these parameters results in a relativization of the picture of gambling addiction. A fundamental principle for such an analysis is the separation between absolute numbers and ratios, which is a common procedure in economic contexts.
The simulation of fluid flows is of importance to many fields of application, especially in industry and infrastructure. The modelling equations applied describe a coupled system of non-linear, hyperbolic partial differential equations given by one-dimensional shallow water equations that enable the consistent implementation of free surface flows in open channels as well as pressurised flows in closed pipes. The numerical realisation of these equations is complicated and challenging to date due to their characteristic properties that are able to cause discontinuous solutions.
The Java Virtual Machine (JVM) executes the compiled bytecode version of a Java program and acts as a layer between the program and the operating system. The JVM provides additional features such as Process, Thread, and Memory Management to manage the execution of these programs. The Garbage Collection (GC) is part of the memory management and has an impact on the overall runtime performance because it is responsible for removing dead objects from the heap. Currently, the execution of a program needs to be halted during every GC run. The problem of this stop-the-world approach is that all threads in the JVM need to be suspended. It would be desirable to have a thread-local GC that only blocks the current thread and does not affect any other threads. In particular, this would improve the execution of multi-threaded Java programs. An object that is accessible by more than one thread is called escaped. It is not possible to thread-locally determine if escaped objects are still alive so that they cannot be handled in a thread-local GC. To gain significant performance improvements with a thread-local GC, it is therefore necessary to determine if it is possible to reliably predict if a given object will escape. Experimental results show that the escaping of objects can be predicted with high accuracy based on the line of code the object was allocated from. A thread-local GC was developed to minimize the number of stop-the-world GCs. The prototype implementation delivers a proof-of-concept that shows that this goal can be achieved in certain scenarios.
Grailog embodies a systematics to visualize knowledge sources by graphical elements. Its main benefit is that the resulting visual presentations are easier to read for humans than the original symbolic source code. In this paper we introduce a methodology to handle the mapping from Datalog RuleML, serialized in XML, to an SVG representation of Grailog, also serialized in XML, via eXtensible Stylesheet Language Transformations (XSLT) 2.0/XML; the SVG is then rendered visually by modern Web browsers. This initial mapping is realized to target Grailog's "fully node copied" normal form. Elements can thus be translated one at a time, separating the fundamental Datalog-to-SVG translation concern from the concern of merging node copies for optimal (hyper)graph layout and avoiding its high computational complexity in this online tool. The resulting open source Grailog Knowledge-Source Visualizer (Grailog KS Viz) supports Datalog RuleML with positional relations of arity n>1. The on-the-fly transformation was shown to run on all recent major Web browsers and should be easy to understand, use, and extend.
We developed a scene text recognition system with active vision capabilities, namely: auto-focus, adaptive aperture control and auto-zoom. Our localization system is able to delimit text regions in images with complex backgrounds, and is based on an attentional cascade, asymmetric adaboost, decision trees and Gaussian mixture models. We think that text could become a valuable source of semantic information for robots, and we aim to raise interest in it within the robotics community. Moreover, thanks to the robot’s pan-tilt-zoom camera and to the active vision behaviors, the robot can use its affordances to overcome hindrances to the performance of the perceptual task. Detrimental conditions, such as poor illumination, blur, low resolution, etc. are very hard to deal with once an image has been captured and can often be prevented. We evaluated the localization algorithm on a public dataset and one of our own with encouraging results. Furthermore, we offer an interesting experiment in active vision, which makes us consider that active sensing in general should be considered early on when addressing complex perceptual problems in embodied agents.
Switched power electronic subsystems are widely used in various applications. A fault in one of their components may have a significant effect on the system’s load or may even cause a damage. Therefore, it is important to detect and isolate faults and to report true faults to a supervisory system in order to avoid malfunction of or damage to a load. If, in a model-based approach to fault detection and isolation of hybrid systems, switching devices are considered as ideal switches then some equations must be reformulated whenever some devices have switched. In this paper, a fixed causality bond graph representation of hybrid system models is used, i.e., computational causalities assigned according to the Standard Causality Assignment Procedure (SCAP) are independent of system modes of operation. The latter are taken into account by transformer moduli mi(t) ∈ {0, 1} ∀t ≥ 0 in a unique set of equations of motion. In a case study, this approach is used for fault diagnosis in a three-phase full-wave rectifier. Residuals of Analytical Redundancy Relations (ARRs) holding for all modes of operations and serving as fault indicators are computed in an offline simulation as part of a DAE system by using a bond graph model of the faulty system instead of the real one and by coupling it to a bond graph of the healthy system by means of residual sinks.
Reactive oxygen species and the bacteriostatic and bactericidal effects of isoconazole nitrate
(2013)
BACKGROUND
Propionic acidemia is an inherited disorder caused by deficiency of propionyl-CoA carboxylase. Although it is one of the most frequent organic acidurias, information on the outcome of affected individuals is still limited.
STUDY DESIGN/METHODS
Clinical and outcome data of 55 patients with propionic acidemia from 16 European metabolic centers were evaluated retrospectively. 35 patients were diagnosed by selective metabolic screening while 20 patients were identified by newborn screening. Endocrine parameters and bone age were evaluated. In addition, IQ testing was performed and the patients' and their families' quality of life was assessed.
RESULTS
The vast majority of patients (>85%) presented with metabolic decompensation in the neonatal period. Asymptomatic individuals were the exception. About three quarters of the study population was mentally retarded, median IQ was 55. Apart from neurologic symptoms, complications comprised hematologic abnormalities, cardiac diseases, feeding problems and impaired growth. Most patients considered their quality of life high. However, according to the parents' point of view psychic problems were four times more common in propionic acidemia patients than in healthy controls.
CONCLUSION
Our data show that the outcome of propionic acidemia is still unfavourable, in spite of improved clinical management. Many patients develop long-term complications affecting different organ systems. Impairment of neurocognitive development is of special concern. Nevertheless, self-assessment of quality of life of the patients and their parents yielded rather positive results.
Issues in an issue tracking system contain different kinds of information like requirements, features, development tasks, bug reports, bug fixing tasks, refactoring tasks and so on. This information is generally accompanied by discussions or comments, which again are different kinds of information (e.g. social interaction, implementation ideas, stack traces or error messages). We propose to improve automatic categorization of this information and use the categorized data to support software engineering tasks. We want to obtain improvements in two different ways. Firstly, we want to obtain algorithmic improvements (e.g. natural language processing techniques) to retrieve and use categorized auxiliary data. Secondly we want to utilize multiple task-based categorizations to support different software engineering tasks.
Within an elementary decision of March 28th, 2006 the German Federal Constitutional Court implemented the following: “According to the status quo of research it is certain, that gambling and bets can result in morbid addictive behaviour. ... However different gambling products exhibit different addictive potentials.” Up to now a specific identification of the addictive potential of a concrete gambling product was nearly impossible. This being said, the Wissenschaftliches Forum Glücksspiel (Gambling Scientific Forum) developed a globally applicable assessment tool to measure and evaluate the risk potential of gambling products.
AsTERiG is developed by the Gambling Scientific Forum in the years 2006-2010. At the completion of this final version as well as in the composition of this survey the following scientists were involved: Prof. Dr. Reiner Clement, Bonn-Rhein-Sieg University; Prof. Dr. Jörg Ennuschat, University of Konstanz; Prof. Jörg Häfeli, Lucerne University of Applied Sciences and Arts; Prof. Dr. Gerhard Meyer, University of Bremen; Chantal Mörsen, Charité Berlin; Prof. Dr. Dr. Franz W. Peren, Bonn-Rhein-Sieg University; Prof. Dr. Wiltrud Terlau, Bonn-Rhein-Sieg University.
A structural mapping of mutations causing succinyl-CoA:3-ketoacid CoA transferase (SCOT) deficiency
(2013)
Succinyl-CoA:3-ketoacid CoA transferase (SCOT) deficiency is a rare inherited metabolic disorder of ketone metabolism, characterized by ketoacidotic episodes and often permanent ketosis. To date there are ~20 disease-associated alleles on the OXCT1 gene that encodes the mitochondrial enzyme SCOT. SCOT catalyzes the first, rate-limiting step of ketone body utilization in peripheral tissues, by transferring a CoA moiety from succinyl-CoA to form acetoacetyl-CoA, for entry into the tricarboxylic acid cycle for energy production. We have determined the crystal structure of human SCOT, providing a molecular understanding of the reported mutations based on their potential structural effects. An interactive version of this manuscript (which may contain additional mutations appended after acceptance of this manuscript) may be found on the web address:
http://www.thesgc.org/jimd/SCOT
The device (10) has a handrail (18) provided with an optical contactless monitoring device formed as an active sensor system, where the monitoring device is arranged in a region of a guide (14) of the handrail at a front base (16) of an escalator (12) or a moving pavement. The monitoring device has two transmission paths (28, 30) with wavelength bands that are different from each other, where one of the paths includes the handrail. Ratio or difference between signals of the paths is used for recognizing foreign bodies e.g. hands of adults and children.
This paper presents recent research on an active multispectral scanning sensor capable of classifying an object's surface material in order to distinguish between different kinds of materials and human skin. The sensor itself has already been presented in previous work and can be used in conjunction with safeguarding equipment at manually-fed machines or robot workplaces, for example. This work shows how an extended sensor system with advanced material classifiers can be used to provide additional value by distinguishing different materials of work pieces in order to suggest different tools or parameters for the machine (e.g. the use of a different saw blade or rotation speed at table saws). Additionally, a first implementation and evaluation of an active multispectral camera system addressing new safety applications is described. Both approaches intend to increase the productivity and the user's acceptance of the sensor technology.