Refine
H-BRS Bibliography
- yes (165) (remove)
Departments, institutes and facilities
- Fachbereich Informatik (58)
- Fachbereich Wirtschaftswissenschaften (44)
- Fachbereich Angewandte Naturwissenschaften (27)
- Institute of Visual Computing (IVC) (18)
- Fachbereich Ingenieurwissenschaften und Kommunikation (13)
- Institut für funktionale Gen-Analytik (IFGA) (10)
- Fachbereich Sozialpolitik und Soziale Sicherung (6)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (5)
- Institut für Sicherheitsforschung (ISF) (4)
- Institut für Detektionstechnologien (IDT) (3)
Document Type
- Article (65)
- Conference Object (35)
- Part of a Book (22)
- Book (monograph, edited volume) (20)
- Report (6)
- Master's Thesis (5)
- Bachelor Thesis (2)
- Contribution to a Periodical (2)
- Doctoral Thesis (2)
- Part of Periodical (2)
Year of publication
- 2012 (165) (remove)
Keywords
- Macht (5)
- Corporate Social Responsibility (3)
- ISM: molecules (3)
- Lehrbuch (3)
- Wirtschaftsethik (3)
- 3D-Scanner (2)
- ARRs (2)
- Augmented Reality (2)
- Bag of Features (2)
- CD21 (2)
After more than twenty years of research, the molecular events of apoptotic cell death can be succinctly stated; different pathways, activated by diverse signals, increase the activity of proteases called caspases that rapidly and irreversibly dismantle condemned cell by cleaving specific substrates. In this time the ideas that apoptosis protects us from tumourigenesis and that cancer chemotherapy works by inducing apoptosis also emerged. Currently, apoptosis research is shifting away from the intracellular events within the dying cell to focus on the effect of apoptotic cells on surrounding tissues. This is producing counterintuitive data showing that our understanding of the role of apoptosis in tumourigenesis and cancer therapy is too simple, with some interesting and provocative implications. Here, we will consider evidence supporting the idea that dying cells signal their presence to the surrounding tissue and, in doing so, elicit repair and regeneration that compensates for any loss of function caused by cell death. We will discuss evidence suggesting that cancer cell proliferation may be driven by inappropriate or corrupted tissue-repair programmes that are initiated by signals from apoptotic cells and show how this may dramatically modify how we view the role of apoptosis in both tumourigenesis and cancer therapy.
In service robotics, tasks without the involvement of objects are barely applicable, like in searching, fetching or delivering tasks. Service robots are supposed to capture efficiently object related information in real world scenes while for instance considering clutter and noise, and also being flexible and scalable to memorize a large set of objects. Besides object perception tasks like object recognition where the object’s identity is analyzed, object categorization is an important visual object perception cue that associates unknown object instances based on their e.g. appearance or shape to a corresponding category. We present a pipeline from the detection of object candidates in a domestic scene over the description to the final shape categorization of detected candidates. In order to detect object related information in cluttered domestic environments an object detection method is proposed that copes with multiple plane and object occurrences like in cluttered scenes with shelves. Further a surface reconstruction method based on Growing Neural Gas (GNG) in combination with a shape distribution-based descriptor is proposed to reflect shape characteristics of object candidates. Beneficial properties provided by the GNG such as smoothing and denoising effects support a stable description of the object candidates which also leads towards a more stable learning of categories. Based on the presented descriptor a dictionary approach combined with a supervised shape learner is presented to learn prediction models of shape categories.
Experimental results, of different shapes related to domestically appearing object shape categories such as cup, can, box, bottle, bowl, plate and ball, are shown. A classification accuracy of about 90% and a sequential execution time of lesser than two seconds for the categorization of an unknown object is achieved which proves the reasonableness of the proposed system design. Additional results are shown towards object tracking and false positive handling to enhance the robustness of the categorization. Also an initial approach towards incremental shape category learning is proposed that learns a new category based on the set of previously learned shape categories.
The ability of detecting people has become a crucial subtask, especially in robotic systems which aim an application in public or domestic environments. Robots already provide their services e.g. in real home improvement markets and guide people to a desired product. In such a scenario many robot internal tasks would benefit from the knowledge of knowing the number and positions of people in the vicinity. The navigation for example could treat them as dynamical moving objects and also predict their next motion directions in order to compute a much safer path. Or the robot could specifically approach customers and offer its services. This requires to detect a person or even a group of people in a reasonable range in front of the robot. Challenges of such a real-world task are e.g. changing lightning conditions, a dynamic environment and different people shapes. In this thesis a 3D people detection approach based on point cloud data provided by the Microsoft Kinect is implemented and integrated on mobile service robot. A Top-Down/Bottom-Up segmentation is applied to increase the systems flexibility and provided the capability to the detect people even if they are partially occluded. A feature set is proposed to detect people in various pose configurations and motions using a machine learning technique. The system can detect people up to a distance of 5 meters. The experimental evaluation compared different machine learning techniques and showed that standing people can be detected with a rate of 87.29% and sitting people with 74.94% using a Random Forest classifier. Certain objects caused several false detections. To elimante those a verification is proposed which further evaluates the persons shape in the 2D space. The detection component has been implemented as s sequential (frame rate of 10 Hz) and a parallel application (frame rate of 16 Hz). Finally, the component has been embedded into complete people search task which explorates the environment, find all people and approach each detected person.
This paper compares the memory allocation of two Java virtual machines, namely Oracle Java HotSpot VM 32-bit (OJVM) and Jamaica JamaicaVM (JJVM). The basic difference of the architectures in both machines is that the JamaicaVM uses fixed-size blocks for allocating objects on the heap. The basic difference of the architectures is that the JJVM uses fixed size block allocation on the heap. This means that objects have to be split into several connected blocks if they are bigger than the specified block-size. On the other hand, for small objects a full block must be allocated. The paper contains both theoretical and experimental analysis on the memory-overhead. The theoretical analysis is based on specifications of the two virtual machines. The experimental analysis is done with a modified JVMTI Agent together with the SPECjvm2008 Benchmark.
The objective of this thesis is to implement a computer game based motivation system for maximal strength testing on the Biodex System 3 Isokinetic Dynamometer. The prototype game has been designed to improve the peak torque produced in an isometric knee extensor strength test. An extensive analysis is performed on a torque data set from a previous study. The torque responses for five second long maximal voluntary contractions of the knee extensor are analyzed to understand torque response characteristics of different subjects. The parameters identifed in the data analysis are used in the implementation of the 'Shark and School of Fish' game. The behavior of the game for different torque responses is analyzed on a different torque data set from the previous study. The evaluation shows that the game rewards and motivates continuously over a repetition to reach the peak torque value. The evaluation also shows that the game rewards the user more if he overcomes a baseline torque value within the first second and then gradually increase the torque to reach peak torque.
This article concerns with the accessibility of Business process modelling tools (BPMo tools) and business process modelling languages (BPMo languages). Therefore the reader will be introduced to business process management and the authors' motivation behind this inquiry. Afterwards, the paper will reflect problems when applying inaccessible BPMo tools. To illustrate these problems the authors distinguish between two different categories of issues and provide practical examples. Finally the article will present three approaches to improve the accessibility of BPMo tools and BPMo languages.
In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key was to make the results applicable to data publication processes in other sciences as well.
Auf dem Weg zur Promotion: Zur Benachteiligung von Fachhochschul-Absolventinnen und -Absolventen
(2012)
Das Team aus Sascha Czornohus, Katrin Dobersalske, Fabian Heuel und Nina Petrow bearbeitet mit seinem „Aufsatz Auf dem Weg zur Promotion: Strukturelle Benachteiligung von Fachhochschul-Absolventinnen und -Absolventen“ ein hochschulpolitisch brisantes und daher sensibles Thema, das die Hochschulpolitik inzwischen in offener Auseinandersetzung beschäftigt. Seit die Forschung unstreitig zu den Aufgaben der Fachhochschulen zählt, Promotion zu den Berufungsvoraussetzungen an ihnen gehört und alle Professoren an Universitäten ausgebildet wurden, wurde die Forderung nach Promotionszugängen für die Master-Absolventen der Fachhochschulen immer lauter. Teillösungen wurden gefunden, aber die Debatte wird breiter und umfasst inzwischen (auf dem Hintergrund der Profil- und Schwerpunktbildung mit sehr unterschiedlichem Ausbau der Fächer) die Frage, ob überhaupt ganzen Hochschulen das Promotionsrecht verliehen werden sollte oder von Fachbereich zu Fachbereich verschieden – auch bei Universitäten. Die Überzeugung, dass die Verteilung des Promotionsrechts in Deutschland überprüft werden sollte, breitet sich aus. Das HSW ist an der breiteren Diskussion dieses Themas interessiert.
AV-Medientechnik
(2012)
Um Filme, TV-Beiträge oder Musikvideos professionell zu produzieren, ist die sichere Beherrschung der Produktionstechnik absolut erforderlich. Basierend auf den jeweiligen technischen und physiologischen Grundlagen vermittelt dieses Buch praxisnah die wesentlichen technischen Inhalte der Film- und Fernsehproduktion. (Verlagsangaben)
Die Darlegungsform von Nachhaltigkeitsleistungen sind üblicherweise gesondert ausgegebene Nachhaltigkeitsberichte auf die DAX-30-Unternehmen in ihren Geschäftsberichten hinweisen. Insbesondere die für kapitalmarktorientierte Unternehmen bedeutende Stakeholdergruppe der Analysten und Investoren fordert jedoch zunehmend eine integrative Darstellung aller Dimensionen der Triple-Bottom-Line auch im Lagebericht. Die gesetzlichen Offenlegungspflichten nach § 289 Abs. 3 bzw. § 315 Abs. 1 S. 4 HGB und DRS 15.32 erhöhen den Druck auf die DAX-30-Unternehmen zusätzlich. Diese Arbeit thematisiert im Kern das aktuelle Spannungsfeld zwischen Ökonomie, Ökologie und sozialem Engagement von Unternehmen. Auf Basis einer umfassenden theoretischen Analyse werden konkrete Kennzahlen zur Erfassung von Indikatoren der Nachhaltigkeit gebildet und deren Ausprägung bei den DAX 30 Unternehmen erarbeitet.
In der vorliegenden Arbeit wurde ein Verfahren zur Analyse von Molekülen auf Grundlage ihrer molekularen Oberfläche und lokalen Werte für physiko-chemische und topografische Eigenschaften entwickelt. Der als Kernkomponente der Analyse entwickelte Fuzzy-Controller kombiniert molekulare Eigenschaften und selektiert die für Wechselwirkungen relevanten Merkmale auf der Oberfläche. Die Ergebnisse des Fuzzy-Controllers werden für die Berechnung von 3D-Deskriptoren und für die Visualisierung der ermittelten Domänen auf der Oberfläche herangezogen. Es werden zwei Arten von Deskriptoren berechnet. Deskriptoren, welche Flächeninhalte und Zugehörigkeiten zu den spezifizierten Bindungsmerkmalen der Domänen darstellen, und Deskriptoren, welche die räumliche Anordnung der Domänen zueinander beschreiben. Die vom Fuzzy-Controller überarbeitete Oberfläche wird im VRML-Format zur Visualisierung und weiteren Bearbeitung zur Verfügung gestellt. Die berechneten Deskriptoren werden zur Ähnlichkeitsanalyse von Liganden und zur Suche von komplementären Bereichen an der Bindungsstelle einesRezeptors eingesetzt. MTX in protonierter Form und DHF, die an das Enzym DHF-Reduktase binden, und die Inhibitoren Sildenafil, Tadalafil und Vardenafil des Enzyms PDE-5A wurden unter Ähnlichkeitsaspekten analysiert. Bei der Bestimmung von komplementären Bindungsmerkmalen wird ausgehend von den Bindungsmerkmalen eines Liganden nach komplementären Bereichen in der Bindungstasche des Rezeptors gesucht. Als Anwendungsbeispiele werden die Bindungsstelle des Enzyms DHF-Reduktase aus den Komplexen mit MTX und DHF und des Enzyms PDE-5A aus den Komplexen mit Sildenafil, Vardenafil und Tadalafil betrachtet. Insgesamt haben die Anwendungsbeispiele gezeigt, dass der vorgestellte Fuzzy-Controller Bindungsmerkmale auf der molekularen Oberfläche identifiziert unddie darauf basierenden, rotations- und translationsinvarianten Deskriptoren zur Ähnlichkeitsanalyse und zur Suche von komplementären Bereichen angewendet werden können.
Big Data
(2012)
A bond graph representation of switching devices known for a long time has been a modulated transformer with a modulus b(t)∈{0,1}∀t≥0 in conjunction with a resistor R:Ron accounting for the ON-resistance of a switch considered non-ideal. Besides other representations, this simple model has been used in bond graphs for simulation of the dynamic behaviour of hybrid systems. A previous article of the author has proposed to use the transformer–resistor pair in bond graphs for fault diagnosis in hybrid systems. Advantages are a unique bond graph for all system modes, the application of the unmodified standard Sequential Causality Assignment Procedure, fixed computational causalities and the derivation of analytical redundancy relations incorporating ‘Boolean’ transformer moduli so that they hold for all system modes. Switches temporarily connect and disconnect model parts. As a result, some independent storage elements may temporarily become dependent, so that the number of state variables is not time-invariant. This article addresses this problem in the context of modelling and simulation of fault scenarios in hybrid systems. In order to keep time-invariant preferred integral causality at storage ports, residual sinks previously introduced by the author are used. When two storage elements become dependent at a switching time instance ts, a residual sink is activated. It enforces that the outputs of two dependent storage elements become immediately equal by imposing the conjugate3 power variable of appropriate value on their inputs. The approach is illustrated by the bond graph modelling and simulation of some fault scenarios in a standard three-phase switched power inverter supplying power into an RL-load in a delta configuration. A well-developed approach to model-based fault detection and isolation is to evaluate the residual of analytical redundancy relations. In this article, analytical redundancy relation residuals have been computed numerically by coupling a bond graph of the faulty system to one of the non-faulty systems by means of residual sinks. The presented approach is not confined to power electronic systems but can be used for hybrid systems in other domains as well. In further work, the RL-load may be replaced by a bond graph model of an alternating current motor in order to study the effect of switch failures in the power inverter on to the dynamic behaviour of the motor.
For the case when the abstraction of instantaneous state transitions is adopted, this paper proposes to start fault detection and isolation in an engineering system from a single time-invariant causality bond graph representation of a hybrid model. To that end, the paper picks up on a long-known proposal to model switching devices by a transformer modulated by a Boolean variable and a resistor in fixed conductance causality accounting for its ON resistance. Bond graph representations of hybrid system models developed in this way have been used so far mainly for the purpose of simulation. The paper shows that they can well constitute an approach to the bond-graph-based quantitative fault detection and isolation of hybrid models. Advantages are that the standard sequential causality assignment procedure can be a used without modification. A single set of analytical redundancy relations valid for all physically feasible system modes can be (automatically) derived from the bond graph. Stiff model equations due to small values of the ON resistance in the switch model may be avoided by symbolic reformulation of equations and letting the ON resistance of some switches tend to zero, turning them into ideal switches.
First, for two examples considered in the literature, it is shown that the approach proposed in this paper can produce the same analytical redundancy relations as were obtained from a hybrid bond graph with controlled junctions and the use of a sequential causality assignment procedure especially for fault detection and isolation purpose. Moreover, the usefulness of the proposed approach is illustrated in two case studies by its application to standard switching circuits extensively used in power electronic systems and by simulation of some fault scenarios. The approach, however, is not confined to the fault detection and isolation of such systems. Analytically validated simulation results obtained by means of the program Scilab give confidence in the approach.
BWL-Formeln für Dummies
(2012)
In the realm of service robots recovery from faults is indispensable to foster user acceptance. Here fault is to be understood not in the sense of robot internal, rather as interaction faults while situated in and interacting with an environment (aka ex-ternal faults). We reason along the most frequent failures in typical scenarios which we observed during real-world demonstrations and competitions using our Care-O-bot III 1 robot. They take place in an apartment-like environments which is known as closed world. We suggest four different -for now adhoc -fault categories caused by disturbances, imperfect per-ception, inadequate planning or chaining of action sequences. The fault are categorized and then mapped to a handful of partly known, partly extended fault handling techniques. Among them we applied qualitative reasoning, use of simu-lation as oracle, learning for planning (aka en-hancement of plan operators) or -in future -case-based reasoning. Having laid out this frame we mainly ask open questions related to the applicability of the pre-sented approach. Amongst them: how to find new categories, how to extend them, how to as-sure disjointness, how to identify old and label new faults on the fly.
Traffic simulations are typically concerned with modeling human behavior as closely as possible to create realistic results. In conventional traffic simulations used for road planning or traffic jam prediction only the overall behavior of an entire system is of interest. In virtual environments, like digital games, simulated traffic participants are merely a backdrop to the player’s experience and only need to be “sufficiently realistic”. Additionally, restricted computational resources, typical for virtual environment applications, usually limit the complexity of simulated behavior in this field. More importantly, two integral aspects of real-world traffic are not considered in current traffic simulations from both fields: misbehavior and risk taking of traffic participants. However, for certain applications like the FIVIS bicycle simulator, these aspects are essential.
Traffic simulations for virtual environments are concerned with the behavior of individual traffic participants. The complexity of behavior in these simulations is often rather simple to abide by the constraints of processing resources. In sophisticated traffic simulations, the behavior of individual traffic participants is also modeled, but the focus lies on the overall behavior of the entire system, e.g. to identify possible bottle necks of traffic flow [8].
Using virtual environment systems for road safety education requires a realistic simulation of road traffic. Current traffic simulations are either too restricted in their complexity of agent behavior or focus on aspects not important in virtual environments. More importantly, none of them are concerned with modeling misbehavior of traffic participants which is part of every-day traffic and should therefore not be neglected in this context. We present a concept for a traffic simulation that addresses the need for more realistic agent behavior with regard to road safety education. The two major components of this concept are a simulation of persistent agents which minimizes computational overhead and a model of cognitive processes of human drivers combined with psychological personality profiles to allow for individual behavior and misbehavior.
Um eine Software fertigzustellen und dem Endkunden zu übergeben muss zunächst der Entwicklungsprozess durchschritten werden. Das zügige Durchlaufen dieses Entwicklungsprozesses ist besonders für den Endkunden von entscheidender Bedeutung, da die Wartezeit auf das Softwareprodukt für ihn reduziert wird. Problematisch könnte beispielsweise dabei ein modulares Vorgehen werden, wenn zunächst alle einzelnen Teilkomponenten eines Softwareproduktes entwickelt und diese daraufhin in einer anschließenden Phase, auch Integrationsphase genannt, zusammengefügt würden. Die Länge dieser Integrationsphase kann nur schwer vorausgesagt werden, so dass weder das Entwicklerteam noch der Endkunde wissen, wie lang die Fertigstellung des Produktes dauern wird. Dabei entsteht ein weiterer Nachteil. Da die Komponenten separat voneinander entwickelt werden, könnte es passieren, dass diese beim finalen Zusammenfügen nicht kompatibel sein und müssten, falls notwendig, angepasst werden. Die Folge wäre eine Verschwendung von personellen und somit auch finanziellen Ressourcen seitens des entwickelnden Unternehmens.
Controlling 2020
(2012)
Aufgrund der wachsenden Dynamik der Unternehmensumwelt, nimmt die allgemeine Unsicherheit über die zukünftigen Entwicklungsrichtungen der Unternehmen in Deutschland stetig zu. Vor diesem Hintergrund und für einen proaktiven Umgang mit zukünftigen Herausforderungen, sollten Controller ein besonderes Interesse daran haben, frühzeitig zu erfahren, welche Entwicklungen die Controllingpraxis in Zukunft tangieren werden. Trotz der steigenden Nachfrage kommt aktuell weder die akademische, noch die praxisorientierte Controllingforschung ihrer Prognosefunktion in ausreichendem Maße nach. Als wissenschaftlicher Beitrag für eine stärkere Zukunftsorientierung in der Controllingforschung zielt die vorliegende Arbeit darauf ab, mit Hilfe einer qualitativen Metaanalyse, aus Zukunftsbildern der Unternehmensumwelt Thesen über die zukünftige Entwicklung des Controllings in Deutschland bis 2020 aufzustellen und Implikationen für die Controllingpraxis abzuleiten.
This article concerns the design and development of Information- and Communication Technology, in particular computer systems in regard to the demographic transition which will influence user capabilities. It is questionable if current applied computer systems are able to meet the requirements of altered user groups with diversified capabilities. Such an enquiry is necessary based on actual forecasts leading to the assumption that the average age of employees in enterprises will increase significantly within the next 50-60 years, while the percentage of computer aided business tasks, operated by human individuals, rises from year to year. This progress will precipitate specific consequences for enterprises regarding the design and application of computer systems. If computer systems are not adapted to altered user requirements, efficient and productive utilisation could be negatively influenced. These consequences constitute the motivation to extend traditional design methodologies and thereby ensure the application of computer systems that are usable, independent of user capabilities.
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
This project investigated the viability of using the Microsoft Kinect in order to obtain reliable Red-Green-Blue-Depth (RGBD) information. This explored the usability of the Kinect in a variety of environments as well as its ability to detect different classes of materials and objects. This was facilitated through the implementation of Random Sample and Consensus (RANSAC) based algorithms and highly parallelized workflows in order to provide time sensitive results. We found that the Kinect provides detailed and reliable information in a time sensitive manner. Furthermore, the project results recommend usability and operational parameters for the use of the Kinect as a scientific research tool.
Development and Validation of a Rapid and Reliable Method for TPMT Genotyping using real-time PCR
(2012)
Die Macht der Sympathie
(2012)
Die sozialen Herausforderungen der Zukunft und die gesellschaftspolitische Rolle von Unternehmen
(2012)
Distributed computing environments allow collaborative problem solving across teams and organisations. A fundamental precondition for collaboration is the ability to find available participants and be able to exchange information. One way to approach this conceptual formulation are central directories or registry services. A major disadvantage of centralized components is, that they limit the flexibility to form ad hoc networks that are targeted to solve a specific problem. To facilitate flexible and dynamic collaborations, ideas from decentralized and self-organising networks can be combined with concepts of service oriented computing. This project aims to investigate potential solutions for dynamic discovery of network participants and outlines how to manage challenges associated with the development of a discovery protocol for distributed systems. During the course of this project a prototypical implementation was created that integrates into the open source distributed, collaborative problem solving environment RCE [9]. It is currently developed at the German Aerospace Center (DLR) but is planned to make the framework available to broader community.
Die vorliegende wirtschaftswissenschaftliche Studie, die vom Forschungsinstitut für Glücksspiel und Wetten am BusinessCampus der Hochschule Bonn-Rhein-Sieg im Auftrag der Deutschen Automatenindustrie erstellt wurde, setzt sich kritisch-methodisch mit den bestehenden Annahmen über die sozialen Folgekosten von Spielsucht in Bezug auf gewerbliche Spielautomaten in Gastbetrieben und Spielhallen auseinander. Sie kommt hierbei zu neuen, fundierten Bewertungen, insbesondere im Vergleich mit Risiken bei Spielbanken und Lotterien. In einer Kosten-Nutzen-Bilanz überwiegen die Nutzenaspekte des gewerblichen Geld-Gewinnspiels die Kosten um ein Mehrfaches.
In der Arbeit wurde ein Steuerungsframework für die LAMA-Bibliothek (http://www.libama.org) zur Konfiguration von Lösern linearer Gleichungssysteme entwickelt. Hierzu wurde ein Parser mit der Boost.Spirit-Biblithek realisiert, der die Laufzeitinterpretation einer domänenspezifische Sprache (DSL) erlaubt. Durch die Konfigurationssprache ist es möglich, Löser ohne Einschränkungen über ihre ID zu verknüpfen, diesen Lösern Logger und logisch verknüpfte Haltekriterien zuzuordnen.
Volks- und Raiffeisenbanken sehen sich trotz positiver Geschäftsentwicklung einem zunehmenden Wettbewerbsdruck ausgesetzt, dem die Institute auch durch einen Ausbau des Kreditgeschäftes entgegensteuern wollen. Aus diesem Grund müssen sie zu einer aktiven Steuerung des Vertriebs - auch im Kreditgeschäft - übergehen. Allerdings stellt das mit dem Kreditgeschäft einhergehende Kreditrisiko, sowohl gemessen an seiner absoluten Höhe als auch an seiner Ergebniswirkung bereits jetzt ein zentrales Risiko genossenschaftlicher Kreditinstitute dar, das - auch aus aufsichtrechtlichen Gründen - durch ein konsequentes Reporting beherrschbar gemacht werden muss. Ziel der vorliegenden Arbeit ist die Entwicklung eines entsprechenden Kreditreportings für die Raiffeisenbank RheinbachVoreifel eG, das auch bei anderen genossenschaftlichen Instituten Anwendung finden könnte. Mit ihm soll eine effiziente Steuerung des Kreditgeschäfts aufbauend auf seinen Steuerungsperspektiven Risiko, Ertrag und Prozess sichergestellt werden. Gleichzeitig wird versucht, das System durch eine überschaubare Anzahl von Kennzahlen anwenderfreundlich zu gestalten.
Die Matrix-Vektor-Multiplikation für dünn besetzte Matrizen (SpMV) stellt für weitreichende wissenschaftliche Anwendungen eine der Kernoperationen des High-Performance-Computing-Bereichs dar. Für die verteilte Berechnung mit immer beliebter werdenden hybriden Rechenclustern kommt dabei die Frage nach einer geeigneten Partitionierungsstrategie für die Verteilung von Daten und Berechnung auf. Diese Arbeit beschäftigt sich damit welchen Einfluss die Struktur der Matrix und die unterschiedlichen Prozessortypen auf die Leistung der SpMV haben und schlägt ein Modell vor, um für diese eine lastbalancierte Verteilung zu erreichen. Wesentliche Bestandteile sind dabei die Laufzeitvorhersage für aktuelle CPUs und GPUs basierend auf einem abgewandelten Roofline-Modell sowie die bewährte Methode der Graph-Partitionierung.
The biological effects of bilirubin, still poorly understood, are concentration-dependent ranging from cell protection to toxicity. Here we present data that at high nontoxic physiological concentrations, bilirubin inhibits growth of proliferating human coronary artery smooth muscle cells by three events. It impairs the activation of Raf/ERK/MAPK pathway and the cellular Raf and cyclin D1 content that results in retinoblastoma protein hypophosphorylation on amino acids S608 and S780. These events impede the release of YY1 to the nuclei and its availability to regulate the expression of genes and to support cellular proliferation. Moreover, altered calcium influx and calpain II protease activation leads to proteolytical degradation of transcription factor YY1. We conclude that in the serum-stimulated human vascular smooth muscle primary cell cultures, bilirubin favors growth arrest, and we propose that this activity is regulated by its interaction with the Raf/ERK/MAPK pathway, effect on cyclin D1 and Raf content, altered retinoblastoma protein profile of hypophosphorylation, calcium influx, and YY1 proteolysis. We propose that these activities together culminate in diminished 5 S and 45 S ribosomal RNA synthesis and cell growth arrest. The observations provide important mechanistic insight into the molecular mechanisms underlying the transition of human vascular smooth muscle cells from proliferative to contractile phenotype and the role of bilirubin in this transition.
Die Debatte um das menschliche Erkenntnisvermögen, also die Frage nach der Art und Weise, wie Menschen Wissen und Erkenntnis erlangen, ist nicht neu, sondern sie stellt sich seitdem philosophische Fragen gestellt werden – ohne dass allerdings über die Jahrhunderte hinweg eine definitive Antwort auf diese Frage gefunden werden konnte.
We present our approach to extend a Virtual Reality software framework towards the use for Augmented Reality applications. Although VR and AR applications have very similar requirements in terms of abstract components (like 6DOF input, stereoscopic output, simulation engines), the requirements in terms of hardware and software vary considerably. In this article we would like to share the experience gained from adapting our VR software framework for AR applications. We will address design issues for this task. The result is a VR/AR basic software that allows us to implement interactive applications without fixing their type (VR or AR) beforehand. Switching from VR to AR is a matter of changing the configuration file of the application. We also give an example of the use of the extended framework: Augmenting the magnetic field of bar magnets in physics classes. We describe the setup of the system and the real-time calculation of the magnetic field, using a GPU.
Gas chromatography with simultaneous flame-ionization detection (FID) and a nitrogen-phosphorus detection (NPD) as well as gas chromatography-mass spectrometry (GC/MS) has been used to characterize some long-chain primary alkyl amines and alkyl diamines after derivatization with trifluoroacetic anhydride (TFAA).
Oft wird die Gewinnung von Gastärzten aus dem Ausland als Möglichkeit zur kurz- und mittelfristigen Deckung des Ärztebedarfs in Kliniken in Erwägung gezogen. Die hohen Anforderungen von Landesprüfungsämtern, Arbeitsagenturen, Ausländerbehörden und Botschaften erweisen sich sowohl für Ärzte als auch Kliniken als schwer überwindbare Hürden. Insbesondere Sprachschwierigkeiten sowie mangelhafte fachliche und kulturelle Kenntnisse der internationalen Ärzte sind ohne professionelle Unterstützung kaum zu kompensieren.
Gute Reha ist mehr als reine Krankenbehandlung - Zur Reha-Qualitätssicherung im Sinne der Patienten
(2012)
The criteria for assessing the quality of rubber materials are the polymer or copolymer composition and the additives. These additives include plasticizers, extender oils, carbon black, inorganic fillers, antioxidants, heat and light stabilizers, processing aids, cross-linking agents, accelerators, retarders, adhesives, pigments, smoke and flame retardants, and others. Determination of additives in polymers or copolymers generally requires the extraction of these substances from the matrix as a first step, which can be challenging, and the subsequent analysis of the extracted additives by gas chromatography (GC), GC–mass spectrometry (MS), high performance liquid chromatography (HPLC), HPLC–MS, capillary electrophoresis, thin-layer chromatography, and other analytical techniques. In the present work, nitrile rubber materials were studied using direct analytical flash pyrolysis hyphenated to GC and electrospray ionization MS in both scan and selected ion monitoring modes to demonstrate that this technique is a good tool to identify the organic additives in nitrile rubber.
A robot (e.g. mobile manipulator) that interacts with its environment to perform its tasks, often faces situations in which it is unable to achieve its goals despite perfect functioning of its sensors and actuators. These situations occur when the behavior of the object(s) manipulated by the robot deviates from its expected course because of unforeseeable ircumstances. These deviations are experienced by the robot as unknown external faults. In this work we present an approach that increases reliability of mobile manipulators against the unknown external faults. This approach focuses on the actions of manipulators which involve releasing of an object. The proposed approach, which is triggered after detection of a fault, is formulated as a three-step scheme that takes a definition of a planning operator and an example simulation as its inputs. The planning operator corresponds to the action that fails because of the fault occurrence, whereas the example simulation shows the desired/expected behavior of the objects for the same action. In its first step, the scheme finds a description of the expected behavior of the objects in terms of logical atoms (i.e. description vocabulary). The description of the simulation is used by the second step to find limits of the parameters of the manipulated object. These parameters are the variables that define the releasing state of the object.
Using randomly chosen values of the parameters within these limits, this step creates different examples of the releasing state of the object. Each one of these examples is labelled as desired or undesired according to the behavior exhibited by the object (in the simulation), when the object is released in the state corresponded by the example. The description vocabulary is also used in labeling the examples autonomously. In the third step, an algorithm (i.e. N-Bins) uses the labelled examples to suggest the state for the object in which releasing it avoids the occurrence of unknown external faults.
The proposed N-Bins algorithm can also be used for binary classification problems. Therefore, in our experiments with the proposed approach we also test its prediction ability along with the analysis of the results of our approach. The results show that under the circumstances peculiar to our approach, N-Bins algorithm shows reasonable prediction accuracy where other state of the art classification algorithms fail to do so. Thus, N-Bins also extends the ability of a robot to predict the behavior of the object to avoid unknown external faults. In this work we use simulation environment OPENRave that uses physics engine ODE to simulate the dynamics of rigid bodies.
Interactive Distributed Rendering of 3D Scenes on Multiple Xbox 360 Systems and Personal Computers
(2012)
IT-Controlling
(2012)
IT-Radar für BPM und ERP
(2012)
Mit dem IT-Radar für BPM und ERP liegt ein Instrument zur Unterstützung der aktiven Steuerung und Validierung der IT-Strategie vor. Die ersten Ergebnisse zeigen, dass klassische Aufgaben des BMP- und ERP-Managements wie Prozessintegration nach wie vor hohe Aktualität haben und neue Themen wie die dienstliche Nutzung privater Endgeräte (BYOD - Bring Your Own Device), die Verarbeitung von sehr großen Datenmengen (Big Data) und Echtzeitverarbeitung (In-Memory Computing) zwar intensiv auf die Agenda des Chief Information Officers (CIO) drängen, aber klassische Aufgaben nicht verdrängen.