Refine
H-BRS Bibliography
- yes (204) (remove)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (57)
- Fachbereich Informatik (44)
- Fachbereich Angewandte Naturwissenschaften (33)
- Präsidium (27)
- Fachbereich Ingenieurwissenschaften und Kommunikation (22)
- Institute of Visual Computing (IVC) (16)
- Institut für funktionale Gen-Analytik (IFGA) (11)
- Fachbereich Sozialpolitik und Soziale Sicherung (9)
- Institut für Cyber Security & Privacy (ICSP) (3)
- Institut für Detektionstechnologien (IDT) (3)
Document Type
- Article (49)
- Part of a Book (44)
- Conference Object (40)
- Part of Periodical (27)
- Book (monograph, edited volume) (17)
- Bachelor Thesis (4)
- Contribution to a Periodical (4)
- Master's Thesis (4)
- Report (4)
- Working Paper (4)
Year of publication
- 2009 (204) (remove)
Keywords
- Bilanzanalyse (3)
- Lehrbuch (3)
- Rechnungswesen (3)
- Bilanzrecht (2)
- Controlling (2)
- Corrosion inhibitors (2)
- Finanzinstrument (2)
- Handelsrecht (2)
- IT controlling (2)
- IT management (2)
Mergers and acquisitions take place all over the world and in many industries, typically motivated by corporate politics. While IT management is often not involved in the decision-making, it has to solve a wide range of problems in the post-merger phase. Indeed, merging two or more companies implies not only merging their core businesses, but also creating a new and efficiently integrated IT organisation from the individual ones, since persistence of the current IT organisations usually does not make sense. In addition, corporate management frequently imposes constraints, e.g., cost reductions, on the IT infrastructure. The principal critical success factor when merging IT organisations is the uninterrupted operation of the IT business, because a service gap is neither acceptable for in-house functional departments nor for external customers. Therefore, the IT rebuilding phase has to focus on IT services that facilitate the processes of functional departments, support processes, and processes of customers and suppliers, so that any transformation work is transparent to internal and external customers. In this article we describe a real-world but anonymous case study. Our goals are to highlight the points important for merging IT organisations, and to help decision-makers, particularly in the areas of IT organisation and IT personnel. We focus on the arising organisational and non-technical issues from a management perspective, i.e., the CIO's view, and provide checklists intended to help IT managers to address the most pressing issues. To assist CIOs surviving in the post merger phase, we give check lists for merging IT organisations, check lists for merging IT human resources, check lists for IT budgets and reporting, and assess activities in a merger scenario. IT hardware, software and IT infrastructure as well as running IT projects are not considered in this paper.
IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).
BWL für Dummies
(2009)
Ausgangspunkt unserer Überlegungen ist die Feststellung, dass die Legitimierung moderner Formen des Wissens mit dem Verlust von legitimierenden Metaerzählungen einhergeht. Diese Feststellung bezieht sich nicht nur ganz allgemein auf die klassischen Geistes- und Sozialwissenschaften, sondern auch konkret auf die angewandte Management- und Organisationsforschung. Traditionell werden diese untergeordneten Diskursarten durch den übergeordneten Diskurs der Aufklärung legitimiert und unterwerfen sich dem Diktat der Rationalität des Modernismus (Ant 2004).
In der Fachgruppe IT-Controlling des Fachbereichs Wirtschaftsinformatik der Gesellschaft für Informatik e. V. kommen seit 1989 Führungskräfte aus dem Informations- und ITManagement, dem IT-Controlling, Unternehmens- und IT-Berater/-innen sowie Wissenschaftler/-innen zusammen, um Methoden, Anwendungen und Herausforderungen des ITControllings zu diskutieren. Die Fachgruppe ist im deutschsprachigen Raum das zentrale Fachgremium für das Controlling der betrieblichen Informationsverarbeitung (gegenwärtig verbreitet als IT-Controlling und IV-Controlling bezeichnet; weit gehend synonym dazu auch Informatik-Controlling, Informationssystem-Controlling, Informations-Controlling).
Kinetic Inductance Detectors with Integrated Antennas for Ground and Space-Based Sub-mm Astronomy
(2009)
Very large arrays of Microwave Kinetic Inductance Detectors (MKIDs) have the potential to revolutionize ground and space based astronomy. They can offer in excess of 10.000 pixels with large dynamic range and very high sensitivity in combination with very efficient frequency division multiplexing at GHz frequencies. In this paper we present the development of a 400 pixel MKID demonstration array, including optical coupling, sensitivity measurements, beam pattern measurements and readout. The design presented can be scaled to any frequency between 80 GHz and >5 THz because there is no need for superconducting structures that become lossy at frequencies above the gap frequency of the materials used. The latter would limit the frequency coverage to below 1 THz for relatively high gap materials such as NbTiN. An individual pixels of the array consist of a distributed Aluminium CPW MKID with an integrated twin slot antenna at its end. The antenna is placed in the in the second focus of an elliptical high purity Si lens. The lens-antenna coupling design allows room for the MKID resonator outside of the focal point of the lens. The best dark noise equivalent power of these devices is measured to be NEP = 7×10-19 W/[square root]Hz and the optical coupling efficiency is around 30%, in which no antireflection coating was used on the Si lens. For the readout we use a commercial arbitrary waveform generator and a 1.5 GHz FFTS. We show that using this concept it is possible to read out in excess of 400 pixels with 1 board and 1 pair of coaxial cables.
Timely recognition of threats can be significantly supported by security assistance systems that work continuously in time and call the security personnel in case of anomalous events in the surveillance area. We describe the concept and the realization of an indoor security assistance system for real-time decision support. The system consists of a computer vision module and a person classification module. The computer vision module provides a video event analysis of the entrance region in front of the demonstrator. After entering the control corridor, the persons are tracked, classified, and potential threats are localized inside the demonstrator. Data for the person classification are provided by chemical sensors detecting hazardous materials. Due to their limited spatio-temporal resolution, a single chemical sensor cannot localize this material and associate it with a person. We compensate this deficiency by fusing the output of multiple, distributed chemical sensors with kinematical data from laser-range scanners. Considering both the computer vision formation and the results of the person classification affords the localization of threats and a timely reaction of the security personnel.
Objektrelationale Datenbanken und Rough Sets für die Analyse von Contextualized Attention Metadata
(2009)
We present an interactive system that uses ray tracing as a rendering technique. The system consists of a modular Virtual Reality framework and a cluster-based ray tracing rendering extension running on a number of Cell Broadband Engine-based servers. The VR framework allows for loading rendering plugins at runtime. By using this combination it is possible to simulate interactively effects from geometric optics, like correct reflections and refractions.
This report presents the implementation and evaluation of a computer vision task on a Field Programmable Gate Array (FPGA). As an experimental approach for an application-specific image-processing problem it provides reliable results to measure gained performance and precision compared with similar solutions on General Purpose Processor (GPP) architectures.
The project addresses the problem of detecting Binary Large OBjects (BLOBs) in a continuous video stream. For this problem a number of different solutions exist. But most of these are realized on GPP platforms, where resolution and processing speed define the performance barrier. With the opportunity of parallelization and performance abilities like in hardware, the application of FPGAs become interesting. This work belongs to the MI6 project from the Computer Vision research group of the University of Applied Sciences Bonn-Rhein-Sieg. It address the detection of the users position and orientation in relation to the virtual environment in an Immersion Square.
The goal is to develop a light emitting device, that points from the user towards the point of interest on the projection screen. The projected light dots are used to represent the user in the virtual environment. By detecting the light dots with video cameras, the idea is to interface the position and orientation of the relative position of the user interface. Fort that the laser dots need to be arranged in a unique pattern, which requires at least five points.[29] For a reliable estimation a robust computation of the BLOB's center-points is necessary.
This project has covered the development of a BLOB detection system on a FPGA platform. It detects binary spatially extended objects in a continuous video stream and computes their center points. The results are displayed to the user and where validated for their ground truth. The evaluation compares precision and performance gain against similar approaches on GPP platforms.
Heutzutage ist die Entwicklung von Luft- und Raumfahrzeugen ein komplexer und standardisierter Prozess, der verschiedene Disziplinen der Wissenschaft und des Ingenieurwesens vereint. Die Kenntnis flugphysikalischer Eigenschaften, insbesondere Aerodynamik und Strömung, ist für den Entwurf von Luft- und Raumfahrzeugen unerlässlich. Um den Aufwand zur Berechnung dieser Eigenschaften zu verringern, wurden Methoden und Werkzeuge zur computergestützten Simulation entworfen. Diese werden in integrierten simulationsbasierten Entwicklungsprozessen zusammengefasst. Dadurch ist es beispielsweise möglich, Zeitersparnisse von bis zu mehreren Jahren, gegenüber physikalischen Tests in Windkanälen, zu erzielen [Bec08].
In this paper, residual sinks are used in bond graph model-based quantitative fault detection for the coupling of a model of a faultless process engineering system to a bond graph model of the faulty system. By this way, integral causality can be used as the preferred computational causality in both models. There is no need for numerical differentiation. Furthermore, unknown variables do not need to be eliminated from power continuity equations in order to obtain analytical redundancy relations (ARRs) in symbolic form. Residuals indicating faults are computed numerically as components of a descriptor vector of a differential algebraic equation system derived from the coupled bond graphs. The presented bond graph approach especially aims at models with non-linearities that make it cumbersome or even impossible to derive ARRs from model equations by elimination of unknown variables. For illustration, the approach is applied to a non-controlled as well as to a controlled hydraulic two-tank system. Finally, it is shown that not only the numerical computation of residuals but also the simultaneous numerical computation of their sensitivities with respect to a parameter can be supported by bond graph modelling.
Forschungsprojekt Web 2.0
(2009)
Der Begriff Web 2.0 ist in der Internetbranche bereits seit 2005 populär, als Tim O'Reilly am 30. September 2005 den Artikel "What is Web 2.0?" veröffentlichte (O'Reilley 2005). Seit dem Jahr 2007 ist er auch der breiten Masse bekannt und entwickelt sich besonders im Marketing zu einem regelrechten Buzz-Word.