Refine
Departments, institutes and facilities
- Fachbereich Informatik (42)
- Fachbereich Wirtschaftswissenschaften (25)
- Institut für funktionale Gen-Analytik (IFGA) (25)
- Institute of Visual Computing (IVC) (17)
- Fachbereich Angewandte Naturwissenschaften (16)
- Fachbereich Ingenieurwissenschaften und Kommunikation (15)
- Fachbereich Sozialpolitik und Soziale Sicherung (10)
- Institut für Cyber Security & Privacy (ICSP) (10)
- Institut für Verbraucherinformatik (IVI) (5)
- Institut für Sicherheitsforschung (ISF) (2)
Document Type
- Article (62)
- Conference Object (60)
- Part of a Book (22)
- Book (monograph, edited volume) (12)
- Report (8)
- Master's Thesis (7)
- Conference Proceedings (4)
- Bachelor Thesis (2)
- Contribution to a Periodical (2)
- Lecture (2)
Year of publication
- 2011 (183) (remove)
Has Fulltext
- no (183) (remove)
Keywords
- Unternehmen (3)
- Business Ethnography (2)
- CUDA (2)
- Emergency support system (2)
- Finite-Elemente-Methode (2)
- Global Software Engineering (2)
- Mobile sensors (2)
- 3D Visualisierung (1)
- 3D gaming (1)
- 3D nucleus (1)
A method for minimum range extension with improved accuracy in triangulation laser range finder
(2011)
The development of pulmonary edema can be considered as a combination of alveolar flooding via increased fluid filtration, impaired alveolar-capillary barrier integrity, and disturbed resolution due to decreased alveolar fluid clearance. An important mechanism regulating alveolar fluid clearance is sodium transport across the alveolar epithelium. Transepithelial sodium transport is largely dependent on the activity of sodium channels in alveolar epithelial cells. This paper describes how sodium channels contribute to alveolar fluid clearance under physiological conditions and how deregulation of sodium channel activity might contribute to the pathogenesis of lung diseases associated with pulmonary edema. Furthermore, sodium channels as putative molecular targets for the treatment of pulmonary edema are discussed.
The Web has become an indispensable prerequisite of everyday live and the Web browser is the most used application on a variety of distinct devices. The content delivered by the Web has changed drastically from static pages to media-rich and interactive Web applications offering nearly the same functionality as native applications, a trend which is further pushed by the Cloud and more specifically the Cloud’s SaaS layer. In the light of this development, security and performance of Web browsing has become a crucial issue.
Routing Attacks are a serious threat to communication in tactical MANETs. TOGBAD is a centralised approach, using topology graphs to detect such attacks. In this paper, we present TOGBAD's newly added wormhole detection capability. It is an adaptation of a wormhole detection method developed by Hu et al. This method is based on nodes' positions. We adapted it to the specific properties of tactical environments. Furthermore, we present simulation results which show TOGBAD's performance regarding the detection of wormhole attacks.
The usage of link quality based routing metrics significantly improves the quality of the chosen paths and by that the performance of the network. But, attackers may try to exploit link qualities for their purposes. Especially in tactical multi-hop networks, routing may fall prey to an attacker. Such routing attacks are a serious threat to communication. TOGBAD is a centralised approach, using topology graphs to detect routing attacks. In this paper, we enhance TOGBAD with the capability to detect fake link qualities. We use a Challenge/Response method to estimate the link qualities in the network. Based on this, we perform plausibility checks for the link qualities propagated by the nodes in the network. Furthermore, we study the impact of attackers propagating fake link qualities and present simulation results showing TOGBAD's detection rate.
Wohin in der Reha?
(2011)
Arbeitsmarktintegration - eine Aufgabe der medizinischen Rehabilitation Abhängigkeitskranker?
(2011)
NMR structures of thiostrepton derivatives for characterization of the ribosomal binding site
(2011)
Software offshoring has been established as an important business strategy over the last decade. While research on such forms of Global Software Development (GSD) has mainly focused on the situation of large enterprises, small enterprises are increasingly engaging in offshoring, too. Representing the biggest share of the German software industry, small companies are known to be important innovators and market pioneers. They often regard their flexibility and customer-orientation as core competitive advantages. Unlike large corporations, their small size allows them to adopt software development approaches that are characterized by a high agility and flat hierarchies. At the same time, their distinct strategies make it unlikely that they can simply adopt management strategies that were developed for larger companies.
Flexible development approaches like the ones preferred by small corporations have proven to be problematic in the context of offshoring, as their strong dependency on constant communication is strongly affected by the various barriers of international cooperation between companies. Cooperating closely over companies’ borders in different time zones and in culturally diverse teams poses complex obstacles for flexible management approaches. It is still a matter of discussion in fields like Software Engineering and Computer Supported Cooperative Work how these obstacles can be tackled and how they affect companies in the long term. Hence, it is agreed that we need a more detailed understanding of distributed software development practices in order to come to feasible technological and organizational solutions.
This dissertation presents results from two ethnographically-informed case studies of software offshoring in small German enterprises. By adopting Anselm Strauss’ concept of articulation work, we want to deepen the understanding of managing distributed software development in flexible, customer-oriented organizations. In doing so, we show how practices of coordinating inter-organizational software development are closely related to aspects of organizational learning in small enterprises. By means of interviews with developers and project managers from both parties of the cooperation, we do not only take into account the multiple perspectives of the cooperation, but also include the socio-cultural background of international software development projects into our analysis.
The recent explosion of available audio-visual media is the new challenge for information retrieval research. Audio speech recognition systems translate spoken content to the text domain. There is a need for searching and indexing this data which possesses no logical structure. One possible way to structure it on a high level of abstraction is by finding topic boundaries. Two unsupervised topic segmentation methods were evaluated with real-world data in the course of this work. The first one, TSF, models topic shifts as fluctuations in the similarity function of the transcript. The second one, LCSeg, approaches topic changes as places with the least overlapping lexical chains. Only LCSeg performed close to a similar real-world corpus. Other reported results could not be outperformed. Topic analysis based on the repeated word usage models renders topic changes more ambiguous than expected. This issue has more impact on the segmentation quality than the state-of-the-art ASR word error rate. It could be concluded that it is advisable to develop topic segmentation algorithms with real-world data to avoid potential biases to artificial data. Unlike evaluated approaches based on word usage analysis, methods operating with local contexts can be expected to perform better through emulation of semantic dependencies.
This study presents the findings of a quantitative study on the use of Open Educational Resources (OER) and Open Educational Practices (OEP) in Higher Education and Adult Learning Institutions. The study is based on the results of an online survey targeted at four educational roles: educational policy makers; institutional policy makers/managers; educational professionals; and learners. The report encompasses five chapters and four annexes. Chapter I presents the survey and Chapter II discloses the main research questions and models. Chapter III characterises the universe of respondents. Chapter IV advances with a detailed survey analysis including an overview of key statistical data. Finally, Chapter V provides an exploratory in-depth analysis of some key issues: representations, attitudes and uses of OEP. The table of contents and the complete list of diagrams and tables can be found at the end of the report.
In an explorative study, we investigated on German schoolteachers how they use, reuse, produce and manage Open Educational Resources. The main questions in this research have been, what their motivators and barriers are in their use of Open Educational Resources, what others can learn from their Open Educational Practices, and what we can do to raise the dissemination level of OER in schools.
This paper addresses special skills, learners in Internet-based learning scenarios need. In self-directed learning scenarios, as most Internet-based learning scenarios are designed, learners bear the responsibility for their learning progress. To ease this task, institutions could prime the learners for the situation which may be quite different to their previous learning experiences. Basing on a Delphi-study, conducted with experts from the e-Learning sector in Germany, Austria, and Switzerland, the basic requirements have been determined.
Adaptability as a Special Demand on Open Educational Resources: The Cultural Context of e-Learning
(2011)
Producing and providing Open Educational Resources (OERs) is driven by the concepts of openness and sharing. Although there already are a lot of free high-quality resources available, practitioners often rather rewrite learning resources than creatively embed (and thus, reuse) existing OERs. In this paper, we analyse the reasons for this in two different educational contexts. As a result of this analysis, we found that the uncertainty on possible adaptation needs is one of the major barriers. In order to overcome this barrier and make different learning contexts comparable, we analysed the context of learners and in particular, in the research project ‘Learning Culture’, we investigated the field of culturally motivated expectations and attitudes of learners. This paper shows the results of this research project and discusses which cultural issues should be taken into consideration when OERs are to be adapted from one to another cultural context.
In Anlehnung an die von Leidner und Kayworth (2006) durchgeführte Studie zum Umgang mit Kultur in der angelsächsischen Wissenschaftsdisziplin „Information Systems“ wurde eine entsprechende Literaturstudie für die gestaltungsorientierte Wirtschaftsinformatik des deutschen Sprachraums durchgeführt. In der Studie wurde in den Hauptorganen der Disziplin untersucht, in welcher Häufigkeit kulturelle Einflüsse auf Informationstechnologie thematisiert wurden, wie diese Einflüsse aufgearbeitet wurden und welche Referenzmodel-le/Referenzliteratur verwendet wurden. Nach einer kurzen Darstellung der gewählten Vorgehensweise werden die Ergebnisse und Beschränkungen der Studie präsentiert.
The roadmap for quality and innovation through open educational practices has been conceived as a number of steps, a conceptual document, which can be used by organisations, leaners or professionals in order to improve their open educational practices. After the development of the core concept of the OPAL project, the guidelines for OEP, it became clear that these guidelines, would have to play an important part on the roadmap exercise, because they represent the very essence of how to foster and stimulate open educational practices. The roadmap therefore is meant to be an instrument, a tool which helps the different stakeholders to use the guidelines for their own context and purpose.
Selective screening for inborn errors of metabolism--assessment of metabolites in body fluids
(2011)
This paper picks up on one of the ways reported in the literature to represent hybrid models of engineering systems by bond graphs with static causalities. The representation of a switching device by means of a modulated transformer (MTF) controlled by a Boolean variable in conjunction with a resistor has been used so far to build a model for simulation. In this paper, it is shown that it can also constitute an approach to bond graph based quantitative fault detection and isolation in hybrid system models. Advantages are that Analytical Redundancy Relations (ARRs) do not need to be derived again after a switch state has changed. ARRs obtained from the bond graph are valid for all system modes. Furthermore, no adaption of the standard sequential causality assignment procedure (SCAP) with respect to fault detection and isolation (FDI) is needed.
Die Zukunft der CCS-Technologie ist maßgeblich von den Entwicklungsmöglichkeiten abhängig, die ihr der Gesetzgeber einräumt. Der von der Bundesregierung am 13.4.2011 beschlossene Gesetzentwurf geht mit großer Vorsicht an den Einsatz der CCS-Technologie heran. Die Entwicklung sowie die Planungs- und Investitionssicherheit haben gegenüber den Bedenken gegen die kurz- und langfristige Sicherheit der CO2-Speicherung zurückstecken müssen. Der nachfolgende Beitrag erläutert Rechtsprobleme der Genehmigung von CO2-Abscheidungsanlagen, Transportleitungen und Speichern sowie der Gefahrenvorsorge, -nachsorge und Haftung.
This report presents an approach on a quadrotor dynamics stabilization based on ICP SLAM. Because the quadrotor lacks sensory information to detect its horizontal drift an additional sensor as Hokuyo-UTM has been used to perform on-line ICP-based SLAM. The obtained position estimates were used in control loops to maintain desired position and orientation of the vehicle. Such attitude parameters as height, yaw and position in space were controlled based on the laser data. As a result the quadrotor demonstrated two significant for autonomous navigation capabilities: performance of on-line SLAMon a flying vehicle and maintaining desired position in 3D space. Visual approach on optical flow based on Pyramid Lucas-Kanade algorithm has been touched and tested in different environmental conditions though hasn't been implemented in the control loop. Also the performance of the Hokuyo laser scanner and the related to it ICP SLAM algorithm have been tested in different environmental conditions indoors, outdoors and in presence of smoke. Results are presented and discussed. The requirement of performing on-line SLAM algorithm and to carry quite heavy equipment for it forced to seek a solution to increase the payload of the quadrotor with its computational power. A new hardware and distributed software architectures are therefore presented in the report.
The smart home of the future is typically researched in lab settings or apartments that have been built from scratch. However, comparing the lifecycle of buildings and information technology, it is evident that modernization strategies and technologies are needed to empower residents to modify and extend their homes to make it smarter. In this paper, we describe a case study about the deployment, adaption to and adoption of tailorable home energy management systems in 7 private households. Based on this experience, we want to discuss how hardware and software technologies should be designed so that people could build their own smart home with a high usability and user experience.
The work done in this thesis enhances the MMD algorithm in multi-core environments. The MMD algorithm, a transformation based algorithm for reversible logic synthesis, is based on the works introduced by Maslov, Miller and Dueck and their original, sequential implementation. It synthesises a formal function specification, provided by a truth table, into a reversible network and is able to perform several optimization steps after the synthesis. This work concentrates on one of these optimization steps, the template matching. This approach is used to reduce the size of the reversible circuit by replacing a number of gates that match a template which implements the same function and uses less gates. Smaller circuits have several benefits since they need less area and are not as costly. The template matching approach introduced in the original works is computationally expensive since it tries to match a library of templates against the given circuit. For each template at each position in the circuit, a number of different combinations have to be calculated during runtime resulting in high execution times, especially for large circuits. In order to make the template matching approach more efficient and usable, it has been reimplemented in order to take advantage of modern multi-core architectures such as the Cell Broadband Engine or a Graphics Processing Unit. For this work, two algorithmically different approaches that try to consider each multi-core architecture’s strengths, have been analyzed and improved. For the analysis these approaches have been cross-implemented on the two target hardware architectures and compared to the original parallel versions. Important metrics for this analysis are the execution time of the algorithm and the result of the minimization with the template matching approach. It could be shown that the algorithmically different approaches produce the same minimization results, independent of the used hardware architecture. However, both cross-implementations also show a significantly higher execution time which makes them practically irrelevant. The results of the first analysis and comparison lead to the decision to enhance only the original parallel approaches. Using the same metrics for successful enhancements as mentioned above, it could be shown that improving the algorithmic concepts and exploiting the capabilities of the hardware lead to better results for the execution time and the minimization results compared to their original implementations.
Incremental Bond Graphs
(2011)
Bond Graph Modelling of Engineering Systems: Theory, Applications and Software Support addresses readers to consider the potential and the state-of-the-art of bond graph modeling of engineering systems with respect to theory, applications and software support. Bond graph modelling is a physical modelling methodology based on first principles that is particularly suited for modelling multidisciplinary or mechatronic systems. This book covers theoretical issues and methodology topics that have been subject of ongoing research during past years, presents new promising applications such as the bond graph modeling of fuel cells and illustrates how bond graph modeling and simulation of mechatronic systems can be supported by software. This up-to-date comprehensive presentation of various topics has been made possible by the cooperation of a group of authors who are experts in various fields and share the “bond graph way of thinking.”
At present, data publication is one of the most dynamic topics in e-Research. While the fundamental problems of electronic text publication have been solved in the past decade, standards for the external and internal organisation of data repositories are advanced in some research disciplines but underdeveloped in others. We discuss the differences between an electronic text publication and a data publication and the challenges that result from these differences for the data publication process. We place the data publication process in the context of the human knowledge spiral and discuss key factors for the successful acquisition of research data from the point of view of a data repository. For the relevant activities of the publication process, we list some of the measures and best practices of successful data repositories.
Aufgrund der zunehmenden Behandlung von sozialen Netzwerken in den Medien war es das Ziel der Arbeit das Geschäftsmodell von sozialen Netzwerken näher zu analysieren. Die Arbeit zeigt, dass soziale Online-Netzwerke zu den Diensten im Internet gehören, die zwar schon länger existieren, ihren eigentlichen Durchbruch aber erst in den letzten Jahren erlebten. Zu Beginn als reine Kommunikationsplattform genutzt, werden sie heute zur allgemeinen Freizeitgestaltung verwendet und integrieren sich zunehmend in das alltägliche Leben. Die Arbeit beschäftigt sich mit den ökonomischen Besonderheiten von sozialen Online-Netzwerken. Analysiert werden Netzwerkeffekte, Angebots- und Nachfrageverhalten, kritische Masse-Phänomene, Tippy markets, Netzwerkgesetze, Lock-In-Effekte und Wechselkosten. Es wird untersucht, ob und inwieweit sich hinter den sozialen Online-Netzwerken auch klar erkennbare Geschäftsmodelle verbergen. Aufbauend auf einer kritischen Auseinandersetzung mit der Vielfalt existierender Geschäftsmodelle erfolgt die Entwicklung eines eigenen tragfähigen Ansatzes. Auf dieser Basis wird eine Analyse existierender Online-Netzwerke und eine Beurteilung ihres Innovationsgrades vorgenommen.
Negative Schlagzeilen über den Klimawandel, die Verschwendung nichterneuerbarer Ressourcen und Umweltverschmutzung werden täglich über die Medien verbreitet. Durch immer neue Lebensmittelskandale sind die Verbraucher verunsichert. Ihr Vertrauen in Produzenten und Anbieter ist erschüttert. Zunehmend mehr Konsumenten versuchen, umweltbewusstes und nachhaltiges Handeln in ihren Alltag zu integrieren. Auch die gesellschaftliche Verantwortung von Unternehmen wurde noch nie so aktiv wie heute in der Öffentlichkeit herausgestellt und diskutiert. Beide Entwicklungen tragen dazu bei, dass der Begriff „Bio“ zurzeit in aller Munde ist. Er findet außer im Bereich der Lebensmittel auch in Sektoren wie Kosmetik und Mode Anwendung. Ziel dieser Arbeit ist es, Handlungsempfehlungen für den Einzelhandel herauszuarbeiten, um sein am Markt bestehendes Potential auszuschöpfen, insbesondere um, Kunden zu binden und Neukunden zu gewinnen. Zusätzlich werden mögliche Kooperationsfelder von Handel und Industrie systematisiert sowie Handlungsempfehlungen erarbeitet, die zur Verbesserung der Kooperation von im Bereich des Category Management für Bio-Produkte beitragen.
Logistikmarkt Russland
(2011)
Russland ist eine der bedeutenden und aufstrebenden Wirtschaftsregionen, für die sich im Besonderen logistische Herausforderungen stellen. Die vorliegende Arbeit analysiert und beschreibt den "Logistikmarkt Russland", der sich für europäische und insbesondere deutsche Logistikdienstleister sowohl wegen seiner Größe und Nähe zur EU als auch aufgrund des enormen Wachstumspotentials als sehr attraktiv darstellt. Zielsetzung dieser Abhandlung ist das Aufzeigen relevanter Trends und Entwicklungstendenzen, sowohl um Logistikdienstleistern bei der Beurteilung des strategischen Marktpotentials Russlands Hilfestellungen zu geben, als auch um wirtschaftliche Chancen und Risiken auf diesem Markt zu betrachten.
Superconducting heterodyne receiver has played a vital role in the high resolution spectroscopy applications for astronomy and atmospheric research up to 2THz. NbN hot electron bolometer (HEB) mixer, as the most sensitive mixer above 1.5THz, has been used in the Herschel space telescope for 1.4-1.9THz and has also shown an ultra-high sensitivity up to 5.3THz. Combined a HEB mixer with a novel THz quantum cascade laser (QCL) as local oscillator (LO), such an all solid-state heterodyne receiver provides the technology which can be used for any balloon-, air- and space-borne heterodyne instruments above 2THz. Here we report the first high-resolution heterodyne spectroscopy measurement using a gas cell and using such a HEB-QCL receiver. The receiver employs a 2.9THz metal-metal waveguide QCL as LO and a NbN HEB as a mixer. By using a gas cell filled with methanol (CH3OH) gas in combination with hot/cold blackbody loads as signal source, we successfully recorded the methanol emission line around 2.918THz. Spectral lines at different pressures and also different frequency of the QCL are studied.
Based on our reconfigurable FPGA spectrometer technology, we have developed a read-out system, operating in the frequency domain, for arrays of Microwave Kinetic Inductance Detectors (MKIDs). The readout consists of a combination of two digital boards: A programmable DAC-/FPGA-board (tone-generator) to stimulate the MKIDs detectors and an ADC-/FPGA-unit to analyze the detectors response. Laboratory measurement show no deterioration of the noise performance compared to low noise analog mixing. Thus, this technique allows capturing several hundreds of detector signals with just one pair of coaxial cables.
Hochschulbibliotheken haben heutzutage neben der traditionellen Aufgabe der Literaturversorgung und Informationsversorgung auch die Aufgabe, über ihre Website im Internet präsent zu sein und sich dort als zentraler Informationsanbieter zu positionieren. Die Webauftritte haben dabei die Funktionen als Zugangspunkt und Vermittlungsstelle für Informationen sowie als Werbemittel bzw. Instrument der Öffentlichkeitsarbeit. Im Hinblick auf die immer stärkere Benutzerorientierung der Bibliotheken werden die Webangebote von fünf Universitätsbibliotheken und einer Fachhochschulbibliothek nach diversen Kriterien analysiert: es wird herausgearbeitet, welche Inhalte auf den Websites zu finden sind, die Struktur wird näher betrachtet, ebenso wie die Navigationsmöglichkeiten, die Sprache und Textgestaltung und das Design.
Spectral surveys provide the only way to determine the full molecular inventory of an object and hence build a comprehensive view of the state of the molecular gas and its role in star formation and the structure and evolution of the ISM. Of course spectral surveys also provide the most efficient method of identifying new and unexpected species that have to be include in the chemical networks. The most extensive and complete survey of an extragalactic system has been the continuous spectral survey from 129 GHz to 175 GHz carried out by Martín et al. (2006) toward NGC253. This first spectral line surveys at 2 mm towards the prototypical starbursts galaxies NGC253 have shown an unexpected chemical richness.
Schwarz-Grün
(2011)
This master thesis describes a supervised approach to the detection and the identification of humans in TV-style video sequences. In still images and video sequences, humans appear in different poses and views, fully visible and partly occluded, with varying distances to the camera, at different places, under different illumination conditions, etc. This diversity in appearance makes the task of human detection and identification to a particularly challenging problem. A possible solution of this problem is interesting for a wide range of applications such as video surveillance and content-based image and video processing. In order to detect humans in views ranging from full to close-up view and in the presence of clutter and occlusion, they are modeled by an assembly of several upper body parts. For each body part, a detector is trained based on a Support Vector Machine and on densely sampled, SIFT-like feature points in a detection window. For a more robust human detection, localized body parts are assembled using a learned model for geometric relations based on Gaussians. For a flexible human identification, the outward appearance of humans is captured and learned using the Bag-of-Features approach and non-linear Support Vector Machines. Probabilistic votes for each body part are combined to improve classification results. The combined votes yield an identification accuracy of about 80% in our experiments on episodes of the TV series "Buffy the Vampire Slayer". The Bag-of-Features approach has been used in previous work mainly for object classification tasks. Our results show that this approach can also be applied to the identification of humans in video sequences. Despite the difficulty of the given problem, the overall results are good and encourage future work in this direction.
Liquid–liquid equilibria of dipropylene glycol dimethyl ether and water by molecular dynamics
(2011)
The Anomalous X‐ray Pulsar 4U 0142+61 is the only neutron star where it is believed that one of the long searched‐for ‘fallback’ disks has been detected in the mid‐IR by Wang et al. [1] using Spitzer. Such a disk originates from material falling back to the NS after the supernova. We search for cold circumstellar material in the 90 GHz continuum using the Plateau de Bure Interferometer. No millimeter flux is detected at the position of 4U 0142+61, the upper flux limit is 150 μJy corresponding to the 3σ noise rms level. The re‐processed Spitzer MIPS 24μm data presented previously by Wang et al. [2] show some indication of flux enhancement at the position of the neutron star, albeit below the 3σ statistical significance limit. At far infrared wavelengths the source flux densities are probably below the Herschel confusion limits.
While industrialized countries are becoming service economies, all countries are becoming global. As competition becomes more global, understanding and accommodating the needs of international customers with different cultural backgrounds has become increasingly important. This study highlights cross-cultural perceptions of service problems in the tourist industry.