Refine
H-BRS Bibliography
- no (66) (remove)
Departments, institutes and facilities
Document Type
- Article (29)
- Conference Object (19)
- Part of a Book (7)
- Report (4)
- Lecture (2)
- Master's Thesis (2)
- Book (monograph, edited volume) (1)
- Conference Proceedings (1)
- Doctoral Thesis (1)
Year of publication
- 2011 (66) (remove)
Has Fulltext
- no (66)
Keywords
- Business Ethnography (2)
- Global Software Engineering (2)
- Adaptation (1)
- Atomistic force fields (1)
- Automated multiple development (1)
- Benutzerfreundlichkeit (1)
- Binding Sites (1)
- Ceramides (1)
- Cholesterol (1)
- Cloud (1)
Die Zukunft der CCS-Technologie ist maßgeblich von den Entwicklungsmöglichkeiten abhängig, die ihr der Gesetzgeber einräumt. Der von der Bundesregierung am 13.4.2011 beschlossene Gesetzentwurf geht mit großer Vorsicht an den Einsatz der CCS-Technologie heran. Die Entwicklung sowie die Planungs- und Investitionssicherheit haben gegenüber den Bedenken gegen die kurz- und langfristige Sicherheit der CO2-Speicherung zurückstecken müssen. Der nachfolgende Beitrag erläutert Rechtsprobleme der Genehmigung von CO2-Abscheidungsanlagen, Transportleitungen und Speichern sowie der Gefahrenvorsorge, -nachsorge und Haftung.
Projektmanagement im Mittelstand? Häufig Fehlanzeige, seufzen Beobachter. Mit modernen Methoden der Projektabwicklung, wie sie sich seit vielen Jahren etwa in Konzernen bewährt haben, wollen mittelständische Unternehmen wenig zu tun haben. Doch Vorsicht! Professor Uwe Braehmer warnt vor Pauschalurteilen. Einige Mittelständler beherrschen meisterhaft die Klaviatur des Projektmanagements. Andere erkennen derzeit neue Geschäftschancen durch Projektmanagement. Und wieder andere brauchen ein dem Mittelstand speziell angepasstes Projektmanagement. Professor Uwe Braehmer weiß: „Auf viele mittelständische Geschäftsführer wirkt Projektmanagement zu akademisch und zu groß dimensioniert.“ Bedarfsgerecht gestaltet könne es im Mittelstand eine wesentlich breitere Akzeptanz finden.
Das Thema Querdenken ist so aktuell, spannend und interessant, dass sich Prof. Dr. Jens Böcker von der Hochschule Bonn-Rhein-Sieg nach unseren ersten Gesprächen dazu entschlossen hat, eine Projektarbeit für seine Studenten zu diesem Thema zu vergeben. Prof. Dr. Böcker ist Professor im Fachbereich Wirtschaftswissenschaften für BWL mit dem Schwerpunkt Marketing. Das „Forschungsprojekt Querdenken“ wurde von einer Gruppe Studenten um Manuel Hammes als Projektleiter im Sommersemester 2010 bei Prof. Dr. Böcker durchgeführt.
Hochschulbibliotheken haben heutzutage neben der traditionellen Aufgabe der Literaturversorgung und Informationsversorgung auch die Aufgabe, über ihre Website im Internet präsent zu sein und sich dort als zentraler Informationsanbieter zu positionieren. Die Webauftritte haben dabei die Funktionen als Zugangspunkt und Vermittlungsstelle für Informationen sowie als Werbemittel bzw. Instrument der Öffentlichkeitsarbeit. Im Hinblick auf die immer stärkere Benutzerorientierung der Bibliotheken werden die Webangebote von fünf Universitätsbibliotheken und einer Fachhochschulbibliothek nach diversen Kriterien analysiert: es wird herausgearbeitet, welche Inhalte auf den Websites zu finden sind, die Struktur wird näher betrachtet, ebenso wie die Navigationsmöglichkeiten, die Sprache und Textgestaltung und das Design.
The work done in this thesis enhances the MMD algorithm in multi-core environments. The MMD algorithm, a transformation based algorithm for reversible logic synthesis, is based on the works introduced by Maslov, Miller and Dueck and their original, sequential implementation. It synthesises a formal function specification, provided by a truth table, into a reversible network and is able to perform several optimization steps after the synthesis. This work concentrates on one of these optimization steps, the template matching. This approach is used to reduce the size of the reversible circuit by replacing a number of gates that match a template which implements the same function and uses less gates. Smaller circuits have several benefits since they need less area and are not as costly. The template matching approach introduced in the original works is computationally expensive since it tries to match a library of templates against the given circuit. For each template at each position in the circuit, a number of different combinations have to be calculated during runtime resulting in high execution times, especially for large circuits. In order to make the template matching approach more efficient and usable, it has been reimplemented in order to take advantage of modern multi-core architectures such as the Cell Broadband Engine or a Graphics Processing Unit. For this work, two algorithmically different approaches that try to consider each multi-core architecture’s strengths, have been analyzed and improved. For the analysis these approaches have been cross-implemented on the two target hardware architectures and compared to the original parallel versions. Important metrics for this analysis are the execution time of the algorithm and the result of the minimization with the template matching approach. It could be shown that the algorithmically different approaches produce the same minimization results, independent of the used hardware architecture. However, both cross-implementations also show a significantly higher execution time which makes them practically irrelevant. The results of the first analysis and comparison lead to the decision to enhance only the original parallel approaches. Using the same metrics for successful enhancements as mentioned above, it could be shown that improving the algorithmic concepts and exploiting the capabilities of the hardware lead to better results for the execution time and the minimization results compared to their original implementations.
Selective screening for inborn errors of metabolism--assessment of metabolites in body fluids
(2011)
In an explorative study, we investigated on German schoolteachers how they use, reuse, produce and manage Open Educational Resources. The main questions in this research have been, what their motivators and barriers are in their use of Open Educational Resources, what others can learn from their Open Educational Practices, and what we can do to raise the dissemination level of OER in schools.
In Anlehnung an die von Leidner und Kayworth (2006) durchgeführte Studie zum Umgang mit Kultur in der angelsächsischen Wissenschaftsdisziplin „Information Systems“ wurde eine entsprechende Literaturstudie für die gestaltungsorientierte Wirtschaftsinformatik des deutschen Sprachraums durchgeführt. In der Studie wurde in den Hauptorganen der Disziplin untersucht, in welcher Häufigkeit kulturelle Einflüsse auf Informationstechnologie thematisiert wurden, wie diese Einflüsse aufgearbeitet wurden und welche Referenzmodel-le/Referenzliteratur verwendet wurden. Nach einer kurzen Darstellung der gewählten Vorgehensweise werden die Ergebnisse und Beschränkungen der Studie präsentiert.
This paper addresses special skills, learners in Internet-based learning scenarios need. In self-directed learning scenarios, as most Internet-based learning scenarios are designed, learners bear the responsibility for their learning progress. To ease this task, institutions could prime the learners for the situation which may be quite different to their previous learning experiences. Basing on a Delphi-study, conducted with experts from the e-Learning sector in Germany, Austria, and Switzerland, the basic requirements have been determined.
Adaptability as a Special Demand on Open Educational Resources: The Cultural Context of e-Learning
(2011)
Producing and providing Open Educational Resources (OERs) is driven by the concepts of openness and sharing. Although there already are a lot of free high-quality resources available, practitioners often rather rewrite learning resources than creatively embed (and thus, reuse) existing OERs. In this paper, we analyse the reasons for this in two different educational contexts. As a result of this analysis, we found that the uncertainty on possible adaptation needs is one of the major barriers. In order to overcome this barrier and make different learning contexts comparable, we analysed the context of learners and in particular, in the research project ‘Learning Culture’, we investigated the field of culturally motivated expectations and attitudes of learners. This paper shows the results of this research project and discusses which cultural issues should be taken into consideration when OERs are to be adapted from one to another cultural context.
Spectral surveys provide the only way to determine the full molecular inventory of an object and hence build a comprehensive view of the state of the molecular gas and its role in star formation and the structure and evolution of the ISM. Of course spectral surveys also provide the most efficient method of identifying new and unexpected species that have to be include in the chemical networks. The most extensive and complete survey of an extragalactic system has been the continuous spectral survey from 129 GHz to 175 GHz carried out by Martín et al. (2006) toward NGC253. This first spectral line surveys at 2 mm towards the prototypical starbursts galaxies NGC253 have shown an unexpected chemical richness.
Superconducting heterodyne receiver has played a vital role in the high resolution spectroscopy applications for astronomy and atmospheric research up to 2THz. NbN hot electron bolometer (HEB) mixer, as the most sensitive mixer above 1.5THz, has been used in the Herschel space telescope for 1.4-1.9THz and has also shown an ultra-high sensitivity up to 5.3THz. Combined a HEB mixer with a novel THz quantum cascade laser (QCL) as local oscillator (LO), such an all solid-state heterodyne receiver provides the technology which can be used for any balloon-, air- and space-borne heterodyne instruments above 2THz. Here we report the first high-resolution heterodyne spectroscopy measurement using a gas cell and using such a HEB-QCL receiver. The receiver employs a 2.9THz metal-metal waveguide QCL as LO and a NbN HEB as a mixer. By using a gas cell filled with methanol (CH3OH) gas in combination with hot/cold blackbody loads as signal source, we successfully recorded the methanol emission line around 2.918THz. Spectral lines at different pressures and also different frequency of the QCL are studied.
A method for minimum range extension with improved accuracy in triangulation laser range finder
(2011)
Liquid–liquid equilibria of dipropylene glycol dimethyl ether and water by molecular dynamics
(2011)
Providing Mobile Phone Access in Rural Areas via Heterogeneous Meshed Wireless Back-Haul Networks
(2011)
NMR structures of thiostrepton derivatives for characterization of the ribosomal binding site
(2011)
The smart home of the future is typically researched in lab settings or apartments that have been built from scratch. However, comparing the lifecycle of buildings and information technology, it is evident that modernization strategies and technologies are needed to empower residents to modify and extend their homes to make it smarter. In this paper, we describe a case study about the deployment, adaption to and adoption of tailorable home energy management systems in 7 private households. Based on this experience, we want to discuss how hardware and software technologies should be designed so that people could build their own smart home with a high usability and user experience.
Based on our reconfigurable FPGA spectrometer technology, we have developed a read-out system, operating in the frequency domain, for arrays of Microwave Kinetic Inductance Detectors (MKIDs). The readout consists of a combination of two digital boards: A programmable DAC-/FPGA-board (tone-generator) to stimulate the MKIDs detectors and an ADC-/FPGA-unit to analyze the detectors response. Laboratory measurement show no deterioration of the noise performance compared to low noise analog mixing. Thus, this technique allows capturing several hundreds of detector signals with just one pair of coaxial cables.
The Web has become an indispensable prerequisite of everyday live and the Web browser is the most used application on a variety of distinct devices. The content delivered by the Web has changed drastically from static pages to media-rich and interactive Web applications offering nearly the same functionality as native applications, a trend which is further pushed by the Cloud and more specifically the Cloud’s SaaS layer. In the light of this development, security and performance of Web browsing has become a crucial issue.