005 Computerprogrammierung, Programme, Daten
Refine
Departments, institutes and facilities
- Institut für Cyber Security & Privacy (ICSP) (160)
- Institut für Verbraucherinformatik (IVI) (107)
- Fachbereich Informatik (62)
- Fachbereich Wirtschaftswissenschaften (57)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (7)
- Fachbereich Ingenieurwissenschaften und Kommunikation (3)
- Graduierteninstitut (1)
- Institut für funktionale Gen-Analytik (IFGA) (1)
- Institute of Visual Computing (IVC) (1)
- Zentrum für Ethik und Verantwortung (ZEV) (1)
Document Type
- Conference Object (187)
- Article (75)
- Part of a Book (20)
- Book (monograph, edited volume) (12)
- Contribution to a Periodical (8)
- Working Paper (4)
- Conference Proceedings (3)
- Master's Thesis (3)
- Research Data (2)
- Doctoral Thesis (2)
Year of publication
Keywords
- Usable Security (10)
- GDPR (7)
- Cloud (5)
- HTTP (5)
- Privacy (5)
- Usable Privacy (5)
- security (5)
- usable privacy (5)
- Big Data Analysis (4)
- Global Software Engineering (4)
Echtzeit-orientierte Multimedia-Kommunikation im Internet eröffnet eine Vielzahl neuer Anwendungen. Diese innovative Kommunikationsplattform ist gerade für weltweit operierende Unternehmen von Interesse. So können z.B. durch die Verwendung von VoIP-Lösungen oder Groupware-Applikationen Kosten gesenkt und gleichzeitig die Zusammenarbeit der Mitarbeiter optimiert werden. Dies trifft auch für Video-Konferenzsysteme zu. Anstelle regelmäßiger Meetings, die meist mit Dienstreisen eines Großteils der Teilnehmer verbunden sind, können Konferenzen virtuell durch die Übertragung von Sprachund Videodaten über das Internet abgehalten werden. Die Akzeptanz der beschriebenen Kommunikationsanwendungen hängt stark von den Faktoren Dienstgüte und Sicherheit ab. Die Übertragung der echtzeit-orientierten Mediendaten muss möglichst kontinuierlich erfolgen, so dass sowohl eine ruckelfreie Wiedergabe der Sprache als auch der Bewegtbilder möglich ist. Da Konferenzen firmenintern und vertraulich sind, werden sie hinter verschlossener Tür abgehalten. Das Pendant in der elektronischen Welt muss eine Entsprechung anbieten. Se- curity-Mechanismen haben allerdings einen Einfluss auf Dienstgüteparameter. Dies muss bei der Entwicklung von Techniken zum Schutz multimedialer Kommunikation berücksichtigt und abgestimmt werden. Dieser Beitrag zeigt anhand des Beispiels eines Video-Konferenzsystems für das Internet, wie Sicherheitsmechanismen in echtzeit-orientierte Multimedia-Kommunikationsanwendungen unter Berücksichtigung von Quality of Service (QoS) integriert werden können.
Das Auslesen von Messdaten in elektronischer Form ermöglicht es, diese vom Ursprung bis zur Rechnungsstellung effizient und ohne Medienbruch zu erheben und zu verarbeiten. Gerade im liberalisierten Energiemarkt ist dies von Bedeutung, da eine Vielzahl von Marktteilnehmern miteinander kommunizieren muss. Das im VERNET-Programm geförderte SELMA-Projekt verfolgt das Ziel, einen Standard für den sicheren elektronischen Austausch von Messdaten zu entwickeln und zu etablieren. Eine der zentralen Anforderungen ist die Gewährleistung der Authentizität und Integrität der über offene Netze ausgelesenen Messdaten, die über die gesamte Lebensdauer der Messdaten nachprüfbar sein sollen. Die technische Umsetzung dieser Anforderungen resultiert in einer Sicherheitsarchitektur, die durch den durchgängigen Einsatz elektronischer Signaturen gekennzeichnet ist. Mit den signierten Datensätzen können die Rechnungen von den Marktteilnehmern auf ihre Authentizität und Integrität hin überprüft werden. Dieser Beitrag zeigt die gesetzgeberischen Hindernisse auf, die bei der Umsetzung der Anforderungen an qualifizierte Signaturen im elektronischen Messdatenaustausch auftreten und wie dennoch eine größtmögliche Beweiskraft für fortgeschrittene Signaturen erreicht werden kann.
This work introduces Grid computing, showsits use in eHealth environments and elicits trends towards the integration of custodians in eHealth Grids. It considers security and privacy requirements for the use of Grid computing in eHealth scenariosand discusses the possible integration of different types of data custodians. Finally the paper concludes and gives an outlook on the development and deployment of eHealth Gridsinthe near future.
This paper addresses the urgent need for international standardization of Context Metadata for e-Learning environments. In particular, E-Learning when distributed over the Internet, can synchronously and asynchronously reach a huge number of learners but also has to deal with a variety of different cultures and societies and the related complications. A lot of the differences strongly demand adaptation processes in which especially the contents are being modified to fit the needs in the targeted contexts. In our approach solving this task, we determined a list of around 160 significant possible differences and defined those as context metadata. In this paper, we show the results of our research regarding to the determination of context related influence factors as well as approaches to deal with them and present a first specification of the representing context-metadata.
In recent years a new category of digital signature algorithms based on Elliptic Curve Cryptography (ECC) has taken place besides well known schemes as RSA or DSA. So far it is, however, still not obvious how ECC-based signature schemes can be integrated in X.509-based Public Key Infrastructures (PKI).This paper briefly introduces cryptographic basics of signature schemes based on elliptic curves and points out the necessary cryptography parameters that are important in this context. Afterwards the structure and the encoding of X.509 certificates and Certificate Revocation Lists (CRL) are discussed regarding the integration of ECC public keys and ECC signatures respectively. The paper closes with exemplary implementations of ECC-based security systems.
Data transfer and staging services are common components in Grid-based, or more generally, in service-oriented applications. Security mechanisms play a central role in such services, especially when they are deployed in sensitive application fields like e-health. The adoption of WS-Security and related standards to SOAP-based transfer services is, however, problematic as a straightforward adoption of SOAP with MTOM introduces considerable inefficiencies in the signature generation process when large data sets are involved. This paper proposes a non-blocking, signature generation approach enabling a stream-like processing with considerable performance enhancements.
The @neurIST project
(2008)
This paper presents the security architecture of the @neurIST medical information system. @neurIST aims at a research and decision support system for treating diseases that unites multiple medical institutions and service providers offering technical solutions based on the Service Oriented Architecture (SOA) paradigm. The security architecture provides secure access to federated medical data spread across multiple sites and protects the privacy of the patients by pseudonymisation of the medical data required for the study.
Trust and Social Capital: Revisiting an Offshoring Failure Story of a Small German Software Company
(2009)
Objektrelationale Datenbanken und Rough Sets für die Analyse von Contextualized Attention Metadata
(2009)
In this paper, we present a solution how to test cultural influences on E-Learning in a global context. Based on a metadata approach, we show how specifically cultural influence factors can be determined to transfer and adapt learning environments. We present a method how those influence factors can be validated for both, to improve the dynamical meta-data specification and to be used in the development of (international) E-Learning scenarios.
Usable Security und Privacy
(2010)
When entering a password (or other secrets) the typed input is most commonly masked, i.e. the characters are hidden behind bullets or asterisks. This, however, complicates the input and highly decreases the user's confident causing several issues such as login failure attempts. On the other hand, password masking is an important security requirement for a lot of applications and contexts to prevent a third person to read the password. Thus, simply dropping password masking is not feasible in general. A common solution provides the user with the choice of toggling password masking on and off, but due to distinct defaults (in dependency of the application and context) this is rather complex and confusing. Enhanced password visualization technologies beyond the simple masking of passwords can provide more sophisticated solutions from both a usability and security perspective. In this paper, available password visualization technologies are presented and discussed. Furthermore a novel password visualization approach is introduced, the TransparentMask, which provides unique properties in comparison to the existing schemes. Amongst these are the ability to detect mistakes while typing and being able to localize and correct the typo within a certain range. Finally, a security analysis of the TransparentMask shows that the protection level given by a certain password length is slightly decreased in comparison to the fully masked approach.
Publikation von Umweltdaten
(2010)
The Web has become an indispensable prerequisite of everyday live and the Web browser is the most used application on a variety of distinct devices. The content delivered by the Web has changed drastically from static pages to media-rich and interactive Web applications offering nearly the same functionality as native applications, a trend which is further pushed by the Cloud and more specifically the Cloud’s SaaS layer. In the light of this development, security and performance of Web browsing has become a crucial issue.
Software offshoring has been established as an important business strategy over the last decade. While research on such forms of Global Software Development (GSD) has mainly focused on the situation of large enterprises, small enterprises are increasingly engaging in offshoring, too. Representing the biggest share of the German software industry, small companies are known to be important innovators and market pioneers. They often regard their flexibility and customer-orientation as core competitive advantages. Unlike large corporations, their small size allows them to adopt software development approaches that are characterized by a high agility and flat hierarchies. At the same time, their distinct strategies make it unlikely that they can simply adopt management strategies that were developed for larger companies.
Flexible development approaches like the ones preferred by small corporations have proven to be problematic in the context of offshoring, as their strong dependency on constant communication is strongly affected by the various barriers of international cooperation between companies. Cooperating closely over companies’ borders in different time zones and in culturally diverse teams poses complex obstacles for flexible management approaches. It is still a matter of discussion in fields like Software Engineering and Computer Supported Cooperative Work how these obstacles can be tackled and how they affect companies in the long term. Hence, it is agreed that we need a more detailed understanding of distributed software development practices in order to come to feasible technological and organizational solutions.
This dissertation presents results from two ethnographically-informed case studies of software offshoring in small German enterprises. By adopting Anselm Strauss’ concept of articulation work, we want to deepen the understanding of managing distributed software development in flexible, customer-oriented organizations. In doing so, we show how practices of coordinating inter-organizational software development are closely related to aspects of organizational learning in small enterprises. By means of interviews with developers and project managers from both parties of the cooperation, we do not only take into account the multiple perspectives of the cooperation, but also include the socio-cultural background of international software development projects into our analysis.
XML Encryption and XML Signature are fundamental security standards forming the core for many applications which require to process XML-based data. Due to the increased usage of XML in distributed systems and platforms such as in SOA and Cloud settings, the demand for robust and effective security mechanisms increased as well. Recent research work discovered, however, substantial vulnerabilities in these standards as well as in the vast majority of the available implementations. Amongst them, the so-called XML Signature Wrapping attack belongs to the most relevant ones. With the many possible instances of this attack type, it is feasible to annul security systems relying on XML Signature and to gain access to protected resources as has been successfully demonstrated lately for various Cloud infrastructures and services. This paper contributes a comprehensive approach to robust and effective XML Signatures for SOAP-based Web Services. An architecture is proposed, which integrates the r equired enhancements to ensure a fail-safe and robust signature generation and verification. Following this architecture, a hardened XML Signature library has been implemented. The obtained evaluation results show that the developed concept and library provide the targeted robustness against all kinds of known XML Signature Wrapping attacks. Furthermore the empirical results underline, that these security merits are obtained at low efficiency and performance costs as well as remain compliant with the underlying standards.
The documentation requirements of data published in long term archives have significantly grown over the last decade. At WDCC the data publishing process is assisted by “Atarrabi”, a web-based workflow system for reviewing and editing metadata information by the data authors and the publication agent. The system ensures high metadata quality for long-term use of the data with persistent identifiers (DOI/URN). By these well-defined references (DOI) credit can properly be given to the data producers in any publication.
Fast täglich werden neue Angriffe auf IT-Systeme bekannt, bei denen sensible Daten entwendet werden. Das vorliegende Buch vermittelt die wesentlichen Grundlagen und Technologien, die zur Absicherung von Computernetzwerken benötigt werden. Stets legen die Autoren dabei Wert auf eine verständliche Darstellung, die – soweit möglich – auf abstrakte Modelle und formalen Notationen verzichtet. Zu jedem Kapitel werden Aufgaben zur Kontrolle von Wissensstand und Verständnis angeboten.
Botnets
(2013)
Malware poses one of the major threats to all currently operated computer systems. The scale of the problem becomes obvious by looking at the global economic loss caused by different kinds of malware, which is estimated to be more than US$ 10 billion every year. Botnets, a special kind of malware, are used to reap economic gains by criminals as well as for politically motivated activities. In contrast to other kinds of malware, botnets utilize a hidden communication channel to receive commands from their operator and communicate their current status. The ability to execute almost arbitrary commands on the infected machines makes botnets a general-purpose tool to perform malicious cyber-activities. (Verlagsangaben)
The usage of the Web has experienced a vertiginous growth in the last few years. Watching video online has been one major driving force for this growth lately. Until the appearance of the HTML5 agglomerate of (still draft) specifications, the access and consumption of multimedia content in the Web has not been standardized. Hence, the use of proprietary Web browser plugins flourished as intermediate solution. With the introduction of the HTML5 VideoElement, Web browser plugins are replaced with a standardized alternative. Still, HTML5 Video is currently limited in many respects, including the access to only file-based media. This paper investigates on approaches to develop video live streaming solutions based on available Web standards. Besides a pull-based design based on HTTP, a push-based architecture is introduced, making use of the WebSocket protocol being part of the HTML5 standards family as well. The evaluation results of both conceptual principles emphasize, that push-based approaches have a higher potential of providing resource and cost efficient solutions as their pull-based counterparts. In addition, initial approaches to instrument the proposed push-based architecture with adaptiveness to network conditions have been developed.
SOA-Readiness of REST
(2014)
Service Security Revisited
(2014)
Dieses Buch führt Sie umfassend in die WebSocket-Technik und die damit einhergehenden neuen Entwicklungsmöglichkeiten ein. Unter den zahlreichen exemplarischen Anwendungen finden sich Beispiele auf Basis von Node.js, Vert.x, und JSR 356, als Programmiersprachen werden Java und JavaScript eingesetzt.
Despite the lack of standardisation for building REST-ful HTTP applications, the deployment of REST-based Web Services has attracted an increased interest. This gap causes, however, an ambiguous interpretation of REST and induces the design and implementation of REST-based systems following proprietary approaches instead of clear and agreed upon definitions. Issues arising from these shortcomings have an influence on service properties such as the loose coupling of REST-based services via a unitary service contract and the automatic generation of code. To overcome such limitations, at least two prerequisites are required: the availability of specifications for implementing REST-based services and auxiliaries for auditing the compliance of those services with such specifications. This paper introduces an approach for conformance testing of REST-based Web Services. This appears conflicting at the first glance, since there are no specifications available for implementing REST by, e.g., t he prevalent technology set HTTP/URI to test against. Still, by providing a conformance test tool and leaning it on the current practice, the exploration of service properties is enabled. Moreover, the real demand for standardisation gets explorable by such an approach. First investigations conducted with the developed conformance test system targeting major Cloud-based storage services expose inconsistencies in many respects which emphasizes the necessity for further research and standardisation.
This paper gives necessary foundations to understand the mechanism of warning processing and summarizes the state of the art in warning development. That includes a description of tools, researchers use to work in this scientific field. In detail these are models that describes the human way of processing warnings and mental models. Both are presented detailed with relevant examples. The paper tells how these tools are connected and how they are used to improve the effectiveness of warnings.
Web of Services Security
(2015)