Refine
Departments, institutes and facilities
Document Type
- Conference Object (10)
- Article (9)
- Part of a Book (2)
- Contribution to a Periodical (1)
- Doctoral Thesis (1)
- Report (1)
Keywords
- GDPR (4)
- Human-Centered Design (2)
- Usable Security and Privacy (2)
- factor analysis (2)
- Big Data Analysis (1)
- Biometric Traits (1)
- Continuous Authentication (1)
- Data Protection Officer (1)
- Digital Ecosystem (1)
- Employee Privacy (1)
Digital ecosystems are driving the digital transformation of business models. Meanwhile, the associated processing of personal data within these complex systems poses challenges to the protection of individual privacy. In this paper, we explore these challenges from the perspective of digital ecosystems' platform providers. To this end, we present the results of an interview study with seven data protection officers representing a total of 12 digital ecosystems in Germany. We identified current and future challenges for the implementation of data protection requirements, covering issues on legal obligations and data subject rights. Our results support stakeholders involved in the implementation of privacy protection measures in digital ecosystems, and form the foundation for future privacy-related studies tailored to the specifics of digital ecosystems.
Risk-based authentication (RBA) extends authentication mechanisms to make them more robust against account takeover attacks, such as those using stolen passwords. RBA is recommended by NIST and NCSC to strengthen password-based authentication, and is already used by major online services. Also, users consider RBA to be more usable than two-factor authentication and just as secure. However, users currently obtain RBA's high security and usability benefits at the cost of exposing potentially sensitive personal data (e.g., IP address or browser information). This conflicts with user privacy and requires to consider user rights regarding the processing of personal data. We outline potential privacy challenges regarding different attacker models and propose improvements to balance privacy in RBA systems. To estimate the properties of the privacy-preserving RBA enhancements in practical environments, we evaluated a subset of them with long-term data from 780 users of a real-world online service. Our results show the potential to increase privacy in RBA solutions. However, it is limited to certain parameters that should guide RBA design to protect privacy. We outline research directions that need to be considered to achieve a widespread adoption of privacy preserving RBA with high user acceptance.
Privatheit am Arbeitsplatz
(2020)
Users should always play a central role in the development of (software) solutions. The human-centered design (HCD) process in the ISO 9241-210 standard proposes a procedure for systematically involving users. However, due to its abstraction level, the HCD process provides little guidance for how it should be implemented in practice. In this chapter, we propose three concrete practical methods that enable the reader to develop usable security and privacy (USP) solutions using the HCD process. This chapter equips the reader with the procedural knowledge and recommendations to: (1) derive mental models with regard to security and privacy, (2) analyze USP needs and privacy-related requirements, and (3) collect user characteristics on privacy and structure them by user group profiles and into privacy personas. Together, these approaches help to design measures for a user-friendly implementation of security and privacy measures based on a firm understanding of the key stakeholders.
Components and Architecture for the Implementation of Technology-Driven Employee Data Protection
(2021)
Applied privacy research has so far focused mainly on consumer relations in private life. Privacy in the context of employment relationships is less well studied, although it is subject to the same legal privacy framework in Europe. The European General Data Protection Regulation (GDPR) has strengthened employees’ right to privacy by obliging that employers provide transparency and intervention mechanisms. For such mechanisms to be effective, employees must have a sound understanding of their functions and value. We explored possible boundaries by conducting a semistructured interview study with 27 office workers in Germany and elicited mental models of the right to informational self-determination, which is the European proxy for the right to privacy. We provide insights into (1) perceptions of different categories of data, (2) familiarity with the legal framework regarding expectations for privacy controls, and (3) awareness of data processing, data flow, safeguards, and threat models. We found that legal terms often used in privacy policies used to describe categories of data are misleading. We further identified three groups of mental models that differ in their privacy control requirements and willingness to accept restrictions on their privacy rights. We also found ignorance about actual data flow, processing, and safeguard implementation. Participants’ mindsets were shaped by their faith in organizational and technical measures to protect privacy. Employers and developers may benefit from our contributions by understanding the types of privacy controls desired by office workers and the challenges to be considered when conceptualizing and designing usable privacy protections in the workplace.
The ongoing digitisation in everyday working life means that ever larger amounts of personal data of employees are processed by their employers. This development is particularly problematic with regard to employee data protection and the right to informational self-determination. We strive for the use of company Privacy Dashboards as a means to compensate for missing transparency and control. For conceptual design we use among other things the method of mental models. We present the methodology and first results of our research. We highlight the opportunities that such an approach offers for the user-centred development of Privacy Dashboards.
Continuous authentication has emerged as a promising approach to increase user account security for online services. Unlike traditional authentication methods, continuous authentication provides ongoing security throughout the session, protecting against session takeover attacks due to illegitimate access. The effectiveness of continuous authentication systems relies on the continuous processing of users' sensitive biometric data. To balance security and privacy trade-offs, it's crucial to understand when users are willing to disclose biometric data for enhanced account security, addressing inevitable privacy concerns and user acceptance. To address this knowledge gap, we conducted an online study with 830 participants from the U.S., aiming to investigate user perceptions towards continuous authentication across different classes of online services. Our analysis identified four groups of biometric traits that directly reflect users' willingness to disclose them. Our findings demonstrate that willingness to disclose is influenced by both the specific biometric traits and the type of online service involved. User perceptions are strongly shaped by factors such as response efficacy, perceived privacy risks associated with the biometric traits, and concerns about the service providers' handling of such data. Our results emphasize the inadequacy of one-size-fits-all solutions and provide valuable insights for the design and implementation of continuous authentication systems.
The processing of employees’ personal data is dramatically increasing, yet there is a lack of tools that allow employees to manage their privacy. In order to develop these tools, one needs to understand what sensitive personal data are and what factors influence employees’ willingness to disclose. Current privacy research, however, lacks such insights, as it has focused on other contexts in recent decades. To fill this research gap, we conducted a cross-sectional survey with 553 employees from Germany. Our survey provides multiple insights into the relationships between perceived data sensitivity and willingness to disclose in the employment context. Among other things, we show that the perceived sensitivity of certain types of data differs substantially from existing studies in other contexts. Moreover, currently used legal and contextual distinctions between different types of data do not accurately reflect the subtleties of employees’ perceptions. Instead, using 62 different data elements, we identified four groups of personal data that better reflect the multi-dimensionality of perceptions. However, previously found common disclosure antecedents in the context of online privacy do not seem to affect them. We further identified three groups of employees that differ in their perceived data sensitivity and willingness to disclose, but neither in their privacy beliefs nor in their demographics. Our findings thus provide employers, policy makers, and researchers with a better understanding of employees’ privacy perceptions and serve as a basis for future targeted research
on specific types of personal data and employees.
The European General Data Protection Regulation requires the implementation of Technical and Organizational Measures (TOMs) to reduce the risk of illegitimate processing of personal data. For these measures to be effective, they must be applied correctly by employees who process personal data under the authority of their organization. However, even data processing employees often have limited knowledge of data protection policies and regulations, which increases the likelihood of misconduct and privacy breaches. To lower the likelihood of unintentional privacy breaches, TOMs must be developed with employees’ needs, capabilities, and usability requirements in mind. To reduce implementation costs and help organizations and IT engineers with the implementation, privacy patterns have proven to be effective for this purpose. In this chapter, we introduce the privacy pattern Data Cart, which specifically helps to develop TOMs for data processing employees. Based on a user-centered design approach with employees from two public organizations in Germany, we present a concept that illustrates how Privacy by Design can be effectively implemented. Organizations, IT engineers, and researchers will gain insight on how to improve the usability of privacy-compliant tools for managing personal data.
The processing of employee personal data is dramatically increasing. To protect employees' fundamental right to privacy, the law provides for the implementation of privacy controls, including transparency and intervention. At present, however, the stakeholders responsible for putting these obligations into action, such as employers and software engineers, simply lack the fundamental knowledge needed to design and implement the necessary controls. Indeed, privacy research has so far focused mainly on consumer relations in the private context. In contrast, privacy in the employment context is less well studied. However, since privacy is highly context-dependent, existing knowledge and privacy controls from other contexts cannot simply be adopted to the employment context. In particular, privacy in employment is subject to different legal and social norms, which require a different conceptualization of the right to privacy than is usual in other contexts. To adequately address these aspects, there is broad consensus that privacy must be regarded as a socio-technical concept in which human factors must be considered alongside technical-legal factors. Today, however, there is a particular lack of knowledge about human factors in employee privacy. Disregarding the needs and concerns of individuals or lack of usability, though, are common reasons for the failure of privacy and security measures in practice. This dissertation addresses key knowledge gaps on human factors in employee privacy by presenting the results of a total of three in-depth studies with employees in Germany. The results provide insights into employees' perceptions of the right to privacy, as well as their perceptions and expectations regarding the processing of employee personal data. The insights gained provide a foundation for the human-centered design and implementation of employee-centric privacy controls, i.e., privacy controls that incorporate the views, expectations, and capabilities of employees. Specifically, this dissertation presents the first mental models of employees on the right to informational self-determination, the German equivalent of the right to privacy. The results provide insights into employees' (1) perceptions of categories of data, (2) familiarity and expectations of the right to privacy, and (3) perceptions of data processing, data flow, safeguards, and threat models. In addition, three major types of mental models are presented, each with a different conceptualization of the right to privacy and a different desire for control. Moreover, this dissertation provides multiple insights into employees' perceptions of data sensitivity and willingness to disclose personal data in employment. Specifically, it highlights the uniqueness of the employment context compared to other contexts and breaks down the multi-dimensionality of employees' perceptions of personal data. As a result, the dimensions in which employees perceive data are presented, and differences among employees are highlighted. This is complemented by identifying personal characteristics and attitudes toward employers, as well as toward the right to privacy, that influence these perceptions. Furthermore, this dissertation provides insights into practical aspects for the implementation of personal data management solutions to safeguard employee privacy. Specifically, it presents the results of a user-centered design study with employees who process personal data of other employees as part of their job. Based on the results obtained, a privacy pattern is presented that harmonizes privacy obligations with personal data processing activities. The pattern is useful for designing privacy controls that help these employees handle employee personal data in a privacy-compliant manner, taking into account their skills and knowledge, thus helping to protect employee privacy. The outcome of this dissertation benefits a wide range of stakeholders who are involved in the protection of employee privacy. For example, it highlights the challenges to be considered by employers and software engineers when conceptualizing and designing employee-centric privacy controls. Policymakers and researchers gain a better understanding of employees' perceptions of privacy and obtain fundamental knowledge for future research into theoretical and abstract concepts or practical issues of employee privacy. Employers, IT engineers, and researchers gain insights into ways to empower data processing employees to handle employee personal data in a privacy-compliant manner, enabling employers to improve and promote compliance. Since the basic principles underlying informational self-determination have been incorporated into European privacy legislation, we are confident that our results are also of relevance to stakeholders outside Germany.