Fachbereich Wirtschaftswissenschaften
Refine
H-BRS Bibliography
- yes (319)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (319)
- Institut für Verbraucherinformatik (IVI) (102)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (50)
- Centrum für Entrepreneurship, Innovation und Mittelstand (CENTIM) (7)
- Fachbereich Sozialpolitik und Soziale Sicherung (6)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (6)
- Fachbereich Ingenieurwissenschaften und Kommunikation (5)
- Fachbereich Informatik (4)
- Graduierteninstitut (4)
- Institute of Visual Computing (IVC) (3)
Document Type
- Article (136)
- Conference Object (100)
- Part of a Book (35)
- Book (monograph, edited volume) (17)
- Working Paper (8)
- Conference Proceedings (6)
- Preprint (6)
- Doctoral Thesis (4)
- Report (4)
- Contribution to a Periodical (2)
Year of publication
Language
- English (319) (remove)
Keywords
- Sustainability (9)
- ICT (7)
- Dementia (5)
- Exergame (5)
- Kenya (5)
- User Experience (5)
- Well-being (5)
- work engagement (5)
- Design (4)
- Eco-Feedback (4)
Queueing Theory
(2024)
In the last two decades, studies that analyse the political economy of sustainable energy transitions have increasingly become available. Yet very few attempts have been made to synthesize the factors discussed in the growing literature. This paper reviews the extant empirical literature on the political economy of sustainable energy transitions. Using a well-defined search strategy, a total of 36 empirical contributions covering the period 2008 to 2022 are reviewed full text. Overall, the findings highlight the role of vested interest, advocacy coalitions and green constituencies, path dependency, external shocks, policy and institutional environment, political institutions and fossil fuel resource endowments as major political economy factors influencing sustainable energy transitions across both high income countries, and low and middle income countries. In addition, the paper highlights and discusses some critical knowledge gaps in the existing literature and provides suggestions for a future research agenda.
Mergers and acquisitions take place all over the world and in many industries, typically motivated by corporate politics. While IT management is often not involved in the decision-making, it has to solve a wide range of problems in the post-merger phase. Indeed, merging two or more companies implies not only merging their core businesses, but also creating a new and efficiently integrated IT organisation from the individual ones, since persistence of the current IT organisations usually does not make sense. In addition, corporate management frequently imposes constraints, e.g., cost reductions, on the IT infrastructure. The principal critical success factor when merging IT organisations is the uninterrupted operation of the IT business, because a service gap is neither acceptable for in-house functional departments nor for external customers. Therefore, the IT rebuilding phase has to focus on IT services that facilitate the processes of functional departments, support processes, and processes of customers and suppliers, so that any transformation work is transparent to internal and external customers. In this article we describe a real-world but anonymous case study. Our goals are to highlight the points important for merging IT organisations, and to help decision-makers, particularly in the areas of IT organisation and IT personnel. We focus on the arising organisational and non-technical issues from a management perspective, i.e., the CIO's view, and provide checklists intended to help IT managers to address the most pressing issues. To assist CIOs surviving in the post merger phase, we give check lists for merging IT organisations, check lists for merging IT human resources, check lists for IT budgets and reporting, and assess activities in a merger scenario. IT hardware, software and IT infrastructure as well as running IT projects are not considered in this paper.
IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).
In 1991 the researchers at the center for the Learning Sciences of Carnegie Mellon University were confronted with the confusing question of “where is AI” from the users, who were interacting with AI but did not realize it. Three decades of research and we are still facing the same issue with the AItechnology users. In the lack of users’ awareness and mutual understanding of AI-enabled systems between designers and users, informal theories of the users about how a system works (“Folk theories”) become inevitable but can lead to misconceptions and ineffective interactions. To shape appropriate mental models of AI-based systems, explainable AI has been suggested by AI practitioners. However, a profound understanding of the current users’ perception of AI is still missing. In this study, we introduce the term “Perceived AI” as “AI defined from the perspective of its users”. We then present our preliminary results from deep-interviews with 50 AItechnology users, which provide a framework for our future research approach towards a better understanding of PAI and users’ folk theories.
For most people, using their body to authenticate their identity is an integral part of daily life. From our fingerprints to our facial features, our physical characteristics store the information that identifies us as "us." This biometric information is becoming increasingly vital to the way we access and use technology. As more and more platform operators struggle with traffic from malicious bots on their servers, the burden of proof is on users, only this time they have to prove their very humanity and there is no court or jury to judge, but an invisible algorithmic system. In this paper, we critique the invisibilization of artificial intelligence policing. We argue that this practice obfuscates the underlying process of biometric verification. As a result, the new "invisible" tests leave no room for the user to question whether the process of questioning is even fair or ethical. We challenge this thesis by offering a juxtaposition with the science fiction imagining of the Turing test in Blade Runner to reevaluate the ethical grounds for reverse Turing tests, and we urge the research community to pursue alternative routes of bot identification that are more transparent and responsive.
Technological objects present themselves as necessary, only to become obsolete faster than ever before. This phenomenon has led to a population that experiences a plethora of technological objects and interfaces as they age, which become associated with certain stages of life and disappear thereafter. Noting the expanding body of literature within HCI about appropriation, our work pinpoints an area that needs more attention, “outdated technologies.” In other words, we assert that design practices can profit as much from imaginaries of the future as they can from reassessing artefacts from the past in a critical way. In a two-week fieldwork with 37 HCI students, we gathered an international collection of nostalgic devices from 14 different countries to investigate what memories people still have of older technologies and the ways in which these memories reveal normative and accidental use of technological objects. We found that participants primarily remembered older technologies with positive connotations and shared memories of how they had adapted and appropriated these technologies, rather than normative uses. We refer to this phenomenon as nostalgic reminiscence. In the future, we would like to develop this concept further by discussing how nostalgic reminiscence can be operationalized to stimulate speculative design in the present.
When dialogues with voice assistants (VAs) fall apart, users often become confused or even frustrated. To address these issues and related privacy concerns, Amazon recently introduced a feature allowing Alexa users to inquire about why it behaved in a certain way. But how do users perceive this new feature? In this paper, we present preliminary results from research conducted as part of a three-year project involving 33 German households. This project utilized interviews, fieldwork, and co-design workshops to identify common unexpected behaviors of VAs, as well as users’ needs and expectations for explanations. Our findings show that, contrary to its intended purpose, the new feature actually exacerbates user confusion and frustration instead of clarifying Alexa's behavior. We argue that such voice interactions should be characterized as explanatory dialogs that account for VA’s unexpected behavior by providing interpretable information and prompting users to take action to improve their current and future interactions.
AI (artificial intelligence) systems are increasingly being used in all aspects of our lives, from mundane routines to sensitive decision-making and even creative tasks. Therefore, an appropriate level of trust is required so that users know when to rely on the system and when to override it. While research has looked extensively at fostering trust in human-AI interactions, the lack of standardized procedures for human-AI trust makes it difficult to interpret results and compare across studies. As a result, the fundamental understanding of trust between humans and AI remains fragmented. This workshop invites researchers to revisit existing approaches and work toward a standardized framework for studying AI trust to answer the open questions: (1) What does trust mean between humans and AI in different contexts? (2) How can we create and convey the calibrated level of trust in interactions with AI? And (3) How can we develop a standardized framework to address new challenges?
This paper gives an overview of the development of Fair Trade in six European countries: Austria, France, Germany, the Netherlands, Switzerland and the United Kingdom. After the description of the food retail industry and its market structures in these countries, the main European Fair Trade organizations are analyzed regarding their role within the Fair Trade system. The following part deals with the development of Fair Trade sales in general and with respect to the products coffee, tea, bananas, fruit juice and sugar. An overview of the main activities of national Fair Trade organizations, e.g. public relation activities, completes the analysis. This study shows the enormous upswing of Fair Trade during the last decade and the reasons for this development. Nevertheless, it comes to the conclusion that Fair Trade is still far away from being an essential part of the food retail industry in Europe.
Public preferences
(2021)
For reforms to be acceptable and sustainable in the long run, they should be aligned with public preferences. ‘Preferences’ is a technical term used in social sciences or humanities including for example disciplines such as economics, philosophy or psychology. Broadly speaking, preferences refer to an individual’s judgements on liking one alternative more than others. More specifically, preferences are ‘subjective comparative evaluations, in the form of “Agent prefers X to Y”’ (Hansson and Grüne-Yanoff 2018). Here, we are particularly interested in people’s policy preferences concerning social protection, an area which deserves more attention in policy debates and research.
The cooperation between researchers and practitioners during the different stages of the research process is promoted as it can be of benefit to both society and research supporting processes of ‘transformation’. While acknowledging the important potential of research–practice–collaborations (RPCs), this paper reflects on RPCs from a political-economic perspective to also address potential unintended adverse effects on knowledge generation due to divergent interests, incomplete information or the unequal distribution of resources. Asymmetries between actors may induce distorted and biased knowledge and even help produce or exacerbate existing inequalities. Potential merits and limitations of RPCs, therefore, need to be gauged. Taking RPCs seriously requires paying attention to these possible tensions—both in general and with respect to international development research, in particular: On the one hand, there are attempts to contribute to societal change and ethical concerns of equity at the heart of international development research, and on the other hand, there is the relative risk of encountering asymmetries more likely.
Over the past two decades many low and middle income countries worldwide have started to extend the coverage and improve the functioning of public social protection systems. The research program on international policy diffusion provides empirical evidence that apart from domestic factors international interdependencies matter as well for national policy change in social protection. However, little is known about the governance structures mediating international policy diffusion in social protection.
Over the past two decades many governments of low and middle income countries have started to introduce social protection measures or to extend the coverage and improve the functioning of public social protection systems. These reforms are a "global phenomenon" and can be observed in many African, Asian and Latin American countries. This paper focuses on international determinants for policy change within social protection by assessing the state of the art of both policy diffusion and policy transfer studies. Empirical studies of policy transfer and diffusion in the field of social protection are furthermore assessed in light of the theoretical background.
The paper examines the effectiveness of transgovernmental policy networks as a governance structure for policy diffusion. The analysis is based on a survey including 50 social protection policy maker and technical practitioner who are country delegates to transgovernmental policy networks within the policy area of social protection. The paper provides anecdotal empirical evidence that policy networks contribute to policy diffusion by inducing mutual learning processes.
Political economic analyses of recent social protection reforms in Asian, African or Latin American countries have increased throughout the last few years. Yet, most contributions focus on one social protection mechanism only and do not provide a comparative approach across policy areas. In addition, most studies are empirical studies, with no or very limited theoretical linkages. The paper aims to explain multiple trajectories of social protection reform processes looking at cash transfers and social health protection policies in Kenya. It develops a taxonomy and suggest a conceptual framework to assess and explain reform dynamics across different social protection pillars. In order to allow for a more differentiated typology and enable us to understand different reform dynamics, the article uses the approach on gradual institutional change. While existing approaches to institutional change mostly focus on institutional change prompted by exogenous shocks or environmental shifts, this approach takes account of both, exogenous and endogenous sources of change.
There has been a growing interest in taste research in the HCI and CSCW communities. However, the focus is more on stimulating the senses, while the socio-cultural aspects have received less attention. However, individual taste perception is mediated through social interaction and collective negotiation and is not only dependent on physical stimulation. Therefore, we study the digital mediation of taste by drawing on ethnographic research of four online wine tastings and one self-organized event. Hence, we investigated the materials, associated meanings, competences, procedures, and engagements that shaped the performative character of tasting practices. We illustrate how the tastings are built around the taste-making process and how online contexts differ in providing a more diverse and distributed environment. We then explore the implications of our findings for the further mediation of taste as a social and democratized phenomenon through online interaction.
Taste is a complex phenomenon that depends on the individual experience and is a matter of collective negotiation and mediation. On the contrary, it is uncommon to include taste and its many facets in everyday design, particularly online shopping for fresh food products. To realize this unused potential, we conducted two Co-Design workshops. Based on the participants’ results in the workshops, we prototyped and evaluated a click-dummy smart-phone app to explore consumers’ needs for digital taste depiction. We found that emphasizing the natural qualities of food products, external reviews, and personalizing features lead to a reflection on the individual taste experience. The self-reflection through our design enables consumers to develop their taste competencies and thus strengthen their autonomy in decision-making. Ultimately, exploring taste as a social experience adds to a broader understanding of taste beyond a sensory phenomenon.
Dried serum spots that are well prepared can be attractive alternatives to frozen serum samples for shelving specimens in a medical or research center's biobank and mailing freshly prepared serum to specialized laboratories. During the pre-analytical phase, complications can arise which are often challenging to identify or are entirely overlooked. These complications can lead to reproducibility issues, which can be avoided in serum protein analysis by implementing optimized storage and transfer procedures. With a method that ensures accurate loading of filter paper discs with donor or patient serum, a gap in dried serum spot preparation and subsequent serum analysis shall be filled. Pre-punched filter paper discs with a 3 mm diameter are loaded within seconds in a highly reproducible fashion (approximately 10% standard deviation) when fully submerged in 10 μl of serum, named the "Submerge and Dry" protocol. Such prepared dried serum spots can store several hundred micrograms of proteins and other serum components. Serum-borne antigens and antibodies are reproducibly released in 20 μl elution buffer in high yields (approximately 90%). Dried serum spot-stored and eluted antigens kept their epitopes and antibodies their antigen binding abilities as was assessed by SDS-PAGE, 2D gel electrophoresis-based proteomics, and Western blot analysis, suggesting pre-punched filter paper discs as handy solution for serological tests.
Although most individuals who gamble do so without any adverse consequences, some individuals develop a recurrent, maladaptive pattern of gambling behaviour, often called pathological gambling or gambling disorder, that is associated with financial losses, disruption of family and interpersonal relationships, and co-occurring psychiatric disorders. Identifying whether different types of gambling modalities vary in their ability to lead to maladaptive patterns of gambling behaviour is essential to develop public policies that seek to balance access to gambling opportunities with minimizing risk for the potential adverse consequences of gambling behaviour. Until recently, assessing the risk potential of different types of gambling products was nearly impossible. ASTERIG, initially developed in Germany in 2006-2010, is an assessment tool to measure and to evaluate the risk potential of any gambling product based on scores on ten dimensions. In doing so, it also allows a comparison to be drawn between the addictive potential of different gambling products. Furthermore, the tool highlights where the specific risk potential of each specific gambling product lies. This makes it a valuable tool at the legislative, case law, and administrative levels as it allows the risk potential of individual gambling products to be identified and to be compared globally and across 10 different dimensions of risk potential. We note that specific gambling products should always be evaluated rather than product groups (lotteries, slot machines) or providers, as there may be variations among those product groups that impact their risk potential. For example, slot machines may vary on the amount of jackpot, which may influence their risk potential.
A central objective of the German gambling law is to ensure the protection of minors and players (§ 1 Sentence 1 No. 3 GlüStV 2012). Since the year 2014 gambling facilities for commercial games of chance in gambling halls and restaurants have been certified by the German Safety Standards Authority [Technischer Überwachungsverein – TÜV]. Certification by government-accredited testing organizations based on internationally validated, interdisciplinary scientific expertise "Safeguarding the Protection of Minors and Players with Respect to Commercial Gambling in Germany – 2.0" is a quality assurance instrument from a regulatory perspective. It is in the interests of, in particular, excellent quality providers to ensure that they are also perceived as providing this level of quality. Certification leads to market separation. In so doing, the advantages of end-to-end certification should be greater than any disadvantages. Analysing the international environment shows that certifi cation initiatives are necessary and have been put in place in other sectors of the gambling industry.
The primary aim is quality assurance for a responsible handling of commercial games of chance offerings (responsible gambling). The presented testing catalogue for commercial gambling can also provide an impetus in the international context and, as appropriate, a set standard.
Intercultural Management
(2021)
Trust is the lubricant of the sharing economy, especially in peer-to-peer carsharing where you leave a valuable good to a stranger in the hope of getting it backunscathed. Central mechanisms for handling this information gap nowadays are ratings and reviews of other users. The rising of connected car technology opens new possibilities to increase trust by collecting and providing e.g. driving behavior data. At the same time, this means an intrusion into the privacy of the user. Therefore, in this work we explore technological approaches that allow building trust without violating the privacy of individuals. We evaluate to what extent blockchain technology and smart contracts are suitable technologies to meet these challengesby setting upa prototype implementation of a block-chain-based carsharing approach. In this context, we present our research approachand evaluate the prototype in terms of trust and privacy.
Trust is the lubricant of the sharing economy. This is true especially in peer-to-peer carsharing, in which one leaves a highly valuable good to a stranger in the hope of getting it back unscathed. Nowadays, ratings of other users are major mechanisms for establishing trust. To foster uptake of peer-to-peer carsharing, connected car technology opens new possibilities to support trust-building, e.g., by adding driving behavior statistics to users' profiles. However, collecting such data intrudes into rentees' privacy. To explore the tension between the need for trust and privacy demands, we conducted three focus group and eight individual interviews. Our results show that connected car technologies can increase trust for car owners and rentees not only before but also during and after rentals. The design of such systems must allow a differentiation between information in terms of type, the context, and the negotiability of information disclosure.
People are getting older because of the demographic changes and the rate of disabled people is also going up. This article shows the challenge for BPMTool developer due to these circumstances. It illustrates how these changes impact the usage of BPM-Tools based on an Evaluation of an exemplary BPMTool (Cooper & Patterson, 2007) in terms of IT-Usability and IT-Accessibility. This evaluation was conducted in a research laboratory at the university.
The curricula of all degree programs at H-BRS have many different practice-oriented activities and focus on hands-on learning. In labs and small classrooms (30–60 persons), students get a personalized learning environment which is complemented with many individual and group projects that foster collaborative work situations. There are several main areas that students learn from working with industry, local organizations or public institutions.
Continued growth in international experiences for U.S. co++6llege students is a favorable trend. However, the most substantial increase has occurred with of short-term study abroad programs. Many of these programs include extensive travel instead of involving a single site. There is great danger that if not properly managed, these types of international educational experience will default into little more than an organized group tour.
In these types of programs it is challenging to induce student participants to engage meaningfully with local residents as the traveling group tends to form into its own portable society. In addition, the current state of wireless communications means that students participating in these types of programs can easily stay plugged into their home social networks which further reduces meaningful interactions in the cultures being visited.
Incorporating well designed research projects into short-term study abroad programs holds the potential to offset some of the inherent limitations of such programs. Research projects can serve both to prepare the students for the trip and promote meaningful cross-cultural interaction while the program is underway.
In this paper, the authors provide suggestions based on their experiences with short-term travel abroad programs which incorporated student research. Several potential problems are identified and suggestions are given for project design.
Neutral buoyancy has been used as an analog for microgravity from the earliest days of human spaceflight. Compared to other options on Earth, neutral buoyancy is relatively inexpensive and presents little danger to astronauts while simulating some aspects of microgravity. Neutral buoyancy removes somatosensory cues to the direction of gravity but leaves vestibular cues intact. Removal of both somatosensory and direction of gravity cues while floating in microgravity or using virtual reality to establish conflicts between them has been shown to affect the perception of distance traveled in response to visual motion (vection) and the perception of distance. Does removal of somatosensory cues alone by neutral buoyancy similarly impact these perceptions? During neutral buoyancy we found no significant difference in either perceived distance traveled nor perceived size relative to Earth-normal conditions. This contrasts with differences in linear vection reported between short- and long-duration microgravity and Earth-normal conditions. These results indicate that neutral buoyancy is not an effective analog for microgravity for these perceptual effects.
Smart home systems change the way we experience the home. While there are established research fields within HCI for visualizing specific use cases of a smart home, studies targeting user demands on visualizations spanning across multiple use cases are rare. Especially, individual data-related demands pose a challenge for usable visualizations. To investigate potentials of an end-user development (EUD) approach for flexibly supporting such demands, we developed a smart home system featuring both pre-defined visualizations and a visualization creation tool. To evaluate our concept, we installed our prototype in 12 households as part of a Living Lab study. Results are based on three interview studies, a design workshop and system log data. We identified eight overarching interests in home data and show how participants used pre-defined visualizations to get an overview and the creation tool to not only address specific use cases but also to answer questions by creating temporary visualizations.
So far, sustainable HCI has mainly focused on the domestic context, but there is a growing body of work looking at the organizational context. As in the domestic context, these works still rest on psychological theories for behaviour change used for the domestic context. We supplement this view with an organizational theory-informed approach that adopts organizational roles as a key element. We will show how a role-based analysis could be applied to uncover information needs and to give em-ployee’s eco-feedback, which is linked to their tasks at hand. We illustrate the approach on a qualitative case study that was part of a broader, ongoing action research conducted in a German production company.
Designing consumption feedback to support sustainable behavior is an active research topic. In recent years, relevant work has suggested a variety of possible design strategies. Addressing the more recent developments in this field, this paper presents a structured literature review, providing an overview of current information design approaches and highlighting open research questions. We suggest a literature-based taxonomy of used strategies, data source and output media with a special focus on design. In particular, we analyze which visual forms are used in current research to reach the identified strategy goals. Our survey reveals that the trend is towards more complex and contextualized feedback and almost every design within sustainable HCI adopts common visualization forms. Furthermore, adopting more advanced visual forms and techniques from information visualization research is helpful when dealing with ever-increasing data sources at home. Yet so far, this combination has often been neglected in feedback design.
The accurate forecasting of solar radiation plays an important role for predictive control applications for energy systems with a high share of photovoltaic (PV) energy. Especially off-grid microgrid applications using predictive control applications can benefit from forecasts with a high temporal resolution to address sudden fluctuations of PV-power. However, cloud formation processes and movements are subject to ongoing research. For now-casting applications, all-sky-imagers (ASI) are used to offer an appropriate forecasting for aforementioned application. Recent research aims to achieve these forecasts via deep learning approaches, either as an image segmentation task to generate a DNI forecast through a cloud vectoring approach to translate the DNI to a GHI with ground-based measurement (Fabel et al., 2022; Nouri et al., 2021), or as an end-to-end regression task to generate a GHI forecast directly from the images (Paletta et al., 2021; Yang et al., 2021). While end-to-end regression might be the more attractive approach for off-grid scenarios, literature reports increased performance compared to smart-persistence but do not show satisfactory forecasting patterns (Paletta et al., 2021). This work takes a step back and investigates the possibility to translate ASI-images to current GHI to deploy the neural network as a feature extractor. An ImageNet pre-trained deep learning model is used to achieve such translation on an openly available dataset by the University of California San Diego (Pedro et al., 2019). The images and measurements were collected in Folsom, California. Results show that the neural network can successfully translate ASI-images to GHI for a variety of cloud situations without the need of any external variables. Extending the neural network to a forecasting task also shows promising forecasting patterns, which shows that the neural network extracts both temporal and momentarily features within the images to generate GHI forecasts.
The Peren-Clement Index
(2024)
Introduction: Recovery experiences have thus far been portrayed as experiences that simply “happen” to people. However, recovery can also be understood from a crafting perspective; that is, individuals may proactively shape their work and non-work activities to recover from stress, satisfy their psychological needs, and achieve optimal functioning.
Materials and Methods: In my talk, I will present the theoretical basis of needs-based crafting based on a conceptual review of the literature. Moreover, I will present empirical findings on the validation of a newly developed off-job crafting scale.
Results: In five sub studies, we found that off-job crafting was related to optimal functioning over time. Moreover, the newly developed off-job crafting scale had good convergent and discriminant validity, internal consistency, and test-retest reliability.
Conclusions: Theoretical and empirical evidence suggests that needs-based crafting can enhance optimal functioning in different life domains and support people in performing their work duties sustainably. Proactive attempts to achieve better recovery through needs satisfaction may be beneficial in an intensified and continually changing and challenging working life. Our line of research provides important avenues for organizational research and practices regarding recovery and needs satisfaction occurring at work and outside work.
Unlimited paid time off policies are currently fashionable and widely discussed by HR professionals around the globe. While on the one hand, paid time off is considered a key benefit by employees and unlimited paid time off policies (UPTO) are seen as a major perk which may help in recruiting and retaining talented employees, on the other hand, early adopters reported that employees took less time off than previously, presumably leading to higher burnout rates. In this conceptual review, we discuss the theoretical and empirical evidence regarding the potential effects of UPTO on leave utilization, well-being and performance outcomes. We start out by defining UPTO and placing it in a historical and international perspective. Next, we discuss the key role of leave utilization in translating UPTO into concrete actions. The core of our article constitutes the description of the effects of UPTO and the two pathways through which these effects are assumed to unfold: autonomy need satisfaction and detrimental social processes. We moreover discuss the boundary conditions which facilitate or inhibit the successful utilization of UPTO on individual, team, and organizational level. In reviewing the literature from different fields and integrating existing theories, we arrive at a conceptual model and five propositions, which can guide future research on UPTO. We conclude with a discussion of the theoretical and societal implications of UPTO.