Refine
H-BRS Bibliography
- yes (4915) (remove)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (1241)
- Fachbereich Informatik (1148)
- Fachbereich Angewandte Naturwissenschaften (766)
- Fachbereich Ingenieurwissenschaften und Kommunikation (636)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (480)
- Präsidium (404)
- Fachbereich Sozialpolitik und Soziale Sicherung (402)
- Institute of Visual Computing (IVC) (313)
- Institut für funktionale Gen-Analytik (IFGA) (241)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (195)
Document Type
- Article (1603)
- Conference Object (1119)
- Part of a Book (689)
- Part of Periodical (410)
- Book (monograph, edited volume) (370)
- Report (145)
- Preprint (88)
- Working Paper (87)
- Contribution to a Periodical (83)
- Doctoral Thesis (70)
Year of publication
Keywords
- Lehrbuch (85)
- Deutschland (27)
- Nachhaltigkeit (27)
- Controlling (23)
- Unternehmen (23)
- Digitalisierung (17)
- Management (17)
- Betriebswirtschaftslehre (16)
- Machine Learning (16)
- Corporate Social Responsibility (15)
Queueing Theory
(2024)
Entrepreneurship education serves a conduit for new venture creation as it provides the knowledge and skills needed to increase the self-efficacy of individuals to start and run new businesses and to grow existing ones. This study, therefore, sought to assess the relationship between the approaches to the teaching of entrepreneur-ship and entrepreneurial intention on a cohort of 292 respondents consisting of students who have studied entrepreneurship in three selected Universities. A structured questionnaire was used to obtain data randomly from students. The canonical correlation results indicate that education for and through entrepreneurship is the best approach to promoting entrepreneurial intensity among University students, if the aim of teaching entrepreneur-ship is to promote start-up activities. The findings provide valuable insights for institutions of higher learning and policy makers in Ghana with respect to the appropriate methodologies to be adopted in the teaching of entrepreneurship in our universities.
In the last two decades, studies that analyse the political economy of sustainable energy transitions have increasingly become available. Yet very few attempts have been made to synthesize the factors discussed in the growing literature. This paper reviews the extant empirical literature on the political economy of sustainable energy transitions. Using a well-defined search strategy, a total of 36 empirical contributions covering the period 2008 to 2022 are reviewed full text. Overall, the findings highlight the role of vested interest, advocacy coalitions and green constituencies, path dependency, external shocks, policy and institutional environment, political institutions and fossil fuel resource endowments as major political economy factors influencing sustainable energy transitions across both high income countries, and low and middle income countries. In addition, the paper highlights and discusses some critical knowledge gaps in the existing literature and provides suggestions for a future research agenda.
As competition for tourists becomes more global, understanding and accommodating the needs of international tourists, with their different cultural backgrounds, has become increasingly important. This study highlights the variations in tourist industry service--particularly as they relate to different cultures. Specifically, service failures experienced by Japanese and German tourists in the U.S. were categorized using the Critical Incident Technique (CIT). The results were compared with earlier studies of service failures experienced by American consumers in the tourist industry. The sample consists of 128 Japanese and 94 “Germanic” (German, Austrian, Swiss-German) respondents. The Japanese and German sample rated “Inappropriate employee behavior” most significant category of service failure. More than half of these respondents said that, because of the failure, they would avoid the offending U.S. business. This is a much stronger response than an American sample had reported in an earlier study. The implications for managers and researchers are discussed.
Robust Indoor Localization Using Optimal Fusion Filter For Sensors And Map Layout Information
(2014)
Conclusion
(2018)
There is a paradigm shift from traditional content-based education and training to competencybased and practice-oriented training. This shift has occurred because practice-oriented teaching has been found to produce a training outcome that is industry focused, generating the relevant occupational standards. Competency-based training program often comprises of modules broken into segments called learning outcomes. These learning outcomes are based on criteria set by industry and assessment is designed to ensure students become competent in their respective areas of specialization.
Multidisciplinary, multicultural, and multitasking has taken center stage in the global educational debate. Globalization and improvement in communication have affected the way organisations operate and hence influenced whom they hire. Today, it is common practice to work with people from diverse backgrounds and it requires competencies that go beyond general project management. Intercultural awareness, networking in different global communities, and learning to develop specific communication strategies for different stakeholders is all part of the package of skills and competencies that are required in today's interconnected world. This has indirect implication on the nature of skills and competencies institutions/universities must equip their students with to enable them to compete successfully in the working world.
Airborne and spaceborne platforms are the primary data sources for large-scale forest mapping, but visual interpretation for individual species determination is labor-intensive. Hence, various studies focusing on forests have investigated the benefits of multiple sensors for automated tree species classification. However, transferable deep learning approaches for large-scale applications are still lacking. This gap motivated us to create a novel dataset for tree species classification in central Europe based on multi-sensor data from aerial, Sentinel-1 and Sentinel-2 imagery. In this paper, we introduce the TreeSatAI Benchmark Archive, which contains labels of 20 European tree species (i.e., 15 tree genera) derived from forest administration data of the federal state of Lower Saxony, Germany. We propose models and guidelines for the application of the latest machine learning techniques for the task of tree species classification with multi-label data. Finally, we provide various benchmark experiments showcasing the information which can be derived from the different sensors including artificial neural networks and tree-based machine learning methods. We found that residual neural networks (ResNet) perform sufficiently well with weighted precision scores up to 79 % only by using the RGB bands of aerial imagery. This result indicates that the spatial content present within the 0.2 m resolution data is very informative for tree species classification. With the incorporation of Sentinel-1 and Sentinel-2 imagery, performance improved marginally. However, the sole use of Sentinel-2 still allows for weighted precision scores of up to 74 % using either multi-layer perceptron (MLP) or Light Gradient Boosting Machine (LightGBM) models. Since the dataset is derived from real-world reference data, it contains high class imbalances. We found that this dataset attribute negatively affects the models' performances for many of the underrepresented classes (i.e., scarce tree species). However, the class-wise precision of the best-performing late fusion model still reached values ranging from 54 % (Acer) to 88 % (Pinus). Based on our results, we conclude that deep learning techniques using aerial imagery could considerably support forestry administration in the provision of large-scale tree species maps at a very high resolution to plan for challenges driven by global environmental change. The original dataset used in this paper is shared via Zenodo (https://doi.org/10.5281/zenodo.6598390, Schulz et al., 2022). For citation of the dataset, we refer to this article.
This paper presents the b-it-bots RoboCup@Work team and its current hardware and functional architecture for the KUKA youBot robot.We describe the underlying software framework and the developed capabilities required for operating in industrial environments including features such as reliable and precise navigation, flexible manipulation and robust object recognition.
Estimation of Prediction Uncertainty for Semantic Scene Labeling Using Bayesian Approximation
(2018)
With the advancement in technology, autonomous and assisted driving are close to being reality. A key component of such systems is the understanding of the surrounding environment. This understanding about the environment can be attained by performing semantic labeling of the driving scenes. Existing deep learning based models have been developed over the years that outperform classical image processing algorithms for the task of semantic labeling. However, the existing models only produce semantic predictions and do not provide a measure of uncertainty about the predictions. Hence, this work focuses on developing a deep learning based semantic labeling model that can produce semantic predictions and their corresponding uncertainties. Autonomous driving needs a real-time operating model, however the Full Resolution Residual Network (FRRN) [4] architecture, which is found as the best performing architecture during literature search, is not able to satisfy this condition. Hence, a small network, similar to FRRN, has been developed and used in this work. Based on the work of [13], the developed network is then extended by adding dropout layers and the dropouts are used during testing to perform approximate Bayesian inference. The existing works on uncertainties, do not have quantitative metrics to evaluate the quality of uncertainties estimated by a model. Hence, the area under curve (AUC) of the receiver operating characteristic (ROC) curves is proposed and used as an evaluation metric in this work. Further, a comparative analysis about the influence of dropout layer position, drop probability and the number of samples, on the quality of uncertainty estimation is performed. Finally, based on the insights gained from the analysis, a model with optimal configuration of dropout is developed. It is then evaluated on the Cityscape dataset and shown to be outperforming the baseline model with an AUC-ROC of about 90%, while the latter having AUC-ROC of about 80%.
A robot (e.g. mobile manipulator) that interacts with its environment to perform its tasks, often faces situations in which it is unable to achieve its goals despite perfect functioning of its sensors and actuators. These situations occur when the behavior of the object(s) manipulated by the robot deviates from its expected course because of unforeseeable ircumstances. These deviations are experienced by the robot as unknown external faults. In this work we present an approach that increases reliability of mobile manipulators against the unknown external faults. This approach focuses on the actions of manipulators which involve releasing of an object. The proposed approach, which is triggered after detection of a fault, is formulated as a three-step scheme that takes a definition of a planning operator and an example simulation as its inputs. The planning operator corresponds to the action that fails because of the fault occurrence, whereas the example simulation shows the desired/expected behavior of the objects for the same action. In its first step, the scheme finds a description of the expected behavior of the objects in terms of logical atoms (i.e. description vocabulary). The description of the simulation is used by the second step to find limits of the parameters of the manipulated object. These parameters are the variables that define the releasing state of the object.
Using randomly chosen values of the parameters within these limits, this step creates different examples of the releasing state of the object. Each one of these examples is labelled as desired or undesired according to the behavior exhibited by the object (in the simulation), when the object is released in the state corresponded by the example. The description vocabulary is also used in labeling the examples autonomously. In the third step, an algorithm (i.e. N-Bins) uses the labelled examples to suggest the state for the object in which releasing it avoids the occurrence of unknown external faults.
The proposed N-Bins algorithm can also be used for binary classification problems. Therefore, in our experiments with the proposed approach we also test its prediction ability along with the analysis of the results of our approach. The results show that under the circumstances peculiar to our approach, N-Bins algorithm shows reasonable prediction accuracy where other state of the art classification algorithms fail to do so. Thus, N-Bins also extends the ability of a robot to predict the behavior of the object to avoid unknown external faults. In this work we use simulation environment OPENRave that uses physics engine ODE to simulate the dynamics of rigid bodies.
A system that interacts with its environment can be much more robust if it is able to reason about the faults that occur in its environment, despite perfect functioning of its internal components. For robots, which interact with the same environment as human beings, this robustness can be obtained by incorporating human-like reasoning abilities in them. In this work we use naive physics to enable reasoning about external faults in robots. We propose an approach for diagnosing external faults that uses qualitative reasoning on naive physics concepts for diagnosis. These concepts are mainly individual properties of objects that define their state qualitatively. The reasoning process uses physical laws to generate possible states of the concerned object(s), which could result into a detected external fault. Since effective reasoning about any external fault requires the information of relevant properties and physical laws, we associate different properties and laws to different types of faults which can be detected by a robot. The underlying ontology of this association is proposed on the basis of studies conducted (by other researchers) on reasoning of physics novices about everyday physical phenomena. We also formalize some definitions of properties of objects into a small framework represented in First-Order logic. These definitions represent naive concepts behind the properties and are intended to be independent from objects and circumstances. The definitions in the framework illustrates our proposal of using different biased definitions of properties for different types of faults. In this work, we also present a brief review of important contributions in the area of naive/qualitative physics. These reviews help in understanding the limitations of naive/qualitative physics in general. We also apply our approach to simple scenarios to asses its limitations in particular. Since this work was done independent of any particular real robotic system, it can be seen as a theoretical proof of the concept of usefulness of naive physics for external fault reasoning in robotics.
Despite perfect functioning of its internal components, a robot can be unsuccessful in performing its tasks because of unforeseen situations. These situations occur when the behavior of the objects in the robot’s environment deviates from its expected values. For robots, such deviations are exhibited in the form of unknown external faults which prohibit them from performing their tasks successfully. In this work we propose to use naive physics knowledge to reason about such faults in the robotics domain. We propose an approach that uses naive physics concepts to find information about the situations which result in a detected unknown fault. The naive physics knowledge is represented by the physical properties of objects which are formalized in a logical framework. The proposed approach applies a qualitative version of physical laws to these properties for reasoning about the detected fault. By interpreting the reasoning results the robot finds the information about the situations which can cause the fault. We apply the proposed approach to the scenarios in which a robot performs manipulation tasks of picking and placing objects. Results of this application show that naive physics holds great promise for reasoning about unknown ex- ternal faults in robotics.
Due to the use of fossil fuel resources, many environmental problems have been increasingly growing. Thus, the recent research focuses on the use of environment friendly materials from sustainable feedstocks for future fuels, chemicals, fibers and polymers. Lignocellulosic biomass has become the raw material of choice for these new materials. Recently, the research has focused on using lignin as a substitute material in many industrial applications. The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. DPPH assay was used to determine the antioxidant activity of Kraft lignin compared to Organosolv lignins from different biomasses. The purification procedure of Kraft lignin showed that double-fold selective extraction is the most efficient confirmed by UV-Vis, FTIR, HSQC, 31PNMR, SEC, and XRD. The antioxidant capacity was discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: Biomass source influences the DPPH inhibition (softwood > grass) and the TPC (softwood < grass). DPPH inhibition affected by the polarity of the extraction solvent. Following the trend: ethanol > diethylether > acetone. Reduced polydispersity has positive influence on the DPPH inhibition. Storage decreased the DPPH inhibition but increased the TPC values. The DPPH assay was also used to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. In both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems the 5% addition showed the highest activity and the highest addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source; Organosolv of softwood > Kraft of softwood > Organosolv of grass. Lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of Kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin leaching in the produced films affected the activity positively and the chitosan addition enhances the activity for both Gram-positive and Gram-negative bacteria. Testing the films against food spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both food spoilage bacteria.
Mergers and acquisitions take place all over the world and in many industries, typically motivated by corporate politics. While IT management is often not involved in the decision-making, it has to solve a wide range of problems in the post-merger phase. Indeed, merging two or more companies implies not only merging their core businesses, but also creating a new and efficiently integrated IT organisation from the individual ones, since persistence of the current IT organisations usually does not make sense. In addition, corporate management frequently imposes constraints, e.g., cost reductions, on the IT infrastructure. The principal critical success factor when merging IT organisations is the uninterrupted operation of the IT business, because a service gap is neither acceptable for in-house functional departments nor for external customers. Therefore, the IT rebuilding phase has to focus on IT services that facilitate the processes of functional departments, support processes, and processes of customers and suppliers, so that any transformation work is transparent to internal and external customers. In this article we describe a real-world but anonymous case study. Our goals are to highlight the points important for merging IT organisations, and to help decision-makers, particularly in the areas of IT organisation and IT personnel. We focus on the arising organisational and non-technical issues from a management perspective, i.e., the CIO's view, and provide checklists intended to help IT managers to address the most pressing issues. To assist CIOs surviving in the post merger phase, we give check lists for merging IT organisations, check lists for merging IT human resources, check lists for IT budgets and reporting, and assess activities in a merger scenario. IT hardware, software and IT infrastructure as well as running IT projects are not considered in this paper.
Sind kleinere und mittlere Unternehmen (KMU) bereits auf die Digitale Transformation vorbereitet?
(2018)
Eine von den Autoren durchgeführte Untersuchung konnte deutliche Indizien dafür ausmachen, dass viele kleinere und mittlere Unternehmen (KMU) aktuell noch keine ausreichende Reife zur Digitalen Transformation haben. Zur Lösung des Problems wird vorgeschlagen, ein agiles IT-Management-Konzept zu entwickeln, um den IT-Bereich dynamisch und ohne formalen Ballast des klassischen IT-Managements zu steuern.
Multi-Merger-Szenarien als Herausforderung für das IT-Controlling - Checklisten zur IT-Integration
(2006)
Digitalisierung für kleinere und mittlere Unternehmen (KMU): Anforderungen an das IT-Management
(2018)
Während sich die unternehmerische Arbeitswelt immer mehr in Richtung Agilität verschiebt, verharrt das IT-Controlling noch in alten, klassischen Strukturen. Diese Arbeit untersucht die Fragestellung, ob und inwieweit agile Ansätze im IT-Controlling eingesetzt werden können. Dieser Beitrag ist eine modifizierte Version des in der Zeitschrift „HMD Praxis der Wirtschaftsinformatik“ (https://link.springer.com/article/10.1365/s40702-022-00837-0) erschienenen Artikels „Agiles IT-Controlling“.
Agiles IT-Controlling
(2022)
Während im IT-Projektmanagement agile Methoden seit vielen Jahren in der Praxis Zuspruch finden, werden im IT-Controlling überwiegend noch klassische Methoden eingesetzt. Der Beitrag untersucht die Fragestellung, ob und wie die im IT-Controlling eingesetzten Methoden auch agilen Paradigmen folgen und Methoden des agilen IT-Projektmanagements adaptiert werden können.
IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).
A plethora of architectural patterns and elements for developing service-oriented applications can be gathered from the state-of-the-art. Most of these approaches are merely applicable for single-tenant applications. However, less methodical support is provided for scenarios, in which multiple different tenants with varying requirements access the same application stack concurrently. In order to fill this gap, both novel and existing architectural patterns, architectural elements, as well as fundamental design decisions must be considered and integrated into a framework that leverages the devel- opment of multi-tenant application. This paper addresses this demand and presents the SOAdapt framework. It promotes the development of adaptable multi-tenant applications based on a service-oriented architecture that is capable of incorporating specific requirements of new tenants in a flexible manner.
Die digitale Transformation verändert die internationale Kooperation der Hochschulen massiv. Über die Möglichkeiten der virtuellen Mobilität hinaus entstehen neue Themenfelder, die internationale Lern- und Lehrerlebnisse mit digitaler Unterstützung verändern, ergänzen oder neu ermöglichen. Dazu sind im Bereich der Förderung der Internationalisierung (DAAD, Erasmus+, BMBF u.a.) Projekte und Förderformate entstanden, die Digitalisierung und Internationalisierung kombinieren und die neuen Themenstellungen adressieren, z.B. didaktische Formate, administrative Prozesse (auch im Kontext OZG und DSGVO), virtuelle und hybride Mobilität, internationale Projekt- und Teamformate sowie schlussendlich auch Inhalte, die internationale, interkulturelle und interdisziplinäre Kompetenzen mit digitalen Kompetenzen verbinden. Der vorgeschlagene Workshop soll entsprechende Projekte zusammenbringen und die Themen strukturieren, um einen Überblick der Entwicklungen zu schaffen und somit einen Beitrag zur Definition des Themenfelds „Digitalisierung & Internationalisierung“ zu leisten.
Trueness and precision of milled and 3D printed root-analogue implants: A comparative in vitro study
(2023)
The need for innovation around the control functions of inverters is great. PV inverters were initially expected to be passive followers of the grid and to disconnect as soon as abnormal conditions happened. Since future power systems will be dominated by generation and storage resources interfaced through inverters these converters must move from following to forming and sustaining the grid. As “digital natives” PV inverters can also play an important role in the digitalisation of distribution networks. In this short review we identified a large potential to make the PV inverter the smart local hub in a distributed energy system. At the micro level, costs and coordination can be improved with bidirectional inverters between the AC grid and PV production, stationary storage, car chargers and DC loads. At the macro level the distributed nature of PV generation means that the same devices will support both to the local distribution network and to the global stability of the grid. Much success has been obtained in the former. The later remains a challenge, in particular in terms of scaling. Yet there is some urgency in researching and demonstrating such solutions. And while digitalisation offers promise in all control aspects it also raises significant cybersecurity concerns.
A principal step towards solving diverse perception problems is segmentation. Many algorithms benefit from initially partitioning input point clouds into objects and their parts. In accordance with cognitive sciences, segmentation goal may be formulated as to split point clouds into locally smooth convex areas, enclosed by sharp concave boundaries. This goal is based on purely geometrical considerations and does not incorporate any constraints, or semantics, of the scene and objects being segmented, which makes it very general and widely applicable. In this work we perform geometrical segmentation of point cloud data according to the stated goal. The data is mapped onto a graph and the task of graph partitioning is considered. We formulate an objective function and derive a discrete optimization problem based on it. Finding the globally optimal solution is an NP-complete problem; in order to circumvent this, spectral methods are applied. Two algorithms that implement the divisive hierarchical clustering scheme are proposed. They derive graph partition by analyzing the eigenvectors obtained through spectral relaxation. The specifics of our application domain are used to automatically introduce cannot-link constraints in the clustering problem. The algorithms function in completely unsupervised manner and make no assumptions about shapes of objects and structures that they segment. Three publicly available datasets with cluttered real-world scenes and an abundance of box-like, cylindrical, and free-form objects are used to demonstrate convincing performance. Preliminary results of this thesis have been contributed to the International Conference on Autonomous Intelligent Systems (IAS-13).
A company's financial documents use tables along with text to organize the data containing key performance indicators (KPIs) (such as profit and loss) and a financial quantity linked to them. The KPI’s linked quantity in a table might not be equal to the similarly described KPI's quantity in a text. Auditors take substantial time to manually audit these financial mistakes and this process is called consistency checking. As compared to existing work, this paper attempts to automate this task with the help of transformer-based models. Furthermore, for consistency checking it is essential for the table's KPIs embeddings to encode the semantic knowledge of the KPIs and the structural knowledge of the table. Therefore, this paper proposes a pipeline that uses a tabular model to get the table's KPIs embeddings. The pipeline takes input table and text KPIs, generates their embeddings, and then checks whether these KPIs are identical. The pipeline is evaluated on the financial documents in the German language and a comparative analysis of the cell embeddings' quality from the three tabular models is also presented. From the evaluation results, the experiment that used the English-translated text and table KPIs and Tabbie model to generate table KPIs’ embeddings achieved an accuracy of 72.81% on the consistency checking task, outperforming the benchmark, and other tabular models.
In 1991 the researchers at the center for the Learning Sciences of Carnegie Mellon University were confronted with the confusing question of “where is AI” from the users, who were interacting with AI but did not realize it. Three decades of research and we are still facing the same issue with the AItechnology users. In the lack of users’ awareness and mutual understanding of AI-enabled systems between designers and users, informal theories of the users about how a system works (“Folk theories”) become inevitable but can lead to misconceptions and ineffective interactions. To shape appropriate mental models of AI-based systems, explainable AI has been suggested by AI practitioners. However, a profound understanding of the current users’ perception of AI is still missing. In this study, we introduce the term “Perceived AI” as “AI defined from the perspective of its users”. We then present our preliminary results from deep-interviews with 50 AItechnology users, which provide a framework for our future research approach towards a better understanding of PAI and users’ folk theories.
For most people, using their body to authenticate their identity is an integral part of daily life. From our fingerprints to our facial features, our physical characteristics store the information that identifies us as "us." This biometric information is becoming increasingly vital to the way we access and use technology. As more and more platform operators struggle with traffic from malicious bots on their servers, the burden of proof is on users, only this time they have to prove their very humanity and there is no court or jury to judge, but an invisible algorithmic system. In this paper, we critique the invisibilization of artificial intelligence policing. We argue that this practice obfuscates the underlying process of biometric verification. As a result, the new "invisible" tests leave no room for the user to question whether the process of questioning is even fair or ethical. We challenge this thesis by offering a juxtaposition with the science fiction imagining of the Turing test in Blade Runner to reevaluate the ethical grounds for reverse Turing tests, and we urge the research community to pursue alternative routes of bot identification that are more transparent and responsive.
Technological objects present themselves as necessary, only to become obsolete faster than ever before. This phenomenon has led to a population that experiences a plethora of technological objects and interfaces as they age, which become associated with certain stages of life and disappear thereafter. Noting the expanding body of literature within HCI about appropriation, our work pinpoints an area that needs more attention, “outdated technologies.” In other words, we assert that design practices can profit as much from imaginaries of the future as they can from reassessing artefacts from the past in a critical way. In a two-week fieldwork with 37 HCI students, we gathered an international collection of nostalgic devices from 14 different countries to investigate what memories people still have of older technologies and the ways in which these memories reveal normative and accidental use of technological objects. We found that participants primarily remembered older technologies with positive connotations and shared memories of how they had adapted and appropriated these technologies, rather than normative uses. We refer to this phenomenon as nostalgic reminiscence. In the future, we would like to develop this concept further by discussing how nostalgic reminiscence can be operationalized to stimulate speculative design in the present.
When dialogues with voice assistants (VAs) fall apart, users often become confused or even frustrated. To address these issues and related privacy concerns, Amazon recently introduced a feature allowing Alexa users to inquire about why it behaved in a certain way. But how do users perceive this new feature? In this paper, we present preliminary results from research conducted as part of a three-year project involving 33 German households. This project utilized interviews, fieldwork, and co-design workshops to identify common unexpected behaviors of VAs, as well as users’ needs and expectations for explanations. Our findings show that, contrary to its intended purpose, the new feature actually exacerbates user confusion and frustration instead of clarifying Alexa's behavior. We argue that such voice interactions should be characterized as explanatory dialogs that account for VA’s unexpected behavior by providing interpretable information and prompting users to take action to improve their current and future interactions.
AI (artificial intelligence) systems are increasingly being used in all aspects of our lives, from mundane routines to sensitive decision-making and even creative tasks. Therefore, an appropriate level of trust is required so that users know when to rely on the system and when to override it. While research has looked extensively at fostering trust in human-AI interactions, the lack of standardized procedures for human-AI trust makes it difficult to interpret results and compare across studies. As a result, the fundamental understanding of trust between humans and AI remains fragmented. This workshop invites researchers to revisit existing approaches and work toward a standardized framework for studying AI trust to answer the open questions: (1) What does trust mean between humans and AI in different contexts? (2) How can we create and convey the calibrated level of trust in interactions with AI? And (3) How can we develop a standardized framework to address new challenges?
In the fermentation process sugars are transformed into lactic acid. pH meters have traditionally been used for fermentation process monitoring based on acidity. More recently, near infrared (NIR) spectroscopy has proven to provide an accurate and non-invasive method to detect when the transformation of sugars into lactic acid is finished. The fermentation process when sugars are transformed into lactic acid. This research proposes the use of simplified NIR spectroscopy using multispectral optical sensors as a simpler and less expensive measure to end the fermentation process. The NIR spectrum of milk and yogurt is compared to find and extract features that can be used to design a simple sensor to monitor the yogurt fermentation process. Multispectral images in four selected wavebands within the NIR spectrum are captured and show different spectral remission characteristics for milk, yogurt and water, which support the selection of these wavebands for milk and yogurt classification.
Information reliability and automatic computation are two important aspects that are continuously pushing the Web to be more semantic. Information uploaded to the Web should be reusable and extractable automatically to other applications, platforms, etc. Several tools exist to explicitly markup Web content. The Web services may also have a positive role on the automatic processing of Web contents, especially when they act as flexible and agile agents. However, Web services themselves should be developed with semantics in mind. They should include and provide structured information to facilitate their use, reuse, composition, query, etc. In this chapter, the authors focus on evaluating state-of-the-art semantic aspects and approaches in Web services. Ultimately, this contributes to the goal of Web knowledge management, execution, and transfer.
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
ENaC channels
(2023)
The epithelial sodium channel (ENaC) is a heterotrimeric ion channel that plays a key role in sodium and water homeostasis in tetrapod vertebrates. In the aldosterone-sensitive distal nephron, hormonally controlled ENaC expression matches dietary sodium intake to its excretion. Furthermore, ENaC mediates sodium absorption across the epithelia of the colon, sweat ducts, reproductive tract, and lung. ENaC is a constitutively active ion channel and its expression, membrane abundance, and open probability (PO) are controlled by multiple intracellular and extracellular mediators and mechanisms [9]. Aberrant ENaC regulation is associated with severe human diseases, including hypertension, cystic fibrosis, pulmonary edema, pseudohypoaldosteronism type 1, and nephrotic syndrome [9].
Introduction: After cellulose, lignin represents the most abundant biopolymer on earth that accounts for up to 18-35 % by weight of lignocellulose biomass. Today, it is a by-product of the paper and pulping industry. Although lignin is available in huge amounts, mainly in form of so called black liquor produced via Kraft-pulping, processes for the valorization of lignin are still limited [1]. Due to its hyperbranched polyphenol-like structure, lignin gained increasing interest as biobased building block for polymer synthesis [2]. The present work is focused on extraction and purification of lignin from industrial black liquor and synthesis of lignin-based polyurethanes.
Lignocellulose feedstock (LCF) provides a sustainable source of components to produce bioenergy, biofuel, and novel biomaterials. Besides hard and soft wood, so-called low-input plants such as Miscanthus are interesting crops to be investigated as potential feedstock for the second generation biorefinery. The status quo regarding the availability and composition of different plants, including grasses and fast-growing trees (i.e., Miscanthus, Paulownia), is reviewed here. The second focus of this review is the potential of multivariate data processing to be used for biomass analysis and quality control. Experimental data obtained by spectroscopic methods, such as nuclear magnetic resonance (NMR) and Fourier-transform infrared spectroscopy (FTIR), can be processed using computational techniques to characterize the 3D structure and energetic properties of the feedstock building blocks, including complex linkages. Here, we provide a brief summary of recently reported experimental data for structural analysis of LCF biomasses, and give our perspectives on the role of chemometrics in understanding and elucidating on LCF composition and lignin 3D structure.
Renewable resources gain increasing interest as source for environmentally benign biomaterials, such as drug encapsulation/release compounds, and scaffolds for tissue engineering in regenerative medicine. Being the second largest naturally abundant polymer, the interest in lignin valorization for biomedical utilization is rapidly growing. Depending on resource and isolation procedure, lignin shows specific antioxidant and antimicrobial activity. Today, efforts in research and industry are directed toward lignin utilization as renewable macromolecular building block for the preparation of polymeric drug encapsulation and scaffold materials. Within the last five years, remarkable progress has been made in isolation, functionalization and modification of lignin and lignin-derived compounds. However, literature so far mainly focuses lignin-derived fuels, lubricants and resins. The purpose of this review is to summarize the current state of the art and to highlight the most important results in the field of lignin-based materials for potential use in biomedicine (reported in 2014–2018). Special focus is drawn on lignin-derived nanomaterials for drug encapsulation and release as well as lignin hybrid materials used as scaffolds for guided bone regeneration in stem cell-based therapies.
Antioxidant activity is an essential feature required for oxygen-sensitive merchandise and goods, such as food and corresponding packaging as well as materials used in cosmetics and biomedicine. For example, vanillin, one of the most prominent antioxidants, is fabricated from lignin, the second most abundant natural polymer in the world. Antioxidant potential is primarily related to the termination of oxidation propagation reactions through hydrogen transfer. The application of technical lignin as a natural antioxidant has not yet been implemented in the industrial sector, mainly due to the complex heterogeneous structure and polydispersity of lignin. Thus, current research focuses on various isolation and purification strategies to improve the compatibility of lignin material with substrates and enhancing its stabilizing effect.
Antioxidant activity is an essential aspect of oxygen-sensitive merchandise and goods, such as food and corresponding packaging, cosmetics, and biomedicine. Technical lignin has not yet been applied as a natural antioxidant, mainly due to the complex heterogeneous structure and polydispersity of lignin. This report presents antioxidant capacity studies completed using the 2,2-diphenyl-1-picrylhydrazyl (DPPH) assay. The influence of purification on lignin structure and activity was investigated. The purification procedure showed that double-fold selective extraction is the most efficient (confirmed by ultraviolet-visible (UV/Vis), Fourier transform infrared (FTIR), heteronuclear single quantum coherence (HSQC) and 31P nuclear magnetic resonance spectroscopy, size exclusion chromatography, and X-ray diffraction), resulting in fractions of very narrow polydispersity (3.2⁻1.6), up to four distinct absorption bands in UV/Vis spectroscopy. Due to differential scanning calorimetry measurements, the glass transition temperature increased from 123 to 185 °C for the purest fraction. Antioxidant capacity is discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: antioxidant activity (DPPH inhibition) of kraft lignin fractions were 62⁻68%, whereas beech and spruce/pine-mixed lignin showed values of 42% and 64%, respectively. Total phenol content (TPC) of the isolated kraft lignin fractions varied between 26 and 35%, whereas beech and spruce/pine lignin were 33% and 34%, respectively. Storage decreased the TPC values but increased the DPPH inhibition.
The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. The polyphenolic structure of lignin in addition to the presence of O-containing functional groups is potentially responsible for these activities. This study used DPPH assays to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. The scavenging activity (SA) of both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems was affected by the percentage of the added lignin: the 5% addition showed the highest activity and the 30% addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source showing the following trend: organosolv of softwood > kraft of softwood > organosolv of grass. Testing the antimicrobial activities of lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin release in the produced films affected the activity positively and the chitosan addition enhances the activity even more for both Gram-positive and Gram-negative bacteria. Testing the films against spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both B. thermosphacta and P. fluorescens.
Once aberrantly activated, the Wnt/βcatenin pathway may result in uncontrolled proliferation and eventually cancer. Efforts to counter and inhibit this pathway are mainly directed against βcatenin, as it serves a role on the cytoplasm and the nucleus. In addition, speciallygenerated lymphocytes are recruited for the purpose of treating liver cancer. Peripheral blood mononuclear lymphocytes are expanded by the timely addition of interferon γ, interleukin (IL)1β, IL2 and anticluster of differentiation 3 antibody. The resulting cells are called cytokineinduced killer (CIK) cells. The present study utilised these cells and combine them with drugs inhibiting the Wnt pathway in order to examine whether this resulted in an improvement in the killing ability of CIK cells against liver cancer cells. Drugs including ethacrynic acid (EA) and ciclopirox olamine (CPX) were determined to be suitable candidates, as determined by previous studies. Drugs were administered on their own and combined with CIK cells and then a cell viability assay was performed. These results suggest that EAtreated cells demonstrated apoptosis and were significantly affected compared with untreated cells. Unlike EA, CPX killed normal and cancerous cells even at low concentrations. Subsequent to combining EA with CIK cells, the potency of killing was increased and a greater number of cells died, which proves a synergistic action. In summary, EA may be used as an antihepatocellular carcinoma drug, while CPX possesses a high toxicity to cancerous as well as to normal cells. It was proposed that EA should be integrated into present therapeutic methods for cancer.
In today’s business, culture plays a vital role or to a high degree influences the attitude, perception and decision making process of an individual. Culture is an unavoidable state of rules and regulations that defines people’s daily life in a particular environment or society. There are plenty examples of business failures or stagnation or failure of joint ventures, on account of the management's inability to recognize cross-cultural challenges and tackle them appropriately.
BWL kompakt für Dummies
(2018)
"BWL kompakt für Dummies" bietet Ihnen eine verständliche Einführung in die Betriebswirtschaftslehre, egal ob Sie sie für die Aus- oder Weiterbildung brauchen oder sich einfach schlau machen wollen. Tobias Amely stellt Ihnen die wesentlichen Elemente und Grundbegriffe der Betriebswirtschaftslehre vor und zeigt die Bezüge zur Unternehmenspraxis auf: Materialwirtschaft, Leistungsbereitstellung und Produktion, Marketing, Investition und Finanzierung, Unternehmensorganisation und -führung, Rechnungswesen und Controlling.
BWL-Formeln für Dummies
(2012)
BWL-Formeln für Dummies
(2018)
Können Sie sich auch Formeln so schlecht merken? Das ist mit diesem handlichen Nachschlagewerk auch gar nicht mehr nötig! Der Autor des Bestsellers "BWL für Dummies" Tobias Amely hat für Sie die wichtigsten BWL-Formeln zusammengestellt. Zu jeder Formel finden Sie auch gleich ein anschauliches Beispiel, wie sie eingesetzt wird und eine Erklärung, wofür man sie eigentlich braucht. "BWL-Formeln für Dummies" ist also viel mehr als eine reine Auflistung von Formeln.
BWL kompakt für Dummies
(2016)
BWL-Klausuren für Dummies
(2019)
Investition und Finanzierung sind wichtige Themen in der Unternehmenspraxis und im Studium der Betriebswirtschaftslehre. Dieses Buch führt Sie anhand anschaulicher Beispiele in die Grundlagen des Themas ein und zeigt die Ziele finanzwirtschaftlichen Handelns auf. Tobias Amely und Christine Immenkötter zeigen Ihnen die Grundzüge der Finanzwirtschaft und stellen Ihnen die wichtigsten Instrumente sowohl der Außen- und Innenfinanzierung als auch des Finanzmanagements vor. Lernen Sie die statische und dynamische Investitionsrechnung kennen und erfahren Sie, was man über Investitionen in Wertpapiere wissen muss. So liefert Ihnen dieses Buch im bewährten ... für Dummies-Stil einen guten und leicht verständlichen Überblick über alle wichtigen Themen der Investition und Finanzierung. (Verlagsangaben)
BWL für Dummies
(2021)
BWL für Dummies
(2009)
BWL für Dummies
(2016)
"BWL für Dummies" führt kompetent, prägnant und umfassend in die Betriebswirtschaftslehre ein. Dabei werden die wesentlichen Elemente der Betriebswirtschaftslehre praxisorientiert vorgestellt und in ihrem Zusammenhang dargestellt. Folgende Themen werden behandelt: Materialwirtschaft, Leistungsbereitstellung und Produktion, Marketing, Investition und Finanzierung, Unternehmensorganisation und -führung, Rechnungswesen, Controlling.
BWL für Dummies
(2013)
Competitions for Benchmarking: Task and Functionality Scoring Complete Performance Assessment
(2015)
This paper presents the preliminary results of the Socialist Republic of Vietnam country case study conducted as part of the research project Sustainable Labour Migration implemented by the University of Applied Science Bonn-Rhein-Sieg. The project focuses on stakeholder perspectives on countries of origin benefits and the sustainability of different transnational skill partnership schemes. Existing and ongoing small-scale initiatives indicate that opportunities exist for all three types of labour mobility pathways, from recruiting youth for apprenticeships and subsequent skilled work to recruitment and recognition of skilled 'professionals' certificates for direct work contracts to initial vocational education and training programs in a dual-track approach. While the latter has the highest potential to be more beneficial than other approaches, pursuing and supporting the scaling up of all three pathways in parallel will have additional, mutually reinforcing and supporting effects. The potential for benefits over and above those already realised by existing skill partnerships appears high, especially considering the favourable framework conditions specific to the long-standing German-Vietnamese relationship. If the potential of well-managed skill partnerships was realised, such sustainable models of skilled labour migration could serve as a unique selling point in the international competition for skilled labour.
SLC6A14 (ATB0,+) is unique among SLC proteins in its ability to transport 18 of the 20 proteinogenic (dipolar and cationic) amino acids and naturally occurring and synthetic analogues (including anti-viral prodrugs and nitric oxide synthase (NOS) inhibitors). SLC6A14 mediates amino acid uptake in multiple cell types where increased expression is associated with pathophysiological conditions including some cancers. Here, we investigated how a key position within the core LeuT-fold structure of SLC6A14 influences substrate specificity. Homology modelling and sequence analysis identified the transmembrane domain 3 residue V128 as equivalent to a position known to influence substrate specificity in distantly related SLC36 and SLC38 amino acid transporters. SLC6A14, with and without V128 mutations, was heterologously expressed and function determined by radiotracer solute uptake and electrophysiological measurement of transporter-associated current. Substituting the amino acid residue occupying the SLC6A14 128 position modified the binding pocket environment and selectively disrupted transport of cationic (but not dipolar) amino acids and related NOS inhibitors. By understanding the molecular basis of amino acid transporter substrate specificity we can improve knowledge of how this multi-functional transporter can be targeted and how the LeuT-fold facilitates such diversity in function among the SLC6 family and other SLC amino acid transporters.
LiDAR-based Indoor Localization with Optimal Particle Filters using Surface Normal Constraints
(2023)
Die Darlegungsform von Nachhaltigkeitsleistungen sind üblicherweise gesondert ausgegebene Nachhaltigkeitsberichte auf die DAX-30-Unternehmen in ihren Geschäftsberichten hinweisen. Insbesondere die für kapitalmarktorientierte Unternehmen bedeutende Stakeholdergruppe der Analysten und Investoren fordert jedoch zunehmend eine integrative Darstellung aller Dimensionen der Triple-Bottom-Line auch im Lagebericht. Die gesetzlichen Offenlegungspflichten nach § 289 Abs. 3 bzw. § 315 Abs. 1 S. 4 HGB und DRS 15.32 erhöhen den Druck auf die DAX-30-Unternehmen zusätzlich. Diese Arbeit thematisiert im Kern das aktuelle Spannungsfeld zwischen Ökonomie, Ökologie und sozialem Engagement von Unternehmen. Auf Basis einer umfassenden theoretischen Analyse werden konkrete Kennzahlen zur Erfassung von Indikatoren der Nachhaltigkeit gebildet und deren Ausprägung bei den DAX 30 Unternehmen erarbeitet.
Die Debatte um das menschliche Erkenntnisvermögen, also die Frage nach der Art und Weise, wie Menschen Wissen und Erkenntnis erlangen, ist nicht neu, sondern sie stellt sich seitdem philosophische Fragen gestellt werden – ohne dass allerdings über die Jahrhunderte hinweg eine definitive Antwort auf diese Frage gefunden werden konnte.
Ausgangspunkt unserer Überlegungen ist die Feststellung, dass die Legitimierung moderner Formen des Wissens mit dem Verlust von legitimierenden Metaerzählungen einhergeht. Diese Feststellung bezieht sich nicht nur ganz allgemein auf die klassischen Geistes- und Sozialwissenschaften, sondern auch konkret auf die angewandte Management- und Organisationsforschung. Traditionell werden diese untergeordneten Diskursarten durch den übergeordneten Diskurs der Aufklärung legitimiert und unterwerfen sich dem Diktat der Rationalität des Modernismus (Ant 2004).
„Ein Wort gibt das andere“ – zwischenmenschliche Kommunikation folgt bestimmten Regeln. Wer diese Mechanismen durchschaut, kann nicht nur eigene Gesprächsziele besser erreichen, sondern auch andere Menschen leichter verstehen und erfolgreicher mit ihnen interagieren.
Nicht von ungefähr zählt Kommunikationskompetenz zu den gefragtesten Soft Skills in Beruf und Alltag. Diese Einführung in die Theorie und Praxis der Kommunikation erläutert die Prinzipien effizienter Kommunikation nach wie vor ein Klassiker zur Veranschaulichung von gruppendynamischen Prozessen und Rollenverhalten. Das Lehrbuch erklärt das Phänomen der Kommunikation anhand verschiedener sozialpsychologischer Untersuchungen, Theorien, Beispiele und Sichtweisen, regt zu einer erweiterten Reflexion darüber an und liefert konkrete Hinweise und Übungen, welche die eigene Kommunikationspraxis effektiv verbessern.
(Verlagsangaben)
The transport of carbon dioxide through pipelines is one of the important components of Carbon dioxide Capture and Storage (CCS) systems that are currently being developed. If high flow rates are desired a transportation in the liquid or supercritical phase is to be preferred. For technical reasons, the transport must stay in that phase, without transitioning to the gaseous state. In this paper, a numerical simulation of the stationary process of carbon dioxide transport with impurities and phase transitions is considered. We use the Homogeneous Equilibrium Model (HEM) and the GERG-2008 thermodynamic equation of state to describe the transport parameters. The algorithms used allow to solve scenarios of carbon dioxide transport in the liquid or supercritical phase, with the detection of approaching the phase transition region. Convergence of the solution algorithms is analyzed in connection with fast and abrupt changes of the equation of state and the enthalpy function in the region of phase transitions.
Pipeline transport is an efficient method for transporting fluids in energy supply and other technical applications. While natural gas is the classical example, the transport of hydrogen is becoming more and more important; both are transmitted under high pressure in a gaseous state. Also relevant is the transport of carbon dioxide, captured in the places of formation, transferred under high pressure in a liquid or supercritical state and pumped into underground reservoirs for storage. The transport of other fluids is also required in technical applications. Meanwhile, the transport equations for different fluids are essentially the same, and the simulation can be performed using the same methods. In this paper, the effect of control elements such as compressors, regulators and flaptraps on the stability of fluid transport simulations is studied. It is shown that modeling of these elements can lead to instabilities, both in stationary and dynamic simulations. Special regularization methods were developed to overcome these problems. Their functionality also for dynamic simulations is demonstrated for a number of numerical experiments.
Ziel der vorliegenden Forschungsarbeit ist es, den Einfluss von Persönlichkeit auf nachhaltige Maßnahmen anhand des Streamingkonsums zu eruieren. Der allgemein steigende Streamingkonsum und die damit einhergehenden Umweltschäden einerseits und ein wachsendes gesellschaftliches Umweltbewusstsein andererseits stellen einen Widerspruch dar. An einer Online-Umfrage zu diesen und weiterführenden Aspekten nahmen 204 Probanden teil. Während sich die Eigenschaften Verträglichkeit und Offenheit in hoher Ausprägung positiv auf die Umwelteinstellung, das Umweltverhalten und die Umweltbesorgnis auswirkten, wurden die umweltfreundlichen Maßnahmen in einer Clusteranalyse hingegen stärker von der Gruppe bevorzugt, deren Verträglichkeit und Offenheit verhältnismäßig schwach ausgeprägt waren. Ein geringes Wissen über die streamingbedingten Umweltfolgen lag grundsätzlich vor und dient als möglicher Erklärungsansatz des genannten Widerspruchs. Die Probanden forderten, ein Bewusstsein für diese Thematik zu schaffen. Um Streamingkonsum umweltfreundlicher zu gestalten empfiehlt es sich, alle am Prozess beteiligten Akteure einzubeziehen. Die befragten Konsumenten bevorzugten dabei vor allem die Verwendung von Ökostrom und lehnten eine Umstellung der Bezahlstruktur vorwiegend ab.
Nowadays, we input text not only on stationary devices, but also on handheld devices while walking, driving, or commuting. Text entry on the move, which we term as nomadic text entry, is generally slower. This is partially due to the need for users to move their visual focus from the device to their surroundings for navigational purposes and back. To investigate if better feedback about users' surroundings on the device can improve performance, we present a number of new and existing feedback systems: textual, visual, textual & visual, and textual & visual via translucent keyboard. Experimental comparisons between the conventional and these techniques established that increased ambient awareness for mobile users enhances nomadic text entry performance. Results showed that the textual and the textual & visual via translucent keyboard conditions increased text entry speed by 14% and 11%, respectively, and reduced the error rate by 13% compared to the regular technique. The two methods also significantly reduced the number of collisions with obstacles.
In den letzten Jahren haben sich elektronische Zahlungssysteme als populäre Alternative zur klassischen Bargeldzahlung etabliert. Diese Zahlungssysteme bestehen in der Regel aus zwei elementaren Komponenten: einem Terminal und einer Kasse. Damit ist der Käufer eines Produktes in der Lage, seine Schuld gegenüber dem Verkäufer bargeldlos und elektronisch zu begleichen. Die dabei am Häufigsten anfallenden Geschäftsprozesse, das Buchen und das Stornieren von Zahlungsbelegen, werden hierbei als Transaktionen bezeichnet, da diese entweder vollständig gelingen oder im Fehlerfall ohne Auswirkungen bleiben müssen. In diesem Buch wird daher die Implementierung eines zuverlässigen Zahlungssystems mit einem TeleCash-Terminal dargestellt. Dabei werden in den geforderten Geschäftsprozessen die wichtigen Transaktionseigenschaften sichergestellt. Es werden dazu zunächst die Grundlagen von Transaktionen erarbeitet und ein geeignetes Transaktionskonzept entwickelt. Anschließend wird die konkrete Realisierung des Systems mit Hilfe der Java Transaction Services durchgeführt. Abschließend wird das entstandene System hinsichtlich seiner Transaktionseigenschaften untersucht.