Refine
H-BRS Bibliography
- yes (4918) (remove)
Departments, institutes and facilities
- Fachbereich Wirtschaftswissenschaften (1243)
- Fachbereich Informatik (1148)
- Fachbereich Angewandte Naturwissenschaften (766)
- Fachbereich Ingenieurwissenschaften und Kommunikation (636)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (480)
- Präsidium (403)
- Fachbereich Sozialpolitik und Soziale Sicherung (402)
- Institute of Visual Computing (IVC) (313)
- Institut für funktionale Gen-Analytik (IFGA) (241)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (195)
Document Type
- Article (1603)
- Conference Object (1119)
- Part of a Book (690)
- Part of Periodical (410)
- Book (monograph, edited volume) (370)
- Report (145)
- Preprint (88)
- Working Paper (87)
- Contribution to a Periodical (83)
- Doctoral Thesis (70)
Year of publication
Keywords
- Lehrbuch (85)
- Deutschland (27)
- Nachhaltigkeit (27)
- Controlling (23)
- Unternehmen (23)
- Digitalisierung (17)
- Management (17)
- Betriebswirtschaftslehre (16)
- Machine Learning (16)
- Corporate Social Responsibility (15)
Sind kleinere und mittlere Unternehmen (KMU) bereits auf die Digitale Transformation vorbereitet?
(2018)
Eine von den Autoren durchgeführte Untersuchung konnte deutliche Indizien dafür ausmachen, dass viele kleinere und mittlere Unternehmen (KMU) aktuell noch keine ausreichende Reife zur Digitalen Transformation haben. Zur Lösung des Problems wird vorgeschlagen, ein agiles IT-Management-Konzept zu entwickeln, um den IT-Bereich dynamisch und ohne formalen Ballast des klassischen IT-Managements zu steuern.
Multi-Merger-Szenarien als Herausforderung für das IT-Controlling - Checklisten zur IT-Integration
(2006)
Digitalisierung für kleinere und mittlere Unternehmen (KMU): Anforderungen an das IT-Management
(2018)
Mergers and acquisitions take place all over the world and in many industries, typically motivated by corporate politics. While IT management is often not involved in the decision-making, it has to solve a wide range of problems in the post-merger phase. Indeed, merging two or more companies implies not only merging their core businesses, but also creating a new and efficiently integrated IT organisation from the individual ones, since persistence of the current IT organisations usually does not make sense. In addition, corporate management frequently imposes constraints, e.g., cost reductions, on the IT infrastructure. The principal critical success factor when merging IT organisations is the uninterrupted operation of the IT business, because a service gap is neither acceptable for in-house functional departments nor for external customers. Therefore, the IT rebuilding phase has to focus on IT services that facilitate the processes of functional departments, support processes, and processes of customers and suppliers, so that any transformation work is transparent to internal and external customers. In this article we describe a real-world but anonymous case study. Our goals are to highlight the points important for merging IT organisations, and to help decision-makers, particularly in the areas of IT organisation and IT personnel. We focus on the arising organisational and non-technical issues from a management perspective, i.e., the CIO's view, and provide checklists intended to help IT managers to address the most pressing issues. To assist CIOs surviving in the post merger phase, we give check lists for merging IT organisations, check lists for merging IT human resources, check lists for IT budgets and reporting, and assess activities in a merger scenario. IT hardware, software and IT infrastructure as well as running IT projects are not considered in this paper.
Agiles IT-Controlling
(2022)
Während im IT-Projektmanagement agile Methoden seit vielen Jahren in der Praxis Zuspruch finden, werden im IT-Controlling überwiegend noch klassische Methoden eingesetzt. Der Beitrag untersucht die Fragestellung, ob und wie die im IT-Controlling eingesetzten Methoden auch agilen Paradigmen folgen und Methoden des agilen IT-Projektmanagements adaptiert werden können.
Während sich die unternehmerische Arbeitswelt immer mehr in Richtung Agilität verschiebt, verharrt das IT-Controlling noch in alten, klassischen Strukturen. Diese Arbeit untersucht die Fragestellung, ob und inwieweit agile Ansätze im IT-Controlling eingesetzt werden können. Dieser Beitrag ist eine modifizierte Version des in der Zeitschrift „HMD Praxis der Wirtschaftsinformatik“ (https://link.springer.com/article/10.1365/s40702-022-00837-0) erschienenen Artikels „Agiles IT-Controlling“.
IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).
A plethora of architectural patterns and elements for developing service-oriented applications can be gathered from the state-of-the-art. Most of these approaches are merely applicable for single-tenant applications. However, less methodical support is provided for scenarios, in which multiple different tenants with varying requirements access the same application stack concurrently. In order to fill this gap, both novel and existing architectural patterns, architectural elements, as well as fundamental design decisions must be considered and integrated into a framework that leverages the devel- opment of multi-tenant application. This paper addresses this demand and presents the SOAdapt framework. It promotes the development of adaptable multi-tenant applications based on a service-oriented architecture that is capable of incorporating specific requirements of new tenants in a flexible manner.
Die digitale Transformation verändert die internationale Kooperation der Hochschulen massiv. Über die Möglichkeiten der virtuellen Mobilität hinaus entstehen neue Themenfelder, die internationale Lern- und Lehrerlebnisse mit digitaler Unterstützung verändern, ergänzen oder neu ermöglichen. Dazu sind im Bereich der Förderung der Internationalisierung (DAAD, Erasmus+, BMBF u.a.) Projekte und Förderformate entstanden, die Digitalisierung und Internationalisierung kombinieren und die neuen Themenstellungen adressieren, z.B. didaktische Formate, administrative Prozesse (auch im Kontext OZG und DSGVO), virtuelle und hybride Mobilität, internationale Projekt- und Teamformate sowie schlussendlich auch Inhalte, die internationale, interkulturelle und interdisziplinäre Kompetenzen mit digitalen Kompetenzen verbinden. Der vorgeschlagene Workshop soll entsprechende Projekte zusammenbringen und die Themen strukturieren, um einen Überblick der Entwicklungen zu schaffen und somit einen Beitrag zur Definition des Themenfelds „Digitalisierung & Internationalisierung“ zu leisten.
Trueness and precision of milled and 3D printed root-analogue implants: A comparative in vitro study
(2023)
The need for innovation around the control functions of inverters is great. PV inverters were initially expected to be passive followers of the grid and to disconnect as soon as abnormal conditions happened. Since future power systems will be dominated by generation and storage resources interfaced through inverters these converters must move from following to forming and sustaining the grid. As “digital natives” PV inverters can also play an important role in the digitalisation of distribution networks. In this short review we identified a large potential to make the PV inverter the smart local hub in a distributed energy system. At the micro level, costs and coordination can be improved with bidirectional inverters between the AC grid and PV production, stationary storage, car chargers and DC loads. At the macro level the distributed nature of PV generation means that the same devices will support both to the local distribution network and to the global stability of the grid. Much success has been obtained in the former. The later remains a challenge, in particular in terms of scaling. Yet there is some urgency in researching and demonstrating such solutions. And while digitalisation offers promise in all control aspects it also raises significant cybersecurity concerns.
A principal step towards solving diverse perception problems is segmentation. Many algorithms benefit from initially partitioning input point clouds into objects and their parts. In accordance with cognitive sciences, segmentation goal may be formulated as to split point clouds into locally smooth convex areas, enclosed by sharp concave boundaries. This goal is based on purely geometrical considerations and does not incorporate any constraints, or semantics, of the scene and objects being segmented, which makes it very general and widely applicable. In this work we perform geometrical segmentation of point cloud data according to the stated goal. The data is mapped onto a graph and the task of graph partitioning is considered. We formulate an objective function and derive a discrete optimization problem based on it. Finding the globally optimal solution is an NP-complete problem; in order to circumvent this, spectral methods are applied. Two algorithms that implement the divisive hierarchical clustering scheme are proposed. They derive graph partition by analyzing the eigenvectors obtained through spectral relaxation. The specifics of our application domain are used to automatically introduce cannot-link constraints in the clustering problem. The algorithms function in completely unsupervised manner and make no assumptions about shapes of objects and structures that they segment. Three publicly available datasets with cluttered real-world scenes and an abundance of box-like, cylindrical, and free-form objects are used to demonstrate convincing performance. Preliminary results of this thesis have been contributed to the International Conference on Autonomous Intelligent Systems (IAS-13).
A company's financial documents use tables along with text to organize the data containing key performance indicators (KPIs) (such as profit and loss) and a financial quantity linked to them. The KPI’s linked quantity in a table might not be equal to the similarly described KPI's quantity in a text. Auditors take substantial time to manually audit these financial mistakes and this process is called consistency checking. As compared to existing work, this paper attempts to automate this task with the help of transformer-based models. Furthermore, for consistency checking it is essential for the table's KPIs embeddings to encode the semantic knowledge of the KPIs and the structural knowledge of the table. Therefore, this paper proposes a pipeline that uses a tabular model to get the table's KPIs embeddings. The pipeline takes input table and text KPIs, generates their embeddings, and then checks whether these KPIs are identical. The pipeline is evaluated on the financial documents in the German language and a comparative analysis of the cell embeddings' quality from the three tabular models is also presented. From the evaluation results, the experiment that used the English-translated text and table KPIs and Tabbie model to generate table KPIs’ embeddings achieved an accuracy of 72.81% on the consistency checking task, outperforming the benchmark, and other tabular models.
In 1991 the researchers at the center for the Learning Sciences of Carnegie Mellon University were confronted with the confusing question of “where is AI” from the users, who were interacting with AI but did not realize it. Three decades of research and we are still facing the same issue with the AItechnology users. In the lack of users’ awareness and mutual understanding of AI-enabled systems between designers and users, informal theories of the users about how a system works (“Folk theories”) become inevitable but can lead to misconceptions and ineffective interactions. To shape appropriate mental models of AI-based systems, explainable AI has been suggested by AI practitioners. However, a profound understanding of the current users’ perception of AI is still missing. In this study, we introduce the term “Perceived AI” as “AI defined from the perspective of its users”. We then present our preliminary results from deep-interviews with 50 AItechnology users, which provide a framework for our future research approach towards a better understanding of PAI and users’ folk theories.
For most people, using their body to authenticate their identity is an integral part of daily life. From our fingerprints to our facial features, our physical characteristics store the information that identifies us as "us." This biometric information is becoming increasingly vital to the way we access and use technology. As more and more platform operators struggle with traffic from malicious bots on their servers, the burden of proof is on users, only this time they have to prove their very humanity and there is no court or jury to judge, but an invisible algorithmic system. In this paper, we critique the invisibilization of artificial intelligence policing. We argue that this practice obfuscates the underlying process of biometric verification. As a result, the new "invisible" tests leave no room for the user to question whether the process of questioning is even fair or ethical. We challenge this thesis by offering a juxtaposition with the science fiction imagining of the Turing test in Blade Runner to reevaluate the ethical grounds for reverse Turing tests, and we urge the research community to pursue alternative routes of bot identification that are more transparent and responsive.
Technological objects present themselves as necessary, only to become obsolete faster than ever before. This phenomenon has led to a population that experiences a plethora of technological objects and interfaces as they age, which become associated with certain stages of life and disappear thereafter. Noting the expanding body of literature within HCI about appropriation, our work pinpoints an area that needs more attention, “outdated technologies.” In other words, we assert that design practices can profit as much from imaginaries of the future as they can from reassessing artefacts from the past in a critical way. In a two-week fieldwork with 37 HCI students, we gathered an international collection of nostalgic devices from 14 different countries to investigate what memories people still have of older technologies and the ways in which these memories reveal normative and accidental use of technological objects. We found that participants primarily remembered older technologies with positive connotations and shared memories of how they had adapted and appropriated these technologies, rather than normative uses. We refer to this phenomenon as nostalgic reminiscence. In the future, we would like to develop this concept further by discussing how nostalgic reminiscence can be operationalized to stimulate speculative design in the present.
When dialogues with voice assistants (VAs) fall apart, users often become confused or even frustrated. To address these issues and related privacy concerns, Amazon recently introduced a feature allowing Alexa users to inquire about why it behaved in a certain way. But how do users perceive this new feature? In this paper, we present preliminary results from research conducted as part of a three-year project involving 33 German households. This project utilized interviews, fieldwork, and co-design workshops to identify common unexpected behaviors of VAs, as well as users’ needs and expectations for explanations. Our findings show that, contrary to its intended purpose, the new feature actually exacerbates user confusion and frustration instead of clarifying Alexa's behavior. We argue that such voice interactions should be characterized as explanatory dialogs that account for VA’s unexpected behavior by providing interpretable information and prompting users to take action to improve their current and future interactions.
AI (artificial intelligence) systems are increasingly being used in all aspects of our lives, from mundane routines to sensitive decision-making and even creative tasks. Therefore, an appropriate level of trust is required so that users know when to rely on the system and when to override it. While research has looked extensively at fostering trust in human-AI interactions, the lack of standardized procedures for human-AI trust makes it difficult to interpret results and compare across studies. As a result, the fundamental understanding of trust between humans and AI remains fragmented. This workshop invites researchers to revisit existing approaches and work toward a standardized framework for studying AI trust to answer the open questions: (1) What does trust mean between humans and AI in different contexts? (2) How can we create and convey the calibrated level of trust in interactions with AI? And (3) How can we develop a standardized framework to address new challenges?
In the fermentation process sugars are transformed into lactic acid. pH meters have traditionally been used for fermentation process monitoring based on acidity. More recently, near infrared (NIR) spectroscopy has proven to provide an accurate and non-invasive method to detect when the transformation of sugars into lactic acid is finished. The fermentation process when sugars are transformed into lactic acid. This research proposes the use of simplified NIR spectroscopy using multispectral optical sensors as a simpler and less expensive measure to end the fermentation process. The NIR spectrum of milk and yogurt is compared to find and extract features that can be used to design a simple sensor to monitor the yogurt fermentation process. Multispectral images in four selected wavebands within the NIR spectrum are captured and show different spectral remission characteristics for milk, yogurt and water, which support the selection of these wavebands for milk and yogurt classification.
Information reliability and automatic computation are two important aspects that are continuously pushing the Web to be more semantic. Information uploaded to the Web should be reusable and extractable automatically to other applications, platforms, etc. Several tools exist to explicitly markup Web content. The Web services may also have a positive role on the automatic processing of Web contents, especially when they act as flexible and agile agents. However, Web services themselves should be developed with semantics in mind. They should include and provide structured information to facilitate their use, reuse, composition, query, etc. In this chapter, the authors focus on evaluating state-of-the-art semantic aspects and approaches in Web services. Ultimately, this contributes to the goal of Web knowledge management, execution, and transfer.
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
ENaC channels
(2023)
The epithelial sodium channel (ENaC) is a heterotrimeric ion channel that plays a key role in sodium and water homeostasis in tetrapod vertebrates. In the aldosterone-sensitive distal nephron, hormonally controlled ENaC expression matches dietary sodium intake to its excretion. Furthermore, ENaC mediates sodium absorption across the epithelia of the colon, sweat ducts, reproductive tract, and lung. ENaC is a constitutively active ion channel and its expression, membrane abundance, and open probability (PO) are controlled by multiple intracellular and extracellular mediators and mechanisms [9]. Aberrant ENaC regulation is associated with severe human diseases, including hypertension, cystic fibrosis, pulmonary edema, pseudohypoaldosteronism type 1, and nephrotic syndrome [9].
Introduction: After cellulose, lignin represents the most abundant biopolymer on earth that accounts for up to 18-35 % by weight of lignocellulose biomass. Today, it is a by-product of the paper and pulping industry. Although lignin is available in huge amounts, mainly in form of so called black liquor produced via Kraft-pulping, processes for the valorization of lignin are still limited [1]. Due to its hyperbranched polyphenol-like structure, lignin gained increasing interest as biobased building block for polymer synthesis [2]. The present work is focused on extraction and purification of lignin from industrial black liquor and synthesis of lignin-based polyurethanes.
Lignocellulose feedstock (LCF) provides a sustainable source of components to produce bioenergy, biofuel, and novel biomaterials. Besides hard and soft wood, so-called low-input plants such as Miscanthus are interesting crops to be investigated as potential feedstock for the second generation biorefinery. The status quo regarding the availability and composition of different plants, including grasses and fast-growing trees (i.e., Miscanthus, Paulownia), is reviewed here. The second focus of this review is the potential of multivariate data processing to be used for biomass analysis and quality control. Experimental data obtained by spectroscopic methods, such as nuclear magnetic resonance (NMR) and Fourier-transform infrared spectroscopy (FTIR), can be processed using computational techniques to characterize the 3D structure and energetic properties of the feedstock building blocks, including complex linkages. Here, we provide a brief summary of recently reported experimental data for structural analysis of LCF biomasses, and give our perspectives on the role of chemometrics in understanding and elucidating on LCF composition and lignin 3D structure.
Renewable resources gain increasing interest as source for environmentally benign biomaterials, such as drug encapsulation/release compounds, and scaffolds for tissue engineering in regenerative medicine. Being the second largest naturally abundant polymer, the interest in lignin valorization for biomedical utilization is rapidly growing. Depending on resource and isolation procedure, lignin shows specific antioxidant and antimicrobial activity. Today, efforts in research and industry are directed toward lignin utilization as renewable macromolecular building block for the preparation of polymeric drug encapsulation and scaffold materials. Within the last five years, remarkable progress has been made in isolation, functionalization and modification of lignin and lignin-derived compounds. However, literature so far mainly focuses lignin-derived fuels, lubricants and resins. The purpose of this review is to summarize the current state of the art and to highlight the most important results in the field of lignin-based materials for potential use in biomedicine (reported in 2014–2018). Special focus is drawn on lignin-derived nanomaterials for drug encapsulation and release as well as lignin hybrid materials used as scaffolds for guided bone regeneration in stem cell-based therapies.
Antioxidant activity is an essential aspect of oxygen-sensitive merchandise and goods, such as food and corresponding packaging, cosmetics, and biomedicine. Technical lignin has not yet been applied as a natural antioxidant, mainly due to the complex heterogeneous structure and polydispersity of lignin. This report presents antioxidant capacity studies completed using the 2,2-diphenyl-1-picrylhydrazyl (DPPH) assay. The influence of purification on lignin structure and activity was investigated. The purification procedure showed that double-fold selective extraction is the most efficient (confirmed by ultraviolet-visible (UV/Vis), Fourier transform infrared (FTIR), heteronuclear single quantum coherence (HSQC) and 31P nuclear magnetic resonance spectroscopy, size exclusion chromatography, and X-ray diffraction), resulting in fractions of very narrow polydispersity (3.2⁻1.6), up to four distinct absorption bands in UV/Vis spectroscopy. Due to differential scanning calorimetry measurements, the glass transition temperature increased from 123 to 185 °C for the purest fraction. Antioxidant capacity is discussed regarding the biomass source, pulping process, and degree of purification. Lignin obtained from industrial black liquor are compared with beech wood samples: antioxidant activity (DPPH inhibition) of kraft lignin fractions were 62⁻68%, whereas beech and spruce/pine-mixed lignin showed values of 42% and 64%, respectively. Total phenol content (TPC) of the isolated kraft lignin fractions varied between 26 and 35%, whereas beech and spruce/pine lignin were 33% and 34%, respectively. Storage decreased the TPC values but increased the DPPH inhibition.
Antioxidant activity is an essential feature required for oxygen-sensitive merchandise and goods, such as food and corresponding packaging as well as materials used in cosmetics and biomedicine. For example, vanillin, one of the most prominent antioxidants, is fabricated from lignin, the second most abundant natural polymer in the world. Antioxidant potential is primarily related to the termination of oxidation propagation reactions through hydrogen transfer. The application of technical lignin as a natural antioxidant has not yet been implemented in the industrial sector, mainly due to the complex heterogeneous structure and polydispersity of lignin. Thus, current research focuses on various isolation and purification strategies to improve the compatibility of lignin material with substrates and enhancing its stabilizing effect.
The antiradical and antimicrobial activity of lignin and lignin-based films are both of great interest for applications such as food packaging additives. The polyphenolic structure of lignin in addition to the presence of O-containing functional groups is potentially responsible for these activities. This study used DPPH assays to discuss the antiradical activity of HPMC/lignin and HPMC/lignin/chitosan films. The scavenging activity (SA) of both binary (HPMC/lignin) and ternary (HPMC/lignin/chitosan) systems was affected by the percentage of the added lignin: the 5% addition showed the highest activity and the 30% addition had the lowest. Both scavenging activity and antimicrobial activity are dependent on the biomass source showing the following trend: organosolv of softwood > kraft of softwood > organosolv of grass. Testing the antimicrobial activities of lignins and lignin-containing films showed high antimicrobial activities against Gram-positive and Gram-negative bacteria at 35 °C and at low temperatures (0-7 °C). Purification of kraft lignin has a negative effect on the antimicrobial activity while storage has positive effect. The lignin release in the produced films affected the activity positively and the chitosan addition enhances the activity even more for both Gram-positive and Gram-negative bacteria. Testing the films against spoilage bacteria that grow at low temperatures revealed the activity of the 30% addition on HPMC/L1 film against both B. thermosphacta and P. fluorescens while L5 was active only against B. thermosphacta. In HPMC/lignin/chitosan films, the 5% addition exhibited activity against both B. thermosphacta and P. fluorescens.
Once aberrantly activated, the Wnt/βcatenin pathway may result in uncontrolled proliferation and eventually cancer. Efforts to counter and inhibit this pathway are mainly directed against βcatenin, as it serves a role on the cytoplasm and the nucleus. In addition, speciallygenerated lymphocytes are recruited for the purpose of treating liver cancer. Peripheral blood mononuclear lymphocytes are expanded by the timely addition of interferon γ, interleukin (IL)1β, IL2 and anticluster of differentiation 3 antibody. The resulting cells are called cytokineinduced killer (CIK) cells. The present study utilised these cells and combine them with drugs inhibiting the Wnt pathway in order to examine whether this resulted in an improvement in the killing ability of CIK cells against liver cancer cells. Drugs including ethacrynic acid (EA) and ciclopirox olamine (CPX) were determined to be suitable candidates, as determined by previous studies. Drugs were administered on their own and combined with CIK cells and then a cell viability assay was performed. These results suggest that EAtreated cells demonstrated apoptosis and were significantly affected compared with untreated cells. Unlike EA, CPX killed normal and cancerous cells even at low concentrations. Subsequent to combining EA with CIK cells, the potency of killing was increased and a greater number of cells died, which proves a synergistic action. In summary, EA may be used as an antihepatocellular carcinoma drug, while CPX possesses a high toxicity to cancerous as well as to normal cells. It was proposed that EA should be integrated into present therapeutic methods for cancer.
In today’s business, culture plays a vital role or to a high degree influences the attitude, perception and decision making process of an individual. Culture is an unavoidable state of rules and regulations that defines people’s daily life in a particular environment or society. There are plenty examples of business failures or stagnation or failure of joint ventures, on account of the management's inability to recognize cross-cultural challenges and tackle them appropriately.
BWL kompakt für Dummies
(2018)
"BWL kompakt für Dummies" bietet Ihnen eine verständliche Einführung in die Betriebswirtschaftslehre, egal ob Sie sie für die Aus- oder Weiterbildung brauchen oder sich einfach schlau machen wollen. Tobias Amely stellt Ihnen die wesentlichen Elemente und Grundbegriffe der Betriebswirtschaftslehre vor und zeigt die Bezüge zur Unternehmenspraxis auf: Materialwirtschaft, Leistungsbereitstellung und Produktion, Marketing, Investition und Finanzierung, Unternehmensorganisation und -führung, Rechnungswesen und Controlling.
BWL-Formeln für Dummies
(2012)
BWL-Formeln für Dummies
(2018)
Können Sie sich auch Formeln so schlecht merken? Das ist mit diesem handlichen Nachschlagewerk auch gar nicht mehr nötig! Der Autor des Bestsellers "BWL für Dummies" Tobias Amely hat für Sie die wichtigsten BWL-Formeln zusammengestellt. Zu jeder Formel finden Sie auch gleich ein anschauliches Beispiel, wie sie eingesetzt wird und eine Erklärung, wofür man sie eigentlich braucht. "BWL-Formeln für Dummies" ist also viel mehr als eine reine Auflistung von Formeln.
BWL kompakt für Dummies
(2016)
BWL-Klausuren für Dummies
(2019)