Refine
Departments, institutes and facilities
- Fachbereich Informatik (59)
- Fachbereich Angewandte Naturwissenschaften (32)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (32)
- Fachbereich Ingenieurwissenschaften und Kommunikation (25)
- Institut für Cyber Security & Privacy (ICSP) (20)
- Fachbereich Wirtschaftswissenschaften (17)
- Institut für funktionale Gen-Analytik (IFGA) (16)
- Institute of Visual Computing (IVC) (13)
- Institut für Verbraucherinformatik (IVI) (11)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (6)
Document Type
- Conference Object (79)
- Article (70)
- Part of a Book (10)
- Conference Proceedings (3)
- Book (monograph, edited volume) (2)
- Doctoral Thesis (2)
- Bachelor Thesis (1)
- Contribution to a Periodical (1)
- Research Data (1)
- Master's Thesis (1)
Year of publication
- 2016 (174) (remove)
Language
- English (174) (remove)
Keywords
- Dielectric analysis (2)
- Fas (2)
- IEEE802.11 (2)
- Intelligent Transport System (2)
- Large, high-resolution displays (2)
- Lignin (2)
- Long-Distance WiFi (2)
- Numerical optimization (2)
- Single Instruction Multiple Data (SIMD) (2)
- SpMV (2)
Recent years have seen extensive adoption of domain generation algorithms (DGA) by modern botnets. The main goal is to generate a large number of domain names and then use a small subset for actual C&C communication. This makes DGAs very compelling for botmasters to harden the infrastructure of their botnets and make it resilient to blacklisting and attacks such as takedown efforts. While early DGAs were used as a backup communication mechanism, several new botnets use them as their primary communication method, making it extremely important to study DGAs in detail.
In this paper, we perform a comprehensive measurement study of the DGA landscape by analyzing 43 DGAbased malware families and variants. We also present a taxonomy for DGAs and use it to characterize and compare the properties of the studied families. By reimplementing the algorithms, we pre-compute all possible domains they generate, covering the majority of known and active DGAs. Then, we study the registration status of over 18 million DGA domains and show that corresponding malware families and related campaigns can be reliably identified by pre-computing future DGA domains. We also give insights into botmasters’ strategies regarding domain registration and identify several pitfalls in previous takedown efforts of DGA-based botnets. We will share the dataset for future research and will also provide a web service to check domains for potential DGA identity.
The aim of design science research (DSR) in information systems is the user-centred creation of IT-artifacts with regard to specific social environments. For culture research in the field, which is necessary for a proper localization of IT-artifacts, models and research approaches from social sciences usually are adopted. Descriptive dimension-based culture models most commonly are applied for this purpose, which assume culture being a national phenomenon and tend to reduce it to basic values. Such models are useful for investigations in behavioural culture research because it aims to isolate, describe and explain culture-specific attitudes and characteristics within a selected society. In contrast, with the necessity to deduce concrete decisions for artifact-design, research results from DSR need to go beyond this aim. As hypothesis, this contribution generally questions the applicability of such generic culture dimensions’ models for DSR and focuses on their theoretical foundation, which goes back to Hofstede’s conceptual Onion Model of Culture. The herein applied literature-based analysis confirms the hypothesis. Consequently, an alternative conceptual culture model is being introduced and discussed as theoretical foundation for culture research in DSR.
During exercise, heart rate has proven to be a good measure in planning workouts. It is not only simple to measure but also well understood and has been used for many years for workout planning. To use heart rate to control physical exercise, a model which predicts future heart rate dependent on a given strain can be utilized. In this paper, we present a mathematical model based on convolution for predicting the heart rate response to strain with four physiologically explainable parameters. This model is based on the general idea of the Fitness-Fatigue model for performance analysis, but is revised here for heart rate analysis. Comparisons show that the Convolution model can compete with other known heart rate models. Furthermore, this new model can be improved by reducing the number of parameters. The remaining parameter seems to be a promising indicator of the actual subject’s fitness.
Recessive mutations in the MPV17 gene cause mitochondrial DNA depletion syndrome, a fatal infantile genetic liver disease in humans. Loss of function in mice leads to glomerulosclerosis and sensineural deafness accompanied with mitochondrial DNA depletion. Mutations in the yeast homolog Sym1, and in the zebra fish homolog tra cause interesting, but not obviously related phenotypes, although the human gene can complement the yeast Sym1 mutation. The MPV17 protein is a hydrophobic membrane protein of 176 amino acids and unknown function. Initially localised in murine peroxisomes, it was later reported to be a mitochondrial inner membrane protein in humans and in yeast. To resolve this contradiction we tested two new mouse monoclonal antibodies directed against the human MPV17 protein in Western blots and immunohistochemistry on human U2OS cells. One of these monoclonal antibodies showed specific reactivity to a protein of 20 kD absent in MPV17 negative mouse cells. Immunofluorescence studies revealed colocalisation with peroxisomal, endosomal and lysosomal markers, but not with mitochondria. This data reveal a novel connection between a possible peroxisomal/endosomal/lysosomal function and mitochondrial DNA depletion.
Cognitive robotics aims at understanding biological processes, though it has also the potential to improve future robotics systems. Here we show how a biologically inspired model of motor control with neural fields can be augmented with additional components such that it is able to solve a basic robotics task, that of obstacle avoidance. While obstacle avoidance is a well researched area, the focus here is on the extensibility of a biologically inspired framework. This work demonstrates how easily the biological inspired system can be used to adapt to new tasks. This flexibility is thought to be a major hallmark of biological agents.
Currently, there are a lot of research activities dealing with gamma titanium aluminide (γ-TiAl) alloys as new materials for low pressure turbine (LPT) blades. Even though the scatter in mechanical properties of such intermetallic alloys is more distinctive as in conventional metallic alloys, stochastic investigations on γ -TiAl alloys are very rare. For this reason, we analyzed the scatter in static and dynamic mechanical properties of the cast alloy Ti-48Al-2Cr-2Nb. It was found that this alloy shows a size effect in strength which is less pronounced than the size effect of brittle materials. A weakest-link approach is enhanced for describing a scalable size effect under multiaxial stress states and implemented in a post processing tool for reliability analysis of real components. The presented approach is a first applicable reliability model for semi-brittle materials. The developed reliability tool was integrated into a multidisciplinary optimization of the geometry of a LPT blade. Some processes of the optimization were distributed in a wide area network, so that specialized tools for each discipline could be employed. The optimization results show that it is possible to increase the aerodynamic efficiency and the structural mechanics reliability at the same time, while ensuring the blade can be manufactured in an investment casting process.
The development of advanced robotic systems is challenging as expertise from multiple domains needs to be integrated conceptually and technically. Model-driven engineering promises an efficient and flexible approach for developing robotics applications that copes with this challenge. Domain-specific modeling allows to describe robotics concerns with concepts and notations closer to the respective problem domain. This raises the level of abstraction and results in models that are easier to understand and validate. Furthermore, model-driven engineering allows to increase the level of automation, e.g. through code generation, and to bridge the gap between modeling and implementation. The anticipated results are improved efficiency and quality of the robotics systems engineering process. Within this contribution, we survey the available literature on domain-specific modeling and languages that target core robotics concerns. In total 137 publications were identified that comply with a set of defined criteria, which we consider essential for contributions in this field. With the presented survey, we provide an overview on the state-of-the-art of domain-specific modeling approaches in robotics. The surveyed publications are investigated from the perspective of users and developers of model-based approaches in robotics along a set of quantitative and qualitative research questions. The presented quantitative analysis clearly indicates the rising popularity of applying domain-specific modeling approaches to robotics in the academic community. Beyond this statistical analysis, we map the selected publications to a defined set of robotics subdomains and typical development phases in robotic systems engineering as reference for potential users. Furthermore, we analyze these contributions from a language engineering viewpoint and discuss aspects such as the methods and tools used for their implementation as well as their documentation status, platform integration, typical use cases and the evaluation strategies used for validation of the proposed approaches. Finally, we conclude with recommendations for discussion in the model-driven engineering and robotics community based on the insights gained in this survey.
WiFi-based Long Distance (WiLD) networks have emerged as a promising alternative approach for Internet in rural areas. However, the MAC layer, which is based on the IEEE802.11 standard, comprises contiguous stations in a cell and is spatially restricted to a few hundred meters at most. In this work, we summarize efforts by different researchers to use IEEE802.11 over long-distances. In addition, we introduce WiLDToken, our solution to optimizing the throughput and fairness and reducing the delay on WiLD links. Compared to previous alternative MAC layers protocols for WiLD, our focus is on optimizing a single link in a multi-radio multi-channel mesh. We implement our protocol in the ns-3 network simulator and show thatWiLDToken is superior to an adapted version of the Distributed Coordination Function (DCF) for different link distances. We find that the throughput on a single link is close to the physical data-rate without a major decrease over longer distances.
Online media consumption is the main driving force for the recent growth of the Web. As especially realtime media is becoming more and more accessible from a wide range of devices, with contrasting screen resolutions, processing resources and network connectivity, a necessary requirement is providing users with a seamless multimedia experience at the best possible quality, henceforth being able to adapt to the specific device and network conditions. This paper introduces a novel approach for adaptive media streaming in the Web. Despite the pervasive pullbased designs based on HTTP, this paper builds upon a Web-native push-based approach by which both the communication and processing overheads are reduced significantly in comparison to the pull-based counterparts. In order to maintain these properties when enhancing the scheme by adaptation features, a server-side monitoring and control needs to be developed as a consequence. Such an adaptive push-based media streaming approach is intr oduced as main contribution of this work. Moreover, the obtained evaluation results provide the evidence that with an adaptive push-based media delivery, on the one hand, an equivalent quality of experience can be provided at lower costs than by adopting pull-based media streaming. On the other hand, an improved responsiveness in switching between quality levels can be obtained at no extra costs.
An exploratory study: Analysis of Serbian tourism market and identification of major market segments
(2016)
This paper presents the b-it-bots RoboCup@Work team and its current hardware and functional architecture for the KUKA youBot robot.We describe the underlying software framework and the developed capabilities required for operating in industrial environments including features such as reliable and precise navigation, flexible manipulation and robust object recognition.
We propose an artificial slime mould model (ASMM) inspired by the plasmodium of Physarum polucephalum (P. polucephalum). ASMM consists of plural slimes, and each slime shares energy via a tube with neighboring slimes. Outer slimes sense their environment and conform to it. Outer slimes periodically transmit information about their surrounding environment via a contraction wave to inner slimes. Thus, ASMM shows how slimes can sense a better environment even if that environment is not adjacent to the slimes. The slimes subsequently can move in the direction of an attractant.
Salts and proteins comprise two of the basic molecular components of biological materials. Kosmotropic/chaotropic co-solvation and matching ion water affinities explain basic ionic effects on protein aggregation observed in simple solutions. However, it is unclear how these theories apply to proteins in complex biological environments and what the underlying ionic binding patterns are. Using the positive ion Ca2+ and the negatively charged membrane protein SNAP25, we studied ion effects on protein oligomerization in solution, in native membranes and in molecular dynamics (MD) simulations. We find that concentration-dependent ion-induced protein oligomerization is a fundamental chemico-physical principle applying not only to soluble but also to membrane-anchored proteins in their native environment. Oligomerization is driven by the interaction of Ca2+ ions with the carboxylate groups of aspartate and glutamate. From low up to middle concentrations, salt bridges between Ca2+ ions and two or more protein residues lead to increasingly larger oligomers, while at high concentrations oligomers disperse due to overcharging effects. The insights provide a conceptual framework at the interface of physics, chemistry and biology to explain binding of ions to charged protein surfaces on an atomistic scale, as occurring during protein solubilisation, aggregation and oligomerization both in simple solutions and membrane systems.
Large sections of the German society are able to buy and consume meat on a daily basis due to progress in the agri-food sector. However, the way meat is produced, traded and consumed increasingly has become an issue that is controversially discussed by the media, non-governmental organisations (NGOs), lobbyists, the industry itself and consumers – often with a negative connotation. The meat industry reacts to this. By creating information campaigns and animal welfare initiatives it aims to stress that it is going to take its corporate social responsibilities (CSR) for consumers and animal welfare seriously. But, the industry’s actions are still criticised as being not sufficient to improve animal welfare levels significantly. Much of this criticism can be observed in online news portals, where articles about the issue get published and commented by readers. This makes online portals a valuable source for information that is to be tapped in this study. It aims to better understand the multifaceted discussions concerning animal welfare initiatives in online portals. By applying qualitative content analysis and web mining techniques to a sample of documents taken from three major German news sites it can be shown that online discussions refer to various aspects of sustainability and corporate social responsibility. Findings also indicate that the discussions are framed by financial aspects.
3D-Printing is an efficient method in the field of additive manufacturing. In order to optimize the properties of manufactured parts it is essential to adapt the curing behavior of the resin systems with respect to the requirements. Thus, effects of resin composition, e.g. due to different additives such as thickener and curing agents, on the curing behavior have to be known. As the resin transfers from a liquid to a solid glass the time dependent ion viscosity was measured using DEA with flat IDEX sensors. This allows for a sensitive measurement of resin changes as the ion viscosity changes two to four decades. The investigated resin systems are based on the monomers styrene and HEMA. To account for the effects of copolymerization in the calculation of the reaction kinetics it was assumed that the reaction can be considered as a homo-polymerization having a reaction order n?1. Then the measured ion viscosity curves are fitted with the solution of the reactions kinetics - the time dependent degree of conversion (DC-function) - for times exceeding the initiation phase representing the primary curing. The measured ion viscosity curves can nicely be fitted with the DC-function and the determined fit parameters distinguish distinctly between the investigated resin compositions.
During the last 50 years, a broad range of visible light curing resin based composites (VLC RBC) was developed for restorative applications in dentistry. Correspondingly, the technologies of light curing units (LCU) have changed from UV to visible blue light, and there from quartz tungsten halogen over plasma arc to LED LCUs increasing their light intensity significantly. In this thesis, the influence of the curing conditions in terms of irradiance, exposure time and irradiance distribution of LCU on reaction kinetics as well as corresponding mechanical and viscoelastic properties were investigated.
Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification
(2016)
Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR) range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as "skin" or "no-skin". The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.
The analysis of Δ9-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver’s licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as well. To the best of our knowledge, this is the first publication on a comprehensively automated classical liquid-liquid extraction workflow in the field of forensic toxicological analysis.
Domestic Robotics
(2016)
Doubting - Path to Science
(2016)
We examined the cytotoxic effects of chaetocin on clear cell renal cell carcinoma (ccRCC) cells and the possibility to combine the effects of chaetocin with the effects of cytokine-induced killer cells (CIK) assayed by MTT assay and FACS analysis. Chaetocin is a thiodioxopiperazine produced by fungi belonging to the chaetomiaceae family. In 2007, it was first reported that chaetocin shows potent and selective ex vivo anti-cancer activity by inducing reactive oxygen species. CIK cells are generated from CD3+/CD56- T lymphocytes with double negative CD4-/CD8- phenotype that are isolated from human blood. The addition of distinct interleukins and antibodies results in the generation of CIK cells that are able to specifically target and destroy renal carcinoma cells. The results of this research state that the anti-ccRCC activity of chaetocin is weak and does not show a high grade of selectivity on clear cell renal cell carcinoma cells. Although the CIK cells show a high grade of selective anti-ccRCC activity, this effect could not be improved by the addition of chaetocin. So chaetocin seems to be no suitable agent for specific targeting ccRCC cells or for the combination therapy with CIK cells in renal cancer.
Effects of Workspace Awareness and Territoriality in Environments with Large, Shared Displays
(2016)
Synchronous cooperative work of multiple collaborators in large, high-resolution display systems comprises such psychological phenomena like workspace awareness and human territoriality. The phenomena and interplay between them can cause a significant impact on human-human and human-environment interaction. In a non-digital environment humans rely on their own physical abilities, utilities, and social protocols to control those phenomena (e.g. close eyes, or use earplugs to reduce workspace awareness; rotate oneself towards collaborators to increase workspace awareness). Digital environments, on the other hand, provide us with a possibility to ease, automate, and unify control processes, thus taking off that burden from users. Yet, we have to understand first, what effects workspace awareness and territoriality have within a collaborative environment. The aim of this doctoral thesis is to investigate effects of workspace awareness and territoriality on users and interaction processes in mixed-focus scenarios of various collaborative settings.
Hydrogen sulfide (H2S) is a well-known environmental chemical threat with an unpleasant smell of rotten eggs. Aside from the established toxic effects of high-dose H2S, research over the past decade revealed that cells endogenously produce small amounts of H2S with physiological functions. H2S has therefore been classified as a gasotransmitter. A major challenge for cells and tissues is the maintenance of low physiological concentrations of H2S in order to prevent potential toxicity. Epithelia of the respiratory and gastrointestinal tract are especially faced with this problem, since these barriers are predominantly exposed to exogenous H2S from environmental sources or sulfur-metabolising microbiota. In this paper, we review the cellular mechanisms by which epithelial cells maintain physiological, endogenous H2S concentrations. Furthermore, we suggest a concept by which epithelia use their electrolyte and liquid transport machinery as defence mechanisms in order to eliminate exogenous sources for potentially harmful H2S concentrations.
In this paper, we introduce the international program erp4students as general example on how to successfully prepare university students for the world of works without having to give up the basic principle in higher education, i.e., to exclusively provide sustainable education. We start with introducing the basic concept and design of the program and provide information regarding the demographic development over the past decade and implemented quality assurance mechanisms. Subsequently, the scope and design of and hitherto achieved insights from the Learning Culture Survey are outlined. On the basis of found results, we finally discuss how erp4students can deal with possible culture-specific issues that latest might emerge when the program gets available for learners in the Asian context.
The work at hand outlines a recording setup for capturing hand and finger movements of musicians. The focus is on a series of baseline experiments on the detectability of coloured markers under different lighting conditions. With the goal of capturing and recording hand and finger movements of musicians in mind, requirements for such a system and existing approaches are analysed and compared. The results of the experiments and the analysis of related work show that the envisioned setup is suited for the expected scenario.
Beta-ketothiolase deficiency, also known as mitochondrial acetoacetyl-CoA thiolase (T2) deficiency, is an autosomal recessive disease caused by mutations in the acetylCoA acetyltransferase 1 (ACAT1) gene. A German T2deficient patient that developed a severe ketoacidotic episode at the age of 11 months, was revealed to be a compound heterozygote of a previously reported null mutation, c.472A>G (p.N158D) and a novel mutation, c.949G>A (p.D317N), in ACAT1. The c.949G>A mutation was suspected to cause aberrant splicing as it is located within an exonic splicing enhancer sequence (c. 947CTGACGC) that is a potential binding site for serine/argininerich splicing factor 1. A mutation in this sequence, c.951C>T, results in exon 10 skipping. A minigene construct was synthesized that included exon 9truncated intron 9exon 10truncated intron 10exon 11, and the splicing of this minigene revealed that the c.949G>A mutant construct caused exon 10 skipping in a proportion of the transcripts. Furthermore, additional substitution of G for C at the first nucleotide of exon 10 (c.941G>C) abolished the effect of the c.949G>A mutation. Transient expression analysis of the c.949G>A mutant cDNA revealed no residual T2 activity in the mutated D317N enzyme. Therefore, c.949G>A (D317N) is a pathogenic missense mutation, and diminishes the effect of an exonic splicing enhancer and causes exon 10 skipping. The present study demonstrates that a missense mutation, or even a synonymous substitution, may disrupt enzyme function by interference with splicing.
WiFi-based Long Distance (WiLD) networks have emerged as a promising alternative technology approach for providing Internet in rural areas. An important factor in network planning of these wireless networks is estimating the path loss. In this work, we present various propagation models we found suitable for point-to-point (P2P) operation in the WiFi frequency bands. We conducted outdoor experiments with commercial offthe- shelf (COTS) hardware in our testbed made of 7 different long-distance links ranging from 450 m to 10.3 km and a mobile measurement station. We found that for short links with omni-directional antennas ground-reflection is a measurable phenomenon. For longer links, we show that either FSPL or the Longley-Rice model provides accurate results for certain links. We conclude that a good site survey is needed to exclude influences not included in the propagation models.
SDN and WMN evolved to be sophisticated technologies used in a variety of applications. However, a combined approach called wmSDN has not been widely addressed in the research community. Our idea in this field consists of WiFi-based point-to-point links managed by the OpenFlow protocol. We investigate two different issues regarding this idea. First, which WiFi operational mode is suitable in an OpenFlow managed broadcast domain? Second, does the performance decrease compared with other routing or switching principles? Therefore, we set up a real-world testbed and a suitable simulation environment. Unlike previous work, we show that it is possible to use WiFi links without conducting MAC address rewriting at each hop by utilizing the 4-address-mode.
Background: Falls are common in older adults and can result in serious injuries. Due to demographic changes, falls and related healthcare costs are likely to increase over the next years. Participation and motivation of older adults in fall prevention measures remain a challenge. The iStoppFalls project developed an information and communication technology (ICT)-based system for older adults to use at home in order to reduce common fall risk factors such as impaired balance and muscle weakness. The system aims at increasing older adults’ motivation to participate in ICT-based fall prevention measures. This article reports on usability, user-experience and user-acceptance aspects affecting the use of the iStoppFalls system by older adults.
Methods: In the course of a 16-week international multicenter study, 153 community-dwelling older adults aged 65+ participated in the iStoppFalls randomized controlled trial, of which half used the system in their home to exercise and assess their risk of falling. During the study, 60 participants completed questionnaires regarding the usability, user experience and user acceptance of the iStoppFalls system. Usability was measured with the System Usability Scale (SUS). For user experience the Physical Activity Enjoyment Scale (PACES) was applied. User acceptance was assessed with the Dynamic Acceptance Model for the Re-evaluation of Technologies (DART). To collect more detailed data on usability, user experience and user acceptance, additional qualitative interviews and observations were conducted with participants.
Results: Participants evaluated the usability of the system with an overall score of 62 (Standard Deviation, SD 15.58) out of 100, which suggests good usability. Most users enjoyed the iStoppFalls games and assessments, as shown by the overall PACES score of 31 (SD 8.03). With a score of 0.87 (SD 0.26), user acceptance results showed that participants accepted the iStoppFalls system for use in their own home. Interview data suggested that certain factors such as motivation, complexity or graphical design were different for gender and age.
Conclusions: The results suggest that the iStoppFalls system has good usability, user experience and user acceptance. It will be important to take these along with factors such as motivation, gender and age into consideration when designing and further developing ICT-based fall prevention systems.
Fault-Channel Watermarks
(2016)
In this paper, several blocking techniques are applied to matrices that do not have a strong blocked structure. The aim is to efficiently use vectorization with current CPUs, even for matrices without an explicit block structure on nonzero elements. Different approaches are known to find fixed or variable sized blocks of nonzero elements in a matrix. We present a new matrix format for 2D rectangular blocks of variable size, allowing fill-ins per block of explicit zero values up to a user definable threshold. We give a heuristic to detect such 2D blocks in a sparse matrix. The performance of a Sparse Matrix Vector Multiplication for chosen block formats is measured and compared. Results show that the benefit of blocking formats depend – as to be expected – on the structure of the matrix and that variable sized block formats can have advantages over fixed size formats.
Within qualitative interviews we examine attitudes towards driverless cars in order to investigate new mobility services and explore the impact of such services on everyday mobility. We identified three main issues that we would like to discuss in the workshop: (I) Designing beyond a driver-centric approach; (II) Developing mobility services for cars which drive themselves; and (III) Exploring self-driving practices.
The Eighth International Conference on Future Computational Technologies and Applications (FUTURE COMPUTING 2016), held between March 20-24, 2016 in Rome, Italy, continued a series of events targeting advanced computational paradigms and their applications. The target was to cover (i) the advanced research on computational techniques that apply the newest human-like decisions, and (ii) applications on various domains. The new development led to special computational facets on mechanism-oriented computing, large-scale computing and technology-oriented computing. They are largely expected to play an important role in cloud systems, on-demand services, autonomic systems, and pervasive applications and services.
Helping Johnny to Analyze Malware: A Usability-Optimized Decompiler and Malware Analysis User Study
(2016)
Reliable and regional differentiated power forecasts are required to guarantee an efficient and economic energy transition towards renewable energies. Amongst other renewable energy technologies, e.g. wind mills, photovoltaic systems are an essential component of this transition being cost-efficient and simply to install. Reliable power forecasts are however required for a grid integration of photovoltaic systems, which among other data requires high-resolution spatio-temporal global irradiance data. Hence the generation of robust reviewed global irradiance data is an essential contribution for the energy transition.
Large bone defects require fabricated bone constructs that consist of three main components: an artificial extracellular matrix scaffold, stem cells with the potential to differentiate into osteoblasts, and bioactive substances, such as osteoinductive growth factors to direct the growth and differentiation of cells toward osteogenic lineage within the scaffold.
Hydrogen sulfide contributes to hypoxic inhibition of airway transepithelial sodium absorption
(2016)
Falls and their consequences are arguably most important events for transition from independent living to institutional care for older adults. Information and communication technology (ICT)-based support of fall prevention and fall risk assessment under the control of the user has a tremendous potential to, over time, prevent falls and reduce associated harm and costs. Our research uses participative design and a persuasive health approach to allow for seamless integration of an ICT-based fall prevention system into older adults’ everyday life. Based on a 6-month field study with 12 participants, we present qualitative results regarding the system use and provide insights into attitudes and practices of older adults concerning fall prevention and ICT-supported self-management of health. Our study demonstrates how it can lead to positive aspects of embodiment and health literacy through continuous monitoring of personal results, improved technical confidence, and quality of life. Implications are provided for designing similar systems.
Brentuximab vedotin (SGN-35) is an antibody–drug conjugate with a high selectivity against CD30+ cell lines and more than 300-fold less activity against antigen-negative cells. In the last years, the results of many in vitro and in vivo studies have led to the fast approval of this drug to treat lymphoma patients. Another innovative method to treat tumor cells including lymphoma cells is the use cytokine-induced killer (CIK) cells, which have also been approved and proven to be a safe treatment with only minor adverse events. In this study, a possible additive effect when combining SGN-35 with CIK cells was investigated. The combinational treatment showed that it reduces the viability of CD30+ cell lines significantly in vitro. Additionally, the amount of lymphoma cells was significantly reduced when exposed to CIK cells as well as when exposed to SGN-35. A significant negative effect of SGN-35 on the function of CIK cells could be excluded. These results lead to the assumption that SGN-35 and CIK cells in combination might achieve better results in an in vitro setting compared to the single use of SGN-35 and CIK cells. Further investigations in in vivo models must be conducted to obtain a better understanding of the exact mechanisms of both treatments when applied in combination.
The mechanical properties of plastic components, especially if they are made of semi-crystalline polymers, are considerably influenced by the process conditions. The degree of crystallization influences thermal and mechanical properties. Even more important is the orientation of molecules due to stretching of the polymer melt. Anisotropic material properties are the result of such orientations. Up to now all these effects are not considered within the simulation models of blow molded parts.
Knowledge-Based Instrumentation and Control for Competitive Industry-Inspired Robotic Domains
(2016)
Job-related migration has been fostered across Europe balancing unemployment in one country with demands for employees in others. However, the numbers of early school leavers and university dropouts significantly increased in the hosting countries. We propose a higher measure of cultural sensitivity in education in order to prevent frustration. The Learning Culture Survey investigates learners’ expectations towards and perceptions of education on international level with the aim to make culture in the context of education better understandable. After a brief introduction, we subsume the steps taken during the past seven years and found results. Subsequently, we introduce a method for the determination of conflict potential, which bases on the understanding of culture as the level to which people within a society accept deviations from the usual. We close with demonstrating the usefulness of the data and insights from our Learning Culture Survey in the context of practical scenarios.
The Participation Act, introduced in the Netherlands in 2015, puts into practice the idea that every individual has to make a contribution in a participatory society. The Act includes aspects of income support, compulsory activities in return for benefits, and labour market reintegration. Drawing on 45 interviews, we provide insights into interactions between the individual financial and social situation, an individual’s position in society, and reintegration activities. The narratives show the fundamental need for individual freedom and societal meaning, recognition, and appreciation, as well as the complex circumstances in which social assistance recipients make decisions. Conflicts between those needs and the Act lead to the question of how personal and societal objectives can be reconciled.
This paper presents methods for the reduction and compression of meteorological data for web-based wind flow visualizations, which are tailored to the flow visualization technique. Flow data sets represent a large amount of data and are therefore not well suited for mobile networks with low data throughput rates and high latency. Using the mechanisms introduced in this paper, an efficient transfer of thinned out and compressed data can be achieved, while keeping the accuracy of the visualized information almost at the same quality level as for the original data.