Refine
Departments, institutes and facilities
- Fachbereich Informatik (54)
- Institut für funktionale Gen-Analytik (IFGA) (33)
- Fachbereich Angewandte Naturwissenschaften (25)
- Institute of Visual Computing (IVC) (18)
- Institut für Cyber Security & Privacy (ICSP) (11)
- Fachbereich Ingenieurwissenschaften und Kommunikation (9)
- Fachbereich Wirtschaftswissenschaften (8)
- Institut für Sicherheitsforschung (ISF) (4)
- Institut für Verbraucherinformatik (IVI) (4)
- Institut für Detektionstechnologien (IDT) (2)
Document Type
- Article (79)
- Conference Object (60)
- Part of a Book (16)
- Report (5)
- Book (monograph, edited volume) (4)
- Lecture (4)
- Doctoral Thesis (2)
- Master's Thesis (2)
- Conference Proceedings (1)
- Contribution to a Periodical (1)
Year of publication
- 2012 (177) (remove)
Language
- English (177) (remove)
Keywords
- ENaC (3)
- ISM: molecules (3)
- 3D-Scanner (2)
- ARRs (2)
- Adaptation (2)
- Bag of Features (2)
- CD21 (2)
- FDI (2)
- Hybrid systems (2)
- ISM: kinematics (2)
The ability to breathe air represents a fundamental step in vertebrate evolution that was accompanied by several anatomical and physiological adaptations. The morphology of the air-blood barrier is highly conserved within air-breathing vertebrates. It is formed by three different plies, which are represented by the alveolar epithelium, the basal lamina, and the endothelial layer. Besides these conserved morphological elements, another common feature of vertebrate lungs is that they contain a certain amount of fluid that covers the alveolar epithelium. The volume and composition of the alveolar fluid is regulated by transepithelial ion transport mechanisms expressed in alveolar epithelial cells. These transport mechanisms have been reviewed extensively. Therefore, the present review focuses on the properties and functional significance of the alveolar fluid. How does the fluid enter the alveoli? What is the fate of the fluid in the alveoli? What is the function of the alveolar fluid in the lungs? The review highlights the importance of the alveolar fluid, its volume and its composition. Maintenance of the fluid volume and composition within certain limits is critical to facilitate gas exchange. We propose that the alveolar fluid is an essential element of the air-blood barrier. Therefore, it is appropriate to refer to this barrier as being formed by four plies, namely (1) the thin fluid layer covering the apical membrane of the epithelial cells, (2) the epithelial cell layer, (3) the basal membrane, and (4) the endothelial cells.
The vectorial transport of Na+ across epithelia is crucial for the maintenance of Na+ and water homeostasis in organs such as the kidneys, lung, or intestine. Dysregulated Na+ transport processes are associated with various human diseases such as hypertension, the salt-wasting syndrome pseudohypoaldosteronism type 1, pulmonary edema, cystic fibrosis, or intestinal disorders, which indicate that a precise regulation of epithelial Na+ transport is essential. Novel regulatory signaling molecules are gasotransmitters. There are currently three known gasotransmitters: nitric oxide (NO), carbon monoxide (CO), and hydrogen sulfide (H2S). These molecules are endogenously produced in mammalian cells by specific enzymes and have been shown to regulate various physiological processes. There is a growing body of evidence which indicates that gasotransmitters may also regulate Na+ transport across epithelia. This review will summarize the available data concerning NO, CO, and H2S dependent regulation of epithelial Na+ transport processes and will discuss whether or not these mediators can be considered as true physiological regulators of epithelial Na+ transport biology.
Wesch D, Althaus M, Miranda P, Cruz-Muros I, Fronius M, Gonzalez-Hernandez T, Clauss WG, de la Rosa DA, Giraldez T. Differential N termini in epithelial Na+ channel delta-subunit isoforms modulate channel trafficking to the membrane. Am J Physiol Cell Physiol 302: C868-C879, 2012. First published December 7, 2011; doi: 10.1152/ajpcell.00255.2011.-The epithelial Na+ channel (ENaC) is a heteromultimeric ion channel that plays a key role in Na+ reabsorption across tight epithelia. The canonical ENaC is formed by three analogous subunits, alpha, beta, and gamma. A fourth ENaC subunit, named delta, is expressed in the nervous system of primates, where its role is unknown. The human delta-ENaC gene generates at least two splice isoforms, delta(1) and delta(2), differing in the N-terminal sequence. Neurons in diverse areas of the human and monkey brain differentially express either delta(1) or delta(2), with few cells coexpressing both isoforms, which suggests that they may play specific physiological roles. Here we show that heterologous expression of delta(1) in Xenopus oocytes and HEK293 cells produces higher current levels than delta(2). Patch-clamp experiments showed no differences in single channel current magnitude and open probability between isoforms. Steady-state plasma membrane abundance accounts for the dissimilarity in macroscopic current levels. Differential trafficking between isoforms is independent of beta- and gamma-subunits, PY-motif-mediated endocytosis, or the presence of additional lysine residues in delta(2)-N terminus. Analysis of delta(2)-N terminus identified two sequences that independently reduce channel abundance in the plasma membrane. The delta(1) higher abundance is consistent with an increased insertion rate into the membrane, since endocytosis rates of both isoforms are indistinguishable. Finally, we conclude that delta-ENaC undergoes dynamin-independent endocytosis as opposed to alpha beta gamma-channels.
The gasotransmitter hydrogen sulphide decreases Na⁺ transport across pulmonary epithelial cells
(2012)
BACKGROUND AND PURPOSE The transepithelial absorption of Na(+) in the lungs is crucial for the maintenance of the volume and composition of epithelial lining fluid. The regulation of Na(+) transport is essential, because hypo- or hyperabsorption of Na(+) is associated with lung diseases such as pulmonary oedema or cystic fibrosis. This study investigated the effects of the gaseous signalling molecule hydrogen sulphide (H(2) S) on Na(+) absorption across pulmonary epithelial cells. EXPERIMENTAL APPROACH Ion transport processes were electrophysiologically assessed in Ussing chambers on H441 cells grown on permeable supports at air/liquid interface and on native tracheal preparations of pigs and mice. The effects of H(2)S were further investigated on Na(+) channels expressed in Xenopus oocytes and Na(+) /K(+)-ATPase activity in vitro. Membrane abundance of Na(+) /K(+)-ATPase was determined by surface biotinylation and Western blot. Cellular ATP concentrations were measured colorimetrically, and cytosolic Ca(2+) concentrations were measured with Fura-2. KEY RESULTS H(2)S rapidly and reversibly inhibited Na(+) transport in all the models employed. H(2)S had no effect on Na(+) channels, whereas it decreased Na(+) /K(+)-ATPase currents. H(2)S did not affect the membrane abundance of Na(+) /K(+)-ATPase, its metabolic or calcium-dependent regulation, or its direct activity. However, H(2)S inhibited basolateral calcium-dependent K(+) channels, which consequently decreased Na(+) absorption by H441 monolayers. CONCLUSIONS AND IMPLICATIONS H(2) S impairs pulmonary transepithelial Na(+) absorption, mainly by inhibiting basolateral Ca(2+)-dependent K(+) channels. These data suggest that the H(2)S signalling system might represent a novel pharmacological target for modifying pulmonary transepithelial Na(+) transport.
Malware is responsible for massive economic damage. Being the preferred tool for digital crime, botnets are becoming increasingly sophisticated, using more and more resilient, distributed infrastructures based on peer-to-peer (P2P) protocols. On the other side, current investigation techniques for malware and botnets on a technical level are time-consuming and highly complex. Fraunhofer FKIE is addressing this problem, researching new ways of intelligent process automation and information management for malware analysis in order to minimize the time needed to investigate these threats.
Today’s computer systems face a vast array of severe threats that are posed by automated attacks performed by malicious software as well as manual attacks by individual humans. These attacks not only differ in their technical implementation but may also be location-dependent. Consequentially, it is necessary to join the information from heterogeneous and distributed attack sensors in order to acquire comprehensive information on current ongoing cyber attacks.
XML Encryption and XML Signature are fundamental security standards forming the core for many applications which require to process XML-based data. Due to the increased usage of XML in distributed systems and platforms such as in SOA and Cloud settings, the demand for robust and effective security mechanisms increased as well. Recent research work discovered, however, substantial vulnerabilities in these standards as well as in the vast majority of the available implementations. Amongst them, the so-called XML Signature Wrapping attack belongs to the most relevant ones. With the many possible instances of this attack type, it is feasible to annul security systems relying on XML Signature and to gain access to protected resources as has been successfully demonstrated lately for various Cloud infrastructures and services. This paper contributes a comprehensive approach to robust and effective XML Signatures for SOAP-based Web Services. An architecture is proposed, which integrates the r equired enhancements to ensure a fail-safe and robust signature generation and verification. Following this architecture, a hardened XML Signature library has been implemented. The obtained evaluation results show that the developed concept and library provide the targeted robustness against all kinds of known XML Signature Wrapping attacks. Furthermore the empirical results underline, that these security merits are obtained at low efficiency and performance costs as well as remain compliant with the underlying standards.
Development and Validation of a Rapid and Reliable Method for TPMT Genotyping using real-time PCR
(2012)
The documentation requirements of data published in long term archives have significantly grown over the last decade. At WDCC the data publishing process is assisted by “Atarrabi”, a web-based workflow system for reviewing and editing metadata information by the data authors and the publication agent. The system ensures high metadata quality for long-term use of the data with persistent identifiers (DOI/URN). By these well-defined references (DOI) credit can properly be given to the data producers in any publication.
In this paper we summarize our research on international educational contexts and transfer the results to the context of urban life-long learning. We will show that a collection and provision of relevant data can help instructors as well as learners to raise their awareness regarding contextual differences, to develop a higher level of acceptance regarding differences, and thus, in the long term, avoid frustration in educational processes and reduce drop out-rates.
In the context of Internet-based e-Learning, including an international auditory is a logical consequence. However, due to uncertainty regarding the foreign learners, e-Learning programs often are limited to local or national participants. Understanding the different expectations of learners regarding instructor-support is one step in order to enable providers of educational services to tailor educational programs that fit the requirements of an international auditory. We asked university students in five countries regarding their expectations towards instructor-support and found major differences between the investigated countries.
For learners, feedback can be both, a strong motivator but in case it fails its purpose, it can be a strong reason for frustration and dropouts as well. Do we have to change our locally implemented feedback strategies when adapting learning contents from national to international settings? In our study, we the investigated learners’ understanding and preferences regarding feedback in scenarios of higher education across the five different national contexts, Bulgaria, Germany, South Korea, Turkey, and Ukraine.
This presentation shows that students in different cultural contexts have different perceptions of time management and work organization. Particularly in group work scenarios, such differences can have a frustrating impact on students from other cultural contexts because e.g., expectations are not met. Being aware of such differences between the learners in a culturally heterogeneous educational scenario, educators can prevent frustration by introducing their students and providing more specific instructions.
Education is widely seen as an important means of addressing both national and international problems, such as political or religious extremism, poverty, and hunger. If publicly available educational resources (OERs) shall help overcoming the educational gap, localization is one of the major issues we need to deal with. Educators as well as learners need to be supported to determine adaptation needs. This paper provides a list of possible in-fluence factors on educational scenarios which are defined as context metadata. In the given form, the list needs to be understood as an addendum for the paper entitled ‘Open Educational Resources: Education for the World?’ from Thomas richter and Maggie McPherson; It is being published in the volume 3, issue 2 of the Journal Distance Education in 2012.
Education is widely seen as an important means of addressing both national and international problems, such as political or religious extremism, poverty, and hunger. However, if developing countries are to become societies that can compete properly with Western industrialized countries, not only is a fundamental shift in thinking with regard to the value of education and more/better provision of teaching required, but strong support from other countries is needed as well. This article explores questions such as whether Western policymakers can avoid a repetition of some of the failures of the past few decades in terms of providing foreign aid; how educators and providers of educational scenarios and learning contents can foster and manage the creation of a worldwide knowledge society; and in particular, if the provision of open educational resources (OER) can realistically overcome the educational gap and foster educational justice.
Disorders of the degradation of branched chain amino acids: what is new in clinics and laboratories?
(2012)
Molybdenum cofactor deficiency (MoCD) is a rare inherited metabolic disorder characterized by severe and progressive neurological damage mainly caused by the loss of sulfite oxidase activity. Elevated urinary levels of sulfite, thiosulfate, and S-sulfocysteine (SSC) are hallmarks in the diagnosis of MoCD and sulfite oxidase deficiency (SOD). Recently, a first successful treatment of a human MoCD type A patient based on a substitution therapy with the molybdenum cofactor precursor cPMP has been reported, resulting in nearly complete normalization of MoCD biomarkers. Knowing the rapid progression of the disease symptoms in nontreated patients, an early diagnosis of MoCD as well as a sensitive method to monitor daily changes in SSC levels, a key marker of sulfite toxicity, is crucial for treatment outcome. Here, we describe a fast and sensitive method for the analysis of SSC in human urine samples using high performance liquid chromatography (HPLC). The analysis is based on precolumn derivatization with O-phthaldialdehyde (OPA) and separation on a C18 reverse phase column coupled to UV detection. The method was extended to human serum analysis and no interference with endogenous amino acids was found. Finally, SSC values from 45 pediatric urine, 75 adult urine, and 24 serum samples from control individuals as well as MoCD patients are reported. Our method represents a cost-effective technique for routine diagnosis of MoCD and SOD, and can be used also to monitor treatment efficiency in those sulfite toxicity disorders on a daily basis.
The osmolality of nonionic, iodinated contrast agents as an important factor for renal safety
(2012)
Software testing in web services environment faces different challenges in comparison with testing in traditional software environments. Regression testing activities are triggered based on software changes or evolutions. In web services, evolution is not a choice for service clients. They have always to use the current updated version of the software. In addition test execution or invocation is expensive in web services and hence providing algorithms to optimize test case generation and execution is vital. In this environment, we proposed several approach for test cases' selection in web services' regression testing. Testing in this new environment should evolve to be included part of the service contract. Service providers should provide data or usage sessions that can help service clients reduce testing expenses through optimizing the selected and executed test cases.
Germany
(2012)
Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.
In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key was to make the results applicable to data publication processes in other sciences as well.
YAWL User Group
(2012)
One of the most common problems in Regenerative Medicine is the regeneration of damaged bone with the aim of repairing or replacing lost or damaged bone tissue by stimulating the natural regenerative process. Particularly in the fields of orthopedic, plastic, reconstructive, maxillofacial and craniofacial surgery there is need for successful methods to restore bone. From a regenerative point of view two different bone replacement problems can be distinguished: large bone defects and small bone defects. Currently, no perfect system exists for the treatment of large bone defects.
The objective of this thesis is to implement a computer game based motivation system for maximal strength testing on the Biodex System 3 Isokinetic Dynamometer. The prototype game has been designed to improve the peak torque produced in an isometric knee extensor strength test. An extensive analysis is performed on a torque data set from a previous study. The torque responses for five second long maximal voluntary contractions of the knee extensor are analyzed to understand torque response characteristics of different subjects. The parameters identifed in the data analysis are used in the implementation of the 'Shark and School of Fish' game. The behavior of the game for different torque responses is analyzed on a different torque data set from the previous study. The evaluation shows that the game rewards and motivates continuously over a repetition to reach the peak torque value. The evaluation also shows that the game rewards the user more if he overcomes a baseline torque value within the first second and then gradually increase the torque to reach peak torque.
The criteria for assessing the quality of rubber materials are the polymer or copolymer composition and the additives. These additives include plasticizers, extender oils, carbon black, inorganic fillers, antioxidants, heat and light stabilizers, processing aids, cross-linking agents, accelerators, retarders, adhesives, pigments, smoke and flame retardants, and others. Determination of additives in polymers or copolymers generally requires the extraction of these substances from the matrix as a first step, which can be challenging, and the subsequent analysis of the extracted additives by gas chromatography (GC), GC–mass spectrometry (MS), high performance liquid chromatography (HPLC), HPLC–MS, capillary electrophoresis, thin-layer chromatography, and other analytical techniques. In the present work, nitrile rubber materials were studied using direct analytical flash pyrolysis hyphenated to GC and electrospray ionization MS in both scan and selected ion monitoring modes to demonstrate that this technique is a good tool to identify the organic additives in nitrile rubber.
In this paper, various enhanced sales forecast methodologies and models for the automobile market are presented. The methods used deliver highly accurate predictions while maintaining the ability to explain the underlying model at the same time. The representation of the economic training data is discussed, as well as its effects on the newly registered automobiles to be predicted. The methodology mainly consists of time series analysis and classical Data Mining algorithms, whereas the data is composed of absolute and/or relative market-specific exogenous parameters on a yearly, quarterly, or monthly base. It can be concluded that the monthly forecasts were especially improved by this enhanced methodology using absolute, normalized exogenous parameters. Decision Trees are considered as the most suitable method in this case, being both accurate and explicable. The German and the US-American automobile market are presented for the evaluation of the forecast models.
Computational chemistry began with the birth of computers in the mid 1900s, and its growth has been directly coupled to the technological advances made in computer science and high-performance computing. A popular goal within the field, be it Newtonian or quantum based methods, is the accurate modelling of physical forces and energetics through mathematics and algorithm design. Through reliable modelling of the underlying forces, molecular simulations frequently provide atomistic insights into macroscopic experimental observations.
This book constitutes the thoroughly refereed post-conference proceedings of the Third International ICST Conference on e-Infrastructure and e-Services for Developing Countries, AFRICOMM 2011, held in Zanzibar, Tansania, in November 2011. The 24 revised full papers presented together with 2 poster papers were carefully reviewed and selected from numerous submissions. The papers cover a wide range of topics in the field of information and communication infrastructures. They are organized in two tracks: communication infrastructures for developing countries and electronic services, ICT policy, and regulatory issues for developing countries.
This project investigated the viability of using the Microsoft Kinect in order to obtain reliable Red-Green-Blue-Depth (RGBD) information. This explored the usability of the Kinect in a variety of environments as well as its ability to detect different classes of materials and objects. This was facilitated through the implementation of Random Sample and Consensus (RANSAC) based algorithms and highly parallelized workflows in order to provide time sensitive results. We found that the Kinect provides detailed and reliable information in a time sensitive manner. Furthermore, the project results recommend usability and operational parameters for the use of the Kinect as a scientific research tool.
Traffic simulations for virtual environments are concerned with the behavior of individual traffic participants. The complexity of behavior in these simulations is often rather simple to abide by the constraints of processing resources. In sophisticated traffic simulations, the behavior of individual traffic participants is also modeled, but the focus lies on the overall behavior of the entire system, e.g. to identify possible bottle necks of traffic flow [8].
At previous SIAS conferences, we presented a novel opto-electronic safety sensor system for skin detection at circular saws jointly developed with the Institute for Occupational Safety and Health of the German Social Accident Insurance (IFA). This work now presents the development results of our consecutive research on a prototype of a sensor system for more general production machine applications including robot workplaces. The system uses offthe shelf LEDs and photodiodes in combination with dedicated optics and a microcontroller system to implement a so-called spectral light curtain.
In this work, preceramic papers containing 85 wt% Al2O3 were heat-treated at 1600 °C to obtain paper-derived ceramics. In order to increase the preceramic paper density prior to sintering, the papers were calendered at different roll temperatures and pressures. The influences of the calendering parameters on the microstructure and mechanical properties of the preceramic papers and the paper-derived ceramics were investigated. It was expected that especially the mechanical properties of the papers and derived ceramics would be improved by calendering.
Traffic simulations are typically concerned with modeling human behavior as closely as possible to create realistic results. In conventional traffic simulations used for road planning or traffic jam prediction only the overall behavior of an entire system is of interest. In virtual environments, like digital games, simulated traffic participants are merely a backdrop to the player’s experience and only need to be “sufficiently realistic”. Additionally, restricted computational resources, typical for virtual environment applications, usually limit the complexity of simulated behavior in this field. More importantly, two integral aspects of real-world traffic are not considered in current traffic simulations from both fields: misbehavior and risk taking of traffic participants. However, for certain applications like the FIVIS bicycle simulator, these aspects are essential.
Traditionally traffic simulations are used to predict traffic jams, plan new roads or highways, and estimate road safety. They are also used in computer games and virtual environments. There are two general concepts of modeling traffic: macroscopic and microscopic modeling. Macroscopic traffic models take vehicle collectives into account and do not consider individual vehicles. Parameters like average velocity and density are used to model the flow of traffic. In contrast, microscopic traffic models consider each vehicle individually. Therefore, vehicle specific parameters are of importance, e.g. current velocity, desired velocity, velocity difference to the lead vehicle, individual time gap.