Refine
H-BRS Bibliography
- yes (133)
Departments, institutes and facilities
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (133) (remove)
Document Type
- Article (93)
- Conference Object (22)
- Preprint (7)
- Part of Periodical (4)
- Part of a Book (2)
- Report (2)
- Book (monograph, edited volume) (1)
- Contribution to a Periodical (1)
- Working Paper (1)
Year of publication
Has Fulltext
- yes (133) (remove)
Keywords
- lignin (7)
- West Africa (5)
- energy meteorology (4)
- Global horizontal irradiance (3)
- additive (3)
- antioxidant (3)
- biomass (3)
- drug release (3)
- hydrogel (3)
- organosolv (3)
Molecular modeling is an important subdomain in the field of computational modeling, regarding both scientific and industrial applications. This is because computer simulations on a molecular level are a virtuous instrument to study the impact of microscopic on macroscopic phenomena. Accurate molecular models are indispensable for such simulations in order to predict physical target observables, like density, pressure, diffusion coefficients or energetic properties, quantitatively over a wide range of temperatures. Thereby, molecular interactions are described mathematically by force fields. The mathematical description includes parameters for both intramolecular and intermolecular interactions. While intramolecular force field parameters can be determined by quantum mechanics, the parameterization of the intermolecular part is often tedious. Recently, an empirical procedure, based on the minimization of a loss function between simulated and experimental physical properties, was published by the authors. Thereby, efficient gradient-based numerical optimization algorithms were used. However, empirical force field optimization is inhibited by the two following central issues appearing in molecular simulations: firstly, they are extremely time-consuming, even on modern and high-performance computer clusters, and secondly, simulation data is affected by statistical noise. The latter provokes the fact that an accurate computation of gradients or Hessians is nearly impossible close to a local or global minimum, mainly because the loss function is flat. Therefore, the question arises of whether to apply a derivative-free method approximating the loss function by an appropriate model function. In this paper, a new Sparse Grid-based Optimization Workflow (SpaGrOW) is presented, which accomplishes this task robustly and, at the same time, keeps the number of time-consuming simulations relatively small. This is achieved by an efficient sampling procedure for the approximation based on sparse grids, which is described in full detail: in order to counteract the fact that sparse grids are fully occupied on their boundaries, a mathematical transformation is applied to generate homogeneous Dirichlet boundary conditions. As the main drawback of sparse grids methods is the assumption that the function to be modeled exhibits certain smoothness properties, it has to be approximated by smooth functions first. Radial basis functions turned out to be very suitable to solve this task. The smoothing procedure and the subsequent interpolation on sparse grids are performed within sufficiently large compact trust regions of the parameter space. It is shown and explained how the combination of the three ingredients leads to a new efficient derivative-free algorithm, which has the additional advantage that it is capable of reducing the overall number of simulations by a factor of about two in comparison to gradient-based optimization methods. At the same time, the robustness with respect to statistical noise is maintained. This assertion is proven by both theoretical considerations and practical evaluations for molecular simulations on chemical example substances.
Qualitätsverbesserung und Zeitersparnis bei der Stipendienvergabe durch automatisierten Workflow
(2013)
Für die Vergabe der Deutschlandstipendien hatte die Hochschule anfangs ein Verfahren festgelegt, das viel manuelle Arbeitsschritte umfasst: Die Studierenden hatten ihre Bewerbungsunterlagen schriftlich einzureichen. Dazu gehörten neben einem Motivationsschreiben, einem Ausdruck des aktuellen Notenspiegels alle weiteren Referenzen zur Einschätzung der Bewerbung gemäß den gesetzlichen Auswahlkriterien. Als Grundlage zur Bewertung der „sozialen Kriterien“ sollten die Bewerberinnen und Bewerber ein Gutachten eines Professors oder einer Professorin der Hochschule einholen.
Das AD 2000-Regelwerk ist der dominierende Standard für den Druckbehälterbau in Deutschland. Die bereits in anderen europäischen Ländern verbreitete DIN EN 13445 findet kaum Berücksichtigung. Dies allerdings zu Unrecht, denn ein aktueller Vergleich, der im Rahmen einer Bachelorarbeit durchgeführte wurde, zeigt: Die EN 13445 ist zu einer echten Alternative gereift. Gerade das Hauptargument gegen eine Umstellung, die steigenden Kosten, ist längst überholt.
Automated parameterization of intermolecular pair potentials using global optimization techniques
(2014)
In this work, different global optimization techniques are assessed for the automated development of molecular force fields, as used in molecular dynamics and Monte Carlo simulations. The quest of finding suitable force field parameters is treated as a mathematical minimization problem. Intricate problem characteristics such as extremely costly and even abortive simulations, noisy simulation results, and especially multiple local minima naturally lead to the use of sophisticated global optimization algorithms. Five diverse algorithms (pure random search, recursive random search, CMA-ES, differential evolution, and taboo search) are compared to our own tailor-made solution named CoSMoS. CoSMoS is an automated workflow. It models the parameters’ influence on the simulation observables to detect a globally optimal set of parameters. It is shown how and why this approach is superior to other algorithms. Applied to suitable test functions and simulations for phosgene, CoSMoS effectively reduces the number of required simulations and real time for the optimization task.
Das sogenannte „Deutschlandstipendium“ ist 2010 ins Leben gerufen worden. Gemäß den gesetzlichen Vorgaben sollen die Stipendien nach Begabung und Leistung vergeben werden. Darüber hinaus sollen auch gesellschaftliches Engagement oder besondere soziale, familiäre oder persönliche Umstände berücksichtigt werden. Bei der Finanzierung sind die Hochschulen zunächst auf das Einwerben privater Fördermittel angewiesen, die von Bund und Land um denselben Betrag aufgestockt werden. Die privaten Mittelgeber können für die von ihnen anteilig finanzierten Stipendien festlegen, aus welchen Studiengängen ihre Stipendiaten ausgewählt werden sollen. Die Hochschulen haben jedoch darauf zu achten, dass ein Drittel aller zu vergebenden Stipendien ohne eine entsprechende Zweckbindung vergeben werden. Einen direkten Einfluss auf die Auswahl einzelner Kandidaten dürfen die Förderer nicht haben. Vor diesem Hintergrund sind die Hochschulen angehalten, Anreize für private Förderer zu schaffen und parallel Bewerbungs- und Auswahlverfahren zu konzipieren, die die genannten gesetzlichen Vorgaben einhalten. Dadurch entsteht bei den Hochschulen ein erheblicher Verwaltungsaufwand. Zu dessen Reduzierung wird in diesem Artikel ein transparenter, nachvollziehbarer, zeit- und kostensparender Prozess durch einen programmierten Workflow beschrieben.
Das Thema Prozessorganisation hat die gfo in den letzten Jahren intensiv begleitet und auf mehreren Tagungen eingehend diskutiert. Um den aktuellen Umsetzungsstand der Prozessorganisation in Deutschland zu untersuchen wurde im Jahr 2014 eine empirische Studie durchgeführt. Neben der Ist-Situation liefert die Studie Einsichten in Erwartungen über zukünftige Entwicklungen, Hindernisse und Erfolgsfaktoren der Einführung einer Prozessorganisation sowie zur Zielerreichung durch prozessorientierte Organisationen.
Polyether and polyether/ester based TPU (thermoplastic polyurethanes) were investigated with wide-angle XRD (X-ray diffraction) and SAXS (small angle X-ray scattering). Furthermore, SAXS measurements were performed in the temperature range of 30 °C to 130 °C. Polyether based polymers exhibit only one broad diffraction signal in a region of 2 θ 15° to 25°. In case of polyurethanes with ether/ester modification, the broad diffraction signal arises with small sharp diffraction signals. SAXS measurements of polymers reveal the size and shape of the crystalline zones of the polymer. Between 30 °C and 130 °C the size of the crystalline zone changes significantly. The size decreases in most of investigated TPU. In the case of Desmopan 9365D an increase of the particle size was observed.
Solar energy is one option to serve the rising global energy demand with low environmental impact.1 Building an energy system with a considerable share of solar power requires long-term investment and a careful investigation of potential sites. Therefore, understanding the impacts from varying regionally and locally determined meteorological conditions on solar energy production will influence energy yield projections. Clouds are moving on a short term timescale and have a high influence on the available solar radiation, as they absorb, reflect and scatter parts of the incoming light.2 However, the impact of cloudiness on photovoltaic power yields (PV) and cloud induced deviations from average yields might vary depending on the technology, location and time scale under consideration.
Solar energy is one option to serve the rising global energy demand with low environmental Impact [1]. Building an energy system with a considerable share of solar power requires long-term investment and a careful investigation of potential sites. Therefore, understanding the impacts from varying regionally and locally determined meteorological conditions on solar energy production will influence energy yield projections. Clouds are moving on a short term timescale and have a high influence on the available solar radiation, as they absorb, reflect and scatter parts of the incoming light [2]. However, modeling photovoltaic (PV) power yields with a spectral resolution and local cloud information gives new insights on the atmospheric impact on solar energy.