Refine
Departments, institutes and facilities
- Fachbereich Ingenieurwissenschaften und Kommunikation (87)
- Institut für Technik, Ressourcenschonung und Energieeffizienz (TREE) (80)
- Fachbereich Informatik (22)
- Zentrum für Innovation und Entwicklung in der Lehre (ZIEL) (6)
- Institute of Visual Computing (IVC) (4)
- Institut für funktionale Gen-Analytik (IFGA) (3)
- Fachbereich Angewandte Naturwissenschaften (2)
- Fachbereich Wirtschaftswissenschaften (2)
- Fachbereich Sozialpolitik und Soziale Sicherung (1)
- Internationales Zentrum für Nachhaltige Entwicklung (IZNE) (1)
Document Type
- Article (72)
- Conference Object (29)
- Preprint (8)
- Part of a Book (6)
- Report (4)
- Contribution to a Periodical (1)
- Doctoral Thesis (1)
- Other (1)
Year of publication
Keywords
- Lattice Boltzmann Method (4)
- Molecular dynamics (4)
- Force field (3)
- Numerical optimization (3)
- polyethylene (3)
- Automatic Differentiation (2)
- Computational fluid dynamics (2)
- Extrusion blow molding (2)
- High-performance computing (2)
- Hydrogen storage (2)
- Lennard-Jones potential (2)
- Molecular modeling (2)
- Molecular simulation (2)
- Monte Carlo (2)
- Pytorch (2)
- Semi-Lagrangian (2)
- Simulation (2)
- crystallization (2)
- ionic liquids (2)
- local chain orientation (2)
- mesoscale coarse-graining (2)
- molecular dynamics (2)
- quantum mechanics (2)
- relaxation (2)
- 3D printing (1)
- ACPYPE (1)
- AMBER (1)
- Access regulation (1)
- Adams-Moulton (1)
- Adaptive resolution schemes (1)
- Alkane (1)
- Antibiotics resistance (1)
- Atmosphere flow (1)
- Atomistic force fields (1)
- Automation (1)
- Automotive (1)
- BDF (1)
- Bachelor’s program (1)
- Basis set (1)
- Bayesian optimization (1)
- Beta strands (1)
- CAE metadata structures (1)
- Carbohydrate (1)
- Compressible (1)
- Compressible flows (1)
- Crystallinity (1)
- Cubature (1)
- Databases, Chemical (1)
- Draw ratio (1)
- Efficiency (1)
- Engineering (1)
- Extensible (1)
- Flow control (1)
- Flow direction (1)
- Fluid Dynamics (1)
- Flux coefficient (1)
- Force field parameters (1)
- Forklifts (1)
- GROW (1)
- Gauss–Hermite quadrature (1)
- Glycam06 (1)
- Gradient-based algorithms (1)
- Gromacs (1)
- High-resolution displays (1)
- Hydrocarbon (1)
- Integrative simulation (1)
- Interdisciplinary education (1)
- Internet (1)
- Introductury project (1)
- Kinetic theory (1)
- Lattice Boltzmann (1)
- Lattice Boltzmann Method Code (1)
- Lattice Boltzmann method (1)
- Lattice-Boltzmann methods (1)
- Lennard-Jones parameters (1)
- MP2.5 (1)
- Machine Learning (1)
- Machine learning (1)
- Membrane protein (1)
- Meso-scale simulation (1)
- Metal hydride (1)
- Metal hydride storage (1)
- Modelling (1)
- Models, Molecular (1)
- Modular software packages (1)
- Mold temperature (1)
- Molecular models (1)
- Molecular rotation (1)
- Molekulardynamik (1)
- Molekulare Simulation (1)
- Monte-Carlo simulation (1)
- Monte-Carlo-Simulation (1)
- Multi-drug efflux (1)
- Multiphase flow (1)
- Multiscale modeling (1)
- Navier-Stokes equation (1)
- Neural networks (1)
- New study course (1)
- Nonbonded scaling factor (1)
- Ocean flow (1)
- Off-lattice Boltzmann (1)
- Optimization (1)
- Organic compounds and Functional groups (1)
- Orthotropic material behavior (1)
- Outer membrane channel (1)
- Parameter optimization (1)
- Peer teaching (1)
- Physical property prediction (1)
- Principal component analysis (1)
- Process dependent material parameters (1)
- Protein folding (1)
- Quality diversity (1)
- Quantum mechanical methods (1)
- Rube Goldberg machine (1)
- SCRUM (1)
- Scientific workflows (1)
- Shan-Chen model (1)
- Simplex optimization (1)
- Simulations (1)
- Soft matter (1)
- Software (1)
- Spurious velocity (1)
- Statistical Physics (1)
- Storage modulus (1)
- Stratified flow (1)
- Study entrants (1)
- Sustainability (1)
- Sustainable engineering (1)
- Taylor-Green (1)
- Taylor–Green vortex (1)
- Thermodynamic data (1)
- Thermodynamics (1)
- Thermodynamische Stoffdaten (1)
- Turbulence (1)
- Unstructured grid (1)
- Vapor–liquid equilibrium (1)
- Visualization (1)
- YASP (1)
- amino acids (1)
- atomistic models (1)
- biaxial stretching (1)
- braking (1)
- complete basis set limit (1)
- complex problems (1)
- cross-disciplinary (1)
- data management (1)
- derivative-free optimization (1)
- design process (1)
- digital manufacturing (1)
- diversity (1)
- driver assistance system (1)
- driver interface (1)
- electrical bicycle drive unit (1)
- employability (1)
- endoscopy (1)
- evaluation (1)
- first-semester students (1)
- force field (1)
- force field development (1)
- force field parameterization (1)
- force fields (1)
- fuel (1)
- gradient-based algorithms (1)
- hands-on experiences (1)
- holistic learning (1)
- hydrocarbon (1)
- interactions (1)
- intercultural learning (1)
- interdisciplinary projects (1)
- international (1)
- learning outcomes (1)
- lipid (1)
- local optimization (1)
- magnetic hyperthermia (1)
- magnetic nanoparticles (1)
- models (1)
- molecular simulations (1)
- mp2 (1)
- multi-body dynamic simulation (1)
- multi-solution optimization (1)
- multiscale parameterization (1)
- multistep (1)
- neural networks (1)
- noise, vibration, and harshness (1)
- non-linear projection (1)
- numerical optimisation (1)
- objective function (1)
- octane (1)
- ontology (1)
- orientation behavior (1)
- pancreatic cancer (1)
- peer-assisted learning (1)
- peptides (1)
- power spectrum (1)
- practical learning (1)
- pre-optimization (1)
- problem based learning (1)
- professors as mentors (1)
- professors as tutors (1)
- propan-2-ol (1)
- proteins (1)
- racing (1)
- replica (1)
- semantic technologies (1)
- shrinkage (1)
- simulation (1)
- simulation process (1)
- smoothing procedures (1)
- sparse grids (1)
- stability (1)
- structural dynamics (1)
- student activating approaches (1)
- temporal discretization (1)
- tensile test (1)
- thermo-mechanical properties (1)
- time integration (1)
- trapezoidal rule (1)
- uniaxial stretching (1)
- vehicle dynamics (1)
- water dimer (1)
- weighting factors (1)
- wind nuisance (1)
Integrating physical simulation data into data ecosystems challenges the compatibility and interoperability of data management tools. Semantic web technologies and relational databases mostly use other data types, such as measurement or manufacturing design data. Standardizing simulation data storage and harmonizing the data structures with other domains is still a challenge, as current standards such as the ISO standard STEP (ISO 10303 ”Standard for the Exchange of Product model data”) fail to bridge the gap between design and simulation data. This challenge requires new methods, such as ontologies, to rethink simulation results integration. This research describes a new software architecture and application methodology based on the industrial standard ”Virtual Material Modelling in Manufacturing” (VMAP). The architecture integrates large quantities of structured simulation data and their analyses into a semantic data structure. It is capable of providing data permeability from the global digital twin level to the detailed numerical values of data entries and even new key indicators in a three-step approach: It represents a file as an instance in a knowledge graph, queries the file’s metadata, and finds a semantically represented process that enables new metadata to be created and instantiated.
Force field (FF) based molecular modeling is an often used method to investigate and study structural and dynamic properties of (bio-)chemical substances and systems. When such a system is modeled or refined, the force field parameters need to be adjusted. This force field parameter optimization can be a tedious task and is always a trade-off in terms of errors regarding the targeted properties. To better control the balance of various properties’ errors, in this study we introduce weighting factors for the optimization objectives. Different weighting strategies are compared to fine-tune the balance between bulk-phase density and relative conformational energies (RCE), using n-octane as a representative system. Additionally, a non-linear projection of the individual property-specific parts of the optimized loss function is deployed to further improve the balance between them. The results show that the overall error is reduced. One interesting outcome is a large variety in the resulting optimized force field parameters (FFParams) and corresponding errors, suggesting that the optimization landscape is multi-modal and very dependent on the weighting factor setup. We conclude that adjusting the weighting factors can be a very important feature to lower the overall error in the FF optimization procedure, giving researchers the possibility to fine-tune their FFs.
Stably stratified Taylor–Green vortex simulations are performed by lattice Boltzmann methods (LBM) and compared to other recent works using Navier–Stokes solvers. The density variation is modeled with a separate distribution function in addition to the particle distribution function modeling the flow physics. Different stencils, forcing schemes, and collision models are tested and assessed. The overall agreement of the lattice Boltzmann solutions with reference solutions from other works is very good, even when no explicit subgrid model is used, but the quality depends on the LBM setup. Although the LBM forcing scheme is not decisive for the quality of the solution, the choice of the collision model and of the stencil are crucial for adequate solutions in underresolved conditions. The LBM simulations confirm the suppression of vertical flow motion for decreasing initial Froude numbers. To gain further insight into buoyancy effects, energy decay, dissipation rates, and flux coefficients are evaluated using the LBM model for various Froude numbers.
This paper presents a novel approach to address noise, vibration, and harshness (NVH) issues in electrically assisted bicycles (e-bikes) caused by the drive unit. By investigating and optimising the structural dynamics during early product development, NVH can decisively be improved and valuable resources can be saved, emphasising its significance for enhancing riding performance. The paper offers a comprehensive analysis of the e-bike drive unit’s mechanical interactions among relevant components, culminating—to the best of our knowledge—in the development of the first high-fidelity model of an entire e-bike drive unit. The proposed model uses the principles of elastic multi body dynamics (eMBD) to elucidate the structural dynamics in dynamic-transient calculations. Comparing power spectra between measured and simulated motion variables validates the chosen model assumptions. The measurements of physical samples utilise accelerometers, contactless laser Doppler vibrometry (LDV) and various test arrangements, which are replicated in simulations and provide accessibility to measure vibrations onto rotating shafts and stationary structures. In summary, this integrated system-level approach can serve as a viable starting point for comprehending and managing the NVH behaviour of e-bikes.
Quality diversity algorithms can be used to efficiently create a diverse set of solutions to inform engineers' intuition. But quality diversity is not efficient in very expensive problems, needing 100.000s of evaluations. Even with the assistance of surrogate models, quality diversity needs 100s or even 1000s of evaluations, which can make it use infeasible. In this study we try to tackle this problem by using a pre-optimization strategy on a lower-dimensional optimization problem and then map the solutions to a higher-dimensional case. For a use case to design buildings that minimize wind nuisance, we show that we can predict flow features around 3D buildings from 2D flow features around building footprints. For a diverse set of building designs, by sampling the space of 2D footprints with a quality diversity algorithm, a predictive model can be trained that is more accurate than when trained on a set of footprints that were selected with a space-filling algorithm like the Sobol sequence. Simulating only 16 buildings in 3D, a set of 1024 building designs with low predicted wind nuisance is created. We show that we can produce better machine learning models by producing training data with quality diversity instead of using common sampling techniques. The method can bootstrap generative design in a computationally expensive 3D domain and allow engineers to sweep the design space, understanding wind nuisance in early design phases.
Heutzutage werden alternative Mobilitätslösungen immer wichtiger. Dabei haben eBikes ihr Potential längst unter Beweis gestellt. Der zugehörige Markt ist über die letzten 10 Jahre enorm gewachsen und gleichermaßen auch die Erwartungen an das Produkt, wie bspw. eine Fahrt ohne störende Vibrationen und Geräusche zu haben. Der Motorfreilauf leistet dabei einen maßgeblichen Einfluss auf das dynamische Verhalten. In diesem Beitrag soll daher eine methodische Vorgehensweise vorgestellt werden, um mittels Versuch und Simulation den Einfluss des Motorfeilaufs auf das dynamische Verhalten der eBike Antriebseinheit zu bestimmen.
This study investigates the initial stage of the thermo-mechanical crystallization behavior for uni- and biaxially stretched polyethylene. The models are based on a mesoscale molecular dynamics approach. We take constraints that occur in real-life polymer processing into account, especially with respect to the blowing stage of the extrusion blow-molding process. For this purpose, we deform our systems using a wide range of stretching levels before they are quenched. We discuss the effects of the stretching procedures on the micro-mechanical state of the systems, characterized by entanglement behavior and nematic ordering of chain segments. For the cooling stage, we use two different approaches which allow for free or hindered shrinkage, respectively. During cooling, crystallization kinetics are monitored: We precisely evaluate how the interplay of chain length, temperature, local entanglements and orientation of chain segments influence crystallization behavior. Our models reveal that the main stretching direction dominates microscopic states of the different systems. We are able to show that crystallization mainly depends on the (dis-)entanglement behavior. Nematic ordering plays a secondary role.
How self-reliant Peer Teaching can be set up to augment learning outcomes for university learners
(2022)
Off-lattice Boltzmann methods increase the flexibility and applicability of lattice Boltzmann methods by decoupling the discretizations of time, space, and particle velocities. However, the velocity sets that are mostly used in off-lattice Boltzmann simulations were originally tailored to on-lattice Boltzmann methods. In this contribution, we show how the accuracy and efficiency of weakly and fully compressible semi-Lagrangian off-lattice Boltzmann simulations is increased by velocity sets derived from cubature rules, i.e. multivariate quadratures, which have not been produced by the Gauss-product rule. In particular, simulations of 2D shock-vortex interactions indicate that the cubature-derived degree-nine D2Q19 velocity set is capable to replace the Gauss-product rule-derived D2Q25. Likewise, the degree-five velocity sets D3Q13 and D3Q21, as well as a degree-seven D3V27 velocity set were successfully tested for 3D Taylor-Green vortex flows to challenge and surpass the quality of the customary D3Q27 velocity set. In compressible 3D Taylor-Green vortex flows with Mach numbers Ma={0.5;1.0;1.5;2.0} on-lattice simulations with velocity sets D3Q103 and D3V107 showed only limited stability, while the off-lattice degree-nine D3Q45 velocity set accurately reproduced the kinetic energy provided by literature.
In this contribution, we perform computer simulations to expedite the development of hydrogen storages based on metal hydride. These simulations enable in-depth analysis of the processes within the systems which otherwise could not be achieved. That is, because the determination of crucial process properties require measurement instruments in the setup which are currently not available. Therefore, we investigate the reliability of reaction values that are determined by a design of experiments.
Specifically, we first explain our model setup in detail. We define the mathematical terms to obtain insights into the thermal processes and reaction kinetics. We then compare the simulated results to measurements of a 5-gram sample consisting of iron-titanium-manganese (FeTiMn) to obtain the values with the highest agreement with the experimental data. In addition, we improve the model by replacing the commonly used Van’t-Hoff equation by a mathematical expression of the pressure-composition-isotherms (PCI) to calculate the equilibrium pressure.
Finally, the parameters’ accuracy is checked in yet another with an existing metal hydride system. The simulated results demonstrate high concordance with experimental data, which advocate the usage of approximated kinetic reaction properties by a design of experiments for further design studies. Furthermore, we are able to determine process parameters like the entropy and enthalpy.
Turbulent compressible flows are traditionally simulated using explicit time integrators applied to discretized versions of the Navier-Stokes equations. However, the associated Courant-Friedrichs-Lewy condition severely restricts the maximum time-step size. Exploiting the Lagrangian nature of the Boltzmann equation’s material derivative, we now introduce a feasible three-dimensional semi-Lagrangian lattice Boltzmann method (SLLBM), which circumvents this restriction. While many lattice Boltzmann methods for compressible flows were restricted to two dimensions due to the enormous number of discrete velocities in three dimensions, the SLLBM uses only 45 discrete velocities. Based on compressible Taylor-Green vortex simulations we show that the new method accurately captures shocks or shocklets as well as turbulence in 3D without utilizing additional filtering or stabilizing techniques other than the filtering introduced by the interpolation, even when the time-step sizes are up to two orders of magnitude larger compared to simulations in the literature. Our new method therefore enables researchers to study compressible turbulent flows by a fully explicit scheme, whose range of admissible time-step sizes is dictated by physics rather than spatial discretization.
Animal models are often needed in cancer research but some research questions may be answered with other models, e.g., 3D replicas of patient-specific data, as these mirror the anatomy in more detail. We, therefore, developed a simple eight-step process to fabricate a 3D replica from computer tomography (CT) data using solely open access software and described the method in detail. For evaluation, we performed experiments regarding endoscopic tumor treatment with magnetic nanoparticles by magnetic hyperthermia and local drug release. For this, the magnetic nanoparticles need to be accumulated at the tumor site via a magnetic field trap. Using the developed eight-step process, we printed a replica of a locally advanced pancreatic cancer and used it to find the best position for the magnetic field trap. In addition, we described a method to hold these magnetic field traps stably in place. The results are highly important for the development of endoscopic tumor treatment with magnetic nanoparticles as the handling and the stable positioning of the magnetic field trap at the stomach wall in close proximity to the pancreatic tumor could be defined and practiced. Finally, the detailed description of the workflow and use of open access software allows for a wide range of possible uses.
In this study, we investigate the thermo-mechanical relaxation and crystallization behavior of polyethylene using mesoscale molecular dynamics simulations. Our models specifically mimic constraints that occur in real-life polymer processing: After strong uniaxial stretching of the melt, we quench and release the polymer chains at different loading conditions. These conditions allow for free or hindered shrinkage, respectively. We present the shrinkage and swelling behavior as well as the crystallization kinetics over up to 600 ns simulation time. We are able to precisely evaluate how the interplay of chain length, temperature, local entanglements and orientation of chain segments influences crystallization and relaxation behavior. From our models, we determine the temperature dependent crystallization rate of polyethylene, including crystallization onset temperature.
The lattice Boltzmann method (LBM) is an efficient simulation technique for computational fluid mechanics and beyond. It is based on a simple stream-and-collide algorithm on Cartesian grids, which is easily compatible with modern machine learning architectures. While it is becoming increasingly clear that deep learning can provide a decisive stimulus for classical simulation techniques, recent studies have not addressed possible connections between machine learning and LBM. Here, we introduce Lettuce, a PyTorch-based LBM code with a threefold aim. Lettuce enables GPU accelerated calculations with minimal source code, facilitates rapid prototyping of LBM models, and enables integrating LBM simulations with PyTorch's deep learning and automatic differentiation facility. As a proof of concept for combining machine learning with the LBM, a neural collision model is developed, trained on a doubly periodic shear layer and then transferred to a different flow, a decaying turbulence. We also exemplify the added benefit of PyTorch's automatic differentiation framework in flow control and optimization. To this end, the spectrum of a forced isotropic turbulence is maintained without further constraining the velocity field.
Off-lattice Boltzmann methods increase the flexibility and applicability of lattice Boltzmann methods by decoupling the discretizations of time, space, and particle velocities. However, the velocity sets that are mostly used in off-lattice Boltzmann simulations were originally tailored to on-lattice Boltzmann methods. In this contribution, we show how the accuracy and efficiency of weakly and fully compressible semi-Lagrangian off-lattice Boltzmann simulations is increased by velocity sets derived from cubature rules, i.e. multivariate quadratures, which have not been produced by the Gauß-product rule. In particular, simulations of 2D shock-vortex interactions indicate that the cubature-derived degree-nine D2Q19 velocity set is capable to replace the Gauß-product rule-derived D2Q25. Likewise, the degree-five velocity sets D3Q13 and D3Q21, as well as a degree-seven D3V27 velocity set were successfully tested for 3D Taylor–Green vortex flows to challenge and surpass the quality of the customary D3Q27 velocity set. In compressible 3D Taylor–Green vortex flows with Mach numbers on-lattice simulations with velocity sets D3Q103 and D3V107 showed only limited stability, while the off-lattice degree-nine D3Q45 velocity set accurately reproduced the kinetic energy provided by literature.