Materials Simulation Lab

Materials Simulation Lab

In our laboratory for material simulation we bundle our expertise in material simulation. This extends from the micro to meso to macro scale. The creation of simulations at different scales is crucial to understand material behavior, verify models and compare them with tests at different scales.

Application areas

Simulations are created in our institute in a wide variety of areas. Whether the hydration process of cement, the heat flow through different materials, transport mechanisms or the interaction of concrete with other materials are to be investigated, each question can not only be practically demonstrated in our laboratories but also supported by simulations.

These questions can be considered in the individual direction by themselves or in combination.

Not only different influences can be combined, but also different scales. For example, a multiscale model can provide the influence of microstructure on meso- or macroscopic material properties.

Applications

Simulation of a phase change material (1D)
Simulation of a phase change material (1D)

In order to be able to cover the broad spectrum, programs or development environments that can adapt to the challenges are also necessary. Therefore, we also work with software solutions tailored to the issues.

For example, commercial software such as Abaqus can be used for direct proofs and verifications. Open source software, on the other hand, can offer greater freedom in research.

However, environments such as Matlab or Python also help with evaluation and research.

For looking at multiple variable influences or applying the phase field method, multiphysics solutions such as MOOSE Framework or openFoam are also used.

Experiences in various numerical applications:

Some applications that have already been used, in some cases in different areas, include:

  • Abaqus
  • Hymostruc
  • MOOSE Framework
  • OpenFOAM
  • PHREEQC
  • LAMMPS Molecular Dynamics Simulator
  • Matlab

This list is continuously expanded to achieve the best solutions for our research and for investigations.

Computational resources

Depending on the problem, different computing power is required.

While the computing power of the standard computers available on the market may be sufficient for static, linear calculations, increased resolutions or nonlinearities may already exceed these resources.

To be prepared for this, capacities are available at different levels:

In addition to the normal workstation computers, computers optimized for numerical simulations are also available.

These are computers with:

  • 8-core processors
  • Up to 128GB RAM for non-linear calculations with a high number of degrees of freedom
  • NVIDIA graphics cards for GPU acceleration with Cuda

High Performance Computing at WiB

Top: Supermicro GPU Server; Bottom: HPE ProLiant
Top: Supermicro GPU Server; Bottom: HPE ProLiant

Our Institute own a High Performance Computer (HPC). The system is build with the power of a dual socket Supermicro GPU node with AMD EPYC™ Milan CPUs and 512GB of memory. This state-of-the-art technology will enable us to perform complex computations at high speeds and accuracy.

The AMD EPYC™ processors with a high core count are designed specifically for HPC workloads, delivering superior performance, scalability, and energy efficiency.

Additional graphic cards (GPUs) will accelerate compute-intensive tasks such as machine learning, deep learning, and scientific simulations. The combination of these powerful CPUs and GPUs enables faster data processing, leading to quicker time-to-insight and better decision making.

Furthermore, both nodes comes with together 768GB of DDR4 RAM, allowing for efficient handling of massive amounts of data as well as the computation of large systems with a high amount of nodes. This also ensures that even the most demanding applications run smoothly without experiencing any bottlenecks or slowdowns.

With this advanced hardware configuration, our HPC system is capable of performing a wide range of calculations, enabling our team to tackle some of the most challenging problems.

These capabilities allowing us simulating

  • complex systems such as heterogenous materials with millions of nodes
  • nonlinear materials behavior where high precision is needed
  • multi-physics simulations involving the complex coupling of its own set of PDEs (partial differential equations) utilizing the parallel processing capabilities of the CPU and GPU architectures
  • analyzing vast amounts of data, or developing cutting-edge AI models
  • etc.

This leads to more detailed and realistic simulations of complex systems.

The first node in our HPC system is specifically engineered for tackling large-scale computational workloads and features top-of-the-line components designed to maximize performance and reliability.

Currently the node consists of:

  • Supermicro GPU Node
  • AMD EPYC™ Milan CPUs with 128 Cores
  • 512 GB DDR4 ECC Memory
  • up to 8 double width GPUs
  • high bandwidth storage
CPU cooler of Supermicro GPU node
CPU cooler of Supermicro GPU node

The second node is build for data processing, the connectivity of our laboratories and virtual desktops to offload tasks with medium workloads.

The current configuration consists of:

  • HPE ProLiant Base
  • AMD EPYC™ CPUs 32 Cores (dual socket)
  • 256 GB DDR4 ECC Memory
  • up to 3 double width GPUs
  • high bandwidth storage
Top: Supermicro GPU Server; Bottom: HPE ProLiant
Top: Supermicro GPU Server; Bottom: HPE ProLiant

High Performance Computing

If the computing power is not sufficient, we already have experience with the Lichtenberg high-performance computer. Here, for example, parallel calculations with a large number of processors can be used by means of Message Passing Interface (MPI), so that even very complex problems can be solved.

This means that the computing power at the WiB is not a limiting factor.

Lichtenberg high-performance computer at the TU Darmstadt

The Lichtenberg High Performance Computer, part of the Hessian Competence Center for High Performance Computing, provides users with large computing capacities of up to 3.148 PFlop/s and 257 TBytes of RAM.

More info can be found here:

To the homepage of the Lichtenberg Cluster

Presentation of some research areas

The application of the PF model.
The application of the PF model.

Concrete structures are affected by various environmental factors, such as changes in temperature and humidity, external loads, etc., which cause micro-cracks or damage of different shapes and sizes. This greatly reduces the durability of concretes1. The autogenous self-healing mechanism is one of the methods to effectively extend the lifespan of concretes. This work aims to investigate the autogenous self-healing process of small cracks with a width of less than 0.2 mm in concrete using the numerical method. Autogenous self-healing is consist of the calcium hydroxide dissolution and the precipitation reaction of calcium carbonate. A phase-field model is developed by incorporating the chemical reaction kinetics, diffusion and thermodynamics, and numerically implemented by the finite element method (FEM) within the MOOSE framework. In order to investigate the evolution of the chemistry of the system, a 1D reaction-diffusion model is employed by means of geochemistry PHREEQC calculation code. In this model, the carbon dioxide is added to water at a logarithmic partial pressure of -3.4 in order to model a situation with unlimited supply of calcium hydroxide to the water phase. The simulation results in PHREEQC was used to feed parameters of the phase-field model. We further investigate the microscopic crack morphology by performing multiple simulations with the parameter informed from the experimental tests.

The main objective of this project is to use kinetic Monte Carlo (KMC) approach in order to compute the dissolution time of Portlandite and initial phases of cementitious materials such as Alite and Belite.

In the modeling of contact surfaces, the material behavior at large stiffness jumps is to be investigated. The main focus is on the contribution to the stiffness and the dynamic behavior in the composite.

This should lead to a more efficient use of materials.

The intention of this project is to work on multiscale modelling to obtain the mechanical properties computation between nano and micro levels of Geopolymer microstructure. Therefore, in this way, we do the simulation of the coarse grained of geopolymer/CNTs from available LAMMPS (Large- Scale Atomic/Molecular Massively Parallel Simulator) trajectory file, which has been computed by molecular dynamics simulation approach. Then, mechanical properties computation through Monte Carlo approaches is defined as the next step, which can be performed by LAMMPS package.

Schematic representation of solute concentration of soluble minerals in situ and adaptive mesh refinement in the simulation for mineral dissolution
Schematic representation of solute concentration of soluble minerals in situ and adaptive mesh refinement in the simulation for mineral dissolution

Modelling of a mineral dissolution front propagation is of interest in a wide range of scientific and engineering fields. The dissolution of minerals often involves complex physico-chemical processes at the solid–liquid interface (at nano-scale), which at the micro-to-meso-scale can be simplified to the problem of continuously moving boundaries. In this work, we studied the dif-fusion-controlled congruent dissolution of minerals from a meso-scale phase transition perspec-tive. The dynamic evolution of the solid–liquid interface, during the dissolution process, is nu-merically simulated by employing the Finite Element Method (FEM) and using the phase–field (PF) approach, the latter implemented in the open-source Multiphysics Object Oriented Simula-tion Environment (MOOSE). The parameterization of the PF numerical approach is discussed in detail and validated against the experimental results for a congruent dissolution case of NaCl (taken from literature) as well as on analytical models for simple geometries. In addition, the effect of the shape of a dissolving mineral particle was analysed, thus demonstrating that the PF approach is suitable for simulating the mesoscopic morphological evolution of arbitrary geom-etries. Finally, the comparison of the PF method with experimental results demonstrated the importance of the dissolution rate mechanisms, which can be controlled by the interface reaction rate or by the diffusive transport mechanism.

Simulations done at the WiB

Here you can see some examples of past and current simulations at our institute: