ASCR: Exascale to Burst Bubbles Blocking Carbon Capture – News Analysis on Supercomputing

Source: ASCR (Advanced Scientific Computing Research), DOE Office of Science

A visualization of CO2 bubbles forming in a chemical cycle reactor. As the bubbles rise, they form large areas of low density that can affect a reactor’s efficiency. (Image: Los Alamos National Laboratory)

Bubbles could block a promising technology that would separate carbon dioxide from industrial emissions and capture the greenhouse gas before it contributes to climate change.

A team of researchers with support from the Department of Energy’s Exascale Computing Project (ECP) aims to break the barrier by using code that captures the floating bubbles and provides insights to deter them.

Chemical looping reactors (CLRs) combine fuels such as methane with oxygen from metal oxide particles before combustion. The reaction produces water vapor and carbon dioxide, which can be easily separated to produce a pure CO2 stream for sequestration or industrial use. Standard post-combustion separation requires the removal of carbon dioxide from a multi-gas mixture.

The reduced metal particles then enter an air reactor where they are oxidized, generating heat that can be used to generate electricity. The replenished carrier particles return to the fuel reactor, thus completing the cycle.

However, the complex flow behavior of the fuel reactor typically results in bubbling, reduced performance, and unused fuel contaminating the exhaust gases. The multiphase processes of CLRs – involving solids, gases and liquids – are also difficult to transfer to industrial use.

“We want the system to perform well, and we can work towards that goal by maximizing the contact between the fuel and the solid oxidizer,” says Jordan Musser, a physics research scientist at the National Energy Technology Laboratory (NETL ). “To do this, we have to prevent the formation of large bubbles that impede the mixing of gas and solids. Another concern is understanding how we can scale up the units we develop in the lab to achieve economies of scale and make these reactors useful to society.”

READ :  TSTC recognizes its own on National IT Professionals Day

Musser is principal investigator for the ECP project MFIX-Exa, Performance Prediction for Multiphase Energy Conversion Device. The initiative, which involves collaborators from Lawrence Berkeley National Laboratory and the University of Colorado Boulder, combines visualization and analysis capabilities developed by ECP with MFIX-Exa, a particle-based multi-phase Computational Fluid Dynamics (CFD) simulation used in Exa- Scale can be run – a trillion calculations per second. Resources from the Oak Ridge Leadership Computing Facility contributed to the work.

“MFIX-Exa is open-source code specifically designed for modeling reacting gas-solid flows in complex reactor geometries,” says Musser.

The team played a video produced by Alexandra Stewart, an intern at Los Alamos National Laboratory (LANL), demonstrating progress in tuning MFIX-Exa with visualization tools at last fall’s SC22 conference in Dallas. It showed that the project can effectively model and intuitively represent bubbling, an important milestone in reactor design optimization.

Finding a model that accurately simulates the CLR environment begins with understanding how two main components – the fuel reactor and the air reactor – behave. Tracking both introduces a computational load balancing issue that can affect the efficiency of the modeling software.

MFIX was developed in the 1980s to run on standard computer processors. It’s precise but slow. The team rewrote MFIX’s entire particle solver, modernizing the old Fortran code to make it a modern, massively parallel program. MFIX-Exa runs on graphics processing units (GPUs), accelerators that promise to increase the speed of exascale machines. It uses multiple strategies to distribute particle calculations and fluid calculations across GPUs, ensuring efficient and constant hardware usage.

READ :  High performance SOSA and CMOSS aligned embedded computing chassis introduced by Annapolis Microsystems

However, it was difficult to create GPU-enabled code to simulate reactive gas-solid flows. The team leveraged the open-source software framework AMReX, which is designed to enable massively parallel Adaptive Mesh Refinement (AMR) exascale applications. AMR focuses computation on the most important parts of a simulation, capturing detail there while conserving computing power in other parts of the domain. Developed by teams at Berkeley Lab, the National Renewable Energy Laboratory, and Argonne National Laboratory, AMReX supports both commodity processors and GPUs, and multiple programming models and languages.

Musser and colleagues used MFIX-Exa to model the fuel reactor in NETL’s 50 kW CLR, interactively detecting bubbles and tracking their characteristics such as volume and velocity. To develop an application-specific scientific workflow, they collaborated with the software teams from ALPINE and Cinema ECP.

The ALPINE project, led by LANL’s James Ahrens, is developing infrastructures and algorithms for in situ visualization – ones that run concurrently with MFIX-Exa to detect bubbles and other features. The MFIX exa workflow also leverages Catalyst, a library to access and run an in situ statistical feature detection algorithm. It is part of Kitware Inc.’s suite of visualization tools.

Cinema, a novel visualization tool, processes post-simulation data and interactively explores and analyzes large amounts of scientific data. Ahrens developed Cinema, which is now directed by Los Alamos resident David Rogers.

Catalyst’s statistical feature detection algorithm performs on-site downsampling of the MFIX exa simulation data more than 300 times into a set of functions that assign a single number to each point in space. Developed by former LANL computer scientist Soumya Dutta, the algorithm determines how similar a small section of the simulation is to a target feature – a bubble or cavity with low particle density. The bubble similarity field calculates the probability that a bubble is present in a given simulation region. The algorithm can also return other useful fields, such as particle velocity.

READ :  Is DigitalOcean Stock a Buy Now?

Next, the workflow uses the simulation output fields to create images that show the evolution of the bubble over time. These are linked to a Cinema database and visualized via a dedicated viewer, allowing subject matter experts to perform post-simulation analysis focusing on specific parameters and then visualize bubble dynamics along those dimensions.

“If you know what visualizations you need during the simulation, you can just push out all those images and not have to render them later,” says Terece Turton, research associate at LANL and integration lead at ALPINE. “When you look at a simulation with billions of particles, it becomes, to a certain extent, unsolvable to save all of that to disk and post-process it. Also, a lot of the data isn’t particularly useful.”

With Cinema’s real-time and post-simulation visualizations, CLR researchers can observe phenomena such as how fluid velocity changes as bubbles rise, move laterally, merge and split, creating large areas of low density in the reactor’s fluidized beds, potentially contributing to its efficiency decreased. Finally, with the output of MFIX-Exa, downscaled bubble functions, and Cinema, they can propose design changes for scaled-up reactor models and assess how these affect bubble formation or other reactor functions.

“The instant visualization is worth every time we invest,” says Musser. “We can run a lot of diagnostics on our code just by looking at the flow patterns. It saves a lot of work for our team. It’s phenomenal.”