From 03dcf8fdde169a3d888697c6cd36f9915aa5f04a Mon Sep 17 00:00:00 2001 From: The Open Journals editorial robot <89919391+editorialbot@users.noreply.github.com> Date: Thu, 10 Oct 2024 20:01:29 +0100 Subject: [PATCH] Creating 10.21105.joss.06973.jats --- .../paper.jats/10.21105.joss.06973.jats | 1113 +++++++++++++++++ 1 file changed, 1113 insertions(+) create mode 100644 joss.06973/paper.jats/10.21105.joss.06973.jats diff --git a/joss.06973/paper.jats/10.21105.joss.06973.jats b/joss.06973/paper.jats/10.21105.joss.06973.jats new file mode 100644 index 0000000000..c4d77c93aa --- /dev/null +++ b/joss.06973/paper.jats/10.21105.joss.06973.jats @@ -0,0 +1,1113 @@ + + +
+ + + + +Journal of Open Source Software +JOSS + +2475-9066 + +Open Journals + + + +6973 +10.21105/joss.06973 + +GEOS: A performance portable multi-physics simulation +framework for subsurface applications + + + +https://orcid.org/0000-0002-2536-7867 + +Settgast +Randolph R. + + +* + + +https://orcid.org/0009-0004-0785-5084 + +Aronson +Ryan M. + + + + + + +Besset +Julien R. + + + + +https://orcid.org/0000-0003-2016-5403 + +Borio +Andrea + + + + +https://orcid.org/0000-0003-2648-0586 + +Bui +Quan M. + + + + + +Byer +Thomas J. + + + + +https://orcid.org/0000-0001-6816-6769 + +Castelletto +Nicola + + + + +https://orcid.org/0009-0006-3742-1425 + +Citrain +Aurélien + + + + +https://orcid.org/0009-0008-7108-9651 + +Corbett +Benjamin C. + + + + + +Corbett +James + + + + +https://orcid.org/0000-0002-6439-9263 + +Cordier +Philippe + + + + +https://orcid.org/0000-0001-7458-6401 + +Cremon +Matthias A. + + + + +https://orcid.org/0000-0002-5366-6418 + +Crook +Cameron M. + + + + +https://orcid.org/0000-0002-6024-861X + +Cusini +Matteo + + + + +https://orcid.org/0000-0001-7273-4458 + +Fei +Fan + + + + +https://orcid.org/0000-0003-0683-1203 + +Frambati +Stefano + + + + +https://orcid.org/0000-0002-8833-9425 + +Franc +Jacques + + + + +https://orcid.org/0000-0003-4395-5125 + +Franceschini +Andrea + + + + +https://orcid.org/0000-0001-8150-1090 + +Frigo +Matteo + + + + +https://orcid.org/0000-0002-7408-3350 + +Fu +Pengcheng + + + + +https://orcid.org/0000-0002-6103-4605 + +Gazzola +Thomas + + + + +https://orcid.org/0000-0002-1747-2018 + +Gross +Herve + + + + +https://orcid.org/0000-0001-8229-963X + +Hamon +Francois + + + + +https://orcid.org/0009-0002-8549-7644 + +Han +Brian M. + + + + +https://orcid.org/0000-0002-4543-8618 + +Hao +Yue + + + + + +Hasanzade +Rasim + + + + + +https://orcid.org/0000-0002-0399-0092 + +Homel +Michael + + + + +https://orcid.org/0000-0002-5380-2563 + +Huang +Jian + + + + +https://orcid.org/0000-0001-6658-8941 + +Jin +Tao + + + + +https://orcid.org/0000-0003-4110-7472 + +Ju +Isaac + + + + + +Kachuma +Dickson + + + + +https://orcid.org/0000-0001-5707-165X + +Karimi-Fard +Mohammad + + + + + +Kim +Taeho + + + + +https://orcid.org/0000-0001-9044-1827 + +Klevtsov +Sergey + + + + + +Lapene +Alexandre + + + + +https://orcid.org/0000-0002-3389-523X + +Magri +Victor A. P. + + + + +https://orcid.org/0000-0002-0329-3385 + +Mazuyer +Antoine + + + + + + +N’diaye +Mamadou + + + + +https://orcid.org/0000-0002-6111-6205 + +Osei-Kuffuor +Daniel + + + + + +Povolny +Stefan + + + + +https://orcid.org/0000-0002-5821-9158 + +Ren +Guotong + + + + + +Semnani +Shabnam J. + + + + +https://orcid.org/0000-0003-3550-0657 + +Sherman +Chris S. + + + + + +Rey +Melvin + + + + +https://orcid.org/0000-0002-3084-6635 + +Tchelepi +Hamdi A. + + + + +https://orcid.org/0009-0001-3960-6064 + +Tobin +William R. + + + + +https://orcid.org/0000-0003-4862-4288 + +Tomin +Pavel + + + + +https://orcid.org/0000-0002-8025-2616 + +Untereiner +Lionel + + + + +https://orcid.org/0000-0001-8001-5517 + +Vargas +Arturo + + + + + +Waziri +Sohail + + + + + +https://orcid.org/0000-0002-6055-4553 + +Wen +Xianhuan + + + + +https://orcid.org/0000-0003-3491-142X + +White +Joshua A. + + + + +https://orcid.org/0000-0002-9575-3886 + +Wu +Hui + + + + + +Lawrence Livermore National Laboratory, USA + + + + +TotalEnergies E&P Research & Technology, +USA + + + + +Stanford University, USA + + + + +Chevron Technical Center, USA + + + + +Politecnico di Torino, Italy + + + + +University of California San Diego + + + + +Inria, Universite de Pau et des Pays de +l’Adour + + + + +Independent + + + + +* E-mail: + + +28 +5 +2024 + +9 +102 +6973 + +Authors of papers retain copyright and release the +work under a Creative Commons Attribution 4.0 International License (CC +BY 4.0) +2022 +The article authors + +Authors of papers retain copyright and release the work under +a Creative Commons Attribution 4.0 International License (CC BY +4.0) + + + +reservoir simulations +computational mechanics +multiphase flow +C++ + + + + + + Summary +

GEOS is a simulation framework focused on solving tightly coupled + multi-physics problems with an initial emphasis on subsurface + reservoir applications. Currently, GEOS supports capabilities for + studying carbon sequestration, geothermal energy, hydrogen storage, + and related subsurface applications. The unique aspect of GEOS that + differentiates it from existing reservoir simulators is the ability to + simulate tightly coupled compositional flow, poromechanics, fault + slip, fracture propagation, and thermal effects, etc. Extensive + documentation is available on the + GEOS + documentation pages + (GEOS + Documentation, 2024). Note that GEOS, as presented + here, is a complete rewrite of the previous incarnation of the GEOS + referred to in + (Settgast + et al., 2017).

+
+ + Statement of need +

The threat of climate change has resulted in an increased focus on + mitigating carbon emissions into the atmosphere. Carbon Capture and + Storage (CCS) of CO2 in subsurface reservoirs and saline + aquifers is an important component in the strategy to meet global + climate goals. Given the 2050 net-zero emissions goals, CO2 + storage capacities required to offset emissions is orders of magnitude + greater than current levels + (Intergovernmental + Panel on Climate Change IPCC, 2023). Evaluation of reservoir + performance and containment risks associated with the injection of + liquefied CO2 in the subsurface in a reproducible and + transparent manner is an important consideration when evaluating new + storage sites. As an example of typical complexities in carbon storage + reservoirs, the 11th Society of Petroleum Engineers Comparative + Solution Project (SPE11) + (Nordbotten + et al., 2024) provides a benchmark example for evaluating the + the predictions of carbon storage simulators. The goal of GEOS is to + provide the global community with an exascale capable open-source tool + that is capable of simulating the complex coupled physics that occurs + when liquefied CO2 is injected into a subsurface reservoir. + To this end, GEOS is freely available and focused on the simulation of + reservoir integrity through various failure mechanisms such as caprock + failure, fault leakage, and wellbore failure. Open source projects + such as OPM + (Rasmussen + et al., 2021), OpenGeoSys + (Naumov + et al., 2024), DuMux + (Koch + et al., 2020) and Darts + (Voskov + et al., 2024) are example efforts that share similar + objectives. However, GEOS stands out in two key areas: the explicit + fault modeling coupled with flow and mechanical deformation, and the + focus on performance portability on platforms ranging from + workstations to exascale supercomputers.

+
+ + GEOS Components +

The core C++17 infrastructure provides common computer science + capabilities typically required for solving differential equations + using a spatially discrete method. The components of the + infrastructure provided by GEOS include a data hierarchy, a discrete + mesh data structure, a mesh-based MPI communications interface, + degree-of-freedom management, IO services, and a physics package + interface.

+

By design, GEOS is intended to be a generic multi-physics + simulation platform. The physics package interface in GEOS is intended + to encapsulate the development of numerical methods applied to the + solution of governing equations relevant to a problem. When + implementing a physics package for a set of coupled physics equations, + each individual physics package is first developed as a stand-alone + capability. The single physics capabilities are then applied together + in a coupled physics package and solved through a flexible strategy + ranging from solving the fully monolithic system to a split operator + approach.

+

To solve the linear systems that arise from the boundary value + problem, GEOS maintains a generic linear algebra interface (LAI) + capable of wrapping various linear algebra packages such as hypre + (Falgout + & Yang, 2002), PETSc + (Balay + et al., 2024), and Trilinos + (Heroux + et al., 2005). Currently, in GEOS only the hypre interface is + actively maintained. For every multi-physics problem involving the + solution of a coupled linear system, GEOS currently relies on a + multigrid reduction preconditioning strategy available in hypre + (Bui + et al., 2020, + 2021).

+

The performance portability strategy utilized by GEOS applies + LLNL’s suite of portability tools RAJA + (David + A. Beckingsale et al., 2019), CHAI + (CHAI, + 2023), and Umpire + (D. + A. Beckingsale et al., 2020). The RAJA performance portability + layer provides + performance + portable kernel launching and wrappers for reductions, + atomics, and local/shared memory to achieve performance on both CPU + and GPU hardware. The combination of CHAI/Umpire provides memory + motion management for platforms with heterogeneous memory spaces + (i.e., host and device memory). Through this strategy, GEOS has been + successfully run on platforms ranging from GPU-based Exa-scale systems + to CPU-based laptops with near-optimal of performance.

+

In addition to its C++ core, the GEOS project provides a Python3 + interface that allows for the integration of the simulation + capabilities into complex Python workflows involving components + unrelated to GEOS.

+
+ + Applications +

To date GEOS has been used to simulate problems relevant to + CO2 storage, enhanced geothermal systems, hydrogen storage, + and both conventional and unconventional oil and gas extraction. Often + these simulations involve coupling between compositional multiphase + flow and transport, poroelasticity, thermal transport, and + interactions with faults and fractures.

+

As an example of a field case where GEOS has been applied, we + present a coupled compositional flow/mechanics simulation of + CO2 injection and storage at a large real-world storage + site. Figure + [RW_results]a + illustrates the computational mesh and Figure + [RW_results]b shows + results after 25 years of injection. Simulations such as this will + play a critical role in predicting the viability of potential + CO2 storage sites.

+ +

Real world CO2 storage site: (a) discrete + mesh, transparency is used for the overburden region to reveal the + complex faulted structure of the storage reservoir; (b) results of a + compositional flow simulation after 25 years of CO2 + injection. The CO2 plume is shown in white near the + bottom of the well. Colors in the reservoir layer indicate changes + in fluid pressure, and the colors in the overburden indicate + vertical displacement resulting from the injection. Note that color + scales have been removed + intentionally.

+ +
+

As an example of the weak scalability of GEOS on an exascale class + system, we present two weak scaling studies on a simple wellbore + geometry run on the Frontier supercomputer at Oak Ridge National + Laboratory. Frontier is comprised of 9,472 Cray EX235a nodes that each + contain a single AMD EPYC 7A53 CPU and four AMD MI250X GPUs + (Atchley + et al., 2023). Note that each MI250X is comprised of two + Graphics Compute Dies (GCD), with each GCD appearing as a GPU to the + operating system. A more detailed discussion and instructions to + reproduce the results are available in the + Performance + Benchmarks of the GEOS documentation.

+

The weak scaling results for mechanics are presented in (Figure + [fig:Frontier_scaling]a) + and shows nearly flat scaling of the GEOS processes (assembly/field + synchronization) up to 32,768 GPUs ( + + 81.3×109 + degrees-of-freedom). There is a moderate decrease in efficiency with + the application of the hypre preconditioner setup and solve, but given + the complexity of those algorithms, this level of scaling efficiency + is excellent. The weak scaling results of compositional flow are + presented in Figure + [fig:Frontier_scaling]b + shows excellent scaling up to 2,048 GPUs.

+ +

Weak scaling results on ORNL/Frontier: average execution + time per newton iteration vs number of GPUs for a mechanics (a) and + a compositional flow (b) simulation, + respectively.

+ +
+
+ + Acknowledgements +

This work was performed under the auspices of the U.S. Department + of Energy by Lawrence Livermore National Laboratory under Contract + DE-AC52-07NA27344. LLNL release number LLNL-JRNL-864747.

+

This research was supported by the Exascale Computing Project + (ECP), Project Number: 17-SC-20-SC, a collaborative effort of two DOE + organizations - the Office of Science and the National Nuclear + Security Administration, responsible for the planning and preparation + of a capable exascale ecosystem, including software, applications, + hardware, advanced system engineering and early testbed platforms, to + support the nation’s exascale computing imperative.

+

Support was provided by TotalEnergies and Chevron through the + FC-MAELSTROM project, a collaborative effort between Lawrence + Livermore National Laboratory, TotalEnergies, Chevron, and Stanford + University, aiming to develop an exascale compatible, multiscale, + research-oriented simulator for modeling fully coupled flow, transport + and geomechanics in geological formations.

+
+ + + + + + + + SettgastRandolph R. + FuPengcheng + WalshStuart D. C. + WhiteJoshua A. + AnnavarapuChandrasekhar + RyersonFrederick J. + + A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions + International Journal for Numerical and Analytical Methods in Geomechanics + 2017 + 41 + 5 + 10.1002/nag.2557 + 627 + 653 + + + + + + BeckingsaleDavid A. + BurmarkJason + HornungRich + JonesHolger + KillianWilliam + KunenAdam J. + PearceOlga + RobinsonPeter + RyujinBrian S. + ScoglandThomas R. W. + + RAJA: Portable performance for large-scale scientific applications + 2019 IEEE/ACM international workshop on performance, portability and productivity in HPC (P3HPC) + 2019 + 10.1109/P3HPC49587.2019.00012 + 71 + 81 + + + + + + CHAI + + CHAI + GitHub repository + GitHub + 2023 + https://github.com/LLNL/chai + + + + + + BeckingsaleD. A. + McFaddenM. J. + DahmJ. P. S. + PankajakshanR. + HornungR. D. + + Umpire: Application-focused management and coordination of complex hierarchical memory + IBM Journal of Research and Development + 2020 + 64 + 3/4 + 10.1147/JRD.2019.2954403 + 15:1 + 15:10 + + + + + + FalgoutR. D. + YangU. M. + + Hypre: A library of high performance preconditioners + Lecture notes in computer science + 2002 + 10.1007/3-540-47789-6_66 + 632 + 641 + + + + + + BalaySatish + AbhyankarShrirang + AdamsMark F. + BensonSteven + BrownJed + BrunePeter + BuschelmanKris + ConstantinescuEmil M. + DalcinLisandro + DenerAlp + EijkhoutVictor + FaibussowitschJacob + GroppWilliam D. + HaplaVáclav + IsaacTobin + JolivetPierre + KarpeevDmitry + KaushikDinesh + KnepleyMatthew G. + KongFande + KrugerScott + MayDave A. + McInnesLois Curfman + MillsRichard Tran + MitchellLawrence + MunsonTodd + RomanJose E. + RuppKarl + SananPatrick + SarichJason + SmithBarry F. + ZampiniStefano + ZhangHong + ZhangHong + ZhangJunchao + + PETSc Web page + 2024 + https://petsc.org/ + + + + + + HerouxM. A. + BartlettR. A. + HowleV. E. + HoekstraR. J. + HuJ. J. + KoldaT. G. + LehoucqR. B. + LongK. R. + PawlowskiR. P. + PhippsE. T. + SalingerA. G. + ThornquistH. K. + TuminaroR. S. + WillenbringJ. M. + WilliamsA. + StanleyK. S. + + An overview of the Trilinos project + ACM Trans. Math. Softw. + 2005 + 31 + 3 + 10.1145/1089014.1089021 + 397 + 423 + + + + + + BuiQuan M. + Osei-KuffuorDaniel + CastellettoNicola + WhiteJoshua A. + + A scalable multigrid reduction framework for multiphase poromechanics of heterogeneous media + SIAM Journal on Scientific Computing + 2020 + 42 + 2 + 10.1137/19M1256117 + B379 + B396 + + + + + + BuiQuan M. + HamonFrançois P. + CastellettoNicola + Osei-KuffuorDaniel + SettgastRandolph R. + WhiteJoshua A. + + Multigrid reduction preconditioning framework for coupled processes in porous and fractured media + Computer Methods in Applied Mechanics and Engineering + 2021 + 387 + 10.1016/j.cma.2021.114111 + 114111 + + + + + + + Intergovernmental Panel on Climate Change IPCC + + Climate Change 2022 - Mitigation of Climate Change: Working Group III Contribution to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change + Cambridge University Press + 2023 + 10.1017/9781009157926 + + + + + GEOS documentation + 2024 + https://geosx-geosx.readthedocs-hosted.com/en/latest/ + + + + + + NordbottenJan M. + FernoMartin A. + FlemischBernd + KovscekAnthony R. + LieKnut Andreas + + The 11th Society of Petroleum Engineers Comparative Solution Project: Problem Definition + SPE Journal + 2024 + 29 + 5 + 10.2118/218015-PA + 2507 + 2524 + + + + + + NaumovDmitri + BilkeLars + LehmannChristoph + FischerThomas + WangWenqing + SilbermannChristian + ThiedauJan + SelzerPhilipp + + OpenGeoSys + Zenodo + 202406 + https://doi.org/10.5281/zenodo.11652195 + 10.5281/zenodo.11652195 + + + + + + KochTimo + GlaserDennis + WeishauptKilian + AckermannSina + BeckMartin + BeckerBeatrix + BurbullaSamuel + ClassHolger + ColtmanEdward + EmmertSimon + FetzerThomas + GruningerChristoph + HeckKatharina + HommelJohannes + KurzTheresa + LippMelanie + MohammadiFarid + ScherrerSamuel + SchneiderMartin + SeitzGabriele + StadlerLeopold + UtzMartin + WeinhardtFelix + FlemischBernd + + DuMux 3 - an open-source simulator for solving flow and transport problems in porous media with a focus on model coupling + Computers & Mathematics with Applications + 2020 + 0898-1221 + 10.1016/j.camwa.2020.02.012 + + + + + + RasmussenAtgeirr Flø + SandveTor Harald + BaoKai + LauserAndreas + HoveJoakim + SkaflestadBård + KlöfkornRobert + BlattMarkus + RustadAlf Birger + SævareidOve + LieKnut-Andreas + ThuneAndreas + + The Open Porous Media Flow reservoir simulator + Computers & Mathematics with Applications + 2021 + 81 + 0898-1221 + https://www.sciencedirect.com/science/article/pii/S0898122120302182 + 10.1016/j.camwa.2020.05.014 + 159 + 185 + + + + + + VoskovDenis + SaifullinIlshat + NovikovAleksei + WapperomMichiel + OrozcoLuisa + SeabraGabriel Serrão + ChenYuan + KhaitMark + LyuXiaocong + TianXiaoming + HoopStephan de + PalhaArtur + + open Delft Advanced Research Terra Simulator (open-DARTS) + Journal of Open Source Software + The Open Journal + 2024 + 9 + 99 + https://doi.org/10.21105/joss.06737 + 10.21105/joss.06737 + 6737 + + + + + + + AtchleyScott + ZimmerChristopher + LangeJohn + BernholdtDavid + Melesse VergaraVeronica + BeckThomas + BrimMichael + BudiardjaReuben + ChandrasekaranSunita + EisenbachMarkus + EvansThomas + EzellMatthew + FrontiereNicholas + GeorgiadouAntigoni + GlenskiJoe + GretePhilipp + HamiltonSteven + HolmenJohn + HueblAxel + JacobsonDaniel + JoubertWayne + McmahonKim + MerzariElia + MooreStan + MyersAndrew + NicholsStephen + OralSarp + PapatheodoreThomas + PerezDanny + RogersDavid M. + SchneiderEvan + VayJean-Luc + YeungP. K. + + Frontier: Exploring exascale + Proceedings of the international conference for high performance computing, networking, storage and analysis + Association for Computing Machinery + New York, NY, USA + 2023 + 9798400701092 + https://doi.org/10.1145/3581784.3607089 + 10.1145/3581784.3607089 + + + + +