
24 |
Oilfield Technology
May/June 2020
Studies are no longer limited by simulation size nor scale, but by the
ability to digest the generated data and infermeaningful insights given the
uncertainty; the bottleneck hasmoved.
According toHyperion Research, spending for HPCwork in the cloud
is expected to grow fromUS$2.5 billion spent in 2018 toUS$7.4 billion
in 2023.
Parallel reservoir simulators
The second reason for the improved capability of reservoir simulation
is the emergence of new parallel simulators that are able to employ the
advantages offered by themodern hardware. Although processes such
as seismic imaging are naturally amenable tomassive parallelism, it is
more challenging to expose such parallelism in reservoir simulation. It
tookmore time and effort to create parallel reservoir simulators, which
is evident in how companies have used their HPC systems over the
years. Historically, companies have dedicated themajority of computing
resources to the seismic imaging process, with reservoir simulation
being a distant second. This scenario is gradually changing as energy
companies begin to usemore parallel simulators andmove tomore
probabilistic methods of reservoir modelling.
The increase in computing power is changing howcompanies view
the use of reservoir simulation. Instead of using one or a fewmodels
to represent the reservoir, they aremoving towardsmore statistical
methods, such as ensemblemodelling. Ensemblemodelling is a technique
where thousands of different realisations of amodel are simulated to
provide an envelope of possible outcomeswith probabilisticweighting.
Ensemblemodelling recognises and embraces uncertainty, and provides
statistical bounds on future production. This enables companies to better
understand the uncertainty associatedwith the reservoir and avoidmore
ad-hoc assumptions during the decision-making process. It also assists the
machine learning and AI methods used by oil companies by creating the
large sets of data that are required. Methods such as ensemblemodelling
or uncertainty quantification require heavy computing power, which
has historically limited their use in traditional reservoir simulation. This
burden has nowbeenmitigated and companies such as Eni, with its new
GPU-basedHPC5 supercomputer, nowchoose the best development
scenario by creating ensembles of models and running hundreds to
thousands of simulations.
Companies are alsomoving towards developing andmodelling
larger, more fine-grainedmodels. Traditional reservoir simulation
involves the process of upscaling, where detail is removed from large
geological models to create smaller simulationmodels that are faster
andmoremanageable. Modern parallel reservoir simulators, such as the
GPU-based ECHELON, enable companies to dispense with the upscaling
process and instead quickly simulate the full geologic sizemodel.
Geologic complexity is a very important factor that controls long-term
recovery. Maintaining the complexity developed inmodern geologic
modelling tools can be very important for understanding and optimising
recovery. Simulators allow companies tomodel these large, complex
systems at speeds that enable the practical simulation of hundreds or
thousands of ensemble realisations. More detailed and higher resolution
models provide engineers andmanagers with additional critical
subsurface data that informs decision-making. This is true even for very
small companies, such as Denver-based iReservoir consulting, where
models of several million or more active cells have become routine
with the use of ECHELON on small workstations. In another example,
Houston-based Marathon Oil Co. uses ECHELON to runmodels with tens
of millions of cells in full-field simulations that includemultiple wells
with complex fracture geometry.
The performance of modern parallel reservoir simulators has also led
to an increased use of more complex problems, such as compositional
modelling. Compositional modelling allows engineers to track how
the chemical composition of the hydrocarbon changes throughout the
production process. This is important in cases, such as carbon dioxide
(CO
2
) flooding, where the changing composition of the hydrocarbon
mix can dramatically impact the recovery. This type of modelling is very
compute-intensive and thus requiresmuch longer run times than less
complex simulation runs. Because of this, engineers have historically
avoided compositional modellingwhen possible bymaking assumptions
and limiting themodel complexity. This adds to the uncertainty in the
model and negatively affects business decisions.
Conclusion
Forces on both the demand and the supply side have impacted the role
of reservoir simulation in the energy industry. On the demand side there
is a growing emphasis on ensemblemethods, largermodels andmore
complex physics. All three drive the need for fast, scalable simulation of the
type offered by simulators such as ECHELON. On the supply sidemulti-core
CPUs andGPUs have emerged asmature foundational platforms for
scientific computing. These new technologies generate critical information
for better decision-making and cost savings in an industrywhere even tiny
improvements in efficiency or production can provide huge rewards.
Figure 1.
V100GPU.
Figure 3.
Pricing evaluation.
Figure 2.
Analysis of an ensemble.