Page 265 - Intelligent Digital Oil And Gas Fields
P. 265

Integrated Asset Management and Optimization Workflows       215


              nongeological and nonphysical features in the reservoir model. Moreover,
              the lessons learned are not properly applied to create a realistic reservoir
              model, and the perceived “history-matched” models are of low predictive
              value.
                 As reservoirs and assets mature and data acquisition and processing
              methods evolve and become more sophisticated, the acquired reservoir data
              grow substantially in terms of quantity and complexity. Particularly with the
              expansion of DOF projects that use automated and data-driven IAMod and
              IAM workflows, the need to solve large-scale, high-resolution modeling
              problems, quantify the inherent model uncertainty for more reliable predic-
              tion, and optimize their performance are becoming prevalent in the E&P
              industry. The challenges resulting from integrating multiple scales of data
              with uncertainties in physical parameters and processes make imperative
              the use of efficient model parameterization, advanced inversion and optimi-
              zation algorithms, with utilization rapidly evolving HPC architectures.
                 Within the last three decades, the oil industry has gained traction in
              developing and implementing stochastic, population-based algorithms in
              reservoir characterization and simulation workflows. The applications of
              simulated annealing (SA), an algorithm that was originally developed for
              solving combinatorial optimization problems, first emerged in the oil and
              gas industry in the early 1990s in areas from stochastic reservoir modeling
              to optimization of well-scheduling and -placement (Deutsch and Journel,
              1994; Ouenes et al., 1994) and have endured through the introduction of
              advanced SA algorithms, such as very fast simulated annealing (VFSA), with
              recent expansion of unconventional exploration (Sui et al., 2014).
                 Another important advance in oil and gas stochastic modeling was the
              introduction of techniques for the design of experiments (DoE), which
              was originally developed in agriculture in the late 1920s (Salsburg, 2001).
              DoE modeling has been primarily used for the rapid quantification of uncer-
              tainty using proxy models with response surface analysis (RSA) and various
              forms of designs (e.g., latin hypercube, Box-Behnken, etc.) in AHM
              (Cullick et al., 2004; Alpak et al., 2013), sensitivity analyses (Fillacier et al.,
              2014), and risk evaluation (Sazonov et al., 2015). In the early 2000s the
              E&P industry started to see an expansion of ensemble-based, Bayesian infer-
              ence and model inversion, using, for example, the evolutionary algorithms
              (Schulze-Riegert and Ghedan, 2007), the ensemble Kalman filter (EnKF)
              (Evensen, 2009), and recently a complementary data assimilation approach,
              the ensemble smoother (ES) and multiple data assimilation (MDA) by
              Emerick and Reynolds (2012, 2013) and Maucec et al. (2016, 2017) and
   260   261   262   263   264   265   266   267   268   269   270