## Publications

144 results found

Mittal S, Westbroek MJE, King PR,
et al., 2020, Path integral Monte Carlo method for the quantum anharmonic oscillator, *EUROPEAN JOURNAL OF PHYSICS*, Vol: 41, ISSN: 0143-0807

Gago P, Raeini A, King P, 2020, A spatially resolved fluid-solid interaction model for dense granular packs/Soft-Sand., *Advances in Water Resources*, Vol: 136, Pages: 1-7, ISSN: 0309-1708

Fluid flow through dense granular packs or soft sands can be described as a Darcy’ s flow for low injection rates, as the friction between grain-grain and grain-walls dominate the solid system behaviour. For high injection rates, fluid forces can generate grain displacement forming flow channels or “fractures”, which in turn modify local properties within the system, such as permeability and stress distribution. Due to this kind of “self organized” behaviour, a spatially resolved model for these interactions is required to capture the dynamics of these systems. In this work, we present a resolved model based on the approach taken by the CFDEM open source project which uses LIGGGHTS – a discrete elements method (DEM)– to model the granular behaviour and OpenFoam finite volume library for computational fluid dynamics (CFD), to simulate the fluid behaviour. The capabilities provided by the DEM engine allows the properties of the solid phase, such as inter-grain cohesion and solid confinement stress to be controlled. In this work the original solver provided by the CFDEM project was modified so as to deal with dense granular packs more effectively. Advantages of the approach presented are that it does not require external “scaling parameters” to reproduce well known properties of porous materials and that it inherits the performance provided by the CFDEM project. The model is validated by reproducing the well-known properties of static porous materials, such as its permeability as a function of the system porosity, and by calculating the drag coefficient for a sphere resting inside a uniform flow. Finally, we present fracture patterns obtained when modelling water injection into a Hele-Shaw cell, filled with a dense granular pack.

Gago P, King P, Wieladek K, 2020, Fluid-induced fracture into weakly consolidated sand: Impact of confining stress on initialization pressure, *Physical Review E: Statistical, Nonlinear, and Soft Matter Physics*, Vol: 101, Pages: 012907-1-012907-6, ISSN: 1539-3755

This paper studies the process of fluid injection driven fractures in granular packs where particles are held together by external confining stresses and weak intergrain cohesion. We investigate the process of fracture formations in soft sand confined into a radial Hele-Shaw cell. Two main regimes are well known for fluid injection in soft sand. For low fluid injection pressures it behaves as a solid porous material while for high enough injection pressures grain rearrangement takes place. Grain rearrangements lead to the formation of fluid channels or “fractures,” the structure and geometry of which depend on the material and fluid properties. Due to macroscopic grain displacements and the predominant role of dissipative frictional forces in granular system dynamics, these materials do not behave as conventional brittle, linear elastic materials and the transition between these two regimes cannot usually be described using poroelastic models. In this work we investigate the change in the minimum fluid pressure required to start grain mobilization as a function of the confining stresses applied to the system using a spatially resolved computational fluid dynamics–discrete element method numerical model. We show that this change is proportional to the applied stress when the confining stresses can be regarded as uniformly distributed among the particles in the system. A preliminary analytical expression for this change is presented.

Ladipo L, Blunt MJ, King PR, 2020, A salinity cut-off method to control numerical dispersion in low-salinity waterflooding simulation, *JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING*, Vol: 184, ISSN: 0920-4105

Shokrollahzadeh Behbahani S, Masihi M, Ghazanfari MH,
et al., 2019, Effect of Characteristic Time on Scaling of Breakthrough Time Distribution for Two-Phase Displacement in Percolation Porous Media, *Transport in Porous Media*, Vol: 130, Pages: 889-902, ISSN: 0169-3913

© 2019, Springer Nature B.V. Determining the time of breakthrough of injected water is important when assessing waterflood in an oil reservoir. Breakthrough time distribution for a passive tracer (for example water) in percolation porous media (near the percolation threshold) gives insights into the dynamic behavior of flow in geometrically complex systems. However, the application of such distribution to realistic two-phase displacements can be done based on scaling of all parameters. Here, we propose two new approaches for scaling of breakthrough time (characteristic times) in two-dimensional flow through percolation porous media. The first is based on the flow geometry, and the second uses the flow parameters of a representative homogenous model. We have tested the effectiveness of these two approaches using a large number of dynamic simulations. The results show significant improved distribution curves for the breakthrough (transit) time between an injector and a producer located in a heterogeneous porous medium in comparison with the previous scaling methods.

Zhou Y, Muggeridge AH, Berg CF,
et al., 2019, Effect of Layering on Incremental Oil Recovery From Tertiary Polymer Flooding, *SPE RESERVOIR EVALUATION & ENGINEERING*, Vol: 22, Pages: 941-951, ISSN: 1094-6470

- Author Web Link
- Cite
- Citations: 2

Westbroek MJE, Coche G-A, King PR,
et al., 2019, Pressure statistics from the path integral for Darcy flow through random porous media, *Journal of Physics A: Mathematical and Theoretical*, Vol: 52, ISSN: 1751-8113

The path integral for classical statistical dynamics is used to determine the properties of one-dimensional Darcy flow through a porous medium with a correlated stochastic permeability for several spatial correlation lengths. Pressure statistics are obtained from the numerical evaluation of the path integral by using the Markov chain Monte Carlo method. Comparisons between these pressure distributions and those calculated from the classic finite-volume method for the corresponding stochastic differential equation show excellent agreement for Dirichlet and Neumann boundary conditions. The evaluation of the variance of the pressure based on a continuum description of the medium provides an estimate of the effects of discretization. Log-normal and Gaussian fits to the pressure distributions as a function of position within the porous medium are discussed in relation to the spatial extent of the correlations of the permeability fluctuations.

Westbroek MJE, King PR, Vvedensky DD, et al., 2019, Pressure and flow statistics of Darcy flow from simulated annealing, Publisher: arXiv

The pressure and flow statistics of Darcy flow through a random permeablemedium are expressed in a form suitable for evaluation by the method ofsimulated annealing. There are several attractive aspects to using simulatedannealing: (i) any probability distribution can be used for the permeability,(ii) there is no need to invert the transmissibility matrix which, while not afactor for single-phase flow, offers distinct advantages for the case ofmultiphase flow, and (iii) the action used for simulated annealing is eminentlysuitable for coarse graining by integrating over the short-wavelength degreesof freedom. In this paper, we show that the pressure and flow statisticsobtained by simulated annealing are in excellent agreement with the moreconventional finite-volume calculations.

King PR, Masihi M, 2018, Percolation theory in reservoir engineering, ISBN: 9781786345233

© 2019 by World Scientific Publishing Europe Ltd. All rights reserved. This book aims to develop the ideas from fundamentals of percolation theory to practical reservoir engineering applications. Through a focus on field scale applications of percolation concepts to reservoir engineering problems, it offers an approximation method to determine many important reservoir parameters, such as effective permeability and reservoir connectivity and the physical analysis of some reservoir engineering properties. Starring with the concept of percolation theory, it then develops into methods to simple geological systems like sand-bodies and fractures. The accuracy and efficiency of the percolation concept for these is explained and further extended to more complex realistic models. Percolation Theory in Reservoir Engineering primarily focuses on larger reservoir scale flow and demonstrates methods that can be used to estimate large scale properties and their uncertainty, crucial for major development and investment decisions in hydrocarbon recovery.

Gago PA, King P, Muggeridge A, 2018, Fractal growth model for estimating breakthrough time and sweep efficiency when waterflooding geologically heterogeneous rocks, *Physical Review Applied*, Vol: 10, ISSN: 2331-7019

We describe a fast method for estimating flow through a porous medium with a heterogeneous permeability distribution. The main application is to contaminant transport in aquifers and recovery of oil by waterflooding, where such geological heterogeneities can result in regions of bypassed contaminants or oil. The extent of this bypassing is normally assessed by a numerical flow simulation that can take many hours of computer time. Ideally the impact of uncertainty in the geological description is then evaluated by the performing of many such simulations using different realizations of the permeability distribution. Obviously, a proper Monte Carlo evaluation may be impossible when the flow simulations are so computationally intensive. Consequently, methods from statistical mechanics, such as percolation theory and random walkers (such as diffusion-limited aggregation), have been proposed; however, these methods are limited to geological heterogeneities where the correlation lengths are smaller than the system size or to continuous permeability distributions. Here we describe a growth model that can be used to estimate the breakthrough time of the water (and hence the sweep efficiency) in most types of geologically heterogeneous rocks. We show how the model gives good estimates of the breakthrough time of water at the production well in a fraction of the time needed to perform a full flow simulation.

Westbroek MJE, Coche G-A, King PR,
et al., 2018, Evaluation of the path integral for flow through random porous media, *PHYSICAL REVIEW E*, Vol: 97, ISSN: 2470-0045

We present a path integral formulation of Darcy's equation in one dimension with random permeability described by a correlated multivariate lognormal distribution. This path integral is evaluated with the Markov chain Monte Carlo method to obtain pressure distributions, which are shown to agree with the solutions of the corresponding stochastic differential equation for Dirichlet and Neumann boundary conditions. The extension of our approach to flow through random media in two and three dimensions is discussed.

Westbroek MJE, King PR, Vvedensky DD,
et al., 2018, User's guide to Monte Carlo methods for evaluating path integrals, *American Journal of Physics*, Vol: 86, Pages: 293-304, ISSN: 0002-9505

We give an introduction to the calculation of path integrals on a lattice, with the quantum harmonic oscillator as an example. In addition to providing an explicit computational setup and corresponding pseudocode, we pay particular attention to the existence of autocorrelations and the calculation of reliable errors. The over-relaxation technique is presented as a way to counter strong autocorrelations. The simulation methods can be extended to compute observables for path integrals in other settings.

Al-Shamma BR, Gosselin O, King PR, 2018, History matching using hybrid parameterisation and optimisation methods, 80th EAGE Conference and Exhibition 2018

Copyright © 2018, Society of Petroleum Engineers. Reservoir models are commonly used in the oil and gas industry to predict reservoir behaviour and forecast production in order to make important financial decision such as reserves estimations, infill well drilling, enhanced oil recovery schemes, etc. Conditioning reservoir models to dynamic production data is known as history matching, which is usually carried out in an attempt to enhance the predicted reservoir performance. Uncertainty quantification is also an important aspect of this task, and encompasses identifying multiple history matched models, which are constrained to a geological concept. History matching and uncertainty quantification can be accomplished by identifying and using efficient and speedy optimisation techniques. The assisted history matching practice usually includes two practices; the first of which is parameterisation, which consists of reducing the number of matching parameters in order to avoid adjusting too many variables with respect to the amount of production data available. A challenging situation results from over-parameterisation, in addition to an ill-posed formulation of the inverse problem. The second process involves optimisation, which aims at solving the inverse problem by reducing a misfit or objective function that defines the difference between simulated and production data. The main challenges of optimisation are local minima solutions and premature convergence. The success of optimisation is greatly dependent on the parameterisation strategy used. These algorithms that analyse various parameterisation methods, combined and examined with diverse optimisation algorithms lead us to suggest novel hybrid approaches addressing the two processes of assisted history matching. We propose a multistage combined parameterisation and optimisation history matching technique. Hybridisation of parameterisation and optimisation algorithms when designed in an optimum manner can combin

Al-Shamma BR, Gosselin O, King PR, 2018, History matching using hybrid parameterisation and optimisation methods

Copyright © 2018, Society of Petroleum Engineers. Reservoir models are commonly used in the oil and gas industry to predict reservoir behaviour and forecast production in order to make important financial decision such as reserves estimations, infill well drilling, enhanced oil recovery schemes, etc. Conditioning reservoir models to dynamic production data is known as history matching, which is usually carried out in an attempt to enhance the predicted reservoir performance. Uncertainty quantification is also an important aspect of this task, and encompasses identifying multiple history matched models, which are constrained to a geological concept. History matching and uncertainty quantification can be accomplished by identifying and using efficient and speedy optimisation techniques. The assisted history matching practice usually includes two practices; the first of which is parameterisation, which consists of reducing the number of matching parameters in order to avoid adjusting too many variables with respect to the amount of production data available. A challenging situation results from over-parameterisation, in addition to an ill-posed formulation of the inverse problem. The second process involves optimisation, which aims at solving the inverse problem by reducing a misfit or objective function that defines the difference between simulated and production data. The main challenges of optimisation are local minima solutions and premature convergence. The success of optimisation is greatly dependent on the parameterisation strategy used. These algorithms that analyse various parameterisation methods, combined and examined with diverse optimisation algorithms lead us to suggest novel hybrid approaches addressing the two processes of assisted history matching. We propose a multistage combined parameterisation and optimisation history matching technique. Hybridisation of parameterisation and optimisation algorithms when designed in an optimum manner can combin

Al-Dhuwaihi A, King P, Muggeridge A, 2018, Upscaling for polymer flooding

Copyright 2018, Society of Petroleum Engineers. Polymer flooding is a proven EOR/IOR process for viscous and light oil reservoirs alike. However, it results in the formation of two shocks front that require simulation models with fine grid blocks to represent field scale fluid movement. Therefore, upscaling is required to transfer such fluid behavior to coarser models. However, most upscaling methods are designed for waterflood only, while upscaling techniques for polymer flood are rarely discussed in the literature. In this paper, A new upscaling methodology specifically designed for polymer flooding is presented to address such impracticality. The methodology allows the average flow behavior to be captured, including the effects of small scale heterogeneity whilst compensating for the impact of increased numerical diffusion present in coarse grid models. The method is based on the pore volume weighted method for relative permeability pseudoization first derived by Emanuel and Cook (1974) for waterflooding but extends its implementation to model polymer specific parameters such as adsorption isotherm and viscosity-concentration function. The method is demonstrated on a series of simple reservoir models for a range of different aggregation ratios, showing overall improvement in the prediction of oil recovery, water cut, produced polymer concentration with time, and pressure response in the coarse grid models. This is demonstrated by comparing the predictions from the coarse grid, upscaled models with those from fine grid simulations and coarse grid simulations of the same model reservoir without the new upscaling methodology.

Masihi M, Gago P, King P, 2016, Estimation of the Effective Permeability of Heterogeneous Porous Media by Using Percolation Concepts, *Transport in Porous Media*, Vol: 114, Pages: 169-199, ISSN: 1573-1634

In this paper we present new methods to estimate the effective permeability (k_eff) of heterogeneous porous media with a wide distribution of permeabilities and various underlying structures, using percolation concepts. We first set a threshold permeability (k_th) on the permeability density function (pdf) and use standard algorithms from percolation theory to check whether the high permeable grid blocks (i.e. those with permeability higher than k_th) with occupied fraction of “p” first forms a cluster connecting two opposite sides of the system in the direction of the flow (high permeability flow pathway). Then we estimate the effective permeability of the heterogeneous porous media in different ways: a power law (k_eff=k_th p^m), a weighted power average (k_eff=[p.k_th^m+(1-p).k_g^m ]^(1/m) with k_g the geometric average of the permeability distribution) and a characteristic shape factor multiplied by the permeability threshold value. We found that the characteristic parameters (i.e. the exponent “m”) can be inferred either from the statistics and properties of percolation sub-networks at the threshold point (i.e. high and low permeable regions corresponding to those permeabilities above and below the threshold permeability value) or by comparing the system properties with an uncorrelated random field having the same permeability distribution. These physically based approaches do not need fitting to the experimental data of effective permeability measurements to estimate the model parameter (i.e. exponent m) as is usually necessary in empirical methods. We examine the order of accuracy of these methods on different layers of 10th SPE model and found very good estimates as compared to the values determined from the commercial flow simulators.

Masihi M, Gago P, King P, 2016, Percolation-based effective permeability estimation in real heterogeneous porous media

It has long been understood that flow behavior in heterogeneous porous media is largely controlled by the continuity of permeability contrasts. With this in mind, we are looking in new methods for a fast estimation of the effective permeability which concentrates on the properties of the percolating cluster. From percolation concepts we use a threshold permeability value (Kth) by which the gridblocks with the highest permeability values connect two opposite side of the system in the direction of the flow. Those methods can be applied to heterogeneous media of a range of permeabilities distribution and various underlying structures. We use power law relations and weighted power averages that can be inferred either from the statistics and the properties of percolation sub-networks at the threshold point. This approach does not need fitting to the experimental data of conductivity measurements to estimate the model parameter as is done in empirical methods. We examine the order of accuracy of these methods on some layers of 10th SPE model and found very good agreements with the values determined from the commercial flow simulators. The results of this work open insights on new methods in estimating the effective permeability using percolation concepts.

Sadeghnejad S, Masihi M, King PR, et al., 2016, Study the effect of connectivity between two wells on secondary recovery efficiency using percolation approach

Estimating available hydrocarbon to be produced during secondary oil recovery is an ongoing activity in field development. The primary plan is normally scheduled during early stage of field's life through master development plan studies. During this period, due to the lake of certain data, estimation of the field efficiency is usually based on rules of thumb and not detailed field characterization. Hence, there is a great motivation to produce simpler physically-based methodologies. The minimum necessity inputs of percolation approach make it a useful tool for foration performance prediction. This approach enables us to attain a better assessment of the efficiency of secondary recovery methods at early production time. The main contribution of this study is to establish a continuum percolation model based on Monte Carlo simulation that can estimate the connectivity of good sands between two wells. In the classical percolation, the connectivity is considered between two lines and two faces of the system in 2-And 3-D; whereas, hydrocarbon production is achieved through wells with the shape of lines (e.g., vertical, horizontal, or deviated wells). In addition, the results showed that not implementation of the correct geometry of wells can alter the estimated results from the percolation approach.

Westbroek MJE, King PR, Vvedensky DD, 2016, Path Integral Method for Flow through Random Porous Media

One of the key problems in modelling flow in oil reservoirs is our lack of precise knowledge of the variations in flow properties across the field. At best we can infer the statistics of these variations from field observations. The challenge is to determine the statistics of the flow itself (flow rates, pressures etc.) from the statistics of the permeability variations. Conventional simulations are computationally very expensive unless smart sampling techniques or surrogate models are used. In this paper we demonstrate the use of a path integral formulation for this problem. To demonstrate how this methods works, we start with the one dimensional Darcy flow problem: q(x)=-K(x)dp(x)/dx where p(x) is the pressure, q(x) is the flow rate and K(x) is the rock permeability. The randomness of the porous medium is modelled by regarding K as a stochastic quantity which is assumed to follow Gaussian statistics. Because of the randomly varying rock structure, there is a variety of conceivable pressure realisations p(x). The path integral Z is an integral over all realisations with an appropriate probability measure. Once Z is evaluated, either analytically, or by standard Monte Carlo methods, any observable of interest, including pressure correlations can be easily obtained.

Sadeghnejad S, Masihi M, King PR, 2016, Study the connectivity of good sands between two wells represented by two points using percolation theory

One of the major applications of percolation theory in petroleum engineering is investigation of connectivity in complex formations. Production normally is achieved through a heterogeneous porous media. Proper assessment of connectivity of formation considering its heterogeneity is important in formation evaluation. Percolation assumes that heterogeneity can be simplified to either permeable or impermeable rocktypes. Considering this, the system outcome (e.g., prediction of recovery) can be easily described by simple mathematical relationships which are entirely independent of small-scale details of formation. The main contribution of this work is to use a continuum percolation approach to estimate the connectivity of permeable sands via two points (P2P) representing two injection and production wells and comparing the results with the conventional line to line connectivity (L2L) in previous studies. In particular, the percolation exponents will be investigated both in P2P and L2L and their connectivity curves will be compared. For this purpose, an object-based technique based on Monte-Carlo simulation is used to model the spatial distribution of isotropic sandbodies in 2-D. The results showed that proper modelling of the shape of wells is a critical issue that can alter the obtained results associated with the amount of connected hydrocarbon when one uses the percolation approach.

Gago PA, King PR, Muggeridge AH, 2016, Fast estimation of effective permeability and sweep efficiency of waterflooding in geologically heterogeneous reservoirs

Geological heterogeneity can adversely affect the macroscopic sweep efficiency when waterflooding oil reservoirs, however the exact distribution of permeability and porosity is generally not known. Engineers try to estimate the range of impacts heterogeneity might have on waterflood efficiency by creating multiple geological models and then simulating a waterflood through each of those realizations. Unfortunately each simulation can be computationally intensive meaning that it is generally not possible to obtain a statistically valid estimate of the expected sweep and the associated standard deviation. In this paper we show how the volume of unswept oil can be estimated rapidly (without flow simulations) from a geometrical characterization of the spatial permeability distribution. A "constriction" factor is defined which quantifies the effective cross-section area of the zones perpendicular to the principal flow direction. This is combined with a 'net-To-gross ratio' (which quantifies the fractional reservoir volume occupied by the zones that contribute to flow) to estimate effective permeability and the expected recovery factor for that realization. The method is tested using a range of realistic geological models, including SPE10 model 2 and its predictions are shown to agree well with values obtained using a well established commercial flow simulator.

Sadeghnejad S, Masihi M, King PR, 2016, Study the connectivity of good sands between two wells represented by two points using percolation theory

One of the major applications of percolation theory in petroleum engineering is investigation of connectivity in complex formations. Production normally is achieved through a heterogeneous porous media. Proper assessment of connectivity of formation considering its heterogeneity is important in formation evaluation. Percolation assumes that heterogeneity can be simplified to either permeable or impermeable rocktypes. Considering this, the system outcome (e.g., prediction of recovery) can be easily described by simple mathematical relationships which are entirely independent of small-scale details of formation. The main contribution of this work is to use a continuum percolation approach to estimate the connectivity of permeable sands via two points (P2P) representing two injection and production wells and comparing the results with the conventional line to line connectivity (L2L) in previous studies. In particular, the percolation exponents will be investigated both in P2P and L2L and their connectivity curves will be compared. For this purpose, an object-based technique based on Monte-Carlo simulation is used to model the spatial distribution of isotropic sandbodies in 2-D. The results showed that proper modelling of the shape of wells is a critical issue that can alter the obtained results associated with the amount of connected hydrocarbon when one uses the percolation approach.

Sadeghnejad S, Masihi M, King PR, et al., 2016, Study the effect of connectivity between two wells on secondary recovery efficiency using percolation approach

Estimating available hydrocarbon to be produced during secondary oil recovery is an ongoing activity in field development. The primary plan is normally scheduled during early stage of field's life through master development plan studies. During this period, due to the lake of certain data, estimation of the field efficiency is usually based on rules of thumb and not detailed field characterization. Hence, there is a great motivation to produce simpler physically-based methodologies. The minimum necessity inputs of percolation approach make it a useful tool for foration performance prediction. This approach enables us to attain a better assessment of the efficiency of secondary recovery methods at early production time. The main contribution of this study is to establish a continuum percolation model based on Monte Carlo simulation that can estimate the connectivity of good sands between two wells. In the classical percolation, the connectivity is considered between two lines and two faces of the system in 2-And 3-D; whereas, hydrocarbon production is achieved through wells with the shape of lines (e.g., vertical, horizontal, or deviated wells). In addition, the results showed that not implementation of the correct geometry of wells can alter the estimated results from the percolation approach.

Westbroek MJE, King PR, Vvedensky DD, 2016, Path Integral Method for Flow through Random Porous Media

One of the key problems in modelling flow in oil reservoirs is our lack of precise knowledge of the variations in flow properties across the field. At best we can infer the statistics of these variations from field observations. The challenge is to determine the statistics of the flow itself (flow rates, pressures etc.) from the statistics of the permeability variations. Conventional simulations are computationally very expensive unless smart sampling techniques or surrogate models are used. In this paper we demonstrate the use of a path integral formulation for this problem. To demonstrate how this methods works, we start with the one dimensional Darcy flow problem: q(x)=-K(x)dp(x)/dx where p(x) is the pressure, q(x) is the flow rate and K(x) is the rock permeability. The randomness of the porous medium is modelled by regarding K as a stochastic quantity which is assumed to follow Gaussian statistics. Because of the randomly varying rock structure, there is a variety of conceivable pressure realisations p(x). The path integral Z is an integral over all realisations with an appropriate probability measure. Once Z is evaluated, either analytically, or by standard Monte Carlo methods, any observable of interest, including pressure correlations can be easily obtained.

Gago PA, King PR, Muggeridge AH, 2016, Fast estimation of effective permeability and sweep efficiency of waterflooding in geologically heterogeneous reservoirs

Geological heterogeneity can adversely affect the macroscopic sweep efficiency when waterflooding oil reservoirs, however the exact distribution of permeability and porosity is generally not known. Engineers try to estimate the range of impacts heterogeneity might have on waterflood efficiency by creating multiple geological models and then simulating a waterflood through each of those realizations. Unfortunately each simulation can be computationally intensive meaning that it is generally not possible to obtain a statistically valid estimate of the expected sweep and the associated standard deviation. In this paper we show how the volume of unswept oil can be estimated rapidly (without flow simulations) from a geometrical characterization of the spatial permeability distribution. A "constriction" factor is defined which quantifies the effective cross-section area of the zones perpendicular to the principal flow direction. This is combined with a 'net-To-gross ratio' (which quantifies the fractional reservoir volume occupied by the zones that contribute to flow) to estimate effective permeability and the expected recovery factor for that realization. The method is tested using a range of realistic geological models, including SPE10 model 2 and its predictions are shown to agree well with values obtained using a well established commercial flow simulator.

Alkhatib AM, King PR, 2015, The use of the least-squares probabilistic- Collocation method in decision making in the presence of uncertainty for chemical- Enhanced-Oil-Recovery processes, Pages: 747-766, ISSN: 1086-055X

Copyright © 2015 Society of Petroleum Engineers. The least-squares Monte Carlo method (LSM) is a decision-evaluation method that can capture the value of flexibility of a process. This method was shown to provide us with some insight into the effect of uncertainty on decision making and to help us capture the upside potential or mitigate the downside effects for a chemical enhanced-oil-recovery (EOR) process. The method is a stochastic approximate dynamic programming approach to decision making. It is modeled after a forward simulation coupled with a recursive algorithm, which produces the near-optimal policy. It relies on Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir-engineering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the LSM by improving the sampling method used to generate the technical uncertainties used in producing the production profiles and to extend its application to different chemical EOR processes. The probabilistic-collocation method has been proved to be a robust and efficient uncertainty-quantification method. It approximates the random input distributions by use of polynomial-chaos expansions and produces a proxy polynomial for the output parameter requiring a limited number of model responses that is conditional on the number of random inputs and the order of the approximation desired. The resulting proxy can then be used to generate the different statistical moments with negligible computational requirement. By use of the sampling methods of the probabilistic- collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision-evaluation method. This is known as the least-squares probabilistic-collocation method (LSPCM). Both methods wer

Al-Shamma BR, Gosselin O, King P, et al., 2015, Comparative performance of history matching algorithms using diverse parameterization methods: Application to a synthetic and North Sea case, Pages: 146-164

Copyright 2015, Society of Petroleum Engineers. Extending the life of the Johnston field requires an ability to produce reliable forecasts of the effects of reservoir interventions. Reliable forecasts need an effective, robust and accurate history match. The goal of this paper is to evaluate the relative performance of three different global assisted history-matching algorithms. We describe the implementation of an integrated parameterization and optimization method, which was tested on the Brugge synthetic model (SPE benchmark case study), and the results based on the best selected method are applied to the Johnston gas field in the Southern North Sea. There are two key components to assisted history matching: (i) choice of the parameterization, and (ii) selection and performance of the optimization algorithm. In this paper, three parameterization methods are tested (including use of gradients) on three global history-matching algorithms: Particle Swarm Optimization (PSO), Evolutionary Algorithm (EA) and Differential Evolution (DE). A combination of two different algorithms was also examined. We assessed the algorithm efficiency based on the lowest achieved objective function and the time taken to converge to the lowest value; the quality of the parameterization was examined based on the lowest objective value, the best history match and the consistency of geological parameters used for the history match. The results show that the effectiveness of each optimization algorithm is dependent on the parameterization method. When comparing parameterization methods, the Around the Median method seems to give the best results in terms of lowest misfit, and best history match when using the same history matching algorithm for the Brugge model example. When comparing across algorithms the DE method performs better than the rest. An iterative combination of algorithms is seen to be the best option and assists in a further minimization of the objective function. The effect of

Petvipusit KR, Elsheikh AH, King PR, et al., 2015, An efficient optimisation technique using adaptive spectral high-dimensional model representation: Application to CO<inf>2</inf> sequestration strategies, Pages: 1576-1595

Copyright © 2015, Society of Petroleum Engineers. The successful operation of CO2 sequestration relies on designing optimal injection strategies that maximise economic performance while guaranteeing long-term storage security. Solving this optimisation problem is computationally demanding. Hence, we propose an efficient surrogate-assisted optimisation technique with three novel aspects: (1) it relies on an ANOVA-like decomposition termed High- Dimensional Model Representation; (2) component-wise interactions are approximated with adaptive sparse grid interpolation; and (3) the surrogate is adaptively partitioned closer to the optimal solution within the optimisation iteration. A High-Dimensional Model Representation (HDMR) represents the model output as a hierarchical sum of component functions with different input variables. This structure enables us to select influential lower-order functions that impact the model output for efficient reduced-order representation of the model. In this work, we build the surrogate based on the HDMR expansion and make use of Sobol indices to adaptively select the significant terms. Then, the selected lower-order terms are approximated by using the Adaptive Sparse Grid Interpolation (ASGI) approach. Once the HDMR is built, a global optimizer is run to decide: 1) the domain shrinking criteria; and 2) the centre point for the next HDMR building. Therefore, this proposed technique is called a walking Cut-AHDMR as it shrinks the search domain while balancing the trade-off between exploration and exploitation of the optimisation algorithm. The proposed technique is evaluated on a benchmark function and on the PUNQ-S3 reservoir model. Based on our numerical results, the walking Cut-AHDMR is a promising approach: not only does it require substantially fewer forward runs in building the surrogate of high dimension but it also effectively guides the search towards the optimal solution. The proposed method provides an efficient tool to fi

Petvipusit KR, Elsheikh AH, Laforce TC,
et al., 2014, Robust optimisation of CO2 sequestration strategies under geological uncertainty using adaptive sparse grid surrogates, *COMPUTATIONAL GEOSCIENCES*, Vol: 18, Pages: 763-778, ISSN: 1420-0597

- Author Web Link
- Cite
- Citations: 15

Alkhatib A, King P, 2014, An approximate dynamic programming approachto decision making in the presence of uncertainty for surfactant-polymer flooding, *COMPUTATIONAL GEOSCIENCES*, Vol: 18, Pages: 243-263, ISSN: 1420-0597

- Author Web Link
- Cite
- Citations: 6

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.