124 results found
Gago PA, King PR, Muggeridge AH, 2016, Fast estimation of effective permeability and sweep efficiency of waterflooding in geologically heterogeneous reservoirs
Geological heterogeneity can adversely affect the macroscopic sweep efficiency when waterflooding oil reservoirs, however the exact distribution of permeability and porosity is generally not known. Engineers try to estimate the range of impacts heterogeneity might have on waterflood efficiency by creating multiple geological models and then simulating a waterflood through each of those realizations. Unfortunately each simulation can be computationally intensive meaning that it is generally not possible to obtain a statistically valid estimate of the expected sweep and the associated standard deviation. In this paper we show how the volume of unswept oil can be estimated rapidly (without flow simulations) from a geometrical characterization of the spatial permeability distribution. A "constriction" factor is defined which quantifies the effective cross-section area of the zones perpendicular to the principal flow direction. This is combined with a 'net-To-gross ratio' (which quantifies the fractional reservoir volume occupied by the zones that contribute to flow) to estimate effective permeability and the expected recovery factor for that realization. The method is tested using a range of realistic geological models, including SPE10 model 2 and its predictions are shown to agree well with values obtained using a well established commercial flow simulator.
Masihi M, Gago P, King P, 2016, Percolation-based effective permeability estimation in real heterogeneous porous media
It has long been understood that flow behavior in heterogeneous porous media is largely controlled by the continuity of permeability contrasts. With this in mind, we are looking in new methods for a fast estimation of the effective permeability which concentrates on the properties of the percolating cluster. From percolation concepts we use a threshold permeability value (Kth) by which the gridblocks with the highest permeability values connect two opposite side of the system in the direction of the flow. Those methods can be applied to heterogeneous media of a range of permeabilities distribution and various underlying structures. We use power law relations and weighted power averages that can be inferred either from the statistics and the properties of percolation sub-networks at the threshold point. This approach does not need fitting to the experimental data of conductivity measurements to estimate the model parameter as is done in empirical methods. We examine the order of accuracy of these methods on some layers of 10th SPE model and found very good agreements with the values determined from the commercial flow simulators. The results of this work open insights on new methods in estimating the effective permeability using percolation concepts.
Masihi M, Gago PA, King PR, 2016, Estimation of the Effective Permeability of Heterogeneous Porous Media by Using Percolation Concepts, TRANSPORT IN POROUS MEDIA, Vol: 114, Pages: 169-199, ISSN: 0169-3913
Sadeghnejad S, Masihi M, King PR, 2016, Study the connectivity of good sands between two wells represented by two points using percolation theory
One of the major applications of percolation theory in petroleum engineering is investigation of connectivity in complex formations. Production normally is achieved through a heterogeneous porous media. Proper assessment of connectivity of formation considering its heterogeneity is important in formation evaluation. Percolation assumes that heterogeneity can be simplified to either permeable or impermeable rocktypes. Considering this, the system outcome (e.g., prediction of recovery) can be easily described by simple mathematical relationships which are entirely independent of small-scale details of formation. The main contribution of this work is to use a continuum percolation approach to estimate the connectivity of permeable sands via two points (P2P) representing two injection and production wells and comparing the results with the conventional line to line connectivity (L2L) in previous studies. In particular, the percolation exponents will be investigated both in P2P and L2L and their connectivity curves will be compared. For this purpose, an object-based technique based on Monte-Carlo simulation is used to model the spatial distribution of isotropic sandbodies in 2-D. The results showed that proper modelling of the shape of wells is a critical issue that can alter the obtained results associated with the amount of connected hydrocarbon when one uses the percolation approach.
Sadeghnejad S, Masihi M, King PR, et al., 2016, Study the effect of connectivity between two wells on secondary recovery efficiency using percolation approach
Estimating available hydrocarbon to be produced during secondary oil recovery is an ongoing activity in field development. The primary plan is normally scheduled during early stage of field's life through master development plan studies. During this period, due to the lake of certain data, estimation of the field efficiency is usually based on rules of thumb and not detailed field characterization. Hence, there is a great motivation to produce simpler physically-based methodologies. The minimum necessity inputs of percolation approach make it a useful tool for foration performance prediction. This approach enables us to attain a better assessment of the efficiency of secondary recovery methods at early production time. The main contribution of this study is to establish a continuum percolation model based on Monte Carlo simulation that can estimate the connectivity of good sands between two wells. In the classical percolation, the connectivity is considered between two lines and two faces of the system in 2-And 3-D; whereas, hydrocarbon production is achieved through wells with the shape of lines (e.g., vertical, horizontal, or deviated wells). In addition, the results showed that not implementation of the correct geometry of wells can alter the estimated results from the percolation approach.
Westbroek MJE, King PR, Vvedensky DD, 2016, Path Integral Method for Flow through Random Porous Media
One of the key problems in modelling flow in oil reservoirs is our lack of precise knowledge of the variations in flow properties across the field. At best we can infer the statistics of these variations from field observations. The challenge is to determine the statistics of the flow itself (flow rates, pressures etc.) from the statistics of the permeability variations. Conventional simulations are computationally very expensive unless smart sampling techniques or surrogate models are used. In this paper we demonstrate the use of a path integral formulation for this problem. To demonstrate how this methods works, we start with the one dimensional Darcy flow problem: q(x)=-K(x)dp(x)/dx where p(x) is the pressure, q(x) is the flow rate and K(x) is the rock permeability. The randomness of the porous medium is modelled by regarding K as a stochastic quantity which is assumed to follow Gaussian statistics. Because of the randomly varying rock structure, there is a variety of conceivable pressure realisations p(x). The path integral Z is an integral over all realisations with an appropriate probability measure. Once Z is evaluated, either analytically, or by standard Monte Carlo methods, any observable of interest, including pressure correlations can be easily obtained.
Al-Shamma BR, Gosselin O, King P, et al., 2015, Comparative performance of history matching algorithms using diverse parameterization methods: Application to a synthetic and North Sea case, Pages: 146-164
Copyright 2015, Society of Petroleum Engineers. Extending the life of the Johnston field requires an ability to produce reliable forecasts of the effects of reservoir interventions. Reliable forecasts need an effective, robust and accurate history match. The goal of this paper is to evaluate the relative performance of three different global assisted history-matching algorithms. We describe the implementation of an integrated parameterization and optimization method, which was tested on the Brugge synthetic model (SPE benchmark case study), and the results based on the best selected method are applied to the Johnston gas field in the Southern North Sea. There are two key components to assisted history matching: (i) choice of the parameterization, and (ii) selection and performance of the optimization algorithm. In this paper, three parameterization methods are tested (including use of gradients) on three global history-matching algorithms: Particle Swarm Optimization (PSO), Evolutionary Algorithm (EA) and Differential Evolution (DE). A combination of two different algorithms was also examined. We assessed the algorithm efficiency based on the lowest achieved objective function and the time taken to converge to the lowest value; the quality of the parameterization was examined based on the lowest objective value, the best history match and the consistency of geological parameters used for the history match. The results show that the effectiveness of each optimization algorithm is dependent on the parameterization method. When comparing parameterization methods, the Around the Median method seems to give the best results in terms of lowest misfit, and best history match when using the same history matching algorithm for the Brugge model example. When comparing across algorithms the DE method performs better than the rest. An iterative combination of algorithms is seen to be the best option and assists in a further minimization of the objective function. The effect of
Alkhatib AM, King PR, 2015, The use of the least-squares probabilistic- Collocation method in decision making in the presence of uncertainty for chemical- Enhanced-Oil-Recovery processes, Pages: 747-766, ISSN: 1086-055X
Copyright © 2015 Society of Petroleum Engineers. The least-squares Monte Carlo method (LSM) is a decision-evaluation method that can capture the value of flexibility of a process. This method was shown to provide us with some insight into the effect of uncertainty on decision making and to help us capture the upside potential or mitigate the downside effects for a chemical enhanced-oil-recovery (EOR) process. The method is a stochastic approximate dynamic programming approach to decision making. It is modeled after a forward simulation coupled with a recursive algorithm, which produces the near-optimal policy. It relies on Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir-engineering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the LSM by improving the sampling method used to generate the technical uncertainties used in producing the production profiles and to extend its application to different chemical EOR processes. The probabilistic-collocation method has been proved to be a robust and efficient uncertainty-quantification method. It approximates the random input distributions by use of polynomial-chaos expansions and produces a proxy polynomial for the output parameter requiring a limited number of model responses that is conditional on the number of random inputs and the order of the approximation desired. The resulting proxy can then be used to generate the different statistical moments with negligible computational requirement. By use of the sampling methods of the probabilistic- collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision-evaluation method. This is known as the least-squares probabilistic-collocation method (LSPCM). Both methods wer
Petvipusit KR, Elsheikh AH, King PR, et al., 2015, An efficient optimisation technique using adaptive spectral high-dimensional model representation: Application to CO<inf>2</inf> sequestration strategies, Pages: 1576-1595
Copyright © 2015, Society of Petroleum Engineers. The successful operation of CO < inf > 2 < /inf > sequestration relies on designing optimal injection strategies that maximise economic performance while guaranteeing long-term storage security. Solving this optimisation problem is computationally demanding. Hence, we propose an efficient surrogate-assisted optimisation technique with three novel aspects: (1) it relies on an ANOVA-like decomposition termed High- Dimensional Model Representation; (2) component-wise interactions are approximated with adaptive sparse grid interpolation; and (3) the surrogate is adaptively partitioned closer to the optimal solution within the optimisation iteration. A High-Dimensional Model Representation (HDMR) represents the model output as a hierarchical sum of component functions with different input variables. This structure enables us to select influential lower-order functions that impact the model output for efficient reduced-order representation of the model. In this work, we build the surrogate based on the HDMR expansion and make use of Sobol indices to adaptively select the significant terms. Then, the selected lower-order terms are approximated by using the Adaptive Sparse Grid Interpolation (ASGI) approach. Once the HDMR is built, a global optimizer is run to decide: 1) the domain shrinking criteria; and 2) the centre point for the next HDMR building. Therefore, this proposed technique is called a walking Cut-AHDMR as it shrinks the search domain while balancing the trade-off between exploration and exploitation of the optimisation algorithm. The proposed technique is evaluated on a benchmark function and on the PUNQ-S3 reservoir model. Based on our numerical results, the walking Cut-AHDMR is a promising approach: not only does it require substantially fewer forward runs in building the surrogate of high dimension but it also effectively guides the search towards the optimal solution. The proposed method p
Abubakar SY, Muggeridge AH, King PR, 2014, Upscaling for thermal recovery, Pages: 468-479
Copyright © (2014) by the Society of Petroleum Engineers All rights reserved. Thermal recovery methods (such as steam and hot water flooding) are increasingly been used to recover bitumen and heavy oils. These schemes are designed using numerical reservoir simulation. Unfortunately in most cases thermal simulation requires very fine grids to capture both the heat transfer and associated fluid dynamics and minimize numerical dispersion. This usually needs powerful computers with a large memory as well as meaning it takes a long time to run a simulation. The alternative is to use upscaling. As yet there are no upscaling methodologies suitable for thermal oil recovery methods. Although tihere is a significant literature on single phase upscaling and the development of pseudo relative permeabilities for waterflooding applications, these methods do not capture the heat transport or the impact of heat on the oil mobility. In this paper, we use the Buckley Leverett solution approximated for thermal EOR processes to derive analytical pseudos for both hot water and steam flooding. The methodology involves upscaling both the oil viscosity dependence on temperature and the relative permeabilities to compensate for the increased numerical dispersion that occurs in coarse grid simulations. The methodology is demonstrated by comparing ID homogeneous fine and coarse grid simulations. The approach provides significantly improved predictions compared with performing coarse grid simulations without upscaling.
Al-Shamma B, Gosselin O, King P, 2014, Parameterization using sensitivity methods for global history matching techniques, Pages: 887-891
For any history-matching method, an efficient optimisation method is required, but more importantly an effective selection of parameters. The parameterisation assists in reducing a large number of possible parameters, in the absence of available data measurements, lowering also the number of altered parameters. This paper describes the implementation of flexible integrated parameterization and optimization methods, tested on the PUNQS3 synthetic model, an iterative series of parameterisation, as a pragmatic strategy, and a comparison between various parameterization methods: layer-based, gradient-based, median-based, and distribution-based. The chosen parameters are regions or zones where permeabilities and porosity are adjusted using a common multipliers. The selected parameters are then utilized as search parameters to minimize an objective function, which quantifies the mismatch between the observed and simulated production data, using a so-called global minimisation algorithm. Successive parameterizations can be used, as part of an iterative process, where the history match is improved by further parameterisation, based on the previous "best match". The optimisation techniques cannot perform well without a suitable and effective parameterisation method. This study shows a pragmatic combination of a global technique and various parameterisation methods. It emphasizes, that a low objective function can be far from the true models, and not predictive.
Alkhatib A, King P, 2014, Robust quantification of parametric uncertainty for surfactant-polymer flooding, COMPUTATIONAL GEOSCIENCES, Vol: 18, Pages: 77-101, ISSN: 1420-0597
Alkhatib A, King P, 2014, An approximate dynamic programming approachto decision making in the presence of uncertainty for surfactant-polymer flooding, COMPUTATIONAL GEOSCIENCES, Vol: 18, Pages: 243-263, ISSN: 1420-0597
Alkhatib A, King P, 2014, Enhanced decision making for chemical EOR processes under uncertainty - Applying the LSPC method
The Least Squares Monte Carlo method is a decision evaluation method that can capture the value of flexibility of a process. This method was shown to provide us with some insight into the effect of uncertainty on decision making and to help us capture the upside potential or mitigate the downside effects for a chemical EOR process. The method is a stochastic approximate dynamic programming approach to decision making. It is based on a forward simulation coupled with a recursive algorithm which produces the near-optimal policy. It relies on Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir engineering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the Least Squares Monte Carlo method by improving the sampling method used to generate the technical uncertainties used in producing the production profiles. The probabilistic collocation method has been proven to be a robust and efficient uncertainty quantification method. It approximates the random input distributions using polynomial chaos expansions and produces a proxy polynomial for the output parameter requiring a limited number of model responses that is conditional on the number of random inputs and the order of the approximation desired. The resulting proxy can then be used to generate the different statistical moments with negligible computational requirement. By using the sampling methods of the probabilistic collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision evaluation method. Thus we introduce the least square probabilistic collocation method. Both methods are then applied to chemical EOR problems using a number of stylized reservoir models. The technical uncertainties considered include the residu
Alkhatib AM, King PR, 2014, The use of the least squares probabilistic collocation method in decision making in the presence of uncertainty for chemical EOR processes, Pages: 153-181
Copyright 2014, Society of Petroleum Engineers. The Least Squares Monte Carlo method is a decision evaluation method that can capture the value of flexibility of a process. This method was shown to provide us with some insight into the effect of uncertainty on decision making and to help us capture the upside potential or mitigate the downside effects for a chemical EOR process. The method is a stochastic approximate dynamic programming approach to decision making. It is based on a forward simulation coupled with a recursive algorithm which produces the near-optimal policy. It relies on Monte Carlo simulation to produce convergent results. This incurs a significant computational requirement when using this method to evaluate decisions for reservoir engianeering problems because this requires running many reservoir simulations. The objective of this study was to enhance the performance of the Least Squares Monte Carlo method by improving the sampling method used to generate the technical uncertainties used in producing the production profiles. The probabilistic collocation method has been proven to be a robust and efficient uncertainty quantification method. It approximates the random input distributions using polynomial chaos expansions and produces a proxy polynomial for the output parameter requiring a limited number of model responses that is conditional on the number of random inputs and the order of the approximation desired. The resulting proxy can then be used to generate the different statistical moments with negligible computational requirement. By using the sampling methods of the probabilistic collocation method to approximate the sampling of the technical uncertainties, it is possible to significantly reduce the computational requirement of running the decision evaluation method. Thus we introduce the least square probabilistic collocation method. Both methods are then applied to surfactant-polymer flooding problems using a number of stylized reservoir m
Bardy G, Biver P, Caumon G, et al., 2014, Proxy comparison for sorting models and assessing uncertainty on oil recovery profiles
To study the impact of subsurface uncertainties on oil recovery, it is common to build a large set of models which cover these uncertainties. Despite increase of computational capabilities, as models become more complex, it is not possible to perform full physic flow simulation for all the generated models. This is why stochastic reservoir model sets are often decimated to assess the impact of static uncertainties on dynamic reservoir performance. This contribution will focus on the use of proxy to perform this data set reduction. A lot of different proxies have been developed, from the simplest to the more complicated so it is difficult to choose the good one according to a particular goal. We present different criteria to compare the proxy quality and their helps to assess uncertainties on oil recovery. A first criterion will be based on the relation which may exists between the model distances computed on the proxy responses and those compute on flow responses. Another criterion is the speed factor and simplification provide by the proxy compared to the full physic simulator. These two criteria are very simple and can be applied in an early time to avoid deploying time consuming proxies which won't provide accurate information. The last criterion presented here, is the confidence intervals which can be computed around probabilistic reservoir production forecasts computed on a small representative subset of model. Even if this criterion can be used only when the entire dataset has been simulated, it provides some quantification about a possible bias created by a proxy and the remaining uncertainties on oil recovery. We present here a comparison study between widely different proxy responses applied on a real dataset of that methodology. This will give us some keys to choose a proxy which is a good compromise between accuracy and easy to handle methodology.
Petvipusit KR, Elsheikh AH, Laforce TC, et al., 2014, Robust optimisation of CO2 sequestration strategies under geological uncertainty using adaptive sparse grid surrogates, COMPUTATIONAL GEOSCIENCES, Vol: 18, Pages: 763-778, ISSN: 1420-0597
Petvipusit R, El Sheikh AM, King PR, et al., 2014, Robust optimisation using spectral high dimensional model representation - An application to CO2 sequestration strategy
Successful CO2 sequestration relies on operation strategies that maximise performance criteria in the presence of uncertainties. Designing optimal injection strategies under geological uncertainty requires multiple simulation runs at different geological models, rendering it computationally expensive. A surrogate model has been successfully used in several studies to reduce the computational burden by approximating the input-output relationships of the simulator with a limited number of simulation runs. However, building the surrogate is a challenging problem since the cost of building the surrogate increases exponentially with dimension. In the current work, we propose the use of Adaptive Sparse Grid Interpolation coupled with High Dimensional Model Representation (ASGI-HDMR) to build a surrogate of high-dimensional problems. This surrogate is then used to assist with finding robust CO2 injection strategies. High Dimensional Model Representation (HDMR) is an ANOVA like technique, which is based on the fact that high-order interactions amongst the input variables may not necessarily have an impact on the output variable; the combination of low-order correlations of the input variables can represent the model in high-dimensional problem. Adaptive Sparse Grid Interpolation (ASGI) is a novel surrogate technique that allows automatic refinement in the dimension where added resolution is needed (dimensional adaptivity). The proposed technique is evaluated on several benchmark functions and on the PUNQ-S3 reservoir model that is based on a real field. For the PUNQ-S3 model, robust CO2 injection strategies were estimated efficiently using the combined ASGI-HDMR technique. Based on our numerical results, ASGIHDMR is a promising approach since it requires significantly fewer forward runs in building an accurate surrogate model for high-dimensional problems in comparison to ASGI without coupling with HDMR. Hence, the ASGI-HDMR enables efficient construction of the surrogates
Sadeghnejad S, Masihi M, Pishvaie M, et al., 2014, Estimating the Connected Volume of Hydrocarbon During Early Reservoir Life by Percolation Theory, ENERGY SOURCES PART A-RECOVERY UTILIZATION AND ENVIRONMENTAL EFFECTS, Vol: 36, Pages: 301-308, ISSN: 1556-7036
Wen H, King PR, Muggeridge AH, et al., 2014, Using percolation theory to estimate recovery from poorly connected sands using pressure depletion
In conventional waterflooding of low to intermediate net to gross reservoirs there is always some oil unswept even in the sands connected to both injection and production wells. This is oil trapped in "dangling ends": flow units only poorly connected to the main flow path. In many cases the unswept volumes can be very large, depending on the properties of the reservoir and fluids and the well locations. In this paper we show how percolation theory can be used to estimate the volumes of oil recovered and those left behind in these dangling ends following a conventional waterflood, without recourse to large scale simulation. Percolation theory is a general mathematical framework for connectivity and has been used previously to investigate the connectivity of flow units. The structure of these connected clusters in terms of backbones and dangling ends has not been previously studied. The results are also used to estimate the recovery of the unswept oil from dangling ends by a waterflood with a voidage replacement ratio < 1. We use a simple model of stochastically-distributed sandbodies to describe the reservoir. Many realizations for a range of net to gross ratio values and sandbody: system sizes were generated. In each realization the clusters connecting the injection and production wells were identified. These spanning clusters were subdivided into backbones and dangling ends. The volume fractions of the backbone and dangling end were then obtained. The statistical average and standard deviation of the volumes association with these clusters were obtained from the ensemble of realisations. These were used to determine the percolation scaling relationships in terms of simple algebraic formulae that cover the whole range of net to gross ratio and system sizes. Our results show that the fraction of dangling ends can reach 20% of the clusters, and 80% among the spanning clusters, indicating a major proportion of the oil would be unswept by conventional wate
Alkhatib A, Babaei M, King PR, 2013, Decision Making Under Uncertainty: Applying the Least-Squares Monte Carlo Method in Surfactant-Flooding Implementation, EAGE Annual Conference and Exhibition incorporating SPE EUROPEC, Publisher: SOC PETROLEUM ENG, Pages: 721-735, ISSN: 1086-055X
Alkhatib A, King P, 2013, Uncertainty quantification of a chemically enhanced oil recovery process: Applying the probabilistic collocation method to a surfactant-polymer flood, SPE Middle East Oil and Gas Show and Conference, MEOS, Proceedings, Vol: 2, Pages: 757-772
Uncertainty in surfactant-polymer flooding is an important challenge to the wide scale implementation of this process. Any successful design of this enhanced oil recovery process will necessitate a good understanding of uncertainty. Thus it is essential to have the ability to quantify this uncertainty in an efficient manner. Monte Carlo Simulation is the traditional uncertainty quantification approach that is used for quantifying parametric uncertainty. However, the convergence of Monte Carlo simulation is relatively low requiring a large number of realizations to converge. This study proposes the use of the probabilistic collocation method in parametric uncertainty quantification for surfactant-polymer flooding using four synthetic reservoir models. Four sources of uncertainty were considered: the chemical flood residual oil saturation, surfactant and polymer adsorption and the polymer viscosity multiplier. The output parameter approximated is the recovery factor. The output metrics were the probability density function and the first two moments. These were compared with the results obtained from Monte Carlo simulation over a large number of realizations. Two methods for solving for the coefficients of the output parameter polynomial chaos expansion are compared: Gaussian quadrature and linear regression. The linear regression approach used two types of sampling: Gaussian quadrature nodes and Chebyshev derived nodes. In general, the probabilistic collocation method was applied successfully to quantify the uncertainty in the recovery factor. Applying the method using Gaussian quadrature produced more accurate results compared with using linear regression with quadrature nodes. Applying the method using linear regression with Chebyshev derived sampling also performed relatively well. Possible enhancements to improve the performance of the probabilistic collocation method were discussed. These enhancements include: improved sparse sampling, approximation order indepen
Alkhatib A, King P, 2013, Applying the Probabilistic Collocation Method to surfactant-polymer flooding, Saint Petersburg Russia - From Fundamental Science to Deployment: 17th European Symposium on Improved Oil Recovery, IOR
Enhanced oil recovery has achieved great attention during the past few years However, broad scale implementation requires greater understanding of the relevant uncertainties and their effect on performance Quantifying this uncertainty is very important for designing these processes, yet traditional methods which are usually based on Monte Carlo simulations require a large number of realizations to produce convergent results We propose the use of a non-intrusive approach known as the Probabilistic Collocation Method (PCM) to quantify parametric uncertainty for surfactant-polymer flooding The quantification of uncertainty was performed for surfactant/polymer related state variables such as adsorption rates and residual saturations The PCM is performed on two reservoir models a modified section of the SPE10 model and the PUNQ-S3 model The random input variables PDFs are first approximated using polynomial chaos expansions and then probabilistic collocation is used to produce approximations of the reservoir model using the collocation points obtained via Gaussian quadrature and Chebyshev extrema These approximations can then be used to produce PDFs for output variables such as the recovery factor Results show that PCM produces similar results to those obtained via Monte Carlo simulation, which requires a large number of simulations, while requiring significantly lower number of simulation runs.
Babaei M, Elsheikh AH, King PR, 2013, A Comparison Study Between an Adaptive Quadtree Grid and Uniform Grid Upscaling for Reservoir Simulation, TRANSPORT IN POROUS MEDIA, Vol: 98, Pages: 377-400, ISSN: 0169-3913
Babaei M, King PR, 2013, An Upscaling-Static-Downscaling Scheme for Simulation of Enhanced Oil Recovery Processes, TRANSPORT IN POROUS MEDIA, Vol: 98, Pages: 465-484, ISSN: 0169-3913
Ghosh B, King P, 2013, Optimisation of smart well completion design in the presence of uncertainty, Society of Petroleum Engineers - SPE Reservoir Characterisation and Simulation Conference and Exhibition, RCSC 2013: New Approaches in Characterisation andModelling of Complex Reservoirs, Vol: 2, Pages: 724-740
Intelligent/smart completions are widely used to maximize the value of production wells through higher ultimate hydrocarbon recoveries, to promote better clean-up of unconventional wells during 'flow-back' and to improve sweep efficiency in case of injector wells. To maximize the economic value of these applications, especially in the presence of uncertainties (geological, reservoir and long term tool reliability) and minimize the economic risk it's vital to optimise the placement and operational settings of the Interval Control Devices (ICDs)/AICDs (Autonomous Inflow Control Devices)/Interval Control Valves (ICVs). The requirement for optimisation could also arise from the limitation imposed by present technology on the maximum number of valves deployable in a single completion string. In this paper an optimisation routine for determining the optimal placement of Interval Control Valves (ICVs), and their inflow settings is presented. The overall optimisation scheme uses the simulated annealing algorithm in conjunction with a commercial reservoir simulator to maximize an objective function that captures the mean and variance in the well's estimated value. Multiple geostatistical realizations are used to incorporate the element of geological/reservoir uncertainty in the optimisation process. The workflow also accounts for the risk of flow control valve failure. A brief description of the screening methodology (to choose the appropriate inflow control technology) and a decision analysis framework for deploying intelligent completion technology, based on utility theory, is also presented herein. The optimisation technique was applied to cooptimise the positions and flow cross-section areas of the ICVs in a horizontal well, completed in an oil reservoir using a composite objective function. Geologic/reservoir and valve-life uncertainties were incorporated in the routine. The improvement in the well's Net Present Value (which is between 55-70% for the cases investigated)
Petvipusit R, Elsheikh AH, Laforce T, et al., 2013, A robust multi-criterion optimization of CO2 sequestration under model uncertainty, Sustainable Earth Sciences, SES 2013: Technologies for Sustainable Use of the Deep Sub-Surface
Successful CO2 storage in deep saline aquifers relies on economic efficiency, sufficient capacity and longterm security of the storage formation. Unfortunately, these three criteria of CO2 storage are generally in conflict, and often difficult to guarantee when there is a lack of geological characteristics of the storage site. We overcome these challenges by developing: 1) multiwell CO2 injection strategies using a multi-criterion optimization to handle conflicting objectives; 2) CO2 injection management that is robust against model uncertainty. PUNQ-S3 model was modified as a leaky storage to study injection strategies associated with the risks of CO2 leakage under geological uncertainty. Based on our numerical results, the NSGA-II with the ASGI technique can effectively obtain a set of efficient-frontier injection strategies. For the uncertainty assessment, the impact of the model uncertainty to the outcomes is significant. Therefore, our findings suggest using the mixture distribution of the objective-function values, as opposed to the traditional Gaussian distribution to cover model uncertainty.
Sadeghnejad S, Masihi M, King PR, 2013, Dependency of percolation critical exponents on the exponent of power law size distribution, PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, Vol: 392, Pages: 6189-6197, ISSN: 0378-4371
Sadeghnejad S, Masihi M, Pishvaie M, et al., 2013, Rock Type Connectivity Estimation Using Percolation Theory, MATHEMATICAL GEOSCIENCES, Vol: 45, Pages: 321-340, ISSN: 1874-8961
Al-Bulushi NI, King PR, Blunt MJ, et al., 2012, Artificial neural networks workflow and its application in the petroleum industry, NEURAL COMPUTING & APPLICATIONS, Vol: 21, Pages: 409-421, ISSN: 0941-0643
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.