Introduction

Well tests have been widely used for several decades in the oil industry to estimate reservoir properties, such as initial pressure, fluid type, permeability and identification of reservoir barriers/boundaries in the formation volume near the wellbore (Pitzer 1964). Information collected during well testing usually consists of flow rates, pressure patterns, temperature data and fluid samples (Sulaiman and Hashim 2016).

Well-testing operation in new fields is faced difficulties such as geological complex that requires advanced technology (Wiebe 1980), experienced personnel and longer durations for operations. Operator companies regularly developing new fields without extensive evaluation of the predicted reserves and the actual (Bittencourt and Horne 1997). Therefore, different reservoir engineer needs to be careful in interpreting the routine drill string test (DST), (Dean and Petty 1964; Oliver and Chen 2011). There is no single method of testing and sampling, that is, the best for the purpose under every circumstance (Soh 2010; Whittle et al. 2003). The selection of the test type, sequence and duration has to be balanced against operational risk, geology, environmental constraints, equipment and the economic value derived from affecting early decisions on project appraisal or development (Bottomley et al. 2015; Li et al. 2014).

There are several sources of uncertainties in the analyses and interpretations of well-test data (Ballin et al. 1993; Sulaiman and Hashim 2016; Vargas et al. 2015; Bittencourt and Horne 1997) as follows:

  • Physical errors in the pressure data, such as noise drift temperature effects and time shift.

  • Errors in the flow rate measurement.

  • The non-unique response of the reservoir with the production time effects.

  • Uncertainty of rock and fluid properties.

Routine well-test analysis provides data on the average properties of the reservoir near the well, but does not provide the overall reservoir characteristics and boundaries. One of the main reasons for this limitation is that a traditional well-test analysis handles the transient pressure data collected from a single well over a short duration. For instance, log and modular formation dynamic tester (MDT) data only provide information on areas adjacent to the wellbore, while the seismic data cannot delineate the heterogeneity of the reservoir (Ballin et al. 1993; Khaledialidusti et al. 2015).

The Moga field is located in the west of Sudan and was discovered at the beginning of 2003. It was a promising location but with a lot of risks, such as the uncontrolled movement of armed and rebellious tribes during the expected process of drilling and completion, the geological nature of the land and the heterogeneity of the petrological layers. These led to a situation where a standard DST operation was carried out for three days without a design for the operation sequence, which should have been based on the field situational characteristics.

Drilling the first well was difficult, costly and ended with abandonment of the well due to lost circulation of oil. Various types of measures were taken during drilling of a test borehole in the second well, (Well 22) which was performed as an exploratory well aimed at the mid-lower Abu Gabra formation at the proposed depth of 2700 mKB and was safely completed. However, the high mud density caused extensive damage to the nearer well area, which also distracted the hypothetical oil rate of flow that does not represent the real reservoir characteristics, burdened with additional risk of poor test design for testing time and huge pressure noise. In view of this anomaly, it is not logical to consider that the test cannot be saved from being run on an inappropriate source of data.

The well-test analyst does the work without going through well-informed and reliable data from different sources, thus resulting in many errors, which may consequently lead to misleading parameters affecting the total estimation of reserves, which will eventually lead to financial loss to the company.

In this paper, the production rate and pressure data from a noisy Gauge were analyzed by joining two gauges and a new pressure profile was generated from the confidence interval. The results were analyzed using the convolution and deconvolution method to extract the skin factor, productivity index and average permeability, which solved the problem of defining the exact reservoir boundaries and offers more advantages in the analysis, especially in future development plan and reduction in the noise data effect at the initial stage.

Literature review

The calculated result on damaged exploratory petrological entities using the deconvolution simulation can be quite challenging compared to the convolution simulation, but on the other hand, the boundary identification can also be quite misleading by using the regular (convolution) simulation, which should appropriately be performed through the deconvolution method. Using a deconvolution method in a high-risk area, like (Moga) field, offers more advantages in the analysis, especially in future development plans and reduction in the noise data effect in the initial stage.

Convolution method

The details of convolution will be explained in a short review of the physical principle of the methods that can estimate the reservoir properties based on the multiple flow periods being the principle of superposition. The principle of superposition is similar to convolution and uses the assumed linearity of the reservoir and the known step response of the system, and it is possible to compute the pressure response of the multiple flow periods by super-positioning the known step responses. Convolution is used to calculate the pressure response based on any flow rate function in a known system.

Bourdet (2002) presented a major development in the mid-1980 in which he advocated superimposing of the log-time derivative of the storage and skin solution on the accepted form of the storage type curve. Combination of the two types of curves could lead to a unique match of field data, which could do away with the need for a Horner buildup graph. They also observed that the data taken before a semi-log straight line could be interpreted, as is often the case, initial claims on this new method were not entirely correct, as some major advantages of a derivative graph were not yet evident. The first impression on the derivative procedure was that the high-precision data were initially required to make such a procedure feasible. The solution to this was to first develop good numerical procedures to differentiate and categorize the numerous field data. To date, the Bourdon-type data taken in the 1960s as well as the high-precision liquid-level data and gas-purged capillary tube data have been successfully differentiated.

One final important step made since 1976 concerns the computer-aided interpretation. The various types of curves and semi-log graphs involved in well-test analyses and differentiation of field data were perfectly suited to a computer with the appropriate software.

Deconvolution method

In general, the terms deconvolution is the inverse of convolution is used to calculate the output of a system, when the input and the system dynamics often described by the impulse response are known, whereas deconvolution can be used to calculate the impulse response when the input signals and output signals are known. The aim of deconvolution is to calculate the impulse response of the system, based on the transient pressure response and the flow rate. In well-testing literature, there are, however, various definitions of deconvolution used. All the definitions are based on the methods used to extract the impulse response g (t) based on the input and output. However, deconvolution which is the inverse of convolution is not always applied in the practical sense of it. Sometimes, estimating the step response h (t) of the system is also considered deconvolution. A skilled engineer can eventually get the same answers on deconvolution problems by using a regular analysis (convolution method) (Xiaohu et al. 2010) and can still give a direct view of the underlying model controlling the well, thus getting the right answer faster. It provides the equivalent radius-of-investigation for the test.

In the field of well testing, deconvolution is considered as a reliable tool to estimate the shut-in pressure response of the reservoir during a varying flow rate (Vaferi and Eslamloueyan 2015). This technique improves the estimation of the type of curve needed, by reshaping the data, which readily improves the estimation of the reservoir properties (Horne and Liu 2013; Liu and Horne 2013). Since 1999, several research works have been carried out on deconvolution in the field of well testing. There is a lot of attention given to deconvolution in the field of well testing (Gringarten 2010; Gringarten 2006; Houze et al. 2010; Levitan 2007; Onur et al. 2009).

Methodology

Well and reservoir overview

Based on the drilling data source and completion program, this test was to estimate the skin factor and initial reservoir pressure in this area, which was needed for the next drilling operations (Table 1).

Table 1 Overview data

Well-testing reservoir parameter

Well testing is conducted to obtain data for interpretation of the reservoirs characteristics, average parameters including porosity, permeability, pressure, temperature, depths, oil viscosity and GOR, all of which are needed to be identified or assumed as the case may be and this data are gathered by the geologist for testing at the PVT laboratory to obtain the expected results.

The input data from well reports of geologist, fluid properties and completion reports are summarized in Table 2. In the case of (Abu Gabra), the reservoir is homogeneous. Therefore, assumption is isotropic of constant thickness.

Table 2 Reservoir parameters

The study was focused on Well-22 especially, on the Drill Stem Test in the Abu Gabra reservoir because it was the first well to be completed and its production time was very short. As such careful attention was focused on choosing the appropriate sequence of analysis to work with.

Result and discussion

Convolution analysis

The previous data were plotted using the simulator; the cartesian plot of (pressure vs time) and (Pressure difference vs time) was generated by the simulator.

By loading pressure gauges data, it can be seen that the noise distracted the first area as shown in Fig. 1.

Fig. 1
figure 1

Pressure plot (2 gauges with noisy data)

Two gauges were loaded with the green points, which are referred to the basic gauge, and the orange points, which are referred to the back-up gauge. The presence of regular distribution points was caused by mechanical issues, whereas the difference between the basic gauge and the back-up gauge caused a confusion, instead of choosing which one to work with it was better to create a new test.

Using the gauge difference in reading, it was possible to get a new gauge calibration between the original gauge and the back-up gauge, which will solve the problem partially. The history plot is automatically created first after the production or pressure data were loaded. The plot is split into a production plot, with the possibility of displaying a cumulative production at the top section and the pressure plot at the bottom section as shown in Fig. 2.

Fig. 2
figure 2

History plot (pressure, rate vs. time)

A clear infinite acting radial flow (IARF) is not found as shown in Fig. 3. Using a model of homogeneous reservoir with the circle boundary scenario seems to be the best, although it may be uncertain to a certain extent.

Fig. 3
figure 3

Simulated derivative-type curve for circle scenario

To get a represented line as shown in Fig. 4, there should be more points to work with, as that can positively indicate the presence of a huge run in the flow rate that passes very fast through the reservoir area. The circle boundary and the high skin factor can give the best simulation scenarios.

Fig. 4
figure 4

Simulated semi-log

The simulated semi-log also cannot give enough points in the middle time region of the reservoir. It seems that the semi-log does not stand in the proper area due to the lack of data and risky results and the process of crosschecking of the permeability and the skin factor from the simulated semi-log with the simulated history and derivative will be a sheer waste of effort as shown in Fig. 5. The results illustrated in Table 3.

Fig. 5
figure 5

Simulated history plot for circle scenario

Table 3 Convolution simulated results

Deconvolution analysis

By using the available data extracted from the deconvolution process and modeling exercise, the assumed model based on the same previous data gave the characteristic features and attributes of a homogeneous reservoir but with a parallel fault, all of which show more appropriate results in the simulation process as in Fig. 6.

Fig. 6
figure 6

Deconvolution derivative plot

After generating the pressure and reservoir responses, the IARF then appears in short sequences, which can be detected easily from the chosen reservoir area that provides better results on the reservoir characteristic properties for gainful usage at a later stage.

Figures 7 and 8 show a yellow line, which represents the reservoir, where the change in the flat line indicates a structural shape, in this case, a parallel fault with no flow boundary. The results are shown in Table 4.

Fig. 7
figure 7

Semi-log for deconvolution analysis

Fig. 8
figure 8

Deconvolution simulated history plot for parallel fault with no flow scenario

Table 4 Deconvolution simulation results

Discussion

Tables 3 and 4 show the characteristic fracture damage. The change in the determined values of skin (s) between the convolution and deconvolution estimated to be 4.3% from the average which can be neglected in practical aspects. The variation in range of the initial reservoir pressure between the two methods is within acceptable limit. However, the reservoir permeability cannot be obtained from the convolution process because the data do not represent the extended reservoir area. The results show reduction in the permeability up to 20% which is beyond the accepted range of practical errors. On the other hand, the deconvolution method has given a better result, after using the condition of a parallel fault even though it does not give a very close result with the circle result. It is to be noted that the convolution method can yield a very wrong estimate of the boundary identity, whereas the deconvolution technique can be more useful in the boundary identity. To accept the deconvolution technique results, a discreet consideration should be evaluated in retrospect of the past well history with the lost circulation, whereas the application of the convolution method needs more time stages to reduce the rate of flow by changing the choke size. Furthermore, the convolution method cannot truly forecast the reservoir capacity because the damage in the stimulation has created an assumption of a higher flow rate in excess of the reservoir actual capacity.

Conclusion

In this study, transient pressure analysis for vertical well was conducted using Saphir Simulator. The techniques introduced in this work give excellent estimation for information of flow characteristics, as well as oil exploration areas of uncertainty such as the interpretation of preferred targeted oil thickness, wrong estimation of skin factor, overestimation of productivity index.

The uncertainty factors associated with the noisy pressure gauges were reduced after using the deconvolution technique. However, it can be considerably mitigated if calculated option can be made for other dip exploration well by estimating the existence of the nearest fault and heterogeneity change for each layer.

Recommendation

Further research works should be developed for the analyses of long-term production data from exploration wells and the present deconvolution analysis concepts should be extended to include the rest of the wells. The use of the deconvolution technique can produce better results, with compatible views on the real boundary identity, as the common usage of the regular technique has frequently been quite misleading in terms of reservoir properties, especially permeability which can be ranged using other logs to get a better view.