Page 126 - Intelligent Digital Oil And Gas Fields
P. 126
Data Filtering and Conditioning 93
• High-frequency data. The gas, water, and oil rates can be taken every sec-
ond for real-time and DOF operations. Some meters need more than15s
to measure the rate, such as Coriolis and MPFM. Orifice and venturi
meters can measure the fluid in seconds or continuously. Vortex and tur-
bine meters depend on the velocity of spinners, generally in a few sec-
onds. Downhole memory pressure sensors can take the pressure signal
every decisecond (1/10s); this high-frequency rate is needed to identify
the early time properties in a pressure transient analysis (PTA) test, such
as the wellbore storage coefficient, skin factors, etc. Others are fiber optic
distributed temperature sensing (DTS) systems.
• High-definition data. Data that stream at terabytes per minute, such as
fiber optic distributed acoustic sensing (DAS) and wellbore
acoustics/μ-seismic data.
Fig. 3.5 depicts the data storage for a real-time database, a 24-h production
test database, and an extra-large database designed for seismic and fiber optic
information. All the data are stored and compressed in a master data bank.
A second step is used to clean up, detect spikes, filter and condition data and
restructure data in a different SQL data table. Depending on the final engi-
neering purpose, the data will be organized and downsampled in seconds,
minutes, hours, days, and months and then summarized depending on
the final utilization. For example, PTA data tables are stored in deciseconds,
RTA in hours, DCA in days, well test performance (nodal analysis) in hours
to days, and numerical model, material balance equation (MBE), econom-
ical analysis and financial calculation in days to months.
3.3.2 Down Sampling Raw Data
The data is frequently reduced, filtered, or simply downsampled to manage
the data for engineering purposes. In signal processing, this is called
“decimation” and is commonly used for PTA or RTA to reduce the pressure
signal data by 10:1 or so versus the raw data. Down sampling is a technique of
data processing that reduces the data frequency from seconds to minutes to
hours while preserving the main signal changes, variations, and physical
meaning of the data. Fig. 3.6 shows an illustration of the data downsampling
process, for an example RTA analysis.; note that data in seconds are down-
sampled to hours. Fig. 3A is a plot of the 1-s data taken over a 24-h period;
Fig. 3B illustrates the data being both cleansed of out-of-range and spikes
and downsampled; and Fig. 3C shows the downsampled data at 1 h intervals
for the day.