Page 111 - Intelligent Digital Oil And Gas Fields
P. 111

80                                        Intelligent Digital Oil and Gas Fields


             external factors, such as power interruption or sensor error. When
             the RTU or transmitter signal is malfunctioning, data spiking can
             increase. Often, data spikes can be mistaken for gas or oil wells flowing
             with slugging conditions. The two can be distinguished because data
             spikes occur with high frequency (every second), whereas the spiking
             behavior in a slugging well occurs with lower frequency (over several
             minutes).
          •  Data do not follow any physical behavior; a signal or multiple signals do
             not correspond to physical requirements. For example, gas rate is a func-
             tion of pressure response, so when gas rate increases, by definition pres-
             sure should decrease and vice versa. When the well is completed and
             shut-in (gas rate¼0.0), the tubing pressure builds up in proportion to
             static reservoir pressure. So if one obtains in case of a reading that does
             not accurately reflect these known physical conditions, it’s probably an
             error.
          The above conditions relate to the sensor signal or SCADA. But engineers
          must learn to distinguish between these types of problems and data that may
          actually be alerting them to real issues in the production system.
             Fig. 3.2 shows a plot with gas production and surface pressure. The data is
          logged every minute for a 24-h period. The plot shows several of the data
          situations described above: data spikes, missing data, gas rate out-of-range
          without pressure unchanged, frozen gas rate, and sometimes zero, when
          pressure is similar to the gas rate above zero.


          3.2.1 Data Validation System Architecture

          The data validation process can be considered part of the overall data
          quality control (DQC), involving both raw and processed data. Produc-
          tion DQC is defined as all those operational procedures that are being
          used routinely to ensure the reliability of monitored data. It consists of
          an examination of data to detect errors, so that data may be corrected,
          filtered, or deleted. Quality control of raw data is performed to eliminate
          errors of measuring devices such as sensor malfunction, instability, and
          interference, to reduce potential corruption of processed data. These
          procedures should be done as close to the source as the data allows. In
          conventional practice, a first level of these practices is often done in
          the historian software. But some companies perform the quality checks
          once data have been communicated from the historian to a database.
          This second option often leads to quality issues and uncertainty about data
   106   107   108   109   110   111   112   113   114   115   116