Page 243 - Geology and Geochemistry of Oil and Gas
P. 243

210                      MATHEMATICAL MODELING IN PETROLEUM GEOLOGY

             The characteristic feature of construction of artificial distributions for solving
           geologic modeling problems is that the decision about the influence of various nat-
           ural factors upon the parameter under study is often made only at the range of their
           variation, which indicates the presence of fuzzy data sets. In such a case, the interval-
           probable determination of the input data and the subsequent calculation results
           provide consideration of the indeterminate form of the basic factors.

           11.2.1. Analytical approach

           11.2.1.1. Entropy of geologic systems
             A study of geologic systems, from the viewpoint of homogeneity/heterogeneity of
           their structure, may be carried out with an aid of the theory of information. Any
           communication is a sum total of information on some real system. In geologic
           terminology, such may be the reports on the various analyses of rocks, oils, gases,
           and formation waters, which have been conducted in the laboratory or obtained
           from formation testing. Any one of these communications describes the state or level
           of knowledge of some geologic system.
             The entropy, a special index in the theory of information, is employed as a cri-
           terion of the a priori (before the communications were received) indefiniteness of the
           system. The concept of entropy, borrowed from the thermodynamics and statistical
           physics, is a basic concept in the theory of information. The entropy, H, is deter-
           mined by the following formula:
                       N
                      X
               H ¼       p log p i                                           (11.1)
                          i
                      i¼1
           where p is the probability of state of any one of the N components constituting the
                 i
           system. The ‘‘minus’’ sign is introduced in order to make the entropy positive,
           inasmuch as p i p1 and log p i p0.
             The H entropy is related to the S entropy (commonly employed in thermody-
           namics) using the Boltzmann’s constant k, namely:
                       N
                      X
               S ¼  k    p log p i                                           (11.2)
                          i
                       i¼1
             The entropy has the following properties:
             (1) It becomes zero, if one of the states of the system is fully known (p ¼ 1) and
                                                                         k
           the others are impossible ðp ¼ p ¼       ¼ p k 1  ¼       ¼ p ¼ 0Þ.
                                       2
                                                            n
                                  1
             (2) At a given number of states, it becomes maximum, if all the states are equally
           probable, but increases with the increasing number of states.
             (3) It is additive, i.e., it is possible to add up entropy of different systems.
             In the theory of information, entropy is generally calculated in the binary system,
           although natural or decimal logarithms are often more convenient to use. The shape
           of the  p log p function, in natural and decimal logarithms, is presented in Fig. 11.4.
             The quantity of information on the state of a real system is measured by the
           decrease in the entropy of this system. The ultimate quantity of the information, I, is
           equal to H (the total entropy of the system), namely
   238   239   240   241   242   243   244   245   246   247   248