Page 321 - Becoming Metric Wise
P. 321

313
                                                           The Informetric Laws

              Clearly, it is impossible for one function, one candidate diversity measure,
              to satisfy all “reasonable” requirements. It seems though that a very good
              choice is given by Leinster and Cobbold (2012) and Zhang et al. (2016):

                                             1
                                                                         (9.24)
                                           1 2 D
              where D is Rao’s quadratic entropy measure (Rao, 1982) defined as:
                                            N
                                           X
                                      D 5      d ij p i p j              (9.25)
                                          i; j51
                                          i 6¼ j

                 Rao describes this index as the expected dissimilarity between two
              individuals selected randomly with replacement, where d ij is the dissimi-
              larity between species i and j and p i (p j ) is the proportion of species i (j).
              If there is only one cell, D is set equal to zero. Rao’s quadratic entropy
              measure is a generalization of the Simpson index, formula (9.23).
   316   317   318   319   320   321   322   323   324   325   326