Page 47 - Video Coding for Mobile Communications Efficiency, Complexity, and Resilience
P. 47

24                                   Chapter 2.  Video Coding:  Fundamentals


            2.5.5.1  Run-Length Encoding
            The  output  of  the  quantization  step  may  contain  long  runs  of  identical  sym-
            bols.  One  way  to  reduce  this  redundancy  is  to  employ  run-length  encoding
            (RLE). There are di6erent forms of RLE. For example, if the quantizer output
            contains  long  runs  of  zeros,  then  RLE  can  represent  such  runs  with  interme-
            diate  symbols  of  the  form  (RUN, LEVEL).  For  example,  a  run  of  the  form
            0; 0; 0; 0; 0; 9 can be represented  by the intermediate symbol  (5,9).

            2.5.5.2  Entropy Encoding
            The  quantizer  can  be  considered  a  DMS  Q  that  can  be  completely  speci-
             ed by its alphabet R  = {r 1 ;r 2 ;:::;r N  }, where r i  are the reconstruction levels
            and  the  associated  probabilities  of  occurrence  P = {p(r 1 );p(r 2 );:::;
            p(r N  )}.  The  information  contained  in  a  symbol  I (r i  )  is  given  by
            Equation  (2.3),  whereas  the  entropy  of  the  source  H (Q)  is  given  by
            Equation (2.4).
               Now consider a symbol encoder that assigns a codeword c i  of length l(c i )
                                                      R
            bits to symbol  r i  . Then  the average  word  length  L of  the code is given by

                                  N
                              R
                              L =    p(r i )l(c i  )   (bits);          (2.13)
                                  i=1
            and the eGciency ( )  of  the code is
                                         H (Q)
                                       =     :                          (2.14)
                                           R
                                          L
               Thus,  an  optimal  (  = 1)  code  must  have  an  average  word  length  that  is
                                             R
            equal to the entropy of the source; i.e., L = H (Q). Clearly, this can be achieved
            if  each  codeword  length  is  equal  to  the  information  content  of  the  associated
            symbol,  that  is,  l(c i )=  I (r i  ).  Since  I (r i  )  is  inversely  proportional  to  p(r i )
            (from  Equation  (2.3)),  then  an  eGcient  code  must  assign  shorter  codewords
            to more probable symbols, and vice versa. This is known as entropyencoding
            or variable-length  coding  (VLC)  (as  opposed  to   xed-length  coding  (FLC)).
               The  most  commonly  used  VLC  is  Hu)man  coding  [22].  Given  a   nite
            set  of  symbols  and  their  probabilities,  Hu6man  coding  yields  the  optimal 14
            integer-length  pre x 15  code.  The  basic  principles  of  Hu6man  coding  can  be
            illustrated using the example given in Figure 2.6. In each stage, the two least
            probable symbols are combined to form a new symbol with a probability equal


             14 Hu6man is optimal in the sense that no other integer-length VLC can achieve a smaller average
            word length.
             15 In a pre x code, no codeword is a pre x of another codeword. This makes the code uniquely
            decodable.
   42   43   44   45   46   47   48   49   50   51   52