Page 323 - Computational Statistics Handbook with MATLAB
P. 323
312 Computational Statistics Handbook with MATLAB
lyzing the performance of density estimates in terms of the asymptotic mean
integrated squared error, and also addresses high dimensional data.
The summary book by Silverman [1986] provides a relatively non-theoret-
ical treatment of density estimation. He includes a discussion of histograms,
kernel methods and others. This book is readily accessible to most statisti-
cians, data analysts or engineers. It contains applications and computational
details, making the subject easier to understand.
Other books on density estimation include Tapia and Thompson [1978],
Devroye and Gyorfi [1985], Wand and Jones [1995], and Simonoff [1996]. The
Tapia and Thompson book offers a theoretical foundation for density estima-
tion and includes a discussion of Monte Carlo simulations. The Devroye and
Gyorfi text describes the underlying theory of density estimation using the
(squared error). The books by
L 1 (absolute error) viewpoint instead of L 2
Wand and Jones and Simonoff look at using kernel methods for smoothing
and exploratory data analysis.
A paper by Izenman [1991] provides a comprehensive review of many
methods in univariate and multivariate density estimation and includes an
extensive bibliography. Besides histograms and kernel methods, he discusses
projection pursuit density estimation [Friedman, Stuetzle, and Schroeder,
1984], maximum penalized likelihood estimators, sieve estimators, and
orthogonal estimators.
For the reader who would like more information on finite mixtures, we rec-
ommend Everitt and Hand [1981] for a general discussion of this topic. The
book provides a summary of the techniques for obtaining mixture models
(estimating the parameters) and illustrates them using applications. That text
also discusses ways to handle the problem of determining the number of
terms in the mixture and other methods for estimating the parameters. It is
appropriate for someone with a general statistics or engineering background.
For readers who would like more information on the theoretical details of
finite mixtures, we refer them to McLachlan and Basford [1988] or Tittering-
ton, Smith and Makov [1985]. A recent book by McLachlan and Peel [2000]
provides many examples of finite mixtures, linking them to machine learn-
ing, data mining, and pattern recognition.
The EM algorithm is described in the text by McLachlan and Krishnan
[1997]. This offers a unified treatment of the subject, and provides numerous
applications of the EM algorithm to regression, factor analysis, medical imag-
ing, experimental design, finite mixtures, and others.
For a theoretical discussion of the adaptive mixtures approach, the reader
is referred to Priebe [1993, 1994]. These examine the error in the adaptive mix-
tures density estimates and its convergence properties. A recent paper by
Priebe and Marchette [2000] describes a data-driven method for obtaining
parsimonious mixture model estimates. This methodology addresses some
of the problems with the adaptive/finite mixtures approach: 1) that adaptive
mixtures is not designed to yield a parsimonious model and 2) how many
terms or component densities should be used in a finite mixture model.
© 2002 by Chapman & Hall/CRC