Page 136 - Digital Analysis of Remotely Sensed Imagery
P. 136

Image Pr ocessing Systems     107

               Signature essential in supervised classification may be developed
               from training samples or from laboratory spectral libraries. In addi-
               tion, there are two novel classifiers in IDRISI Andes, the Fisher classi-
               fier, based on linear discriminant analysis, and the backpropagation
               neural network classifier. Apart from 13 such hard classifiers, there is
               an extensive set of soft classifiers, totaling 14 for analyzing multispec-
               tral data, such as those based on the Dempster-Shafer evidence the-
               ory, fuzzy logic, and the linear mixture model. It is possible to combine
               different classification procedures (e.g., Bayesian probability calcula-
               tion with linear spectral unmixing) to form hybrid procedures to pro-
               duce more reliable classifications. Also released in this version is the
               largest suite of machine learning/neural network classifiers, such as
               classification tree analysis, multilayer perceptron, self-organizing
               feature map, and fuzzy ARTMap. Six special modules are designed to
               analyze hyperspectral images.
                   IDRISI Andes is also able to carry out change detection based on
               image differencing, image ratioing, time series Fourier analysis, spatial/
               temporal correlation, and image profiling over time. Image differenc-
               ing may be implemented via change vector analysis and regression-
               based calibration. Derived using the temporal resonance module, the
               temporal index indicates the degree of correlation between every pair
               of pixels in multitemporal images. Special change analysis tools are
               available for assessing change quickly. The most celebrated addition
               to IDRISI Andes is the land change module for modeling ecological
               sustainability, developed specifically for the International Center for
               Biodiversity Conservation in the Andes. It is able to analyze land con-
               version, predict and model change in the future via Markov chain
               analysis or cellular automata, and assess the effect of the change on
               biodiversity (Hermann, 2006). The modeled results may be validated
               through a set of comparison tools with categorical map data. The
               transition in land cover or potential of change can be explored from
               both static and dynamic explanatory variables using either logistic
               regression or multilayered perceptron neural network.

               4.1.2  Display and Output
               Raster images may be displayed in black and white or in true (24-bit)
               color, or as transparent. Color display is possible with any bands des-
               ignated as the red, green, and blue layers of a RGB composite
               (Fig. 4.1). Each layer can be assigned a symbol file created with a spe-
               cial symbol/palette development tool. Images may be displayed as
               three-dimensional (3D) perspectives, contour plots, and analytical
               hillshading. The fly-through module provides real-time interactive
               animation over a digital elevation model (DEM). A 3D impression of
               stereoscopic images may be obtained with the assistance of a pair of
               anaglyphic glasses.
                   Moreover, images may be displayed on screen as a map composition,
               into which nonimage layers such as hydrography, roads, and elevation
   131   132   133   134   135   136   137   138   139   140   141