and     Zhang Renhua
Institute of Geography, Chinese Academy of Sciences, Bldg 917, Datun Road,
Anwai, Beijing 100101, China, firstname.lastname@example.org
J. of Remote Sensing (Science Press, Beijing), 1997, 1(Suppl.):
Submitted 16 september 1996
Analytical canopy reflectance (CR) models have reached the level of adequacy that makes it possible to estimate vegetation parameters by the inversion of such models. The increasing efficiency of algorithms, and the increasing power of computers incite to develop procedures for the estimation of vegetation phytometrical parameters on large areas using satellite data and the inversion of theoretical CR models.
A Markov chain canopy reflectance model (MCRM) by Kuusk (1995) demonstrated its ability to work on wide range of canopy optical and structural parameters even in the case of serious violations of model assumptions (Kuusk, 1995; 1996). The MCRM is very computer-efficient and can be easily inverted on relatively large sets of reflectance data. Canopy reflectance models relate canopy directional reflectance to canopy structural and optical parameters. In order to solve the inverse problem on satellite images, we have to convert digital counts of satellite radiometers to ground level reflectances. For the conversion it is necessary 1) to determine satellite level radiances, i.e. to perform absolute calibration of radiometers, and 2) to estimate the ground level reflectance of targets, i.e. to perform atmospheric correction of satellite data. A straightforward procedure of pixel-by-pixel inversion of satellite images is possible, however, the inversion time increases rapidly with increasing image size and increasing number of spectral channels. The CR model inversion on large images can be performed more efficiently if some clusterization in the space of spectral signatures is applied. Here the MCR model is inverted on a 256x256 Landsat Thematic Mapper (TM) scene of a test site in Estonia. The spectral images of TM2, TM3, TM4, TM5 and TM7 taken on 8th June 1988 are used.