|
Fusion of Multisensor Data: Review and Comparative Analysis |
|
|
Introduction
Earth observation satellites provide data covering different portions of the electromagnetic spectrum at different spatial, spectral and temporal resolutions. Satellites, such as QuickBird, IKONOS, IRS, bundle a 1:4 ratio of a high resolution (HR) panchromatic (PAN) band and low resolution (LR) Multispectral (MS) bands in order to support both spectral and best spatial resolutions while minimising on-board data handling needs [1]. Fusion of data from multiple sensors aids in delineating objects with comprehensive information due to the integration of spatial information present in the PAN image and spectral information present in the LR MS images. For example, fusion of 1 m IKONOS PAN image with 4 m MS images, permits identification of objects approximately one meter in length on the Earth’s surface, especially useful in urban areas because the characteristic of urban objects are determined not only by their spectra but also by their structure. Remote sensing (RS) data fusion techniques integrate both PAN and MS data and can be performed at pixel [2], feature [3] and decision [4] levels. This paper reviews the outcome of seven pixel based image fusion techniques.
|
|