ADVERTISEMENTS:
After reading this article you will learn about:- 1. Definition of Remote Sensing 2. Devices used for Remote Sensing 3. General Utility 4. Types 5. Stages 6. Basic Concept 7. Sensors used 8. Potential Applications.
Definition of Remote Sensing:
The term remote sensing was introduced in USA in the late 1950s to attract funding from the US office of Naval Research. Parker (1962) defined remote sensing as covering the collection of data about objects which are not in contact with the collecting device.
In a broader sense, remote sensing is defined as the measurement or acquisition of information of some property of an object or phenomenon, by a recording device that is not in physical or intimate contact with the object or phenomenon under study.
ADVERTISEMENTS:
So, remote sensing is a multidisciplinary activity which deals with the inventory, monitoring and assessment of natural resources through the analysis of data obtained by observations from a remote platform.
Devices used for Remote Sensing:
Devices used for remote sensing in soil and land survey detect electromagnetic radiation (EMR).
The sensing may be as follows:
(i) Active sensing:
ADVERTISEMENTS:
When the sensing device directs EMR at an object and detects the amount of that energy which is reflected back. Examples, radar, radio wave emissions (wavelength > 103µm) and laser-imaging radar (UV, visible and near infrared, 10-3– 10 µm).
(ii) Passive sensing:
When the sensing device detects EMR originating from another source, primarily from sun.
General Utility of Remote Sensing:
Remote sensing is now generally accepted as not only involving the collection of raw data, but also involving manual and automated raw data processing, imagery analysis and presentation of the derived information.
ADVERTISEMENTS:
It is usually confined to sensing in the electromagnetic spectrum i.e. using energy which functions in the same manner as light; and covers not only in the visible spectrum, but also the ultraviolet, near infrared, mid infrared, far infrared and radio-waves.
For an example, if, however, the hand is placed on the page to assess its surface temperature, this would not be remote sensing since the sensor (the hand) is in contact with the object.
It should also be noted that the United Nations Committee on the peaceful uses of outer space (COPUOS) in several of its publications distinguishes in remote sensing between the collection of data in the outer space (satellite remote sensing) and airborne remote sensing (inner space).
Remote sensing relies on detecting differences in the reflected or emitted radiation from different areas on the land surface over a range of wavelengths. It can be performed from various platforms, of which aircraft or space craft, and the data are recorded either photographically or in digital form.
ADVERTISEMENTS:
However, a complete remote sensing programme generally involves integrating more than one “level” of imagery with on-site observations including ground-truthing.
Data must be in digital form for computer processing, so air photographs must be converted to a digital format before undergoing image analysis and sophisticated data processing. Most satellite and airborne scanner imagery is initially recorded in digital form.
In digital form, remotely sensed data provide a natural input to Geographic Information System (GIS) where they can be put into layers along with other spatial data (Cadastral boundaries roads, buildings, etc.), and linked to attribute data describing properties of a spatial feature in the GIS. In case of soil, attribute data includes pH, soil texture, types of drainage, predominant soil series in a particular area.
Types of Remote Sensing:
1. Airborne Remote Sensing:
ADVERTISEMENTS:
Aerial Photography:
Aerial photographs are very useful to interpret and map soils, but the interpretation must be done very carefully because of risks of error. The photographs must be considered as an essential tool for the soil scientists, remote sensing techniques give also information about soils, but they are not as well-known as photographs.
Among the remote sensing techniques, the more oftenly used are certainly multi-spectral images taken from satellites.
The smallest scale black and white aerial photography in commercial use was about 1: 85,000 using an 88 mm lens. For many purposes the standard black and white panchromatic photography, used in photogrammetric mapping, was not always the most suitable for forest studies.
The value of black and white infrared aerial photographs was well recognised for separating stands of conifers (gymnosperms) and broad leaved species (hard woods/ angiosperms) in the cool temperature natural forests.
However, this technique was not suitable for separating native conifers from eucalyptus. Black and white infrared aerial photography was widely accepted as the most suitable film type for mapping the boundaries of water surfaces and for separating wet and dry lands.
By the 1960s, colour photography was beginning to compete in forestry operationally with black and white photography. Natural colour photography received tremendous importance in forest studies. The importance of natural colour photography was widely recognised for disease detection and for the identification of some tree species growing in temperate zones.
The versatility provided by the widening range of film filter combinations, film types, improved photographic equipment’s and flying heights from near ground level to high altitudes has contributed to the growing importance of colour aerial photography.
Colour infrared photography has now generally replaced black and white infrared; and colour photography at large, medium and small scales is now increasingly used in periodic national inventories and combined with satellite derived data.
Aerial photography at very large scale (e.g. 1: 1200) has been demonstrated to be suited to the reliable identification of trees of individual species in cool temperate forests and is used operationally in forest inventories.
Aerial photography is used for soil survey and the scale of aerial photography depends upon the flying height of the aircraft above the ground and the focal length of the camera:
Photo scale= Flying height/Focal length
Thus, with a lens of 6 inches focal length, the acquisition or contact scale of the negatives will be double the flying height in feet. Aerial photographs are not used to actually identify soil types, but rather to locate changes in the land surface patterns that may be reletable to differing soil properties.
In ideal circumstances, photographs record differences in reflectance from vegetation or the soil surface that can be correlated directly with the boundaries of the mappable soil units. In addition, it is used for interpreting soil boundaries and planning field work, for field navigation and plotting.
2. Satellite Remote Sensing:
With the development of satellite technology, recently satellite data is being applied more and more in forestry, particularly in multi-temporal and multi-stage surveys of forest resources. Multi-temporal refers to the time-wise collection of data about a site on more than one occasion. Multi-stage infers that the data is collected simultaneously for the same site from more than one altitude.
However, aerial photography is used for systematic topographic mapping and for mapping natural resources. More recently photographs have been taken from space satellite and new methods of recording pictures have been developed.
These new techniques and photography jointly are known as “Remote Sensing”. Remote sensing, though not precisely defined, includes all methods of obtaining pictures or other forms of electromagnetic records of the Earth’s surface from a distance, and the treatment and processing of the picture data.
The term has even been extended to some forms of sea-bed survey and atmospheric monitoring. In practice, photography is used more than any other remote sensing techniques and photographic film is used much more than any other medium to record picture data from other sensor systems such as scanners and side looking radar.
Stages of Remote Sensing/Principles of Remote Sensing:
There are various stages of remote sensing which are also involved in the principle of remote sensing.
Some of those are given below:
(i) Origin of electromagnetic energy (e.g. sun, transmitter carried by the sensor).
(ii) Transmission of energy from the source to the surface of the earth and its subsequent interaction with intervening atmosphere.
(iii) Interaction between energy and earth surface or self-emission.
(iv) Transmission of the emitted or reflected energy to the remote sensor.
(v) Detection of the energy by the sensor converting into photographic image or electrical output.
(vi) Recording of the sensor output.
(vii) Data processing for the generation of data base.
(viii) Collection of ground truth and other collateral information.
(ix) Data processing and interpretation.
Remote sensing system consists of a sensor to collect radiation and a platform—an aircraft, balloon, rocket, satellite or even a ground based sensor-supporting stand-on which a sensor can be mounted. The information received by the sensor is appropriately processed and transported back to the earth.
This may be made through telemeter as in the case of unmanned—space craft, or brought back through films, magnetic tapes etc. as in aircraft or manned-space craft systems.
The data are reformatted and processed on the ground to produce either photographs or computer compatible magnetic tapes (CCT). The photographs or CCTs are interpreted visually or digitally to produce thematic maps and other resources information.
Basic Concept of Remote Sensing:
For understanding the basic concept of remote sensing, the ideas about solar and terrestrial radiation, atmospheric effects, signature concept and other spectral responses from natural earth surfaces like soil, water etc. are important and so those are in short described below:
(i) Radiation from sun and other terrestrial sources:
Electromagnetic radiation spans a large spectrum of wavelengths starting from very short wavelengths, γ-ray (1010m) to long radio waves (106m). However, in remote sensing the most useful regions are the visible (0.4-0.7 µm), the reflected IR (0.7-3 µm), the thermal IR (3-5 & 8-14 µm) and the microwave region varies from 0.3-300 cm.
It is known to all that the sun is the important source of electromagnetic radiation used in conventional optical remote sensing—assuming as blackbody with surface temperature about 6000 K.
The radiation of sun includes or covers UV, visible, IR and radio frequency regions, of which the maximum occurs about 0.55 pm i.e. in the visible region. Although, the radiation from sun reaching the earth’s surface frequently modifies by the atmosphere.
According to Planck’s law, all bodies at temperature below 0° absolute emit electromagnetic radiation at various wavelengths,
where, wλ= energy radiated from a black body
h = height/altitude
c = velocity of light
T = absolute temperature
k = gas constant
The earth can be treated as a blackbody at ~ 300 K emitting electromagnetic radiation with peak emission at about 9.7 pm. According to Planck’s law, the radiation emitted by the earth (300 K) is much less at all wavelengths as compared to emitted radiation from sun (6000 K).
However, at the earth’s surface because of the great distance between the sun and the earth, the energy in between 7-15 µm wavelength regions is predominant due to the thermal emission of the earth.
(ii) Effects of Atmosphere on Remote Sensing:
Electromagnetic radiation is scattered and absorbed by gases and particulate matters while passing through the atmosphere. Methane, hydrogen, helium and nitrogen compounds also play an important role in altering the incident radiation energy spectrum.
The maximum absorption takes place at wavelengths shorter than 0.3 µm and that is mainly due to Ozone (O3). Besides, there are certain other spectral regions where the electromagnetic radiation is passed through without much attenuation and these are called atmospheric windows (Fig. 12.2).
Remote sensing of earth surface is usually confined to these regions of wavelengths. Atmospheric windows used for remote sensing are 0.4-1.3; 1.5-1.8; 2-2.26; 3-3.6; 4.2-5.0; 7-15.0 µm and 10 mm—10 cm wavelengths regions of electromagnetic spectrum.
Even in the atmospheric window regions, scattering (Rayleigh, Mie-and nonselective scattering mechanisms) by the atmospheric constituents produces spatial redistribution of energy.
Sensor:
A sensor sees the energy reflected from the target object and the scattered radiation entering its field of view.
The radiance measured at the top of the atmosphere may be from (a) single or multiple scattering of the atmospheric compositions and reaching the field of view (FOV) of the sensor (La), (b) the diffused downward radiation resulting from scattering reflected by the target object (Lb), (c) the downward component reflected by an adjacent target and further scattered by the atmosphere to get into the FOV (Lc), and (d) reflectance of the target by the direct solar radiation and attenuated to reach the top of the atmosphere—the actual or net information (Ld).
The sum of radiance from La, Lb, and Lc is usually called the path radiance. However, the path radiance reduces the image contrast. Further, it produces radiometric error, since the information characteristic to target Ld is corrupted.
Therefore, the apparent radiance of the ground targets differs from the intrinsic surface radiance because of the presence of the intervening atmosphere. In principle, the added radiance may be removed if the concentration and optical properties of aerosol are known throughout the image.
The atmosphere including haze and clouds is more transparent to microwave than that of optical and infrared region. Hence, microwave remote sensing using active sensors like Side Looking Airborne Radar (SLAR), Synthetic Aperture Radar (SAR) etc. have all weather capability. However, emission from atmosphere can affect the brightness, temperature measurements of the target, even in the microwave region.
(iii) Signatures Concept of Remote Sensing:
Signature may be defined as any set of observable characteristics which is directly or indirectly related to the identification of an object and/or its condition. Spectral, spatial, temporal and polarization variations are four major characteristics of the targets which facilitate discrimination.
(a) Spectral:
It is the change in reflectance or emittance of objects as a function of wavelength. Colour of objects is a manifestation of spectral variation in reflectance in the visible region.
(b) Spatial:
It is the arrangements of terrain features providing attributes, such as shape, size and texture of objects which lead to their identification.
(c) Temporal:
It is the change of reflectivity or emissivity with time. It may be diurnal or seasonal. The variation in reflectivity during the crop growth helps to differentiate crops which may have similar spectral reflectance, but whose growing cycles may not be the same.
A plot of spectral reflectance vs. growth stages of a crop provides a phonologic pattern, which is characteristic of a crop. Therefore, remote sensing data acquired over the same area at different times can make use of the temporal characteristics to discriminate them in a better way.
(d) Polarization:
It is the variation relating to changes in polarization of the radiation reflected or emitted by an object. The intensity of polarization is a characteristic of the object and hence can help in distinguishing the object and it is useful in microwave region. However, signatures are not totally deterministic. They are statistical in nature with a certain mean value and some dispersion around it.
(iv) Spectral Response to Some Objects of Natural Earth Surfaces:
(a) Vegetation:
Vegetation exhibits quite distinct spectral reflectance. Plant pigments, leaf structure, water content etc. influence the spectrum in the visible, near IR and middle IR wavelength regions, respectively. Low reflectance in the blue and red regions corresponds to two chlorophyll absorption bonds, centered at 0.45 and 0.65 pm respectively.
A relative lack of absorption in the green region allows normal vegetation i.e. green colour. With the progress of the leaf growth, intercellular air spaces and the reflectance increases markedly. At the senescent stage of the plant, chlorophyll absorption decreases and red reflectance increases, accompanied by decrease in intercellular air spaces and as a result reflectance decreases in the near IR (Fig. 12.3).
(b) Soil:
Soil reflectance usually increases with wavelength in the visible and near IR regions. However, the soil reflectance is influenced by various parameters like, moisture content, organic matter content, mineral matter content, oxides of Fe and Mn, clay, sand and silt content, slope of the land etc.
As for an example, if the moisture content of the soil increases, the reflectance in the optical IR region decreases, more significantly at the water absorption bands.
Also, the day-night temperature difference decreases with increasing soil moisture due to higher thermal inertia of water. In view of such wider variations in dielectric constant of water and soil at microwave frequencies, the determination of moisture content becomes possible.
(c) Water:
Water absorbs considerable amount of radiation in the near IR and middle IR regions which enables easy delineation of small water bodies. The turbidity of water usually leads to an increase in its reflectance and as a result the peak reflectance shifts towards longer wavelength. However, the dissolved gases and many inorganic salts do not manifest any change in the spectral response of water.
(d) Snow and Clouds:
Snow has a very high reflectance up to 0.8 µm and thereafter sharply decreases. In case of clouds, there is non-selective scattering appearing as uniformly bright throughout the range of 0.3-3 µm. The cloud tops and snow usually have the same temperature and therefore, it is not easily possible to separate these in the thermal IR region.
Sensors used in Remote Sensing:
Sensors can be broadly classified as,
(i) Those operating in the optical-IR (OIR) region, and
(ii) Those operating in the microwave region, because the technology for developing microwave sensors is quite different from that of OIR sensors.
However, both these sensors can be further sub-divided into passive and active sensors.
(i) Passive Sensors:
It may be defined as sensors which sense natural radiations, either emitted or reflected from the earth. It is also possible to produce EMR of a specific wavelength or band of wavelengths and illuminate a terrain on the Earth’s surface.
(ii) Active sensors:
It may be defined as sensors which produce their own electromagnetic radiation. However, both sensors could be either imaging, like the camera, or non-imaging like radiometer.
Some sensor parameters limiting optimum utilization of the data are:
(i) Spatial resolution—discriminating the smallest object on the ground,
(ii) Spectral resolution—the spectral band width, by which the imagery is taken,
(iii) Radiometric sensitivity—ability to differentiate the spectral reflectance or emittance between various targets, and
(iv) Dynamic range—capability to measure the minimum to maximum reflectance. Besides, the sensor should produce imagery with geometric fidelity.
However, it may further be mentioned that the following various types of sensors can be used for the resource survey.
Optical-IR (OIR) sensors including photographic cameras, television cameras, optical mechanical scanners, linear-imaging self-scanning sensors (LISS), and OIR active sensors, microwave remote sensors, including both passive and active etc. are successfully used for the natural resource inventory.
Platforms:
For the reliable data recording, the sensor systems are required to be placed on suitable platforms. However, the platforms may be stationary or mobile depending upon requirements for the observation and constraints.
For an example, for an imaging system, in general, the spatial resolution becomes weak as the platform height increases. However, the ability of the platform for the support of the sensor, in terms of weight, volume, power etc. and its stability have to be taken into consideration.
The most extensively used platforms are aircrafts and satellites.
Generation of Database:
The data acquired by a sensor invariably suffers from a number of errors.
Such errors occur due to various reasons, such as:
(i) Imaging characteristics of the sensor,
(ii) Stability and orbit characteristics of the platform,
(iii) Scene surface characteristics,
(iv) Motion of the earth, and
(v) Atmospheric effects.
Preprocessing should be carried out to correct these errors to the maximum extent so that the inherent quality of the original information of the scene is detected appropriately. The outputs of the preprocessing which are available in the standard formats, either in photographic or digital are known as “data products”.
Preprocessing is carried out for the following purposes:
(i) Removing geometric distortion in the imagery;
(ii) Eliminating radiometric distortion in the imagery, and
(iii) Enhancing the contrast in the data so that certain characteristics of interest come out as a better photograph.
The procedures employed for geometric correction usually treat distortions into two groups: namely, one that is systematic or predictable and the other are essentially random. Systematic errors may be corrected by applying formulae derived by mathematical modelling of the expected distortions (example, earth rotation).
Earth rotation correction is applied on line scan images (optomechanical and LISS type of sensors). The satellite images the earth in its south bound pass in the case of descending node acquisition. The image is made up of individual scan lines and since the earth is rotating from west to east, each successive scan line has to be displaced westward to correct for the relative motion between the satellite and the earth.
Random errors arise from the uncertainty in the measurement or estimation of these parametres and modelling limitations. Geometric distortions, if left uncorrected, result in relative positional errors in latitude and longitude. To a first approximation, these distortions can be corrected from the measured or estimated parameters leaving the random errors uncorrected.
The correction process proceeds by first defining the transformation equations relating to the corrected image co-ordinates to the uncorrected data, using different error models and measured system parameters. The transformation equations are then used to a selected grid of points over the scene and for the rest of the points, interpolation is carried out.
As the transformation results in fractional scan line or pixel values in the uncorrected data, a resampling method is then adopted to determine the gray values at these locations. Radiometric distortion arises from the non-linearity of the detector response, responsibility variation between the detectors, radiation pattern of antenna line and pixel drop outs.
Radiometric errors are usually corrected by extensive calibration measurements during laboratory tests. Image enhancement is a radiometric transformation on the pixel to enhance visual discrimination of low contrast image features.
For the image enhancement, the initial step is to generate an image histogram, which describes statistical distribution of gray levels in an image in forms of the number of pixels comprising each gray level.
One simple way to increase contrast is to expand the original gray level to fill the total dynamic range of the recording or display system. This may be achieved by subtracting a bias gray value and then increasing the gray level range with a gain factor, or by saturating the lower and upper extreme of the gray values and expanding the middle range.
However, such enhancements are useful only for visual analysis and generally there is no advantage for digital classification.
Standard data products are generated from the corrected and formatted data by photo writing to produce photographic products in the form of black and white or colour transparencies or prints of different bands or combination of bands. Enlargements are generated to provide images at a specific usable scale. Digital information is provided in specific formats in computer compatible tapes.
Analysis of Generated Data:
There are mainly two methods of data analysis for extracting resource relating information from data products, either independently or in some combination with other collateral information, are visual interpretation and digital image processing techniques.
(i) Visual Analysis:
Visual interpretation methods are used for extracting information on various natural resources. As for example, tone/colour, texture, shadow, shape, size etc. are interpreted by visual analysis.
(ii) Digital image Processing Techniques:
In this technique, the computer analysis the spectral signature so as to associate each pixel with a particular feature of imagery. The value of reflectance measured by a sensor for the same feature, e.g. wheat field, will not be identical for all pixels; such response variation within a class is to be expected for any earth surface cover.
The digital classification techniques can be grouped into two:
(i) Supervised classifier, and
(ii) Unsupervised classifier.
In supervised classifier system, the analyst locates specific sites in the remotely sensed data representing homogeneous of different classes like agriculture, forest, water bodies etc. and these are called “training sites”.
However, the unsupervised classification system is based on the exploitation of the inherent tendency of different classes to form separate spectral clusters in the feature space. This system uses algorithms for natural groupings of spectral properties of the pixels.
Gaussian Maximum Likelihood (MXL) is one of the more commonly used supervised classifiers. The maximum likelihood classier is computation intensive, since probability of a pixel belonging to each of the defined categories has to be computed.
Potential Applications of Remote Sensing in Agriculture, Aquaculture and Inland Fisheries:
Remote sensing is used to survey various natural resources like agriculture, forestry, minerals, water, marine etc. and also to study various physical phenomena. Since the same data base is utilised by various disciplines remotely sensed data is ideally suited to study inter-relationship of various resources.
Agricultural applications of remote sensing are characterized by a number of phenological, land management and economic features, which combinedly ensure that remote sensing has and will continue to play an increasingly significant role in monitoring agricultural tracts. However, a brief account of agricultural applications of remote sensing programme is given in Table 12.1.
Increasing agricultural productivity has been the main concern since scope for increasing area under agriculture is rather limited. This demands judicious and optimal management of both land and water resources.
However, considerable achievements have been derived from the remote sensing technique since the last few years in India that are:
(i) Crop production forecasting,
(ii) Land use or cover mapping,
(iii) Mapping of waste lands,
(iv) Soil mapping,
(v) Drought monitoring and its assessment,
(vi) Monitoring of surface water bodies,
(vii) Ground water exploration,
(viii) Flood mapping and damage assessment.
Remote sensing technique can also be applied in marine resources and coastal and mineral resources studies.
For these areas, remote sensing technique can be successfully used in phytoplankton estimation, fluorescence studies for chlorophyll ‘a’ estimation, temperature of the sea surface, wetland mapping, oil slicks etc. for marine resources, and in case of mineral resource study, the identification of rocks and other geological studies can be made by applying remote sensing technique.
Integrated Agricultural Resource Management:
In India, till today, about 80% of the total cultivable land is rainfed which are strictly dependent on rainfall. However, the lack of proper management of soil, water, fertilizer nutrients and other related agricultural inputs leads to the degradation of land, loss of water and soil, and deterioration of soil health etc. which affect crop production adversely.
Therefore, all these resources are to be managed through an integrated approach. For this, various basic information required for carrying out the integrated plans that are available through space- based remote sensing. Some of those information’s are shown in Fig. 12.7.
All those above agricultural resources can be effectively studied by the application of remote sensing technique.
Comments are closed.