[an error occurred while processing this directive]
Remote Sensing, Image Classification, and GIS.
The term Remote Sensing includes all sorts of methods for gathering information from afar. Our eyes, for example, common snapshot cameras, sonar, radar and scanners of all types including those carried on satellites. The latter are most interesting from the perspective of this course for several reasons:
As we often say in this course, that much of the matter of information systems boils down to these four basic capacities:
The data structures created by satellite imaging systems are images, like our familiar JPGs and TIFs. As such, they are by definition rasters. The fundamental unit of analysis is a cell that records a the intensity of reflected light. Our familiar tools and procedures such as those provided by Raster GIS and Photoshop will operate on our satellite images. The primary difference between most satellite images and the tiff and jpg images that we collect with our pocket cameras, is that the scanners that we send up into space are not so much concerned with recording the same sorts of light that our eyes do. While our visual system normally makes use of combinations of the Red Green and Blue slices of the spectrum, satellite scanners divide the spectrum into finer divisions, and also collect reflectances of radiation that we can't see. So, while we may still be discussing pixels and cells (our familiar raster data structures) extracting all of the meaning that multispectral images have to offer takes us through a discussion of the electromagnetic spectrum.Images categorized according to the amount of data that used to record the attributes of each pixel:
from Here is a nice page of illustrations of image color modes From Designer.com. Or Wikipedia
Procedures for Associating and Transforming Multispectral Images
Like the earth itself, images of the earth's reflectivity is very unstructured. Even though our eyes may be able to see useful patterns in these images if we color and overlay the various components, as in the two images of Las Vegas taken in 1972 and 1992. We aren't able to systematically access this information as a land cover map as we would in Raster GIS until we have associated groups of pixels together, identifying classes of more or less homogenous land cover. See Picture This classification process is accompilshed through a class of software known as Multispectral Image Classification Software. This software uses statistical techniques and information we may know abour the reflective properties of things to develop reflective signatures that can be fit to unstructured, fuzzy, data.
First a Little History
THe history of remote sensing is like the history of photography and the history of the space program. NASA's has a very good history of remote sensing in their Remote Sensing Tutorial A couple of interesting bits that aren't highlighted in the NASA history include the history of the recently unclassified Corona project (this page by Kieth Clarke at UCSB) which began collecting space images in 1962. In 1972 NASA sent up the LANDSAT satellite which has been systematically sending back snapshots ever since. This data, available as part of NASA's Mission to Earth is one of the greatest data values on the internet. Especially since you may now download these images FREE from a the Global Land Cover Facility at the University of Maryland WE should also bring you up-to-date on the recent history including the emergence of private-sector providers of space imaging such as: DigitalGlobe and GeoEye Which can get you very crisp color images with nearly half-meter resolution of nearly anywhere for about $2,500. The private sector involvement in this industry has introduced some interesting new parameters including a check on previously secret information (or mis-information) and also healthy competition in terms of quality and price.
The Electromagnetic Spectrum
It is beyond the scope of this class to discuss photons, waves and beams, but suffice it to say that the sun is a huge source of radiation that is absorbed and reflected at different rates by different materials. Some Active remote sensing devices such as RADAR create their own radiation. Other imaging systems record long-wave infrared (heat) that is emmitted from objects. The images we will be discussing -- most of all useful geographic images -- are recorded by passive imaging systems that record reflected radiation that initially comes from the sun.
Different materials are distinct, not only in the amount of radiation that they reflect, but in the different wavelengths. For example, your blue pants are reflecting a lot of blue light and absorbing relatively more red and green light. Red, green and blue are spectral bands or slices of the spectrum, if we organize it by wavelength or frequency.
Thanks to Utah State University and NASA for these images.
Imaging SystemsIn addition to film, there are several other technologies for recording reflected light and other portions of the electromagnetic spectrum. The most common system used to collect commercial sattelite imagery is the Multi-Spectral Scanner (MSS) (Diagram photo.) Use of a MSS is convenient, especially in satellite systems, because the image is digital from the start, so there is not a problem returning film from the sattelite to earth. The abilities of space-based imaging systems are growing at a very rapid rate. Recently, a commercial imaging satellite began sending back very high resolution images (70cm panchromatic, 1 meter multispectral)
Here is a detailed overview of how data are returned from the Landsat Thematic Mapper from NASA's Remote Sensing Tutorial.
Visible and Invisible RadiationWe typically think of photography as being a record of things that we can see, but this is not always the case. Many common remote sensing applications record a piece of the spectrum beyond the red end of the visible spectrum (near-Infrared). Although we can't see reflected near-infrared, this sort of light reveals interesting things about the health of vegetation and moisture.
False Color Images
In order to understand a photograph that includes information on reflectivity in the near infrared forces us to think in a new way about the way we look at photographs. Because of the limitations of our eyes and the computer displays that we use to interface with data, we have to learn to see these images, not as records as the way things look, but as records of invisible aspects of things. For an excellent treatment of the subject, see this page from the USGS.Click here to see an example of a multi-band image. from Utah State University.
Spectral signatures for various types of objects is best by setting up calibration sites on the ground as was done in this Hawaiian remote sensing project
While it is possible to determine the exact spectral signature of a blade of grass, the resolution of most sattelite or even airborne sensors (that we know about!) will usually be picking up refelctance from more than one sort of material, yielding what some people call mixed pixels (or mixels) . The process of trying to sort out the involves lots of mathematics and physics.
The annoying Details
Measuring spectral signatures of objects can yield ideal reflectivity information. Of course, the measurements made from space are often muddied up by cluds and haze. This is why hard-core remote sensing engineers often discuss radiometric correction which attempts to compensate for the atmospheric conditions that may be filtering the reflectance.
Image Processing Software
An understanding of the reflectivity of various objects is key for serious interpretation fo remotely sensed images. But very useful information can be derived by harvesting spectral signatures from images when you have prior knowledge of what was on the ground. This technique is known as Image Classification. In the simplest sense, the object is to classify pixels according to their intensity. To make it a bit more complicated, you can classify pixels in multi-spectral images into groups of similar intensities accross several bands.
The idea of classifying areas with similar signatures is important because of the inherent non-exactness of spectral signatures and of remote sensing technology. Assignment of a location to a particular class is accomplished through the use of one of several Clustering Algorithms which can be illustrated as mapping each location in a multi-dimensional graph with one axis for each image band, and breaking up the multidimensional space into envelopes of 'color-space.'
Supervised or Trained Classification
Here are some examples of image classifications that were done in previous years of this course: