This map shows current land use in Germantown, Maryland, derived from a supervised classification of multispectral imagery (bands 4, 5, and 6). Eight land use classes are mapped – urban/residential, roads, agriculture, fallow fields, grasses, mixed forest, deciduous forest, and water
Module 5 introduces digital image classification, where software analyzes and classifies remotely sensed images based on pixel values. In this lab, we focused on two approaches: unsupervised and supervised classification.
Supervised classification uses labeled training samples provided by the user to teach the software what each land cover class looks like spectrally, and then applies those patterns across the entire image. Unsupervised classification, on the other hand, lets the software automatically group pixels into clusters based on their statistical similarity, without any labels at first. The analyst then interprets and labels those clusters as real-world classes using visual interpretation and any available in situ information.
The first few exercises of the lab walked us through both digital classification methods and introduced us to the tools, tips, and tricks used in the process.
The main exercise, though, was to use the supervised classification techniques we learned to classify a multispectral image of Germantown, MD, into land-use classes.
Building Training Data with AOIs and Signatures
We started with a set of coordinates for several land use types:
For each coordinate, I used the Inquire tool in ERDAS IMAGINE to set a cursor at the correct location. Around each cursor, I created an Area of Interest (AOI) using either the Grow tool (which uses a ‘seed’ pixel and expands to neighboring pixels with similar spectral values) or manually tracing a polygon over the area. We were also tasked with finding and creating signatures for waterways and roadways without specific coordinates.
Once I was happy with an AOI, I turned it into a spectral signature using the Create New Signature(s) button in the Signature Editor. All of these signatures were saved into a single signature file that represented the full land use scheme for the project. These signatures are the training samples the software uses to classify the image.
Checking for Spectral Confusion and Choosing Bands
With signatures in place, I needed to sort out spectral confusion (where two classes look too similar spectrally).
To do this, I:
-
Looked at histograms for pairs of classes, band by band, to see when their histograms overlapped and when they separated.
Used the Signature Mean Plot to compare all signatures across all six bands at once (pictured below).
From this analysis, I found:
-
Bands 4 and 5 showed the best separation between most classes.
-
Band 6 also helped separate some of the features.
-
Bands 2 and 3 had significant overlap and, therefore, more confusion.
Because of that, I decided to use a 4–5–6 band combination for visualization and interpretation. In the Signature Editor, I set the colors using Approximate True Colors, assigning:
This band combo highlights vegetation nicely and helped me visually check whether the classification made sense.
Running the supervised classification and distance file
Once the signatures and colors were set, I ran a Supervised Classification using the Maximum Likelihood parametric rule. At the same time, I created a Spectral Distance Image (pictured below), which shows how far each pixel is from its closest class in spectral space. Bright areas in the distance image mark places where the classification is less confident and where errors are more likely.
Using the original image, the classification, and the distance image side by side, I found at least one area that needed more attention:
-
At approximately 300542.53, 4336697.02, a large field was labeled as urban/residential even though it clearly looked like agriculture. I created a new AOI and signature using the polygon tool and updated my signatures to better capture that land use.
Recoding to final land use classes
The initial supervised output contained multiples of the same subclasses, such as several agriculture types or urban types. To simplify this into a clean land use map, I used the Recode tool and merged the classes into eight final categories:
-
Urban/Residential
-
Grasses
-
Deciduous Forest
-
Mixed Forest (Deciduous/Conifer)
-
Fallow Field
-
Agriculture
-
Water
-
Roadway
This step gave me a new thematic raster with a single code for each land use type. I will admit I struggled with the recoding at first and repeated the process about four times. The codes were actually working correctly, but the recoding process doesn't match the class names. Lesson learned: double-check the labels before assuming the recode failed.
The final step in IMAGINE, before exporting the final image and moving my work to ArcGIS Pro, was to add an Area field and calculate the acreage of each class.
The final steps were completed in ArcGIS Pro, resulting in the map layout shown at the top of this blog post. My aesthetically driven mind grapples with the color palette, but the goal is to be able to differentiate between the classes, not to look pretty.
I haven’t decided on my final project, but I will definitely use some of the techniques and processes learned in this week's module.