Goals and Background
This lab was designed to give us students experience correcting remotely sensed images, in this case they are satellite images, for atmospheric interference. There are two main objectives for this lab:
1) Develop our skills in performing absolute atmospheric correction on remotely sensed images
with the use of multiple methods
2) Conduct relative atmospheric correction on remotely
sensed images
Methods
Part 1: Absolute atmospheric correcting using empirical line calibration
This first portion of the lab is conducting atmospheric correction making use of the empirical line calibration (ELC) technique. The ELC method makes use of the in situ data which is a library of spectral values of many surfaces and materials on earths surface collected at the same time that the sensor captures the aerial imagery. These reflectance values are matched to the reflectance values collected in the imagery through the following equation: CRk = DNk * Mk + Lk where CRk is the corrected digitial output pixels for a band, DNk is the image band that is being corrected, Mk is a multiplicative term that affects the brightness values of the image, and Lk which is an additive term. In this method the Mk value acts as the gain and the Lk values acts as the offset. The gain and offset are used to create regression equations which are used with the in situ data and reluctance of the sensor. There are 3 steps to completing this correction.
Section 1: Background and preparations for conducting Empirical Line Calibration
The first step is to open the Spectral Analysis Work Station in ERDAS Imagine 2015. Once this is open then bring in the image you want to correct. In this lab we are working with an image of the Eau Claire area collected on August 3rd 2011 via the Landsat 5 satellite. Once the image is loaded make sure that the correct sensor is selected in this case we want Landsat 5 and then click on the Edit Atmospheric Adjustment tool and make sure that ELC is selected as the correction method. Figure 1 is what the window will look like when you have done this step.
Figure 1 This is the Atmospheric Adjustment tool interface. |
Section 2: Collecting samples and identifying reference to conduct ELC
Once you have the window open the next step is to collect spectral signatures of features throughout the image. Those signatures are then paired with the in situ signatures from various spectral libraries. The first signature we collected in from the Eau Claire image was a roadway. This is done by finding a good prominent road in the image like a highway, zooming in and using the Create a Point tool to collect a signature from the middle of the road feature. It is important for this and all the signatures collected that we made sure we were only collecting the signature of one feature and that there wasn't vegetation overlapping it or something. We want as pure and accurate signatures as we can get from the image. We changed the line color to grey and selected a road signature which then is displayed in the sample chart. We then go into a spectral library and find the corresponding in situ signature which in this case is called Aliphatic Concrete. The imagery sample is the top line and the library signature is the bottom line in the chart of Figure 2. We repeated this process to collect signatures of forested vegetation, aluminum rooftop, agricultural land, and water. Figure 3 is what all of the sample signatures for each of these looked like compared to the in situ spectral library data.
Figure 2 The chart on the right hand side the spectral signature chart where the image signature is compared to the library signature. |
Figure 3 These are the signature comparison charts for each sample. Grey is concrete, green is forest, yellow is agriculture, cyan is rooftop and blue is water. |
Section 3: Executing atmospheric correction using Empirical Line Calibration
Once we have those signatures collected the next step is to run the ELC correction. In this part of the lab this is a fully automated process but late in the lab we will manually create the regression models that are taken into account in the ELC method. Figure 4, in the results portion of the lab, is final corrected image using the ELC method. Once the new image was created we opened it and compared it to the original by collecting spectral signatures from the same objects in both images. Figure 5 are the resulting spectral signature graphs showing the original verses the corrected image signature values. We collected samples from healthy vegetation, roads, water bodies, and agricultural areas to compare the images.
Part 2 Absolute atmospheric correction using enhanced image based Dark object subtraction
The second method we explored in this lab is correcting images based on dark object subtraction (DOS). This method makes use of many parameters to correct the image including sensor gain and offset, solar zenith angle, atmospheric scattering, solar irradiance, as well as absorption and path radiance. There are two steps to the correction in this method:
1) Convert the image collected by the satellite to an at-satellite spectral radiance image
2) Convert the at-satellite radiance image to true surface reflectance
Section 1: Conversion of image (DN) to at-satellite spectral radiance
The first step is to use ERDAS image 2015 to create 6 models one for each band of the imagery (1,2,3,4,5,7). All 6 of the models are created and run in the same model window and make use of the original uncorrected image from Eau Claire just as in part 1. Once the image bands are loaded into the model the next step it to fill in the function for each band. Figure 6 is the equation used in the function. The majority of the information needed for the equation is found in the meta data. Once the function equations are complete for each band the output images need to be saved. Once output destinations are selected the model ( Figure 7) can be run.
Figure 6 This is the function equation for the model to correct each band. |
Section 2: Conversion of at-satellite radiance image to true surface reflectance
Step 2 is very similar procedure as step 1 but this time there is a new equation (Figure 8) and instead of the input bands being from the original Eau Claire image they are from the radiance image bands created in step 1. The new equation makes use of path radiance which is collected by measuring the distance from the origin of the histogram to the actual beginning of the histogram for each band. Solar zenith angle is also used in the equation. This is a constant value for all of the bands the distance between the earth and sun varies and needs to be looked up in a chart which has distances for every day of the year. Once you have all the values to fill int he equation it is time to create another model. Just like the first model there are 6 small models, one for each band, and using the radiance bands as the input, the new equation for the functions, and creating a new output location the model (Figure 9) can be run. Once the images are run in the model you can see that there is a layer stack tool in the model. This takes those new output bands and stacks them together so that we can compare the new stacked image with the original image to see how well the correction worked. Figure 10 in the results section in the newly corrected image using the DOS method.
Figure 8 This is the new equation for the second part of the DOS correction method. |
Part 3 Relative atmospheric correction using multidate image normalization
The final method we used in the lab to conduct atmospheric correction is called multidate image normalization. When in situ data is not available for images, which is many time the case when working with historical images, many times this is the method chosen to correct those images. This method is based on having the same image from two different times. In our case we are using images of the Chicago area, one collected in 2000 and the other in 2009.
Section 1: Collection of pseudo-invariant features from base image and subsequent image
This first step is to open both images in ERDAS 2015 using two separate image viewers. Then link and synchronize the two viewers so that they are set to the same extent and zoom and pan together. We zoomed into the O'Hare International Airport and we are going to use one of the rooftops in image comparison. In order to this we open the spectral profile tool in the mutispectral drop down. Then we unlink and unsynce the two viewers prior to collecting a profile point. We collect this point from the same location on each image which will display as spectral signature in each of the images signature graphs seen in Figure 11. We repeat this point collection procedure for a total of 15 points collected on each image from the same location on each image. It is important to make sure the points are collected from the same location on each image other wise this method will not correct the images accurately. The points collected were collected throughout the image; 5 in Lake Michigan, 5 from urban or built areas, and from lakes and rivers inland. Including the O'Hare point there are 15 signatures for each image. Figure 11 is the spectral signature charts and the point location for both images.
Figure 11 This is showing the 15 points from which spectral signatures were collected. The image on the left is the image from 2000 and the one on the right is the 2009 image that we be corrected. |
After the signature are collected the next step is to conduct regression analysis. On the signature graph window we click the tabular data view. This gives us a chart of the pixel data for each of the points collected for each of the 6 bands. We take the mean numbers from each band and enter them into an excel spread sheet (Figure 12). There are 15 rows, one for each location, and 6 columns, one for each band. We create two separate tables one for each image. Once these tables are made we then take the band 1 columns from each table and create a scatter plot. We then add a trend line from which we get the gain, or slope of the line, and the y intercept is the bias. In the new charts (Figure 13), 6 total one for each pair of bands, we include the regression equation and R squared values. These values are needed for the function or equation that we will be using in the model for this correction method.
Figure 12 These are the two charts of mean pixel values for each sample location. |
Figure 13 These are the 6 regression graphs from which we get the R squares and regression values. |
Section 2: Development of Atmospheric correction image normalization models
The last par of this method is creating another model just as we did in parts 1 and 2 with 6 smaller models, one for each band, inside one large model. The input bands in this model are the 6 bands from the 2009 Chicago image as we are going to the 2000 image to correct the 2009. The equation in Figure 14 will serve as the function equation for the model (Figure 15) and the new corrected output images will be saved. Again like part 2 these new output images are stacked to create the final corrected image. Figure 16, in the results section, is the final corrected image.
Figure 14 This is the equation for the function in the model when using the multidate image normalization method. |
Figure 15 This is the final model for correcting the 2009 Chicago image via the multidate image normalization method. |
Results
Part 1
Figure 4 This is the atmospherically corrected image (right) compared to the original image (left). This correction was done using the ELC method. |
Part 2
Figure 10 Coming Soon
Part 3
Figure 16 This is the Chicago 2009 corrected image (right) compared to the Chicago 2000 original image (left). |
Sources
Landsat satellite image is from Earth Resources Observation and
Science Center, United States Geological Survey.
Spectral signatures are from the respective spectral libraries consulted.