With this blog I intend to share GIS, remote sensing, and spatial analysis tips, experiences, and techniques with others. Most of my work is in the field of Landscape Ecology, so there is a focus on ecological applications. Postings include tips and suggestions for data processing and day-to-day GIS tasks, links to my GIS tools and approaches, and links to scientific papers that I've been involved in.
Thursday, July 23, 2015
Problem with Step 6 of the Climatic Water Deficit Toolbox identified and a workaround found
A while back I created the Climatic Water Deficit Toolbox for generating climatic water deficit and actual evapotranspiration rasters. Recently I was running into a few issues with step 6 - the calculation of the soil water balance. I was getting errors like "ERROR 010240: Could not save raster dataset ...\g_g882 with output format GRID". It was a bit puzzling because I hadn't seen the tool creating the g_g grids before. It turns out when the extent of the available water supply raster and the heat load raster differ then Arc creates temporary GRID files. The remedy for this is to ensure that the available water supply and heat load rasters match exactly in extent, cell alignment, and projection. After the two rasters were lined up the problem went away completely.
Thursday, July 16, 2015
What do you do when your shapefile won't display in ArcMap?
From time to time I get e-mail questions from colleagues about what to do about the terrible and dreaded red exclamation in ArcMap. Here are a couple of pointers when working with shapefiles and ArcMap documents (MXD files). First off some general advice on what to do when you get the red exclamation make:
1. The shapefile is in the same location but the drive letters on the computer are mapped differently (e.g. Z: vs. X: vs. whatever). If knows where it is located at it is as simple re-pathing the data. In the Table of Contents (left panel usually) right click on the layer and go to properties. The click the source tab followed by the set data source button. Navigate to the file and it should update. All other files in the same folder will be updated. If you're using different logins then some of the locations that might be accessible to you wouldn't be to him and visa versa so that is one good reason to always save files on the network to locations that he/you can always both access.
2. If the file was moved use the same procedure as above but navigate to the new location.
3. Shapefiles are composed of many different files that need to be kept together. At a minimum there is a .shp, .dbf, and .shx, but there will also be a .prj and probably a .sbn. These files need to be kept in the same folder at all times. If one or more of them gets moved you should move it back.
4. As with #4 if one of those files gets deleted or corrupted you are out of luck. Most like this isn't the case.
and some general data management tips:
1. Always use the catalog tab in ArcMap to move, copy, or delete files. It'll treat all of the files that compose the shapefile as a single entity.
2. Always place your GIS files in network accessible locations and avoid the local hard drive, dektop, my documents, or temporary places that might get wiped out periodically.
3. When editing shapefiles save frequently. The more the better since ArcMap doesn't have an auto-save feature.
4. Set relative paths for your ArcMap document. File - Map Document Properties - and check 'store relative pathnames to data sources'. There is even a setting to make this the default. Customize - ArcMap Options - General - check 'make relative paths the default for new map documents'.
5. Keep good mental notes where you place files and pay attention when writing outputs. If you are ever in question go to the layer properties and view the file path or use the list by source tab in the table of contents (looks like a silver can with a yellow arrow).
6. Practice good data management skills. Delete old, unnecessary, duplicate, or corrupt files periodically. Elevate final products to folder locations with really clear and memorizable paths so that you or anybody else can find your way back if necessary. Give files really specific names that describe their purpose and avoid default names like Export_Output. If you are going to have multiple users or multiple dates consider using suffixes with your initials and the dates to help keep track of file versioning.
1. The shapefile is in the same location but the drive letters on the computer are mapped differently (e.g. Z: vs. X: vs. whatever). If knows where it is located at it is as simple re-pathing the data. In the Table of Contents (left panel usually) right click on the layer and go to properties. The click the source tab followed by the set data source button. Navigate to the file and it should update. All other files in the same folder will be updated. If you're using different logins then some of the locations that might be accessible to you wouldn't be to him and visa versa so that is one good reason to always save files on the network to locations that he/you can always both access.
2. If the file was moved use the same procedure as above but navigate to the new location.
3. Shapefiles are composed of many different files that need to be kept together. At a minimum there is a .shp, .dbf, and .shx, but there will also be a .prj and probably a .sbn. These files need to be kept in the same folder at all times. If one or more of them gets moved you should move it back.
4. As with #4 if one of those files gets deleted or corrupted you are out of luck. Most like this isn't the case.
and some general data management tips:
1. Always use the catalog tab in ArcMap to move, copy, or delete files. It'll treat all of the files that compose the shapefile as a single entity.
2. Always place your GIS files in network accessible locations and avoid the local hard drive, dektop, my documents, or temporary places that might get wiped out periodically.
3. When editing shapefiles save frequently. The more the better since ArcMap doesn't have an auto-save feature.
4. Set relative paths for your ArcMap document. File - Map Document Properties - and check 'store relative pathnames to data sources'. There is even a setting to make this the default. Customize - ArcMap Options - General - check 'make relative paths the default for new map documents'.
5. Keep good mental notes where you place files and pay attention when writing outputs. If you are ever in question go to the layer properties and view the file path or use the list by source tab in the table of contents (looks like a silver can with a yellow arrow).
6. Practice good data management skills. Delete old, unnecessary, duplicate, or corrupt files periodically. Elevate final products to folder locations with really clear and memorizable paths so that you or anybody else can find your way back if necessary. Give files really specific names that describe their purpose and avoid default names like Export_Output. If you are going to have multiple users or multiple dates consider using suffixes with your initials and the dates to help keep track of file versioning.
Wednesday, July 15, 2015
Remote sensing of cheatgrass die-offs featured on YouTube
This past Tuesday Owen Baughman gave a 1 hour talk as part of the Great Basin Landscape Conservation Cooperative's Webinar series which you can view HERE. The talk was about cheatgrass (Bromus tectorum) die-offs and whether they present an opportunity for ecological restoration, which is the topic he studied for his Master's thesis. Although the focus of the talk was on his field experiments, which I won't describe here (better to hear it in his own words by watching the YouTube video) he did take some time to show some of the preliminary results that came about from our cheatgrass die-off remote sensing study. I'll spare you the remote sensing details and save that for when our upcoming paper comes out, but I'm proud to say that our remote sensing method for mapping cheatgrass die-offs using Landsat imagery appears to be working very well. We developed models for 2014 using the spectral signature of known die-offs and then applied them to 2015 imagery. Sixteen out of seventeen predicted die-offs turned out to be real die-offs when visited in the field indicating a high level of accuracy. You can see one of the example maps for predicted die-off events in 2015 below.
Thursday, July 2, 2015
Updating 2013 NAIP Imagery in Nevada with Sun Elevation and Azimuth
NAIP imagery (National Agriculture Imagery Program of the US Department of Agriculture) constitutes a treasure trove of data that is underutilized by remote sensing analysts in my opinion. It is free, covers the entire state, and is updated regularly. So what's not to love? When it comes to image processing and classification, however, there are many challenges that people should be aware of before engaging in a major project. One challenge that I've grappled with is how to deal with differential illumination based on slope-aspect relative to the sun angle at the time of the collection. This problem can be remedied with Landsat data, because sun elevation and azimuth information is included in the metadata. The approach that we've been using in the Great Basin Landscape Ecology Lab involves creating an illumination raster (hillshade) that mimics the sun at the time of collection and then adjusts each band to its predicted brightness using linear regression.
This problem becomes more challenging with NAIP compared to Landsat, because NAIP images are flown over many months and at different times throughout the day. The result is that a single NAIP tile can have up to four different sun angles and these four regions of a single tile can have shadows pointing in four different directions. Obviously this poses a challenge to image classification. Recently I learned that the USDA APFO now includes a seamline file for each state available for download on the Geospatial Data Gateway - https://gdg.sc.egov.usda.gov/. The approach that I'm now taking for topographic correction with NAIP involves downloading this seamline file, estimating latitude and longitude of center of each seamline polygon, and plugging this into some Excel formulas based on a NOAA spreadsheet (http://www.esrl.noaa.gov/gmd/grad/solcalc/calcdetails.html) that calculate sun elevation and azimuth. I did this for the state of Nevada and joined the results back to the original shapefile to produce the maps below.
This problem becomes more challenging with NAIP compared to Landsat, because NAIP images are flown over many months and at different times throughout the day. The result is that a single NAIP tile can have up to four different sun angles and these four regions of a single tile can have shadows pointing in four different directions. Obviously this poses a challenge to image classification. Recently I learned that the USDA APFO now includes a seamline file for each state available for download on the Geospatial Data Gateway - https://gdg.sc.egov.usda.gov/. The approach that I'm now taking for topographic correction with NAIP involves downloading this seamline file, estimating latitude and longitude of center of each seamline polygon, and plugging this into some Excel formulas based on a NOAA spreadsheet (http://www.esrl.noaa.gov/gmd/grad/solcalc/calcdetails.html) that calculate sun elevation and azimuth. I did this for the state of Nevada and joined the results back to the original shapefile to produce the maps below.
The above map shows sun elevation for NAIP flights in 2013. A higher sun elevation is good because it minimizes shadowing due to topography and vegetation. Exactly half of the area is estimated to have a sun angle between 20 and 40 degrees, with 13% below 20 degrees, and 37% greater than 40 degrees. Higher sun angles seem to be concentrated in areas north of Las Vegas and south of Tonopah. The map below shows sun azimuth. The vast majority of flights occurred when the sun was in the southeast, presumably because winds tend to be calm during the mornings.
The next steps involve plugging this into some tools that I've developed to topographically correct brightness for each of the four bands and then perform classification.The caveats that come along with this analysis are as follows: 1) we assume latitude and longitude for the center of each polygon, 2) we average the beginning and end times for each polygon and plug that single time into the sun elevation and azimuth calculation, and 3) we use a 10 meter DEM because a 1 meter DEM is not available for the entire state.
Subscribe to:
Posts (Atom)