With this blog I intend to share GIS, remote sensing, and spatial analysis tips, experiences, and techniques with others. Most of my work is in the field of Landscape Ecology, so there is a focus on ecological applications. Postings include tips and suggestions for data processing and day-to-day GIS tasks, links to my GIS tools and approaches, and links to scientific papers that I've been involved in.
Monday, December 21, 2015
Geocode by Awesome
When was the last time someone sent you a long list of place names and asked you to make a map out of it? If you are like me, the first thing that might go through your head might go like "Oh really. You couldn't have given me a lat/long??". Geocoding in ArcGIS is a pain, because you need to have a complete geodatabase. That might mean every river, stream, lake, street, lightpost, etc. in the USA! Type in each address into Google Maps? No way. I'd like to get out of my office chair before the end of 2016. BatchGeo? Makes great maps, but you can't export the results. Enter the Geocode by Awesome Table Add-In for Google Spreadsheets. Finally a solution that lives up to its name!
Friday, December 11, 2015
How to e-mail a shapefile
Most GIS professionals are familiar with e-mailing shapefiles. We've been doing it for a while. However, for folks just starting out or for those who use ArcMap the process of e-mailing a shapefile may be a little mystifying. Nonetheless, sharing your data is one of the most valuable GIS skills. Recently I answered this question for a colleague.
Here is what I wrote:
"It depends on the kind of information that she needs/wants. If he/she wants both the geographic and attribute data then the best way is to take all of the files associated with the shapefile and put them in a zip file together. At a minimum you'd need the .shp, .shx, .dbf, but often there is .proj, .shp.xml, .sbn, and even .cpg. You'll need to use My Computer/Windows Explorer to view all of these files. ArcGIS treats them as if they are one. The zip file keeps all files together. It isn't a necessary step, but it is often easier than sending all of the files as individual attachments.
If she/he is just interested in the attribute data or doesn't have ArcGIS then there are two ways to go.
1. In the attribute table select the rows by highlighting them on the left and then copy and paste into Excel. Lately I've been finding this method to not work in 10.3, however.
2. Open Excel. Open the .dbf file. Copy and paste all of the contents into a new Excel file without altering the original .dbf (or it can corrupt the shapefile).
Finally, don't forget to check the size. Anything larger than about 27 Mb probably won't e-mail, although file size limits are specific to the e-mail service used."
Here is what I wrote:
"It depends on the kind of information that she needs/wants. If he/she wants both the geographic and attribute data then the best way is to take all of the files associated with the shapefile and put them in a zip file together. At a minimum you'd need the .shp, .shx, .dbf, but often there is .proj, .shp.xml, .sbn, and even .cpg. You'll need to use My Computer/Windows Explorer to view all of these files. ArcGIS treats them as if they are one. The zip file keeps all files together. It isn't a necessary step, but it is often easier than sending all of the files as individual attachments.
If she/he is just interested in the attribute data or doesn't have ArcGIS then there are two ways to go.
1. In the attribute table select the rows by highlighting them on the left and then copy and paste into Excel. Lately I've been finding this method to not work in 10.3, however.
2. Open Excel. Open the .dbf file. Copy and paste all of the contents into a new Excel file without altering the original .dbf (or it can corrupt the shapefile).
Finally, don't forget to check the size. Anything larger than about 27 Mb probably won't e-mail, although file size limits are specific to the e-mail service used."
Wednesday, December 9, 2015
Paper is now a pre-print
Our paper "Multi-scale connectivity and graph theory highlight critical areas for conservation under climate change" in Ecological Applications is now a pre-print - http://www.esajournals.org/toc/ecap/0/0 .
Thursday, November 5, 2015
Paper accepted in Ecological Applications
My paper entitled "Multi-scale connectivity and graph theory highlight critical areas for conservation under climate change" has been accepted in Ecological Applications pending a few minor revisions! The general idea of this paper is to combine measures of habitat connectivity at the rangewide-scale, metapopulation scale, and local scale to inform conservation decisions. We use a combination of graph theory, Circuitscape, and least-cost paths to analyze the effect of different renewable energy development scenarios and climate change scenarios on habitat connectivity of the Mohave ground squirrel. This paper follows from our earlier effort to map habitat suitability for Mohave ground squirrel:
Inman, R. D., Esque, T.
C., Nussear, K. E., Leitner, P., Matocq, M. D., Weisberg, P. J., ...
& Vandergast, A. G. (2013). Is there room for all of us? Renewable
energy and Xerospermophilus mohavensis. Endangered Species Research, 20(1), 1-18.
as well as our earlier report to the California Energy Commission that included an analysis of climate change effects on habitat availability, landscape genetics, and a more limited analysis of renewable energy development scenarios titled "Habitat Modeling, Landscape Genetics, and Habitat Connectivity for the Mohave Ground Squirrel to Guide Renewable Energy Development" downloadable HERE .
In addition to demonstrating a multi-scale approach for assessing habitat connectivity and informing conservation decisions, the paper also presents a novel methodological contribution to make graph theory operational for species with continuously-distributed habitat.
Illustration by M. A. Walden
Friday, October 23, 2015
New dataset - latitude
Both the Climatic Water Deficit Toolbox and the Day Length Toolbox rely on a latitude raster for some of their calculations. I've had many people ask me where they can get a latitude raster. Now the answer is if you're working in the western US (WA, OR, CA, NV, UT, NM, AZ, WY, CO, MT, ID, ND, SD, and TX) you can download a raster off of our lab's webpage HERE .
These data are at 1200 m cell size and are in a UTM Zone 12 NAD83 projection.
If you are working outside of this area it is still possible to create your own from lines of latitude. First, remove the lines of longitude. Then covert the line vertices to points taking care to ensure that the points are somewhat distributed. Finally, a second order trend analysis in Spatial Analyst should do the trick.
These data are at 1200 m cell size and are in a UTM Zone 12 NAD83 projection.
If you are working outside of this area it is still possible to create your own from lines of latitude. First, remove the lines of longitude. Then covert the line vertices to points taking care to ensure that the points are somewhat distributed. Finally, a second order trend analysis in Spatial Analyst should do the trick.
Tuesday, October 20, 2015
ArcGIS Idea - Quickly graph rasters
ArcGIS Idea - Quickly graph rasters
Check out my ArcGIS Idea for making raster time series easier to visualize.
Check out my ArcGIS Idea for making raster time series easier to visualize.
Friday, October 9, 2015
New tool - Generate points around points aka generate background points or available points
Habitat modeling techniques, such as conditional logistic
regression, require a random sample of background points used to quantify available habitat. One common assumption
of resource selection functions is that animals have an equal probability of
using background points as they do occurrence points. For animals that move
across the landscape this assumption is often violated if the entire movement
corridor is considered as background. This tool alleviates this concern by
creating a local sample of background points around known occurrence points.
There are two versions of this tool. The large version is recommended for most
applications as it tiles the data resulting in faster processing times. It does this by using a parameter specifying the number of features per tile and then tiling (splitting) the dataset. The smaller datasets result in much faster run times and the results are appended to an empty output shapefile continuously. If there is a power outage or some other interruption it should be possible to go back and determine how much of the dataset got processed and which points got processed.
Monday, October 5, 2015
Came across a way to pull out one raster from a multiband in Modelbuilder
Sometimes it is nice to have a quick and simple way of pulling out the single bands from a multi-band image stack without having to resort to a python script. This stack exchange post shows how to do it in a nice, easy, and straightforward manner in ArcGIS ModelBuilder. Kudos to xyz for coming up with this answer. I was able to use this approach recently to perform principal component analysis and then continue to work with the raster downstream.
In this example the parse path tool is used to pull out the path and file name. Then Band_1 is appended in the output name of the raster calculator using the following function: "%Path%\%File%\Band_1"*1
In this example the parse path tool is used to pull out the path and file name. Then Band_1 is appended in the output name of the raster calculator using the following function: "%Path%\%File%\Band_1"*1
Saturday, September 19, 2015
Work around for issues with the Climatic Water Deficit Toolbox yearly normals
I've encountered some issues with the yearly normals toolset in the Climatic Water Deficit.tbx. It worked well for us a year and half ago when we did the analysis for the Dilts et al. 2015. Journal of Biogeography. doi:10.1111/jbi.12561. Since then, however, I've been encountering issues when helping others. I think that it has a lot to do with the fact that there is a lot of arcgisscripting module (.gp) in that tool instead of arcpy. I've got plans to convert the whole thing to arcpy.
In the mean time, what I'm recommending is that you use the time series toolset and trick the tool into thinking that it is processing a time series. This approach has worked really well in a number of cases. Here is how you can accomplish this:
1. Change your predictor variables from prcp_01.img to prcp_2000_01.img for all months for all three PRISM variable types (tmax, tmin, precip).
2. Make a copy of the rasters so that you also have a prcp_2001_01.img.
3. Run the all-in-one model or each step separately in the time series toolset of the Climatic Water Deficit.
4. Once the tool is done running delete all variables that have 2000 in the name. This first year is a dummy year in which soil water is set to 0 in January.
5. Rename your files from prcp_2001_01.img back to prcp_01.img so that you don't get confused and think that they correspond to a specific year.
I hate workarounds, but this has worked out really well for me. It should do the trick until I finish recoding the yearly normals toolset.
Subscribe to:
Posts (Atom)