Thursday, August 10, 2017

Applications: Final

Recently, I have been wanting to go on an overnight, camping/backpacking trip.  As I live in lower Alabama near the Alabama/Florida state line, I have spent time enjoying the woods of the Florida panhandle, specifically the Blackwater River State Forest in Santa Rosa County, Florida.  There I have seen signs for the Florida National Scenic Trail (FNST).  As this trail is local, I have decided to do my camping excursion on the FNST that is contained in Santa Rosa County.  I only wish to spend a night or two on the trail, so I want to make these campsites as ideal as possible.  Qualities for my ideal campsite include:

1.  Location to the trail:  I would prefer to stay within half a mile of the trail but obviously, the closer to the trail the better.
2.  Located in state forest/land:  The trail does cross near private property, and I would obviously not wish to trespass.
3.  Not located in a city:  Backpacking is about enjoying nature, so I do not want to camp within one mile of a city.
4.  Location near a stream or pond (ie a swimming hole):  I would like to rinse off after a long day of hiking.  The campsite does not have to be directly on the water (I am willing to walk a little ways from the campsite to the swimming hole).
5.  Located on dry ground:  Although I want to be near water, I do not want to be sleeping in a wetland.  A 50 foot buffer around a wetland should be sufficient to stay dry.
6.  Located in a tree dense area:  I use a hammock to sleep in so I will need trees to hang up my hammock.

After acquiring the appropriate data, I had to analysis each of my campsite prerequisites.  My ultimate goal was to compare all 6 campsite criteria at the same time so I knew that I was going to create a weighted overlay model.  For this model, I would need each criteria in raster form and for their attributes would to be assigned an indexed value.  Before I converted any of my feature layers whose attributes could not be classified into rasters, I added a “Value” column to their attribute table and assigned a value between 1 and 9.  1 being the worst value and 9 being the best value.  I could then create the model and have values to compare and/or adjust to select the most ideal campsites.  The following slides outline the analysis methods used on each criteria.
First I needed to identify where the FNST actually was in Santa Rosa County.  From the Florida trail layer, I selected all portions of the FNST that was contained in Santa Rosa County.  Then I performed an Euclidean distance analysis for half a mile around the FSNT and reclassified the distances to simplify the values (giving a higher value to the distances closest to the trail).
To ascertain state land status, I used a county parcel layer.  I selected the county parcels that are earmarked by the state for parks/recreational and timber usage. I gave park parcels a value of 9, state timberland parcels a value of 5, and all other parcels a value of 1.
Since I know that I will not be seeking any campsites further than two miles away from the trail (even that is a bit generous) and to help narrow down any future analysis, I contained my searches to the county parcels that were within a 2 mile radius of the FNST.  I selected these parcels, exported them as a feature layer, and then converted them into a raster.  I used the 2-mile parcel feature layer as the boundaries for all my future analysis.
After identifying all the cities in Santa Rosa County, I placed a 1-mile buffer around the cities and gave all buffers a value of 1.  By merging the city buffer layer with the county layer, I had something to compare the buffer value too.  Any of the county area outside of the buffer zones received a value of 9.  I then converted that feature to a raster and extracted by mask to the 2-mile parcel feature.  Next, I wanted to look for swimming hole proximity.  After clipping county water bodies to the 2-mile parcel layer, I performed an Euclidean distance analysis for 2 miles around the water bodies. I then reclassified the distances to simplify the distance values (making sure that the higher values were assigned to the distances closer to the actual water bodies).  To establish a campsite on dry ground, I looked for wetlands in the county.  From the wetland layer, I placed a 50-foot buffer around the wetland areas, merged with the county layer, gave the wetland buffer a value of 1 and county a value of 9, converted to a raster, and extracted to the 2-mile parcel feature.  Landcover analysis would help me locate wooded areas for placing my hammock.  I extracted by mask the Florida landcover raster to Santa Rosa County.  I did not reclassify but just added another value field to the attribute table and manually assigned values to the different land classes.  Anything with water, development, or agriculture, I assigned a 1.  Forested areas received a 9 and anything in between received a 5.
I was then ready to create a weighted overlay model that used all the raster layers as inputs.  A model let me perform the weighted analysis numerous times in order to fine tune the percentages for each input. I ultimately gave proximity to the trail 10% weight (I knew that I could walk further from the trail if necessary); 15% weight to proximity to a swimming hole (this too was not absolutely essential); 15% weight to being on dry ground (I will not actually be sleeping on the ground, so I knew I could sacrifice this requirement); 30% weight to being on State property (this was a vital and I actually restricted the value associated with being in a parcel not owned by the state); 15% weight to being outside of a city buffer; and 15% to hammock placement.
From the results of the weighted analysis, the ideal campsite areas are all found within the Blackwater River State Forest.  I pinpointed 4 distinct locations for a campsite all in that northeastern section of Santa Rosa County.  They are spread around the State Forest so that I have options for hiking whatever distance I would like per day.  I have also included their coordinates so that I can easily locate them using a GPS.  If those exact locations turn out to be inaccessible, they are surrounded by other desirable areas that I could search in as well.

The weighted overlay dramatically decreases the area where I would need to look for a campsite.  I would wish to incorporate a base map/satellite imagery map with the weighted overlay results so that I could have better reference points for finding the ideal camping areas.  If I redid the assignment, I would probably incorporate some of the side trails off of the FNST just to know what options for getting off the trail were.  All in all, I am satisfied that when the time comes, I will be able to find the perfect camping sites for my backpacking adventure on the Florida National Scenic Trail.
To look at my presentation, please follow this link

Wednesday, August 9, 2017

Python Module 11: Sharing Tools

Module 11 dealt with sharing Python tools and scripts.  Fundamentally, its about staying organized and referenced.  Below are screen shots of the parameter dialogue box and subsequent randomly-generated point layer.

Here is a Flow Chart for updating the script tool for ArcMap users:

Coming into this course, Python was very much like a venomous snake to me.  Now I consider it more like a garden variety snake, and I am willing to get a little closer to it… but still consider it a snake needed to be treated with respect and vigilance.  Here are my main take-aways from GIS4102 GIS Programming:
1.      Python was very confusing until I started viewing it like a math problem.  Once I began really thinking in the terms of variables and expressions just as with calculus or algebra, then Python became a little less frustrating.
2.      Punctuation and spelling are critical.  Most of my errors or failure-to-runs were from a misspelled word.

3.      Tools within ArcMap now make a little more sense because I can see “behind the scenes” and am a little more willing to explore that environment.

Tuesday, August 1, 2017

Python Module 10: Creating Custom Tools

Module 10 took an existing Python stand-alone script and turned it into a custom script tool in ArcMap.  The script we worked with, simultaneously clipped multiple layers to a single feature layer.  The steps to create this script tool are as follows:
1.      From ArcMap, create a toolbox in an appropriate folder and add a new script.  Select the existing stand-alone script as the “script file” and select “store relative path names.”
2.      From the parameter properties tab for the script, enter the display name and data type for all the needed parameters. 
3.      Adjust the values for all the parameter properties as necessary (ie choosing “yes” for multivalue).  The script tool window will look like this:

4.      From Python, replace any variable file paths with the code arcpy.GetParameter().  Index the appropriate variable in the argument value obtained from the paraments row location from Step 2.
5.      Use the str() function in any variables that will concatenate the GetParameter outputs and other strings.
6.      To check the status of the script in the results window, add (or replace any “print” statements the stand-alone script) the arcpy.AddMessage() statement.  Make sure that the statement you want printed is contained entirely within the functions argument.  The final results window will look like this:

In order to share the script with others, we needed to make a zip file by compressing both the script and the script’s toolbox.  

Wednesday, July 26, 2017

Python Module 9: Working with Rasters

This module’s script involved creating a final raster image which meets the following criteria:  1)  forest landcover of classification 41, 42, and 43; 2) a slope between 5 and 20 degrees; 3) an aspect between 150 and 270 degrees.  This required reclassifying raster imagery, assigning variables to slope and aspect function outputs, and using map algebra operators.  The most important thing that we needed to do was make sure that the spatial analyst extension was enabled, checked in, and checked out.  I did have some issues when it came to saving the outputs though.  I was encountering a “file already exists” error once I had run the script and produced the reclassified landcover raster and ran the script again.  Previously, I had created a new .gdb within the Results folder to store the script’s outputs.  I realized that this .gdb wasn’t authorized to handle any overwrites since I had set up the workspace to the Elevation.gdb and only that geodatabase had been enabled to be overwritten.  As a fix, I deleted my new .gdb and saved all my outputs to the Elevation.gdb so that it could handle overwrites.

Below is my final raster image, the “update” messages that my script printed while it was running, and a flowchart representing how the script works.

Saturday, July 22, 2017

Applications Module 9: Local Government

The initial assignment involved creating a map packet and report detailing the acreage and zoning assessment for parcels adjacent to the client’s (Mr. Zuko).  The map packet is referred to as Data Driven Pages in ArcMap, and presents a series of pages that index over individual sections of a single map.  I started the project by obtaining parcel and zoning data from the Marion County Property Appraisers.  Then I joined this information to the parcel layer in ArcMap and selected all the parcels that were within a quarter-mile radius of Mr. Zuko’s property.  With the updated and selected parcels, I began the process of creating data driven pages.
First, I set up the grid framework with the Grid Index Feature tool (map scale of 1:24,000 to accommodate for close up analysis).  After making another layer by selecting the zones that intersect the grid, I activated the Data Driven Pages from the toolbar and used the grid index as the extent for the frame.  Next, I needed to create a locator map.  The goal was to create a gray mask over the index grid and a separate, hollow mask the size of an individual grid so that when the pages are scrolled through, the hollow grid “moves” around the gray mask to reflect what section of the map the current page is showing.  After copying two, new grid index layers, I selected the appropriate (aforementioned) symbology for both layers.  The difference between the two layers is found in the properties of the page definition query.  For the stationary, gray index layer, I selected “Don’t match.”  For the movable, hollow, single-grid layer, “Match” was selected.  Next step was to export the map as a PDF and create a parcel report from the attribute table of the quarter-mile parcel layer. 

The second assignment involved editing land parcels in Gulf County to find appropriate locations for a new county Extension office.  The criteria for the appropriate location was that it had to be greater than or equal to 20 acres of vacant, county-owned land.  I needed to select and merge two parcels into one and edit the resulting attribute table.  From this newly merged parcel, I then had to cut out a portion of it into another section.  This involved the editing toolbar (straight segment and cut polygon tools) and the feature construction toolbar (constrain parallel tool).  Next began the process of narrowing down the parcels to accommodate the above criteria.  First, I needed to search by attribute for gulf county owned property.  Then I had to calculate the geometry in a new field for the acreage of these parcels.  Then by building a definition query for greater than or equal to 20 acres and joining a VICD (vacant-improved code) table to the selected parcels, I could identify which parcels met the requirements for a new Extension office.  Using the create report tool mentioned earlier, I had a deliverable report to give the county.

Tuesday, July 18, 2017

Python Module 8: Working with Geometries

The object of this Module was to explain feature geometries by accumulating them from an existing polyline shapefile (river.shp) and inserting the actual attributes into a text document.  The script required a search cursor to pull in the ID#, Shape, and Name for the river geometry.  For loops and the get.Part() method were necessary to extract the actual X,Y values from the Shape row.  The following flowchart outlines the process and the screenshot shows the final text file result.

This was the first lesson where I actually understand (partially anyways) how rows and looping through them is accomplished.  Having a geometry object called into the program that has multiple parts and is then dissected, helped the iteration through the rows make sense.

Sunday, July 16, 2017

Applications Module 8: Location Decisions

The assignment involved analysis of Alachua County, FL, to find a suitable home for a couple moving to the area.  The couple stipulated certain requirements for their housing needs.  The first criteria was that it had to be located near both places of work, the University of Florida and the North Florida Regional Medical Center.  This required the creation of individual maps with Euclidean mileage zones surrounding both work sites (using the Euclidean Distance Tool).  To simplify the zone categories, we reclassified (Spatial Analyst Reclassify Tool) the values to a numerical ranking.
Another criteria was that the couple wanted to be around other homeowners (as opposed to renters) and 40-49 year old.  These required normalization of both values to the overall population.  We will need these datasets later in the assignment to be in the same format as the distance raster zones, so we used the Convert Feature to Raster Tool on the normalized values and then again reclassified the values.  The final output was all 4 criteria individually displayed in their own maps.  I could not resolve the exporting/printer setting issues so I had to create 4 different maps, export them individually into Adobe Illustrator, and compile them there into one final product.

The second map would have shown the result from assigning more importance to one of the criteria then another.  This uses the Weighted Analysis Tool.  Since the couple wanted to focus on locating closer to their workplaces, by restricting the scale value of the least 3 favorable values of both distance files, it produced a weighted result.  I was unable to get the model to run, however.  My error message said that cell size was not set.  Cell size appeared to be set to 300 in all the reclassed input features and in the actual model, but I must be overlooking something.

Application specifically for the newly learned tools of Reclassify and Weighted Overlay allow for quickly analyzing and displaying different features with different importance levels.