Friday, June 23, 2017

Applications Module 5 - DC Crime Mapping

Module 5 produced two crime analysis maps for the Washington D.C. area.  The first map displayed the comparison of crime rates to police station locations and, also the percentages of crime that any one station was responsible for handling.  Based on these factors, two new police stations have been recommended to reduce work load on those high volume stations and to reduce the amount of crime in that given area (since it was in the higher crime percentage radius).  Overall, this map showed that D.C.'s highest occurrence of crime occurs in its central Northwestern corridor but the presence of police stations does lower the crime rate within a half mile radius of the station's location.


The second map looks at crime densities in relation to population densities in D.C. city blocks.  It uses kernel density analysis with a radius of 1500 kilometers for homicides, burglaries, and sex abuse offenses.  This map highlighted patterns for these three particular offenses.



Major tools used to create these maps involved the buffer, spatial join, and the display XY tools.  The exercise also revisited creation of an address locator and usage of the field calculator which were taught in the Intro course.  The kernel density tool and graph creation were new procedures and were utilized to visually present data in a succinct manner.

Monday, June 19, 2017

Programming Module 5 - Geoprocessing

In this week’s lab, we learned how to create models and how to convert those models into scripts that can run in both ArcMap and in PythonWin.  We also worked through the process to create a toolbox with tools.  Specifically, we created a script that contained several tools to eliminate non prime farmland (as identified from a separate soils layer) from a basin layer. 

The flowchart visually steps through the process of creating the layer that contains the non prime farmland erased from the basin.  First, from within the ModelBuilder in ArcMap, I added the basin and soils shapefiles and the clip tool.  Using the connect tool, I drew an arrow between soils.shp (input feature) to the clip tool and between the basin.shp (clip feature) and the clip tool.  After adding the select tool, I connected the output from the clip tool and the select tool using the expression FARMLNDCL = “Not prime farmland” and the clipped soils shapefile.  I then added the erase tool and connected the new Not_Frmlnd.shp (erase feature) and the basin.shp (input feature).  The Erase_Soils shapefile was then created.  After each tool was added, I ran the model to make sure everything had been set up successfully.


Sunday, June 18, 2017

Applications Module 4 - Hurricanes

This week's lab looked at 2012 Hurricane Sandy's storm path and damage analysis.  The hurricane path map reviewed the concepts of making point features from XY tables and making lines from points.  I enjoyed learning how to make unique symbol from the existing symbol libraries.  We also created labels using a VB script.  When it actually got to the final map, however, I chose to rearrange/condense the labels so it changed the look of the script-derived-labels.

The second damage assessment map was a bit more complex.  We had to mosaic imagery and create attribute domains.  The Flicker and Swipe tools were incredibly beneficial in comparing the pre and post imagery to assess the extent of damage.  Using edit sessions, we located point features and made additions to the attributes.  Since the goal of this map was to display the damage to the study area, two inset maps needed to be created to give spatial reference.  


One of my struggles was with the damage assessment table.  I am still not completely positive how to determine storm damage from wind damage nor how to determine the level of inundation damage from the overhead imagery layer so I found it difficult to complete the assessment table.

Monday, June 12, 2017

Programming Module 4 - Debugging

Part 1 Screen Shot

Part 2 Screen Shot

Part 3 Screen Shot

The above screen shots are evidence that I can, at least superficially, debug Python scripts.  This week’s module focused on finding and handling errors, exceptions, and how to debug Python.  

Part 1 and 2 errors included incorrect capitalization, punctuation, and spelling; incorrect file path names, improper indentation, and poorly constructed parameters.  The check icon and running the script with the debugger/step tools helped to find and correct the errors.

Part 3 involved using the try-except method to handle the errors.  This method was confusing to me in the reading and the lecture.  But I think this was due to my lack of understanding of the user-interface exchange that the method utilized.  The assignment, however, clarified the try-except method in the debugging process.  To use the try-except method, you first need to identify the portion of the script where the error may be occurring.  Inset try preceding the script.  Following the portion of the script with the error, insert except Exception which catches any run-time errors and allows the script to continue.  I assigned the variable e to the exception in order to print it and actually see what the variable was.

Wednesday, June 7, 2017

Applications Module 3 -Tsunami

The overarching design of Module 3 was to examine the risk and evacuation zones associated with a tsunami on Japan’s north-eastern coastline.  Specifically, we looked at the impact on the Fukushima prefecture and, by association, the impact and evacuation around the Fukushima-Daiichi Nuclear Power Plant.
The first portion of the lab was a lesson in organization and creation of geodatabases and feature classes specifically in ArcCatalog.  We had to ensure that all data was in the same project coordinate system and had to mosaic two raster datasets through the Mosaic to New Raster and Calculate Statistics tools.  Later, the multiple ring buffer, clipping, and select-by-location tools were used to determine the radiation zones from 3, 7, 15, 30, 40, and 50 miles from the power plant.  Through the use of a model builder we analysed the impact that a tsunami runup would have on the coastline and the resulting evacuation zones.  This provided a step-by-step introduction to the Module Builder and how it can simplify/streamline multiple tools, workflows, and goals.




This map had all the usual “creation” issues:  utilizing art-space appropriately for the plethora of information required, visually balancing colors for different zones/buffers, etc. But I had two main take-aways from this map creation.  One involved the use of the extent indicators.  I haven’t had to use 2 extent indicators in any one map before and learning how to make them visually distinctive (aside from just using labels/text) was a challenge.  It caused me to focus on frame styles/colors.  The second take-away involved the chart creation.  I waited until I had already imported and worked on my map in Adobe Illustrated to insert the radiation effects chart.  This proved to be a multifaceted step in Ai and resulted in a chart that I am still not entirely satisfied with.  After it was created, though, I then went to see if ArcMap had an easy to use chart creation.  Of course, it does and will be used by me in the future!

Programming Module 3 - Python Fundamentals Part 2

We had three separate codes that needed to be analyzed and/or created.  At the end of the assignment, the goal was to be aware of and have a better understanding of scripting errors, comments within script, conditional statements, and looping structures.

The first set of code involve a “game” that people’s names and randomly generated numbers.  We had to find the errors contained within the existing script.  The second code required us to generate a list of 20 randomly picked integers between 0 and 10.

Once I sat down and, essentially, wrote a “paragraph” explaining to myself the process that the script would need to undergo to complete all the steps (feeding the list back into the randomly generated portion to add on integers to the list), the script became clarified for me.  It was very much like shooting in the dark before I actually took the time to get organized!

The third code took our randomly generated list of 20 integers and then removed any usage of a particular number from that number list.  This required the use of   statements in order to generate appropriate messages for the number of times the selected number was going to be removed or if it was found at all in the list.

I was able to get a statement printed if the number was not found in the list.  But I was not able to remove the number from the list so I could not complete the task.  I tried all sorts of different functions and searches through the Python Help but knew that the solution had to involve something that we had already covered and not something that I randomly found in Help. 

 Below is a screen shot of the final output.


Wednesday, May 31, 2017

Applications - Module 2 Lahars

My goal was to identify hazardous drainage areas around Mt. Hood, Oregon in the eventuality of a volcanic eruption and potential lahar creation.  Drainage area identification was accomplished through compounding steps with ArcMaps Spatial Analyst Toolset.  Of foremost importance was locating Mt. Hood (found with the use of the XY Tool) and combining elevation raster data.  The elevation was fundamental to utilize the Spatial Analyst Hydrology Tools including Fill, Flow Direction, Flow Accumulation, and Stream to Feature.  Analysis of pixel percentages were also needed to determine where the likelihood of flowing water/lahar movement would occur.  This required the conversion of floating point data to integer data (using the Math, Integer Tool) to evaluate attribute table data (with the use of the Con Tool).  A ½ mile buffer needed to be established around the stream locations by using the Selection by Location and Buffer Tools.  After these areas were identified, we were to tally up the lahar effect on population and on the schools located within that ½ mile buffer. 
The most important lesson I learned was to properly label any geodatabase files and to fully understand their creation at the time of creation in order to minimize confusion in the long term.  I challenge that I decided to overcome with the help of Adobe Illustrated (although not prescribed in the directions, I found Ai incredibly helpful and fun last semester and thought that it would be good to continue using those skills learned), was the placement of all the city/county/stream labels.

Below is the final product with a summary of findings.