Wednesday, June 21, 2017

Mod 6: Geoprocessing in Python



In this week's assignment we learned how to automate geoprocessing tools using Python scripting. We had to write a script that performs 3 geoprocessing functions: add XY coordinates, create a buffer, then dissolve the buffer into a single feature. 

The steps I took to create this script are as follows:
            import arcpy and env
            set workspace
            enable overwrite output
            add xy coordinates to hospitals.shp layer using AddXY_management tool
            get messages
            run Buffer_analysis tool on hospitals.shp layer at 1000 meter distance
            get messages
            run dissolve tool option on hospitals.shp layer
            get messages

Below is a screenshot of the results in PythonWin.




Module 5: HLS DC Crime Mapping

This week was our first exploration into Homeland Security looking at crime mapping in Washington, DC. Specifically, we were to investigate crime patterns in proximity to police stations and determine locations for the recommendation for new police substations.

The first task in this lab was to geocode our police stations using the data from an excel file. Using a second excel file containing our crime data, we plotted the crime using XY coordinates. We then graphed total crime in DC as seen in the graph on the map. 
Here is where the analysis begins! 
To allocate where to propose new substations, we needed to calculate the proximity of crime to the police stations. To do this, we used the Multiple Ring Buffer tool to create buffers of fixed distances of 0.5, 1, and 2 miles of the police stations. We then performed a spatial join to connect each crime to the nearest police station. A spatial join was performed again to find out the percentages of associated with each station. These percentages were depicted using graduated symbols. The last task was to choose locations for proposed police substations based on the analysis. The locations I chose were in close proximity to the 2 stations with highest percentages of crime. 

The second part of this lab was to create three maps depicting the density of burglaries, homicides, and sex abuse crimes in Washington, DC.


Using Kernel Density, we calculated the magnitude of the three offenses within a 1500 square kilometer radius. These densities are overlaid onto the population density per square mile.


Monday, June 19, 2017

Module 5: Geoprocessing in ArcGIS

This week has us learning how to create a new toolbox, model, and geoprocessing script and script tool. We also learned how to export a model as a script.

The first part of the assignment was the create a model in Model Builder that performed the following tasks:

  • Clips all soils to the extent of the basin polygon.
  • Selects all soils that are classified as "Not prime farmland".
  • Erases the "Not prime farmland" soil selection from the basin polygon.
Below is a screenshot of the model I created:



The model was then exported as a Python script. I ran this script in Python but got an error code, so I fixed the script and ran it again with success (as seen below).


Finally, we were to create a script tool by zipping our created toolbox with our script folder. Below is a screenshot of the the output shapefile from my model/script tool.






Tuesday, June 13, 2017

Week 4: Natural Hazards: Hurricanes

This lab, while interesting to gain knowledge in, really hit home as I have many family members who were affected by Sandy, including a cousin and his family who completely lost their home from the affects of the storm.

The first part of this lab had us creating a map of Hurricane Sandy's path including the barometric pressure and wind speed at various locations along the path. The path was created by exporting the xy coordinates for each location into our geodatabase. This gave us 31 points of which we then added a path using the Points to Line tool.



In the second part of the lab, we used Pre-Storm and Post-Storm aerial imagery to help us conduct a damage assessment of a specified area hit by the storm. Here we had to digitize structures by adding points to them and assigning values to those points.


Module 4: Debugging & Error Handling

For this lab, we were to take 3 scripts, as provided, and debug them in order for them to run properly.

The first script contained 2 errors which I corrected to get the following result:





The second script contained 8 errors. Once these errors were corrected, the result should be a printout of 4 shapefiles.



The third script had two parts. Part A contained an error in which we needed to add a "try/except" statement to the script in order for it to run and then continue to run Part B, which did not contain any errors. 



Friday, June 9, 2017

Participation Assignment #1

Participation assignment #1

Silva A.M.A.N., Tabora R.P.M. (2012). Integration of beach hydrodynamic and morphodynamic modelling in a GIS environment. Journal of Coastal Conservation, (June 2013), 201-210



This article describes the development of a Beach Morphodynamic tool using Python scripting language and integrating morphological numerical models within a GIS platform.  Morphodynamics modelling, in this aspect, refers to predictive models used to study the interaction and adjustment of the seafloor topography and fluid hydrodynamic processes and the sequences of dynamics involving the motion of sediment. With this approach, sandy beaches can be classified into morphodynamic types. The objective of the BeachMM tool was to ease the hydrodynamic and morphodynamic modelling by developing a geoprocessing tool for ArcGIS using a Python programming language that would simplify procedure, automate data flow between predictive models and GIS and graphically display the results (Silva, Tabora. 2013:202). The integration of a GIS with morphological numerical model provides a tool to run simulation and to interpret results in a spatial context. (Pullar and Springer 2000).

The BeachMM tool was applied to the morphodynamic modeling of an exposed beach (Norte beach) along the Portuguese western coast. This coastal stretch is exposed to the North Atlantic wave regime characterized by a predominant swell from the NW with a generally less energetic local wind sea with a wider directional spread (Silva, Tabora. 2013:205). Nearshore wave propagation is strongly disturbed by the presence of the Nazare submarine canyon, which interrupts the net southward longshore sediment transport (Dias et al. 2002). The results of the application of the BeachMM tool at this site are represented in the form of spatial variation of the wave energy dissipation. The dissipation pattern revealed the presence of a submarine sand bar roughly parallel to the coast (Silva and Tabora. 2013:208-209). Modelled nearshore velocity pattern agrees with the uncommon development of a beach that surrounded the headland at that time. Results show a southward sediment transport along the Norte beach.


In conclusion, the article discusses the merits of the use of the BeachMM tool as it was applied to the morphodynamic modelling of the Norte Beach in Portugal. The tool, using data management tools available in ArcGIS and a customized Python scripted interface, successfully simplified dataflow effort, reduced human error and provides a dynamic visualization of the modelling results.

Wednesday, June 7, 2017

Mod 3: Tsunamis

This week we worked on our data management skills by creating file geodatabases to organize feature datasets and feature classes within the datasets. We then used these files to work from and add to to create evacuation zone maps surrounding the Fukushima II Nuclear Power Plant in Japan.



The first step was isolating the Power Plant so we could create a Multiple-Ring Buffer at various distances from the Plant. These buffer zones determined the populations and cities affected by radiation within the distances. We then conducted a run up analysis  based on known measurements from the actual tsunami event. We isolated affected areas and determined their elevation and proximity to the coast line using the "Extract by Mask" Spatial Analyst tool. Using the same tool, we isolated elevations at lower elevations by creating a 10,000 meter buffer of the coastline. From here, we used ModelBuilder starting with the newly created FukuCoastalDEM and dragging the Con, Raster to Polygon, Create Features to Class, Append, and Intersect tools to give us a final output of 3 run up evacuation zones based on elevation.