Posts

Showing posts from July, 2019

Module 4: Crime Analysis

Image
The purpose of this week's lab was to explore 3 selected hotspot mapping techniques for crime analysis for 2017 homicides in the Chicago area. The results of each technique were compared against 2018 homicide data to assess each technique's reliability for predicting crime. The first technique was Grid Overlay Hotspot Analysis. The goal was to determine the number of 2017 homicides in each grid cell and select the cells with the highest count. This was accomplished by first performing a spatial join between the 1/2 mile grid cells and the 2017 homicide data which added a field representing the number of homicides in each grid. I then used the Select by Attributes tool to select all counts greater than 0 and saved the selection as a separate feature class. I selected grids with the top 20% manually from the attribute table - the total number selected was calculated by dividing the total by 5 - and saved the selection as a separate feature class. To dissolve this feature class,...

Module 3: Visibility Analysis

Image
The purpose of this week's lab was to complete 4 ESRI trainings to explore the concepts of line of sight analysis, viewshed analysis, 3D visualization and sharing 3D content. The first training, 3D Visualization Using ArcGIS Pro was concerned with creating and navigating 3D scenes. 3D maps and scenes can be helpful for visualizing and analyzing data in a more realistic setting. Investigating data in 3D allows a different perspective in order to gain different and new insights that might not be answerable in a 2D setting. Applications for 3D maps are many and include showing the impact of a new building in an area, displaying a transportation route through an area, or visualizing subsurface features such as wells, pipelines or fault lines. Although 3D maps have wide ranging applications, navigating them can be initially cumbersome and sometimes difficult to interpret, depending on the data being displayed. In this training, I explored data for Crater Lake in Oregon, and San Diego,...

Module 2: Forestry and LiDAR

Image
The purpose of this week's lab was to find and use LiDAR data in an analysis to calculate forest height and biomass. The original .las LiDAR file was acquired from the Virginia LiDAR online application (https://vgin.maps.arcgis.com/home/index.html). The LiDAR Download Grid: N16_5807_20 was downloaded and decompressed using an LAS Optimizer from ESRI. The DEM and DSM were created from the LiDAR layer by changing the appearance to Ground and Non Ground, respectively, using the LAS Dataset to Raster tool in ArcGIS Pro using a sampling value of 6. The original LiDAR scene and derived DEM are below. To create a forest or tree height layer, the DEM and DSM were used as inputs in the Minus tool in ArcGIS Pro. A tree height distribution chart was created with data from this layer. The chart shows the total count of tree of different heights. It approximates a normal distribution bell curve ranging from -5 (an error value) to 163 feet, with an average of 54 feet. Most of the tree heigh...

Module 1, Part 2: Corridor Analysis

Image
The purpose of this lab was to create a corridor of potential movement of black bears between two protected areas in the Coronado National Forest. The variables for the analysis included distance to roads, elevation and land cover. The flow chart for the work flow in this analysis is below: In order to begin the corridor development, I first developed a habitat suitability model. I began by  reclassifiying the roads shapefile and elevation and landscape rasters. The rasters were reclassified using the Reclassify tool with the cost values provided. The roads shapefile was first converted to a raster using the Polyline to Raster tool, then using the Euclidean Distance tool to identify distances away from the road within the elevation raster’s extent, then using the Reclassify tool with the cost values provided. The habitat suitability model was then developed using the Weighted Overlay tool with all three reclassified rasters, weighted with landcover at 60% and elevation and roa...

Module 1, Part 1: Suitability Analysis

Image
The purpose of this lab was to create a suitability model for a developer. The analysis included provided suitability ratings for land cover, soils, slopes, proximity to streams and proximity to roads. The flow chart of the analysis work flow is below: I began by using the Reclassify tool to reclassify the landcover raster. I used the Euclidean Distance tool on the rivers and roads shapefiles, individually, and then reclassified each raster output using the Reclassify tool. I used the Slope tool on the elevation raster to convert the raster elevations to slope and then reclassified the output raster using the Reclassify tool. I converted the soils shapefile to a raster with the Polygon to Raster tool and reclassified the output raster using the Reclassify tool. To create the final suitability model I added all the reclassified rasters to the Weighted Overlay tool and ran two analyses. The first analysis gave each variable the same weight, 20%. The second analysis used unequal...

Module 7: Working with Rasters

Image
The purpose of this lab was to explore the raster object geoprocessing functions in a Python scripting environment. The final script used the raster object geoprocessing functions .Reclassify, .Slope, and .Aspect functions to create a final raster that combines a reclassified landcover.shp showing only forested land, and highlights slopes between  5° and 20° plus aspect between 150° and 270° from an elevation.shp. The final output raster is below:   The biggest issue I had was with the .Raster function. I defined my new raster as elevraster, however I kept receiving an error that this raster did not exist when trying to perform the .Slope function. To correct this, I saved the raster with .save in the working environment folder. This resolved the error.  The other error was a syntax error while saving the final raster. I simply did not close the parentheses.  Because the script ran without intermediate steps, it was difficult to ensure that th...