Geospatial Analysis on Shapefile of Longitude and Latitude Data Using Vertica: Hurricane Bonnie
Like any natural disaster, hurricanes can leave behind extensive damage to life and property. The question asked by NGOs, government agencies, and insurance companies is, “How can we predict the locations where a storm will inflict the most damage?” Modern spatial analysis enables us to predict the impact of disasters in high-risk areas and enable these groups to effectively allocate their resources.
Vertica enables you to analyze location data efficiently and accurately in your database. Here we will show you how to gain insight into real-world challenges using your spatial data with Vertica. In this example, we will perform a point-in-polygon analysis to identify properties that fall within the impact zone of a hurricane. This analysis will identify the areas most affected by a hurricane and visualize them on a heat map.
Vertica is compatible with the shapefile data format, a popular geospatial vector data format for geographic information system (GIS) software. You can easily load your location data to Vertica in shapefile format and use the data to perform a spatial analysis. Vertica also enables you to export your results to a shapefile which you can load to other data analysis tools for visualization or further analysis. Vertica integrates easily into your spatial analysis work flow.
In this example, we will identify the areas most at risk for damage from Hurricane Bonnie. We will show you each step of the whole work flow, from downloading the publicly available data sets, to generating a heat map which indicates the impacts of the Hurricane. You need three tools:
The following figure shows the work flow:
To perform a geospatial analysis on shapefile of longitude and latitude data in this example, we use the following Vertica built-in functions:
Convert Shapefile Using GDALWe are using the Hurricane BONNIE(2016) data set from the National Hurricane Center and the Landmarks data set from the United States Census Bureau TIGER database. Both data sets are geographic longitude latitude data sets in shapefile format. The BONNIE shapefile is in the SR-ORG7219 spatial reference system and the landmark shapefile is in the NAD83 spatial reference system. We cannot analyze spatial data in different reference systems. The first step, then, is converting them to WGS84 using the GDAL ogr2ogr tool:
# ogr2ogr -f “ESRI Shapefile” -t_srs EPSG:4326 destination.shp source.shp # #Convert hurricane BONNIE polygon data to WGS84 ogr2ogr -f “ESRI Shapefile” -t_srs EPSG:4326 al022016-001_5day_pgn_4326.shp al022016-001_5day_pgn.shp #Convert landmarks point data to WGS84 ogr2ogr -f “ESRI Shapefile” -t_srs EPSG:4326 tl_2015_09_pointlm_4326.shp tl_2015_09_pointlm.shp –s_srs EPSG:4269You can download the converted data sets from our GitHub repository.
Perform a Geospatial Analysis with VerticaThis analysis identifies areas with high concentrations of properties and landmarks located within the predicted cone of the hurricane.
First, load the shapefile data into Vertica:
-- Load the hurricane BONNIE data set (polygons) to Vertica -- STV_ShpCreateTable returns a CREATE TABLE statement \set bonnie ''''`echo /home/shapefiles/`'BONNIE.shp''' SELECT STV_ShpCreateTable(USING PARAMETERS file=:bonnie) OVER(); -- Create a Vertica table to store the polygon objects CREATE TABLE bonnie( gid IDENTITY(64) PRIMARY KEY, ADVDATE VARCHAR(36), geom GEOMETRY(7453) ); -- Copy from shipfile to Vertica table COPY bonnie SOURCE STV_ShpSource(file=:bonnie) PARSER STV_ShpParser(); -- 6 Rows Loaded -- -- -- Load the landmars data set (points) to Vertica \set landmarks ''''`echo /home/shapefiles/`'landmarks.shp''' SELECT STV_ShpCreateTable(USING PARAMETERS file=:landmarks) OVER(); -- Create a Vertica table to store the point objects CREATE TABLE landmarks( gid IDENTITY(64) PRIMARY KEY, STATEFP VARCHAR(18), geom GEOMETRY(85) ); COPY landmarks SOURCE STV_ShpSource(file=:landmarks) PARSER STV_ShpParser(); -- 236268 Rows Loade -- -- -- US states data set from the United States Census Bureau TIGER database -- ftp://ftp2.census.gov/geo/tiger/TIGER2015/STATE/ -- Load US states data set (polygons) to Vertica \set states ''''`echo /home/shapefiles/`'tl_2015_us_state.shp''' SELECT STV_ShpCreateTable(USING PARAMETERS file=:states) OVER(); CREATE TABLE states( gid IDENTITY(64) PRIMARY KEY, REGION VARCHAR(2), DIVISION VARCHAR(2), STATEFP VARCHAR(2), STATENS VARCHAR(8), GEOID VARCHAR(2), STUSPS VARCHAR(2), NAME VARCHAR(100), LSAD VARCHAR(2), MTFCC VARCHAR(5), FUNCSTAT VARCHAR(1), ALAND INT8, AWATER INT8, INTPTLAT VARCHAR(11), INTPTLON VARCHAR(12), geom GEOMETRY(934733) ); COPY states SOURCE STV_ShpSource(file=:states) PARSER STV_ShpParser(); -- 56 Rows LoadedThen, perform a spatial join analysis with Vertica:
-- Compute points intersecting polygons -- Count the landmarks affected by Bonnie, group by state and time frame CREATE TABLE counts AS SELECT ADVDATE, STATEFP, count(*) counts FROM bonnie a, landmarks b WHERE ST_Intersects(a.geom, b.geom) GROUP BY ADVDATE, STATEFP;Next, create a heat map with Vertica:
-- Compute the clipped areas of each state covered by Bonnie -- Join the counts withe the clipped areas CREATE TABLE heatmap AS SELECT c.ADVDATE, c.STATEFP, s.STUSPS, c.counts, ST_intersection(b.geom, s.geom) geom FROM counts c, bonnie b, states s WHERE b.ADVDATE = c.ADVDATE AND s.STATEFP = c.STATEFP;Export Vertica table to shapefile:
-- Export the heatmap table from Vertica to shapefiles SELECT STV_SetExportShapefileDirectory(USING PARAMETERS path = '/home/shapefiles'); -- 1) Export polygon objects in heatmap table to a shapefile SELECT STV_Export2Shapefile(* USING PARAMETERS shapefile = 'heatmap.shp', overwrite = TRUE, shape = 'Polygon') OVER() FROM heatmap WHERE st_geometrytype(geom) = 'ST_Polygon'; -- 42 Rows Exported -- 2) Export multipolygon objects in heatmap table to a shapefile SELECT STV_Export2Shapefile(* USING PARAMETERS shapefile = 'heatmapMulti.shp', overwrite = TRUE, shape = 'MultiPolygon') OVER() FROM heatmap WHERE st_geometrytype(geom) = 'ST_MultiPolygon'; -- 7 Rows Exported