Vector tile server for the Wildfire Predictive Services Unit
The intention of this project is to:
- provide tools to easily spin up a vector tile server in openshift, in a project agnostic manner.
- provide tools to manually pull data from an esri arc server into a postgis database.
- provide tools that periodically synchronize data from an esri arc server into postgis.
- postgis database server. (it is assumed you have a working postgis database server)
- pg_tileserv – serves up vector tiles from postgis server.
- proxy server (varnish?) – caches responses.
- sync cronjob – updates database periodically.
- postgresql server with postgis running locally
Configure pg_tile server
Download the latest pg_tileserver, unzip and start.
mkdir pg_tileserv cd pg_tileserv wget https://postgisftw.s3.amazonaws.com/pg_tileserv_latest_linux.zip unzip pg_tileserv export DATABASE_URL=postgresql://tileserv:[email protected]/tileserv ./pg_tileserv
Install binary requirements
sudo apt install gdal-bin
Install python requirements
This step only required if you’re going to be using the python scripts in this repo to load data. If you’re loading directly from shapefiles, then skip this step.
- appropriate python version is install
- python poetry is installed
Install python requirements
Create a user and database for your tileserver
create user tileserv with password 'tileserv'; create database tileserv with owner tileserv; \c tileserv CREATE EXTENSION IF NOT EXISTS postgis;
Using an arcserver rest endpoint
Given some arcserver layer endpoint, e.g.: https://maps.gov.bc.ca/arcserver/rest/services/whse/bcgw_pub_whse_legal_admin_boundaries/MapServer/8
poetry run python fetch_feature_layer.py https://maps.gov.bc.ca/arcserver/rest/services/whse/bcgw_pub_whse_legal_admin_boundaries/MapServer/8
Using a shapefile
ogr2ogr -f "PostgreSQL" PG:"dbname=tileserv host=localhost user=tileserv password=tileserv" "my_shapefile.shp" -lco precision=NO -nln fire_area_thessian_polygons
- You have the oc command line installed and you’re logged in.
- You have docker installed locally.
- You have a postgres database in your target openshift environment that can be accessed by pg_tileserv (you made need to add additional rules to allow your tile server to communicate with your database.)
Prepare your openshift environment
# we have docker limits, so pull the images local - then put them in openshift # pull local docker pull eeacms/varnish docker pull pramsey/pg_tileserv # tag for upload docker tag eeacms/varnish image-registry.apps.silver.devops.gov.bc.ca/e1e498-tools/varnish:latest docker tag pramsey/pg_tileserv image-registry.apps.silver.devops.gov.bc.ca/e1e498-tools/pg_tileserv:latest # log in to openshift docker docker login -u developer -p $(oc whoami -t) image-registry.apps.silver.devops.gov.bc.ca # push it docker push image-registry.apps.silver.devops.gov.bc.ca/e1e498-tools/varnish:latest docker push image-registry.apps.silver.devops.gov.bc.ca/e1e498-tools/pg_tileserv:latest
# deploy pg_tileserv oc -n e1e498-dev process -f tileserv.yaml | oc -n e1e498-dev apply -f -
Manually Loading data into your openshift hosted postgis database
The easiest way to achieve this, is to tunnel to your database server and then run the import scripts as if your database was local.
oc port-forward patroni-wps-mapserver-prototype-1 5432:5432