Improve docs around tiles, remove old stuff, bump 0.3.0-rc.1

This commit is contained in:
Paul Bienkowski 2021-11-21 20:17:20 +01:00
parent 12ef37392b
commit ea106539c6
10 changed files with 73 additions and 237 deletions

View file

@ -98,6 +98,32 @@ every time you reset the keycloak datbaase, which is inside the PostgreSQL. The
script `api/tools/reset_database.py` does *not* affect the state of the
keycloak database, however, so this should be rather rare.
### Prepare database
Start the PostgreSQL database:
```bash
docker-compose up -d postgres
```
Then initialize an empty database, creating all extensions and tables
automatically:
```bash
docker-compose run --rm api tools/reset_database.py
```
You should import OpenStreetMap data now, see below for instructions.
To serve dynamic vector tiles from the API, run the following command once:
```bash
docker-compose run --rm api tools/prepare_sql_tiles.py
```
You might need to re-run this command after updates, to (re-)create the
functions in the SQL database that are used when generating vector tiles.
### Boot the application
Now you can run the remaining parts of the application:
@ -106,22 +132,53 @@ Now you can run the remaining parts of the application:
docker-compose up -d --build api worker frontend
```
If this does not work, please open an issue and describe the problem you're
having, as it is important to us that onboarding is super easy :)
Your frontend should be running at http://localhost:3001 and the API at
http://localhost:3000 -- but you probably only need to access the frontend for
testing. The frontend dev server also proxies all unknown requests to the API,
so the frontend always just requests data at its own URL.
testing.
### Migrating (Development)
Migrations are not implemented yet. Once we need them, we'll add them and
document the usage here.
## Tileserver generation
## Import OpenStreetMap data
You need to import road information from OpenStreetMap for the portal to work.
This information is stored in your PostgreSQL database and used when processing
tracks (instead of querying the Overpass API), as well as for vector tile
generation.
* Install `osm2pgsql`.
* Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de).
* Import each file like this:
```bash
osm2pgsql --create --hstore --style api/roads_import.lua -O flex \
-H localhost -d obs -U obs \
path/to/downloaded/myarea-latest.osm.pbf
```
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
your setup, and also provide the correct password when queried. This process
should take a few seconds to minutes, depending on the area size. You can run
the process multiple times, with the same or different area files, to import or
update the data. You can also truncate the `road` table before importing if you
want to remove outdated road information.
Refer to the documentation of `osm2pgsql` for assistance. We are using "flex
mode", the provided script `api/roads_import.lua` describes the transformations
and extractions to perform on the original data.
## Static tile generation
The above instructions do not include the serving of vector tiles with the
collected data. That is to be set up separately. Please follow the instructions
in [tile-generator](./tile-generator/README.md).
### Troubleshooting
If any step of the instructions does not work for you, please open an issue and
describe the problem you're having, as it is important to us that onboarding is
super easy :)

View file

@ -1,12 +1,12 @@
HOST = "0.0.0.0"
PORT = 3000
DEBUG = False
DEBUG = True
AUTO_RESTART = True
SECRET = "!!!!!!!!!!!!CHANGE ME!!!!!!!!!!!!"
POSTGRES_URL = "postgresql+asyncpg://obs:obs@postgres/obs"
KEYCLOAK_URL = "http://keycloak:8080/auth/realms/obs-dev/"
KEYCLOAK_CLIENT_ID = "portal"
KEYCLOAK_CLIENT_SECRET = "76b84224-dc24-4824-bb98-9e1ba15bd58f"
KEYCLOAK_CLIENT_SECRET = "c385278e-bd2e-4f13-9937-34b0c0f44c2d"
DEDICATED_WORKER = True
FRONTEND_URL = "http://localhost:3001/"
FRONTEND_DIR = None

View file

@ -1 +1 @@
__version__ = "0.2.0"
__version__ = "0.3.0-rc.1"

View file

@ -1,5 +1,4 @@
#!/usr/bin/env python3
import argparse
import logging
import asyncio
import tempfile
@ -41,27 +40,11 @@ def parse_pg_url(url=app.config.POSTGRES_URL):
async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
parser = argparse.ArgumentParser(
description="processes a single track for use in the portal, "
"using the obs.face algorithms"
)
parser.add_argument(
"--prepare",
action="store_true",
help="prepare and import SQL functions for tile generation",
)
args = parser.parse_args()
if args.prepare:
with tempfile.TemporaryDirectory() as build_dir:
await generate_data_yml(build_dir)
sql_snippets = await generate_sql(build_dir)
await import_sql(sql_snippets)
await generate_tiles()
async def _run(cmd):
if isinstance(cmd, list):
@ -162,18 +145,5 @@ async def import_sql(sql_snippets):
await session.commit()
async def generate_tiles():
pass
# .PHONY: generate-tiles-pg
# generate-tiles-pg: all start-db
# @echo "Generating tiles into $(MBTILES_LOCAL_FILE) (will delete if already exists) using PostGIS ST_MVT()..."
# @rm -rf "$(MBTILES_LOCAL_FILE)"
# # For some reason Ctrl+C doesn't work here without the -T. Must be pressed twice to stop.
# $(DOCKER_COMPOSE) run -T $(DC_OPTS) openmaptiles-tools generate-tiles
# @echo "Updating generated tile metadata ..."
# $(DOCKER_COMPOSE) run $(DC_OPTS) openmaptiles-tools \
# mbtiles-tools meta-generate "$(MBTILES_LOCAL_FILE)" $(TILESET_FILE) --auto-minmax --show-ranges
if __name__ == "__main__":
asyncio.run(main())

View file

@ -71,54 +71,3 @@ services:
# - "traefik.http.routers.traefik.tls.certresolver=leresolver"
# - "traefik.http.routers.traefik.middlewares=basic-auth"
# - "traefik.http.middlewares.basic-auth.basicauth.usersfile=/usersfile"
openmaptiles-tools:
image: openmaptiles/openmaptiles-tools:6.0
environment:
# Must match the version of this file (first line)
# download-osm will use it when generating a composer file
MAKE_DC_VERSION: "3"
# Allow DIFF_MODE, MIN_ZOOM, and MAX_ZOOM to be overwritten from shell
DIFF_MODE: ${DIFF_MODE}
MIN_ZOOM: ${MIN_ZOOM}
MAX_ZOOM: ${MAX_ZOOM}
#Provide BBOX from *.bbox file if exists, else from .env
BBOX: ${BBOX}
# Imposm configuration file describes how to load updates when enabled
IMPOSM_CONFIG_FILE: ${IMPOSM_CONFIG_FILE}
# Control import-sql processes
MAX_PARALLEL_PSQL: ${MAX_PARALLEL_PSQL}
PGDATABASE: obs
PGUSER: obs
PGPASSWORD: obs
PGHOST: postgres
PGPORT: 5432
volumes:
- ./source/tile-generator/:/tileset
- ./data/tiles:/import
- ./data/tiles:/export
- ./data/tiles-build/sql:/sql
- ./data/tiles-build:/mapping
- ./data/tiles-cache:/cache
generate-vectortiles:
image: openmaptiles/generate-vectortiles:6.0
volumes:
- ./data/tiles:/export
- ./data/tiles-build/openmaptiles.tm2source:/tm2source
environment:
MBTILES_NAME: ${MBTILES_FILE}
BBOX: ${BBOX}
MIN_ZOOM: ${MIN_ZOOM}
MAX_ZOOM: ${MAX_ZOOM}
# Control tilelive-copy threads
COPY_CONCURRENCY: ${COPY_CONCURRENCY}
#
PGDATABASE: obs
PGUSER: obs
PGPASSWORD: obs
PGHOST: postgres
PGPORT: 5432

View file

@ -92,55 +92,6 @@ services:
- npm
- start
openmaptiles-tools:
image: openmaptiles/openmaptiles-tools:6.0
environment:
# Must match the version of this file (first line)
# download-osm will use it when generating a composer file
MAKE_DC_VERSION: "3"
# Allow DIFF_MODE, MIN_ZOOM, and MAX_ZOOM to be overwritten from shell
DIFF_MODE: ${DIFF_MODE}
MIN_ZOOM: ${MIN_ZOOM}
MAX_ZOOM: ${MAX_ZOOM}
#Provide BBOX from *.bbox file if exists, else from .env
BBOX: ${BBOX}
# Imposm configuration file describes how to load updates when enabled
IMPOSM_CONFIG_FILE: ${IMPOSM_CONFIG_FILE}
# Control import-sql processes
MAX_PARALLEL_PSQL: ${MAX_PARALLEL_PSQL}
PGDATABASE: obs
PGUSER: obs
PGPASSWORD: obs
PGHOST: postgres
PGPORT: 5432
volumes:
- ./tile-generator/:/tileset
- ./tile-generator/data:/import
- ./tile-generator/data:/export
- ./tile-generator/build/sql:/sql
- ./tile-generator/build:/mapping
- ./tile-generator/cache:/cache
generate-vectortiles:
image: openmaptiles/generate-vectortiles:6.0
volumes:
- ./tile-generator/data:/export
- ./tile-generator/build/openmaptiles.tm2source:/tm2source
environment:
MBTILES_NAME: ${MBTILES_FILE}
BBOX: ${BBOX}
MIN_ZOOM: ${MIN_ZOOM}
MAX_ZOOM: ${MAX_ZOOM}
# Control tilelive-copy threads
COPY_CONCURRENCY: ${COPY_CONCURRENCY}
#
PGDATABASE: obs
PGUSER: obs
PGPASSWORD: obs
PGHOST: postgres
PGPORT: 5432
keycloak:
image: jboss/keycloak
ports:

View file

@ -1,6 +1,6 @@
{
"name": "openbikesensor-portal-frontend",
"version": "0.2.0-pre",
"version": "0.3.0-rc.1",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View file

@ -1,6 +1,6 @@
{
"name": "openbikesensor-portal-frontend",
"version": "0.2.0-pre",
"version": "0.3.0-rc.1",
"private": true,
"dependencies": {
"@testing-library/jest-dom": "^5.12.0",

View file

@ -24,5 +24,5 @@
"theme_color": "#114594",
"background_color": "#ffffff",
"manifest_version": 2,
"version": "0.2.0-pre"
"version": "0.3.0-rc.1"
}

View file

@ -1,91 +0,0 @@
# Tile Generation
To display the collected data we generate vector tiles which can be rendered by
different map renderers, such as
[maplibre-gl-js](https://github.com/MapLibre/maplibre-gl-js) or
[QGIS](https://www.qgis.org/en/site/).
The whole process requires a dockerized setup. Of course you can try to install
and run the tools without docker, but that is probably going to be very
complicated, and we're not documenting it here.
## Data sources
There are two main sources of data. Both feed into a PostgreSQL database into
separate tables, such that they can be joined for processing.
### Application data
The **API** imports tracks separately and stores the imported data into the
`overtaking_event` table. This is already part of the application and does not
need configuration, apart from specifying the correct `postgres.url` in the API
config.
### Importing OpenStreetMap data
This is the road information imported from OpenStreetMap itself. Download the
area(s) you would like to import from
[GeoFabrik](https://download.geofabrik.de). Then import the files like this:
```bash
osm2pgsql --create --hstore --style api/roads_import.lua -O flex \
-H localhost -d obs -U obs -W \
path/to/downloaded/myarea-latest.osm.pbf
```
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
your setup, and also provide the correct password when queried. This process
should take a few seconds to minutes, depending on the area size. You can run
the process multiple times, with the same or different area files, to import or
update the data. You can also truncate the `road` table before importing if you
want to remove outdated road information.
## Configure
Edit the file `tile-generator/.env` and adjust the following variables:
* `PGDATABASE, PGUSER, ...` if you have different PostgreSQL credentials
* `BBOX`, a bounding box for the area you want to generate (keep it small). Use
[this tool](https://boundingbox.klokantech.com/) to draw an area on a map.
## Generate SQL functions
The [OpenMapTiles](https://openmaptiles.org/) project is used to generate the
vector tiles. For this, a lot of logic is generated and imported into the
PostgreSQL database in the form of user functions. To generate and import these, run::
```bash
cd tile-generator/
make clean
make
make import-sql
```
## Generate `.mbtiles` file
This file contains all the vector tiles for the selected area and zoom levels,
and different layers of information (according to the layer descriptions in
`tile-generator/layers/` and `tile-generator/openmaptiles.yaml`). It is
generated like this:
```bash
make generate-tiles-pg
```
## Publish vector tiles
The API is capable of serving the generated mbtiles file in XYZ scheme with PBF
format. Set the config variable `TILES_FILE` to point to your generated file.
The API might be inefficient at publishing the tiles. You might want to try a
proper tileserver with caching and all if you run into trouble with its
capabilities.
The URL for the tiles is:
```
http://api.example.com/tiles/{z}/{x}/{y}.pbf
```
The API generates this URL into the `/config.json` as `obsMapSource` if it is
configured to serve tiles *and* the frontend.