Improve readmes

This commit is contained in:
Paul Bienkowski 2021-11-29 00:26:54 +01:00
parent 0e5600b38d
commit 63adbc58cd
3 changed files with 142 additions and 58 deletions

View file

@ -80,30 +80,30 @@ Then clone the repository as described above.
Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance: Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance:
```bash ```bash
docker-compose up -d --build keycloak docker-compose up -d keycloak
``` ```
Now navigate to http://localhost:3003/ and follow these steps: Now navigate to http://localhost:3003/ and follow these steps:
* Click "Administration Console" and log in with `admin` / `admin` - Click *Administration Console* and log in with `admin` / `admin`.
* Hover over the realm name on the top left and click "Add realm" - Hover over the realm name on the top left and click *Add realm*.
* Name the Realm `obs-dev` (spelling matters) and create it - Name the Realm `obs-dev` (spelling matters) and create it.
* In the sidebar, navigate to Configure -> Clients, and click "Create" on the top right - In the sidebar, navigate to *Configure* → *Clients*, and click *Create* on the top right.
* Client ID is `portal`. Hit "Save". - *Client ID* should be `portal`. Click *Save*.
* In the Tab "Settings", edit the new client's "Access Type" to `confidential` - In the Tab *Settings*, edit the new client's *Access Type* to *confidential*
and enter as "Valid Redirect URIs": `http://localhost:3000/login/redirect`, and enter as *Valid Redirect URIs*: `http://localhost:3000/login/redirect`,
then "Save" then *Save*
* Under "Credentials", copy the "Secret" and paste it into `api/config.dev.py` - Under *Credentials*, copy the *Secret* and paste it into `api/config.dev.py`
as `KEYCLOAK_CLIENT_SECRET`. Please do not commit this change to git. as `KEYCLOAK_CLIENT_SECRET`. Please do not commit this change to git.
* In the sidebar, navigate to Manage -> Users, and click "Add user" on the top right. - In the sidebar, navigate to *Manage* → *Users*, and click *Add user* on the top right.
* Give the user a name (e.g. `test`), leave the rest as-is. - Give the user a name (e.g. `test`), leave the rest as-is.
* Under the tab "Credentials", set a new password, and make it non-temporary. - Under the tab *Credentials*, choose a new password, and make it
Press "Set Password". non-temporary. Click *Set Password*.
We are going to automate this process. For now, you will have to repeat it We are going to automate this process. For now, you will have to repeat it
every time you reset the keycloak datbaase, which is inside the PostgreSQL. The every time you reset your keycloak settings, which are stored inside the
script `api/tools/reset_database.py` does *not* affect the state of the PostgreSQL as well. Luckily, the script `api/tools/reset_database.py` does
keycloak database, however, so this should be rather rare. *not* affect the state of the keycloak database, so this should be rather rare.
### Prepare database ### Prepare database
@ -118,18 +118,18 @@ takes a while, so check the logs of the docker container until you see:
> PostgreSQL init process complete; ready for start up. > PostgreSQL init process complete; ready for start up.
If you don't wait long enough, the following commands might fail. If you don't wait long enough, the following commands might fail. In this case,
you can always stop the container, remove the data directory (`local/postgres`)
and restart the process.
Next, initialize an empty database, creating all extensions and tables Next, initialize an empty database, which applies the database schema for the
for the application at once: application:
```bash ```bash
docker-compose run --rm api tools/reset_database.py docker-compose run --rm api tools/reset_database.py
``` ```
You should import OpenStreetMap data now, see below for instructions. To be able serve dynamic vector tiles from the API, run the following command once:
To serve dynamic vector tiles from the API, run the following command once:
```bash ```bash
docker-compose run --rm api tools/prepare_sql_tiles.py docker-compose run --rm api tools/prepare_sql_tiles.py
@ -138,6 +138,8 @@ docker-compose run --rm api tools/prepare_sql_tiles.py
You might need to re-run this command after updates, to (re-)create the You might need to re-run this command after updates, to (re-)create the
functions in the SQL database that are used when generating vector tiles. functions in the SQL database that are used when generating vector tiles.
You should also import OpenStreetMap data now, see below for instructions.
### Boot the application ### Boot the application
Now you can run the remaining parts of the application: Now you can run the remaining parts of the application:
@ -161,7 +163,10 @@ document the usage here.
You need to import road information from OpenStreetMap for the portal to work. You need to import road information from OpenStreetMap for the portal to work.
This information is stored in your PostgreSQL database and used when processing This information is stored in your PostgreSQL database and used when processing
tracks (instead of querying the Overpass API), as well as for vector tile tracks (instead of querying the Overpass API), as well as for vector tile
generation. generation. The process applies to both development and production setups. For
development, you should choose a small area for testing, such as your local
county or city, to keep the amount of data small. For production use you have
to import the whole region you are serving.
* Install `osm2pgsql`. * Install `osm2pgsql`.
* Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de). * Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de).
@ -174,24 +179,25 @@ generation.
``` ```
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
your setup, and also provide the correct password when queried. This process your setup, and also provide the correct password when queried. For the
should take a few seconds to minutes, depending on the area size. You can run development setup the password is `obs`. For production, you might need to
the process multiple times, with the same or different area files, to import or expose the containers port and/or create a TCP tunnel, for example with SSH,
update the data. You can also truncate the `road` table before importing if you such that you can run the import from your local host and write to the remote
want to remove outdated road information. database.
The import process should take a few seconds to minutes, depending on the area
size. A whole country might even take one or more hours. You should probably
not try to import `planet.osm.pbf`.
You can run the process multiple times, with the same or different area files,
to import or update the data. You can also truncate the `road` table before
importing if you want to remove outdated road information.
Refer to the documentation of `osm2pgsql` for assistance. We are using "flex Refer to the documentation of `osm2pgsql` for assistance. We are using "flex
mode", the provided script `api/roads_import.lua` describes the transformations mode", the provided script `api/roads_import.lua` describes the transformations
and extractions to perform on the original data. and extractions to perform on the original data.
## Static tile generation ## Troubleshooting
The above instructions do not include the serving of vector tiles with the
collected data. That is to be set up separately. Please follow the instructions
in [tile-generator](./tile-generator/README.md).
### Troubleshooting
If any step of the instructions does not work for you, please open an issue and If any step of the instructions does not work for you, please open an issue and
describe the problem you're having, as it is important to us that onboarding is describe the problem you're having, as it is important to us that onboarding is

View file

@ -10,12 +10,14 @@ explicitly. Once we implement them, their usage will be described in the
* Shut down all services * Shut down all services
* Obviously, now is a good time to perform a full backup ;) * Obviously, now is a good time to perform a full backup ;)
* Update the codebase * Update the codebase (`git pull`, `git submodule update`).
* Update docker-compose.yaml from the example. Leave the MongoDB service in * Update docker-compose.yaml from the example. Leave the MongoDB service in
place for now, but update all other service descriptions. You can remove place for now, but update all other service descriptions. You can remove
`redis` already. Generate a better password than the default for your `redis` already. Generate a better password than the default for your
postgres user. postgres user. Traefik rules have been simplified as all routes are handled
* Start up the `mongo` and `postgres` services. by the portal service now.
* Start up the `mongo` and `postgres` services. Wait for postgres to finish
initializing (see [README](README.md)).
* Build the new image (e.g. with `docker-compose build portal`) * Build the new image (e.g. with `docker-compose build portal`)
* Configure your API. The example config file is `api/config.py.example`, and * Configure your API. The example config file is `api/config.py.example`, and
it will need to be mounted to `api/config.py` in the container. Ignore the it will need to be mounted to `api/config.py` in the container. Ignore the
@ -26,13 +28,13 @@ explicitly. Once we implement them, their usage will be described in the
docker-compose run --rm portal python tools/reset_database.py docker-compose run --rm portal python tools/reset_database.py
docker-compose run --rm portal python tools/prepare_sql_tiles.py docker-compose run --rm portal python tools/prepare_sql_tiles.py
``` ```
* Import OSM data (see README) * Import OSM data (see [README](README.md)).
* Run the database migration script: * Run the database migration script:
```bash ```bash
docker-compose run --rm \ docker-compose run --rm \
-v $PWD/export:/export \ -v $PWD/export:/export \
api \ portal \
python tools/import_from_mongodb.py mongodb://mongo/obs \ python tools/import_from_mongodb.py mongodb://mongo/obs \
--keycloak-users-file /export/users.json --keycloak-users-file /export/users.json
``` ```
@ -42,5 +44,6 @@ explicitly. Once we implement them, their usage will be described in the
file to match your keycloak configuration. Import the file file to match your keycloak configuration. Import the file
`export/users.json` into your realm, it will re-add all the users from the `export/users.json` into your realm, it will re-add all the users from the
old installation. You should delete the file and `export/` folder afterwards. old installation. You should delete the file and `export/` folder afterwards.
* Start `api`, `worker` and `frontend`. * Start `portal`.
* Consider configuring a worker service. See [deployment/README.md](deployment/README.md).

View file

@ -15,9 +15,9 @@ server's IP address. This documentation uses `portal.example.com` as an
example. The API is hosted at `https://portal.example.com/api`, while the main example. The API is hosted at `https://portal.example.com/api`, while the main
frontend is reachable at the domain root. frontend is reachable at the domain root.
## Steps ## Setup instructions
### Clone the repo ### Clone the repository
First create a folder somewhere in your system, in the example we use First create a folder somewhere in your system, in the example we use
`/opt/openbikesensor` and export it as `$ROOT` to more easily refer to it. `/opt/openbikesensor` and export it as `$ROOT` to more easily refer to it.
@ -29,11 +29,13 @@ export ROOT=/opt/openbikesensor
mkdir -p $ROOT mkdir -p $ROOT
cd $ROOT cd $ROOT
git clone --recursive https://github.com/openbikesensor/portal source/ git clone --recursive https://github.com/openbikesensor/portal source/
# ... or if you accidentally cloned non --recursive, to fix it run: # If you accidentally cloned without --recursive, fix it by running:
# git submodule update --init --recursive # git submodule update --init --recursive
``` ```
Unles otherwise mentioned, commandlines below assume your `cwd` to be `$ROOT` Unless otherwise mentioned, commands below assume your current working
directory to be `$ROOT`.
### Configure `traefik.toml` ### Configure `traefik.toml`
@ -43,8 +45,9 @@ cp source/deployment/examples/traefik.toml config/traefik.toml
vim config/traefik.toml vim config/traefik.toml
``` ```
Configure your email in the `config/traefik.toml`. This email is uses by Configure your email in the `config/traefik.toml`. This email is used by
Let's Encrypt to send you some mails regarding your certificates. *Let's Encrypt* to send you some emails regarding your certificates.
### Configure `docker-compose.yaml` ### Configure `docker-compose.yaml`
@ -53,21 +56,37 @@ cp source/deployment/examples/docker-compose.yaml docker-compose.yaml
vim docker-compose.yaml vim docker-compose.yaml
``` ```
Change the domain where it occurs, such as in `Host()` rules. * Change the domain where it occurs, such as in `Host()` rules.
* Generate a secure password for the PostgreSQL database user. You will need to
configure this in the application later.
### Create a keycloak instance ### Create a keycloak instance
Follow the official guides to create your own keycloak server: Follow the [official guides](https://www.keycloak.org/documentation) to create
your own keycloak server. You can run the keycloak in docker and include it in
your `docker-compose.yaml`, if you like.
https://www.keycloak.org/documentation Documenting the details of this is out of scope for our project. Please make
sure to configure:
Documenting the details of this is out of scope for our project. Please make sure to configure: * An admin account for yourself
* A realm for the portal
* an admin account for yourself * A client in that realm with "Access Type" set to "confidential" and a
* a realm for the portal
* a client in that realm with "Access Type" set to "confidential" and a
redirect URL of this pattern: `https://portal.example.com/login/redirect` redirect URL of this pattern: `https://portal.example.com/login/redirect`
### Prepare database
Follow the procedure outlined in [README.md](../README.md) under "Prepare
database".
### Import OpenStreetMap data
Follow the procedure outlined in [README.md](../README.md) under "Import OpenStreetMap data".
### Configure portal ### Configure portal
```bash ```bash
@ -93,6 +112,32 @@ docker-compose build portal
docker-compose up -d portal docker-compose up -d portal
``` ```
## Running a dedicated worker
Extend your `docker-compose.yaml` with the following service:
```yaml
worker:
image: openbikesensor-portal
build:
context: ./source
volumes:
- ./data/api-data:/data
- ./config/config.py:/opt/obs/api/config.py
restart: on-failure
links:
- postgres
networks:
- backend
command:
- python
- tools/process_track.py
```
Change the `DEDICATED_WORKER` option in your config to `True` to stop
processing tracks in the portal container. Then restart the `portal` service
and start the `worker` service.
## Miscellaneous ## Miscellaneous
### Logs ### Logs
@ -119,4 +164,34 @@ can always roll back to a pre-update state.
To backup your instances private data you only need to backup the ``$ROOT`` folder. To backup your instances private data you only need to backup the ``$ROOT`` folder.
This should contain everything needed to start your instance again, no persistent This should contain everything needed to start your instance again, no persistent
data lives in docker containers. data lives in docker containers. You should stop the containers for a clean backup.
This backup contains the imported OSM data as well. That is of course a lot of
redundant data, but very nice to have for a quick restore operation. If you
want to generate smaller, nonredundant backups, or backups during live
operation of the database, use a tool like `pg_dump` and extract only the
required tables:
* `overtaking_event`
* `track`
* `user` (make sure to reference `public.user`, not the postgres user table)
* `comment`
You might also instead use the `--exclude-table` option to ignore the `road`
table only (adjust connection parameters and names):
```bash
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
```
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
directory. The processed data can be regenerated, but you can also back that
up, from `local/api-data/processing-output`.
Finally, make sure to create a backup of your keycloak instance. Refer to the
keycloak documentation for how to export its data in a restorable way. This
should work very well if you are storing keycloak data in the PostgreSQL and
exporting that with an exclusion pattern instead of an explicit list.
And then, please test your backup and restore strategy before going live, or at
least before you need it!