Improve readmes

This commit is contained in:
Paul Bienkowski 2021-11-29 00:26:54 +01:00
parent 0e5600b38d
commit 63adbc58cd
3 changed files with 142 additions and 58 deletions

View file

@ -80,30 +80,30 @@ Then clone the repository as described above.
Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance:
```bash
docker-compose up -d --build keycloak
docker-compose up -d keycloak
```
Now navigate to http://localhost:3003/ and follow these steps:
* Click "Administration Console" and log in with `admin` / `admin`
* Hover over the realm name on the top left and click "Add realm"
* Name the Realm `obs-dev` (spelling matters) and create it
* In the sidebar, navigate to Configure -> Clients, and click "Create" on the top right
* Client ID is `portal`. Hit "Save".
* In the Tab "Settings", edit the new client's "Access Type" to `confidential`
and enter as "Valid Redirect URIs": `http://localhost:3000/login/redirect`,
then "Save"
* Under "Credentials", copy the "Secret" and paste it into `api/config.dev.py`
- Click *Administration Console* and log in with `admin` / `admin`.
- Hover over the realm name on the top left and click *Add realm*.
- Name the Realm `obs-dev` (spelling matters) and create it.
- In the sidebar, navigate to *Configure* → *Clients*, and click *Create* on the top right.
- *Client ID* should be `portal`. Click *Save*.
- In the Tab *Settings*, edit the new client's *Access Type* to *confidential*
and enter as *Valid Redirect URIs*: `http://localhost:3000/login/redirect`,
then *Save*
- Under *Credentials*, copy the *Secret* and paste it into `api/config.dev.py`
as `KEYCLOAK_CLIENT_SECRET`. Please do not commit this change to git.
* In the sidebar, navigate to Manage -> Users, and click "Add user" on the top right.
* Give the user a name (e.g. `test`), leave the rest as-is.
* Under the tab "Credentials", set a new password, and make it non-temporary.
Press "Set Password".
- In the sidebar, navigate to *Manage* → *Users*, and click *Add user* on the top right.
- Give the user a name (e.g. `test`), leave the rest as-is.
- Under the tab *Credentials*, choose a new password, and make it
non-temporary. Click *Set Password*.
We are going to automate this process. For now, you will have to repeat it
every time you reset the keycloak datbaase, which is inside the PostgreSQL. The
script `api/tools/reset_database.py` does *not* affect the state of the
keycloak database, however, so this should be rather rare.
every time you reset your keycloak settings, which are stored inside the
PostgreSQL as well. Luckily, the script `api/tools/reset_database.py` does
*not* affect the state of the keycloak database, so this should be rather rare.
### Prepare database
@ -118,18 +118,18 @@ takes a while, so check the logs of the docker container until you see:
> PostgreSQL init process complete; ready for start up.
If you don't wait long enough, the following commands might fail.
If you don't wait long enough, the following commands might fail. In this case,
you can always stop the container, remove the data directory (`local/postgres`)
and restart the process.
Next, initialize an empty database, creating all extensions and tables
for the application at once:
Next, initialize an empty database, which applies the database schema for the
application:
```bash
docker-compose run --rm api tools/reset_database.py
```
You should import OpenStreetMap data now, see below for instructions.
To serve dynamic vector tiles from the API, run the following command once:
To be able serve dynamic vector tiles from the API, run the following command once:
```bash
docker-compose run --rm api tools/prepare_sql_tiles.py
@ -138,6 +138,8 @@ docker-compose run --rm api tools/prepare_sql_tiles.py
You might need to re-run this command after updates, to (re-)create the
functions in the SQL database that are used when generating vector tiles.
You should also import OpenStreetMap data now, see below for instructions.
### Boot the application
Now you can run the remaining parts of the application:
@ -161,7 +163,10 @@ document the usage here.
You need to import road information from OpenStreetMap for the portal to work.
This information is stored in your PostgreSQL database and used when processing
tracks (instead of querying the Overpass API), as well as for vector tile
generation.
generation. The process applies to both development and production setups. For
development, you should choose a small area for testing, such as your local
county or city, to keep the amount of data small. For production use you have
to import the whole region you are serving.
* Install `osm2pgsql`.
* Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de).
@ -174,24 +179,25 @@ generation.
```
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
your setup, and also provide the correct password when queried. This process
should take a few seconds to minutes, depending on the area size. You can run
the process multiple times, with the same or different area files, to import or
update the data. You can also truncate the `road` table before importing if you
want to remove outdated road information.
your setup, and also provide the correct password when queried. For the
development setup the password is `obs`. For production, you might need to
expose the containers port and/or create a TCP tunnel, for example with SSH,
such that you can run the import from your local host and write to the remote
database.
The import process should take a few seconds to minutes, depending on the area
size. A whole country might even take one or more hours. You should probably
not try to import `planet.osm.pbf`.
You can run the process multiple times, with the same or different area files,
to import or update the data. You can also truncate the `road` table before
importing if you want to remove outdated road information.
Refer to the documentation of `osm2pgsql` for assistance. We are using "flex
mode", the provided script `api/roads_import.lua` describes the transformations
and extractions to perform on the original data.
## Static tile generation
The above instructions do not include the serving of vector tiles with the
collected data. That is to be set up separately. Please follow the instructions
in [tile-generator](./tile-generator/README.md).
### Troubleshooting
## Troubleshooting
If any step of the instructions does not work for you, please open an issue and
describe the problem you're having, as it is important to us that onboarding is

View file

@ -10,12 +10,14 @@ explicitly. Once we implement them, their usage will be described in the
* Shut down all services
* Obviously, now is a good time to perform a full backup ;)
* Update the codebase
* Update the codebase (`git pull`, `git submodule update`).
* Update docker-compose.yaml from the example. Leave the MongoDB service in
place for now, but update all other service descriptions. You can remove
`redis` already. Generate a better password than the default for your
postgres user.
* Start up the `mongo` and `postgres` services.
postgres user. Traefik rules have been simplified as all routes are handled
by the portal service now.
* Start up the `mongo` and `postgres` services. Wait for postgres to finish
initializing (see [README](README.md)).
* Build the new image (e.g. with `docker-compose build portal`)
* Configure your API. The example config file is `api/config.py.example`, and
it will need to be mounted to `api/config.py` in the container. Ignore the
@ -26,13 +28,13 @@ explicitly. Once we implement them, their usage will be described in the
docker-compose run --rm portal python tools/reset_database.py
docker-compose run --rm portal python tools/prepare_sql_tiles.py
```
* Import OSM data (see README)
* Import OSM data (see [README](README.md)).
* Run the database migration script:
```bash
docker-compose run --rm \
-v $PWD/export:/export \
api \
portal \
python tools/import_from_mongodb.py mongodb://mongo/obs \
--keycloak-users-file /export/users.json
```
@ -42,5 +44,6 @@ explicitly. Once we implement them, their usage will be described in the
file to match your keycloak configuration. Import the file
`export/users.json` into your realm, it will re-add all the users from the
old installation. You should delete the file and `export/` folder afterwards.
* Start `api`, `worker` and `frontend`.
* Start `portal`.
* Consider configuring a worker service. See [deployment/README.md](deployment/README.md).

View file

@ -15,9 +15,9 @@ server's IP address. This documentation uses `portal.example.com` as an
example. The API is hosted at `https://portal.example.com/api`, while the main
frontend is reachable at the domain root.
## Steps
## Setup instructions
### Clone the repo
### Clone the repository
First create a folder somewhere in your system, in the example we use
`/opt/openbikesensor` and export it as `$ROOT` to more easily refer to it.
@ -29,11 +29,13 @@ export ROOT=/opt/openbikesensor
mkdir -p $ROOT
cd $ROOT
git clone --recursive https://github.com/openbikesensor/portal source/
# ... or if you accidentally cloned non --recursive, to fix it run:
# If you accidentally cloned without --recursive, fix it by running:
# git submodule update --init --recursive
```
Unles otherwise mentioned, commandlines below assume your `cwd` to be `$ROOT`
Unless otherwise mentioned, commands below assume your current working
directory to be `$ROOT`.
### Configure `traefik.toml`
@ -43,8 +45,9 @@ cp source/deployment/examples/traefik.toml config/traefik.toml
vim config/traefik.toml
```
Configure your email in the `config/traefik.toml`. This email is uses by
Let's Encrypt to send you some mails regarding your certificates.
Configure your email in the `config/traefik.toml`. This email is used by
*Let's Encrypt* to send you some emails regarding your certificates.
### Configure `docker-compose.yaml`
@ -53,21 +56,37 @@ cp source/deployment/examples/docker-compose.yaml docker-compose.yaml
vim docker-compose.yaml
```
Change the domain where it occurs, such as in `Host()` rules.
* Change the domain where it occurs, such as in `Host()` rules.
* Generate a secure password for the PostgreSQL database user. You will need to
configure this in the application later.
### Create a keycloak instance
Follow the official guides to create your own keycloak server:
Follow the [official guides](https://www.keycloak.org/documentation) to create
your own keycloak server. You can run the keycloak in docker and include it in
your `docker-compose.yaml`, if you like.
https://www.keycloak.org/documentation
Documenting the details of this is out of scope for our project. Please make
sure to configure:
Documenting the details of this is out of scope for our project. Please make sure to configure:
* an admin account for yourself
* a realm for the portal
* a client in that realm with "Access Type" set to "confidential" and a
* An admin account for yourself
* A realm for the portal
* A client in that realm with "Access Type" set to "confidential" and a
redirect URL of this pattern: `https://portal.example.com/login/redirect`
### Prepare database
Follow the procedure outlined in [README.md](../README.md) under "Prepare
database".
### Import OpenStreetMap data
Follow the procedure outlined in [README.md](../README.md) under "Import OpenStreetMap data".
### Configure portal
```bash
@ -93,6 +112,32 @@ docker-compose build portal
docker-compose up -d portal
```
## Running a dedicated worker
Extend your `docker-compose.yaml` with the following service:
```yaml
worker:
image: openbikesensor-portal
build:
context: ./source
volumes:
- ./data/api-data:/data
- ./config/config.py:/opt/obs/api/config.py
restart: on-failure
links:
- postgres
networks:
- backend
command:
- python
- tools/process_track.py
```
Change the `DEDICATED_WORKER` option in your config to `True` to stop
processing tracks in the portal container. Then restart the `portal` service
and start the `worker` service.
## Miscellaneous
### Logs
@ -119,4 +164,34 @@ can always roll back to a pre-update state.
To backup your instances private data you only need to backup the ``$ROOT`` folder.
This should contain everything needed to start your instance again, no persistent
data lives in docker containers.
data lives in docker containers. You should stop the containers for a clean backup.
This backup contains the imported OSM data as well. That is of course a lot of
redundant data, but very nice to have for a quick restore operation. If you
want to generate smaller, nonredundant backups, or backups during live
operation of the database, use a tool like `pg_dump` and extract only the
required tables:
* `overtaking_event`
* `track`
* `user` (make sure to reference `public.user`, not the postgres user table)
* `comment`
You might also instead use the `--exclude-table` option to ignore the `road`
table only (adjust connection parameters and names):
```bash
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
```
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
directory. The processed data can be regenerated, but you can also back that
up, from `local/api-data/processing-output`.
Finally, make sure to create a backup of your keycloak instance. Refer to the
keycloak documentation for how to export its data in a restorable way. This
should work very well if you are storing keycloak data in the PostgreSQL and
exporting that with an exclusion pattern instead of an explicit list.
And then, please test your backup and restore strategy before going live, or at
least before you need it!