Move deployment readme to docs/
This commit is contained in:
parent
c4cc4a9078
commit
761908a987
25
README.md
25
README.md
|
@ -36,10 +36,11 @@ git submodule update --init --recursive
|
|||
|
||||
## Production setup
|
||||
|
||||
There is a guide for a deployment based on docker in the
|
||||
[deployment](deployment) folder. Lots of non-docker deployment strategy are
|
||||
possible, but they are not "officially" supported, so please do not expect the
|
||||
authors of the software to assist in troubleshooting.
|
||||
There is a guide for a deployment based on docker at
|
||||
[docs/production-deployment.md](docs/production-deployment.md). Lots of
|
||||
non-docker deployment strategies are possible, but they are not "officially"
|
||||
supported, so please do not expect the authors of the software to assist in
|
||||
troubleshooting.
|
||||
|
||||
This is a rather complex application, and it is expected that you know the
|
||||
basics of deploying a modern web application securely onto a production server.
|
||||
|
@ -52,7 +53,8 @@ Please note that you will always need to install your own reverse proxy that
|
|||
terminates TLS for you and handles certificates. We do not support TLS directly
|
||||
in the application, instead, please use this prefered method.
|
||||
|
||||
Upgrading and migrating is descrube
|
||||
Upgrading and migrating is described in [UPGRADING.md](./UPGRADING.md) for each
|
||||
version.
|
||||
|
||||
### Migrating (Production)
|
||||
|
||||
|
@ -75,18 +77,6 @@ docker-compose run --rm api alembic upgrade head
|
|||
docker-compose run --rm api tools/prepare_sql_tiles
|
||||
```
|
||||
|
||||
|
||||
docker-compose run --rm api alembic upgrade head
|
||||
|
||||
### Upgrading from v0.2 to v0.3
|
||||
|
||||
After v0.2 we switched the underlying technology of the API and the database.
|
||||
We now have no more MongoDB, instead, everything has moved to the PostgreSQL
|
||||
installation. For development setups, it is advised to just reset the whole
|
||||
state (remove the `local` folder) and start fresh. For production upgrades,
|
||||
please follow the relevant section in [`UPGRADING.md`](./UPGRADING.md).
|
||||
|
||||
|
||||
## Development setup
|
||||
|
||||
We've moved the whole development setup into Docker to make it easy for
|
||||
|
@ -101,7 +91,6 @@ Then clone the repository as described above.
|
|||
|
||||
### Configure Keycloak
|
||||
|
||||
|
||||
Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance:
|
||||
|
||||
```bash
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
This document describes the general steps to upgrade between major changes.
|
||||
Simple migrations, e.g. for adding schema changes, are not documented
|
||||
explicitly. Their general usage is described in the [README](./README.md) (for
|
||||
development) and [deployment/README.md](deployment/README.md) (for production).
|
||||
development) and [docs/production-deployment.md](docs/production-deployment.md) (for production).
|
||||
|
||||
## 0.7.0
|
||||
|
||||
|
@ -57,7 +57,7 @@ You can, but do not have to, reimport all tracks. This will generate a GPX file
|
|||
for each track and allow the users to download those. If a GPX file has not yet
|
||||
been created, the download will fail. To reimport all tracks, log in to your
|
||||
PostgreSQL database (instructions are in [README.md](./README.md) for
|
||||
development and [deployment/README.md](./deployment/README.md) for production)
|
||||
development and [docs/production-deployment.md](./docs/production-deployment.md) for production)
|
||||
and run:
|
||||
|
||||
```sql
|
||||
|
@ -77,7 +77,7 @@ Make sure your worker is running to process the queue.
|
|||
`POSTGRES_MAX_OVERFLOW`. Check the example config for sane default values.
|
||||
* Re-run `tools/prepare_sql_tiles.py` again (see README)
|
||||
* It has been made easier to import OSM data, check
|
||||
[deployment/README.md](deployment/README.md) for the sections "Download
|
||||
[docs/production-deployment.md](./docs/production-deployment.md) for the sections "Download
|
||||
OpenStreetMap maps" and "Import OpenStreetMap data". You can now download
|
||||
multiple .pbf files and then import them at once, using the docker image
|
||||
built with the `Dockerfile`. Alternatively, you can choose to enable [lean
|
||||
|
@ -132,5 +132,5 @@ Make sure your worker is running to process the queue.
|
|||
`export/users.json` into your realm, it will re-add all the users from the
|
||||
old installation. You should delete the file and `export/` folder afterwards.
|
||||
* Start `portal`.
|
||||
* Consider configuring a worker service. See [deployment/README.md](deployment/README.md).
|
||||
* Consider configuring a worker service. See [docs/production-deployment.md](./docs/production-deployment.md).
|
||||
|
||||
|
|
|
@ -1,49 +0,0 @@
|
|||
###################################################
|
||||
# Keycloak
|
||||
###################################################
|
||||
|
||||
OBS_KEYCLOAK_URI=login.example.com
|
||||
|
||||
# Postgres
|
||||
|
||||
OBS_KEYCLOAK_POSTGRES_USER=obs
|
||||
OBS_KEYCLOAK_POSTGRES_PASSWORD=<<TODO>>
|
||||
OBS_KEYCLOAK_POSTGRES_DB=obs
|
||||
OBS_POSTGRES_MAX_OVERFLOW=20
|
||||
OBS_POSTGRES_POOL_SIZE=40
|
||||
|
||||
# KeyCloak
|
||||
|
||||
OBS_KEYCLOAK_POSTGRES_HOST=postgres-keycloak
|
||||
OBS_KEYCLOAK_ADMIN_USER=admin
|
||||
OBS_KEYCLOAK_ADMIN_PASSWORD=<<TODO>>
|
||||
OBS_KEYCLOAK_REALM=obs
|
||||
OBS_KEYCLOAK_PORTAL_REDIRECT_URI=https://portal.example.com/*
|
||||
|
||||
###################################################
|
||||
# Portal
|
||||
###################################################
|
||||
|
||||
OBS_PORTAL_URI=portal.example.com
|
||||
|
||||
# Postgres + osm2pgsql
|
||||
|
||||
OBS_POSTGRES_HOST=postgres
|
||||
OBS_POSTGRES_USER=obs
|
||||
OBS_POSTGRES_PASSWORD=<<TODO>>
|
||||
OBS_POSTGRES_DB=obs
|
||||
|
||||
# Portal
|
||||
|
||||
OBS_HOST=0.0.0.0
|
||||
OBS_PORT=3000
|
||||
OBS_SECRET=<<TODO>>
|
||||
OBS_POSTGRES_URL=postgresql+asyncpg://obs:<<TODO>>@postgres/obs
|
||||
OBS_KEYCLOAK_URL=https://login.example.com/auth/realms/obs/
|
||||
OBS_KEYCLOAK_CLIENT_ID=portal
|
||||
OBS_KEYCLOAK_CLIENT_SECRET=<<TODO>>
|
||||
OBS_DEDICATED_WORKER="True"
|
||||
OBS_DATA_DIR=/data
|
||||
OBS_PROXIES_COUNT=1
|
||||
|
||||
###################################################
|
|
@ -55,12 +55,7 @@ git clone --recursive https://github.com/openbikesensor/portal source/
|
|||
```bash
|
||||
mkdir -p /opt/openbikesensor/config
|
||||
cd /opt/openbikesensor/
|
||||
|
||||
cp source/deployment/examples/docker-compose.yaml docker-compose.yaml
|
||||
cp source/deployment/examples/.env .env
|
||||
|
||||
cp source/deployment/examples/traefik.toml config/traefik.toml
|
||||
cp source/deployment/examples/config.py config/config.py
|
||||
cp -r source/deployment/config source/deployment/docker-compose.yaml source/deployment/.env .
|
||||
```
|
||||
|
||||
### Create a Docker network
|
||||
|
@ -224,18 +219,6 @@ docker-compose build portal
|
|||
|
||||
*Hint*: This may take up to 10 minutes. In the future, we will provide a prebuild image.
|
||||
|
||||
#### Download OpenStreetMap maps
|
||||
|
||||
Download the area(s) you would like to import from
|
||||
[GeoFabrik](https://download.geofabrik.de) into `data/pbf`, for example:
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
wget https://download.geofabrik.de/europe/germany/schleswig-holstein-latest.osm.pbf -P data/pbf
|
||||
```
|
||||
|
||||
*Hint*: Start with a small region/city, since the import can take some hours for huge areas.
|
||||
|
||||
#### Prepare database
|
||||
|
||||
Run the following scripts to prepare the database:
|
||||
|
@ -248,13 +231,7 @@ For more details, see [README.md](../README.md) under "Prepare database".
|
|||
|
||||
#### Import OpenStreetMap data
|
||||
|
||||
Run the following script, to import the OSM data:
|
||||
|
||||
```
|
||||
docker-compose run --rm portal tools/osm2pgsql.sh
|
||||
```
|
||||
|
||||
For more details. see [README.md](../README.md) under "Import OpenStreetMap data".
|
||||
Follow [these instructions](./osm-import.md).
|
||||
|
||||
|
||||
#### Configure portal
|
Loading…
Reference in a new issue