Compare commits
No commits in common. "main" and "feature/enable-dependabot-for-update-prs" have entirely different histories.
main
...
feature/en
|
@ -1,20 +0,0 @@
|
|||
name: Build docker image
|
||||
on: [push]
|
||||
|
||||
jobs:
|
||||
build-image:
|
||||
runs-on: ubuntu-latest
|
||||
container:
|
||||
image: catthehacker/ubuntu:act-latest
|
||||
steps:
|
||||
- name: Login to Forgejo docker registry
|
||||
uses: docker/login-action@v3.0.0
|
||||
with:
|
||||
registry: git.pub.solar
|
||||
username: hakkonaut
|
||||
password: ${{ secrets.GIT_AUTH_TOKEN }}
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v5.1.0
|
||||
with:
|
||||
push: true
|
||||
tags: git.pub.solar/pub-solar/obs-portal:latest
|
23
.github/dependabot.yml
vendored
Normal file
23
.github/dependabot.yml
vendored
Normal file
|
@ -0,0 +1,23 @@
|
|||
# see also https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "docker"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
|
||||
- package-ecosystem: "gitsubmodule"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
|
||||
- package-ecosystem: "npm"
|
||||
directory: "/frontend"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/api"
|
||||
schedule:
|
||||
interval: "daily"
|
167
CHANGELOG.md
167
CHANGELOG.md
|
@ -1,172 +1,5 @@
|
|||
# Changelog
|
||||
|
||||
## 0.8.1
|
||||
|
||||
### Improvements
|
||||
|
||||
* The zone (urban/rural) is now also exported with the events GeoJson export.
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Update to a current version of gpstime (python dependency) fix portal startup.
|
||||
|
||||
## 0.8.0
|
||||
|
||||
### Features
|
||||
|
||||
* Bulk actions on users owned tracks (reprocess, download, make private, make public, delete) (#269, #38)
|
||||
* Easy sorting by device for "multi-device users" (e.g. group lending out OBSes)
|
||||
* Region display at higher zoom levels to easily find interesting areas (#112)
|
||||
* Export of road statistics on top of the already-existing event statistics (#341)
|
||||
|
||||
### Improvements
|
||||
|
||||
* Refactored database access to hopefully combat portal crashes (#337)
|
||||
* New infrastructure for map imports that makes import of larger maps possible on small VMs (#334)
|
||||
* Reference current postgres and postgis versions in docker-compose.yaml files (#286)
|
||||
* Configurable terms-and-conditions link (#320)
|
||||
* French translation by @cbiteau (#303)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Logout not working (#285)
|
||||
* Duplicate road usage hashes (#335, #253)
|
||||
* cannot import name .... (#338)
|
||||
|
||||
## 0.7.0
|
||||
|
||||
### Features
|
||||
|
||||
* Add histogram of overtaking distances in road details panel
|
||||
* Flip table in road details panel and make it easier to read
|
||||
* Implement difference between urban and rural for events and road segments.
|
||||
* Better road zone detection in import
|
||||
* Make the frontend translatable and add German translation
|
||||
* Add time and user filters to map view (for logged-in users only)
|
||||
|
||||
### Improvements
|
||||
|
||||
* Make raw track not look like a river (#252)
|
||||
* Update many dependencies
|
||||
|
||||
### Bug fixes
|
||||
|
||||
* Overtaking events are now deleted when the parent track is deleted (#206)
|
||||
* Remove useless session creation (#192)
|
||||
* Remove some error logs for canceled requests (as the map page tends to do that quite a lot)
|
||||
* Fix ExportPage bounding box input
|
||||
|
||||
|
||||
## 0.6.2
|
||||
|
||||
### Improvements
|
||||
|
||||
* Prevent directory traversals inside container on python-served frontend.
|
||||
|
||||
## 0.6.1
|
||||
|
||||
### Improvements
|
||||
|
||||
* Make road details request (clicking on a road segment in the map) way faster
|
||||
by using PostGIS geometry index correctly (#226).
|
||||
|
||||
## 0.6.0
|
||||
|
||||
Starting in this version, the database schema is created through migrations
|
||||
instead of using the `reset_database.py` script. This means that for both the
|
||||
initial setup, as well as for upgrades, only the migrations have to be run.
|
||||
|
||||
After updating and migrating, it is good practice to regenerate the SQL tile
|
||||
functions (`api/tools/prepare_sql_tiles.py`) as well. It doesn't matter if you
|
||||
do this when it is not required, so we've written a simple all-in-one update
|
||||
script that you can run to do all upgrade tasks. This is now in
|
||||
`api/tools/upgrade.py`.
|
||||
|
||||
Please check [`UPGRADING.md`](./UPGRADING.md) for more details if you're
|
||||
upgrading an existing installation. It contains an important note for this
|
||||
upgrade in particular.
|
||||
|
||||
## 0.5.1
|
||||
|
||||
Maintenance release, only includes build, deployment and documentation changes.
|
||||
|
||||
## 0.5.0
|
||||
|
||||
### Features
|
||||
|
||||
* Use discrete colors for distances, with greens only above 1.5m
|
||||
* Use viridis colormap for roads' count layers
|
||||
* Generate usage count information (how often has a road been traveled)
|
||||
* Project the whole track to the map, and show both versions
|
||||
* Log out of OpenID server when logging out of application
|
||||
* Convert speed units to km/h in frontend
|
||||
* Pages now have titles (#148)
|
||||
* Remove map from home page, it was empty anyway (#120)
|
||||
|
||||
### Internal
|
||||
|
||||
* Add alembic setup for migrating
|
||||
* Build osm2pgsql with -j4
|
||||
* Update sqlalchemy[asyncio] requirement from ~=1.4.31 to ~=1.4.32 in /api
|
||||
|
||||
## 0.4.2
|
||||
|
||||
### Features
|
||||
|
||||
### Bugfixes
|
||||
|
||||
* Fix export route, it should be a child of /api
|
||||
|
||||
## 0.4.1
|
||||
|
||||
### Features
|
||||
|
||||
* Add page for exporting data through web frontend
|
||||
* Generate GPX track file when importing a track
|
||||
* Add GPX track export button on the track page (accessible for anybody who can
|
||||
see the track)
|
||||
|
||||
## 0.4.0
|
||||
|
||||
### Improvements
|
||||
|
||||
* Retry OpenID Connect connection if it fails on boot
|
||||
* Format log outputs with color and improve access log
|
||||
* Make pool_size and overflow configurable for worker and portal
|
||||
* Add a route for exporting events as GeoJSON/Shapefile
|
||||
* Point footer to forum, not slack (fixes #140)
|
||||
* Improve wording on profile page ("My" instead of "Your")
|
||||
* Show "My tracks" directly in main menu (fixes #136)
|
||||
|
||||
### Bugfixes
|
||||
|
||||
* Make sure the API can recover from the broken postgresql connection state
|
||||
* Remove duplicate events from the same track
|
||||
* Fix direction of road segments (fixes #142)
|
||||
* Solve a few problems with the colormap scales in the map view
|
||||
|
||||
### Docs & deployment
|
||||
|
||||
* Greatly improve deployement docs for a simple follow-along routine
|
||||
* Use environment variables (`OBS_*`) for configuration
|
||||
* Fix port numbers in example files and expose 3000 in the image
|
||||
* Add `LEAN_MODE` configuration to disable `road` database table usage and fall
|
||||
back to Overpass API for processing tracks (see
|
||||
[docs/lean-mode.md](docs/lean-mode.md)).
|
||||
* Read `config.overrides.py` file if it exists
|
||||
* Add osm2pgsql to portal image to be able to import OSM data from within the
|
||||
container
|
||||
* Fix path to roads_import.lua in docs
|
||||
* Explain to use the portal service, instead of api, in production
|
||||
* Use entrypoint instead of command, so you can run process_track.py one-off tasks
|
||||
|
||||
### Internals
|
||||
|
||||
* Use custom `get_single_arg` everywhere, remove sanicargs (fixes #193)
|
||||
* Update requirements and make them consistent
|
||||
* Fix error handling, especially for file uploads
|
||||
|
||||
|
||||
## 0.3.4
|
||||
|
||||
### Features
|
||||
|
|
23
Dockerfile
23
Dockerfile
|
@ -4,7 +4,7 @@
|
|||
# Build the frontend AS builder
|
||||
#############################################
|
||||
|
||||
FROM node:18 as frontend-builder
|
||||
FROM node:17 as frontend-builder
|
||||
|
||||
WORKDIR /opt/obs/frontend
|
||||
ADD frontend/package.json frontend/package-lock.json /opt/obs/frontend/
|
||||
|
@ -21,21 +21,7 @@ RUN npm run build
|
|||
# Build the API and add the built frontend to it
|
||||
#############################################
|
||||
|
||||
FROM python:3.11.3-bullseye
|
||||
|
||||
RUN apt-get update &&\
|
||||
apt-get install -y \
|
||||
libboost-dev \
|
||||
libboost-system-dev \
|
||||
libboost-filesystem-dev \
|
||||
libexpat1-dev \
|
||||
zlib1g-dev \
|
||||
libbz2-dev \
|
||||
libpq-dev \
|
||||
libproj-dev \
|
||||
lua5.3 \
|
||||
liblua5.3-dev &&\
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
FROM python:3.9.7-bullseye
|
||||
|
||||
WORKDIR /opt/obs/api
|
||||
|
||||
|
@ -48,14 +34,13 @@ ADD api/scripts /opt/obs/scripts
|
|||
RUN pip install -e /opt/obs/scripts
|
||||
|
||||
ADD api/setup.py /opt/obs/api/
|
||||
ADD api/alembic.ini /opt/obs/api/
|
||||
ADD api/migrations /opt/obs/api/migrations/
|
||||
ADD api/obs /opt/obs/api/obs/
|
||||
ADD api/tools /opt/obs/api/tools/
|
||||
RUN pip install -e /opt/obs/api/
|
||||
|
||||
COPY --from=frontend-builder /opt/obs/frontend/build /opt/obs/frontend/build
|
||||
|
||||
EXPOSE 3000
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["openbikesensor-api"]
|
||||
|
||||
|
|
126
README.md
126
README.md
|
@ -36,11 +36,10 @@ git submodule update --init --recursive
|
|||
|
||||
## Production setup
|
||||
|
||||
There is a guide for a deployment based on docker at
|
||||
[docs/production-deployment.md](docs/production-deployment.md). Lots of
|
||||
non-docker deployment strategies are possible, but they are not "officially"
|
||||
supported, so please do not expect the authors of the software to assist in
|
||||
troubleshooting.
|
||||
There is a guide for a deployment based on docker in the
|
||||
[deployment](deployment) folder. Lots of non-docker deployment strategy are
|
||||
possible, but they are not "officially" supported, so please do not expect the
|
||||
authors of the software to assist in troubleshooting.
|
||||
|
||||
This is a rather complex application, and it is expected that you know the
|
||||
basics of deploying a modern web application securely onto a production server.
|
||||
|
@ -53,29 +52,19 @@ Please note that you will always need to install your own reverse proxy that
|
|||
terminates TLS for you and handles certificates. We do not support TLS directly
|
||||
in the application, instead, please use this prefered method.
|
||||
|
||||
Upgrading and migrating is described in [UPGRADING.md](./UPGRADING.md) for each
|
||||
version.
|
||||
|
||||
### Migrating (Production)
|
||||
|
||||
Migrations are done with
|
||||
[Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
|
||||
its documentation for help. Most of the time, running this command will do all
|
||||
the migrations you need:
|
||||
Migrations are not implemented yet. Once we need them, we'll add them and
|
||||
document the usage here.
|
||||
|
||||
```bash
|
||||
docker-compose run --rm api tools/upgrade.py
|
||||
```
|
||||
### Upgrading from v0.2 to v0.3
|
||||
|
||||
This command is equivalent to running migrations through *alembic*, then
|
||||
regenerating the SQL functions that compute vector tiles directly in the
|
||||
database:
|
||||
After v0.2 we switched the underlying technology of the API and the database.
|
||||
We now have no more MongoDB, instead, everything has moved to the PostgreSQL
|
||||
installation. For development setups, it is advised to just reset the whole
|
||||
state (remove the `local` folder) and start fresh. For production upgrades,
|
||||
please follow the relevant section in [`UPGRADING.md`](./UPGRADING.md).
|
||||
|
||||
```bash
|
||||
# equivalent to the above command, you don't usually run these
|
||||
docker-compose run --rm api alembic upgrade head
|
||||
docker-compose run --rm api tools/prepare_sql_tiles
|
||||
```
|
||||
|
||||
## Development setup
|
||||
|
||||
|
@ -91,6 +80,7 @@ Then clone the repository as described above.
|
|||
|
||||
### Configure Keycloak
|
||||
|
||||
|
||||
Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance:
|
||||
|
||||
```bash
|
||||
|
@ -107,15 +97,8 @@ Now navigate to http://localhost:3003/ and follow these steps:
|
|||
- In the Tab *Settings*, edit the new client's *Access Type* to *confidential*
|
||||
and enter as *Valid Redirect URIs*: `http://localhost:3000/login/redirect`,
|
||||
then *Save*
|
||||
- Under *Credentials*, copy the *Secret*. Create a file at `api/config.overrides.py` with the secret in it:
|
||||
|
||||
```python
|
||||
KEYCLOAK_CLIENT_SECRET="your secret here"
|
||||
```
|
||||
|
||||
You can use this file in development mode to change settings without editing
|
||||
the git-controlled default file at `api/config.dev.py`. Options in this file
|
||||
take precendence.
|
||||
- Under *Credentials*, copy the *Secret* and paste it into `api/config.dev.py`
|
||||
as `KEYCLOAK_CLIENT_SECRET`. Please do not commit this change to git.
|
||||
- In the sidebar, navigate to *Manage* → *Users*, and click *Add user* on the top right.
|
||||
- Give the user a name (e.g. `test`), leave the rest as-is.
|
||||
- Under the tab *Credentials*, choose a new password, and make it
|
||||
|
@ -143,17 +126,23 @@ If you don't wait long enough, the following commands might fail. In this case,
|
|||
you can always stop the container, remove the data directory (`local/postgres`)
|
||||
and restart the process.
|
||||
|
||||
Next, run the upgrade command to generate the database schema:
|
||||
Next, initialize an empty database, which applies the database schema for the
|
||||
application:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm api tools/upgrade.py
|
||||
docker-compose run --rm api tools/reset_database.py
|
||||
```
|
||||
|
||||
You will need to re-run this command after updates, to migrate the database and
|
||||
(re-)create the functions in the SQL database that are used when generating
|
||||
vector tiles.
|
||||
To be able serve dynamic vector tiles from the API, run the following command once:
|
||||
|
||||
You should also [import OpenStreetMap data](docs/osm-import.md) now.
|
||||
```bash
|
||||
docker-compose run --rm api tools/prepare_sql_tiles.py
|
||||
```
|
||||
|
||||
You might need to re-run this command after updates, to (re-)create the
|
||||
functions in the SQL database that are used when generating vector tiles.
|
||||
|
||||
You should also import OpenStreetMap data now, see below for instructions.
|
||||
|
||||
### Boot the application
|
||||
|
||||
|
@ -169,16 +158,50 @@ testing.
|
|||
|
||||
### Migrating (Development)
|
||||
|
||||
Migrations are done with
|
||||
[Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
|
||||
its documentation for help. Most of the time, running this command will do all
|
||||
the migrations you need:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm api alembic upgrade head
|
||||
```
|
||||
Migrations are not implemented yet. Once we need them, we'll add them and
|
||||
document the usage here.
|
||||
|
||||
|
||||
## Import OpenStreetMap data
|
||||
|
||||
You need to import road information from OpenStreetMap for the portal to work.
|
||||
This information is stored in your PostgreSQL database and used when processing
|
||||
tracks (instead of querying the Overpass API), as well as for vector tile
|
||||
generation. The process applies to both development and production setups. For
|
||||
development, you should choose a small area for testing, such as your local
|
||||
county or city, to keep the amount of data small. For production use you have
|
||||
to import the whole region you are serving.
|
||||
|
||||
* Install `osm2pgsql`.
|
||||
* Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de).
|
||||
* Import each file like this:
|
||||
|
||||
```bash
|
||||
osm2pgsql --create --hstore --style api/roads_import.lua -O flex \
|
||||
-H localhost -d obs -U obs \
|
||||
path/to/downloaded/myarea-latest.osm.pbf
|
||||
```
|
||||
|
||||
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
|
||||
your setup, and also provide the correct password when queried. For the
|
||||
development setup the password is `obs`. For production, you might need to
|
||||
expose the containers port and/or create a TCP tunnel, for example with SSH,
|
||||
such that you can run the import from your local host and write to the remote
|
||||
database.
|
||||
|
||||
The import process should take a few seconds to minutes, depending on the area
|
||||
size. A whole country might even take one or more hours. You should probably
|
||||
not try to import `planet.osm.pbf`.
|
||||
|
||||
You can run the process multiple times, with the same or different area files,
|
||||
to import or update the data. However, for this to work, the actual [command
|
||||
line arguments](https://osm2pgsql.org/doc/manual.html#running-osm2pgsql) are a
|
||||
bit different each time, including when first importing, and the disk space
|
||||
required is much higher.
|
||||
|
||||
Refer to the documentation of `osm2pgsql` for assistance. We are using "flex
|
||||
mode", the provided script `api/roads_import.lua` describes the transformations
|
||||
and extractions to perform on the original data.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
|
@ -186,17 +209,6 @@ If any step of the instructions does not work for you, please open an issue and
|
|||
describe the problem you're having, as it is important to us that onboarding is
|
||||
super easy :)
|
||||
|
||||
### Connecting to the PostgreSQL database
|
||||
|
||||
If you need to connect to your development PostgreSQL database, you should
|
||||
install `psql` locally. The port 5432 is already forwarded, so you can connect with:
|
||||
|
||||
```
|
||||
psql -h localhost -U obs -d obs
|
||||
```
|
||||
|
||||
The password is `obs` as well.
|
||||
|
||||
## License
|
||||
|
||||
Copyright (C) 2020-2021 OpenBikeSensor Contributors
|
||||
|
|
120
UPGRADING.md
120
UPGRADING.md
|
@ -1,123 +1,11 @@
|
|||
# Upgrading
|
||||
|
||||
This document describes the general steps to upgrade between major changes.
|
||||
Simple migrations, e.g. for adding schema changes, are not documented
|
||||
explicitly. Their general usage is described in the [README](./README.md) (for
|
||||
development) and [docs/production-deployment.md](docs/production-deployment.md) (for production).
|
||||
explicitly. Once we implement them, their usage will be described in the
|
||||
[README](./README.md).
|
||||
|
||||
|
||||
## 0.8.1
|
||||
|
||||
- Get the release in your source folder (``git pull; git checkout 0.8.0`` and update submodules ``git submodule update --recursive``)
|
||||
- Rebuild images ``docker-compose build``
|
||||
- No database upgrade is required, but tile functions need an update:
|
||||
```bash
|
||||
docker-compose run --rm portal tools/prepare_sql_tiles.py
|
||||
```
|
||||
- Start your portal and worker services. ``docker-compose up -d worker portal``
|
||||
|
||||
|
||||
## 0.8.0
|
||||
Upgrade to `0.7.x` first. See below for details. Then follow these steps:
|
||||
|
||||
> **Warning** The update includes a reprocessing of tracks after import. Depending on the number of tracks this can take a few hours. The portal is reachable during that time but events disappear and incrementally reappear during reimport.
|
||||
|
||||
> **Info** With this version the import process for OpenStreetMap data has changed: the [new process](docs/osm-import.md) is easier on resources and finally permits to import a full country on a low-end VM.
|
||||
|
||||
- Do your [usual backup](docs/production-deployment.md)
|
||||
- get the release in your source folder (``git pull; git checkout 0.8.0`` and update submodules ``git submodule update --recursive``)
|
||||
- Rebuild images ``docker-compose build``
|
||||
- Stop your portal and worker services ``docker-compose stop worker portal``
|
||||
- run upgrade
|
||||
```bash
|
||||
docker-compose run --rm portal tools/upgrade.py
|
||||
```
|
||||
this automatically does the following
|
||||
- Migration of database schema using alembic.
|
||||
- Upgrade of SQL tile schema to new schema.
|
||||
- Import the nuts-regions from the web into the database.
|
||||
- Trigger a re-import of all tracks.
|
||||
- Start your portal and worker services. ``docker-compose up -d worker portal``
|
||||
|
||||
|
||||
## 0.7.0
|
||||
|
||||
Upgrade to `0.6.x` first. See below for details. Then follow these steps:
|
||||
|
||||
- Rebuild images
|
||||
- Stop your portal and worker services.
|
||||
- **Migration with alembic**: required
|
||||
- **Prepare SQL Tiles**: required
|
||||
- Start your portal and worker services.
|
||||
- **Reimport tracks**: no action required
|
||||
- **OSM Import**: required
|
||||
- **Config changes**: add `POSTGRES_MAX_OVERFLOW` and `POSTGRES_POOL_SIZE`
|
||||
variables, see `api/config.py.example`
|
||||
|
||||
## 0.6.0
|
||||
|
||||
**Make sure to upgrade to `0.5.1` first, by checking out that version tag and
|
||||
running migrations, then coming back to this version.** This is required
|
||||
because the migrations have been edited to create the initial database schema,
|
||||
but if you run the 0.5.1 migrations first, your database will remember that it
|
||||
already has all the tables created. This is not required if you set up a new
|
||||
installation.
|
||||
|
||||
For this update, run these steps:
|
||||
|
||||
- Build new images
|
||||
- Stop portal and worker services
|
||||
- Run the new upgrade tool:
|
||||
```bash
|
||||
docker-compose run --rm portal tools/upgrade.py
|
||||
```
|
||||
- Start portal and worker services
|
||||
|
||||
## 0.5.0
|
||||
|
||||
The upgrade requires the following steps in the given order
|
||||
|
||||
- Rebuild images
|
||||
- Stop your portal and worker services.
|
||||
- **Migration with alembic**: required
|
||||
- **Prepare SQL Tiles**: required
|
||||
- Start your portal and worker services.
|
||||
- **Reimport tracks**: required
|
||||
- **OSM Import**: no action required
|
||||
- **Config changes**: none
|
||||
|
||||
## 0.4.1
|
||||
|
||||
You can, but do not have to, reimport all tracks. This will generate a GPX file
|
||||
for each track and allow the users to download those. If a GPX file has not yet
|
||||
been created, the download will fail. To reimport all tracks, log in to your
|
||||
PostgreSQL database (instructions are in [README.md](./README.md) for
|
||||
development and [docs/production-deployment.md](./docs/production-deployment.md) for production)
|
||||
and run:
|
||||
|
||||
```sql
|
||||
UPDATE track SET processing_status = 'queued';
|
||||
```
|
||||
|
||||
You can do this selectively with `WHERE` statements.
|
||||
|
||||
Make sure your worker is running to process the queue.
|
||||
|
||||
## 0.4.0
|
||||
|
||||
* Rebuild your image, this may take longer than usual, as it will compile
|
||||
`osm2pgsql` for you. Next time, it should be in your docker build cache and
|
||||
be fast again.
|
||||
* Add new config flags: `VERBOSE`, `LEAN_MODE`, `POSTGRES_POOL_SIZE`,
|
||||
`POSTGRES_MAX_OVERFLOW`. Check the example config for sane default values.
|
||||
* Re-run `tools/prepare_sql_tiles.py` again (see README)
|
||||
* It has been made easier to import OSM data, check
|
||||
[docs/production-deployment.md](./docs/production-deployment.md) for the sections "Download
|
||||
OpenStreetMap maps" and "Import OpenStreetMap data". You can now download
|
||||
multiple .pbf files and then import them at once, using the docker image
|
||||
built with the `Dockerfile`. Alternatively, you can choose to enable [lean
|
||||
mode](docs/lean-mode.md). You do not need to reimport data, but setting this
|
||||
up now will make your life easier in the long run ;)
|
||||
|
||||
## v0.2 to v0.3 (MongoDB to PostgreSQL)
|
||||
|
||||
* Shut down all services
|
||||
|
@ -166,5 +54,5 @@ Make sure your worker is running to process the queue.
|
|||
`export/users.json` into your realm, it will re-add all the users from the
|
||||
old installation. You should delete the file and `export/` folder afterwards.
|
||||
* Start `portal`.
|
||||
* Consider configuring a worker service. See [docs/production-deployment.md](./docs/production-deployment.md).
|
||||
* Consider configuring a worker service. See [deployment/README.md](deployment/README.md).
|
||||
|
||||
|
|
2
api/.gitignore
vendored
2
api/.gitignore
vendored
|
@ -43,5 +43,3 @@ local/
|
|||
# both, because then developers will only update one of them and they'll
|
||||
# contradict. For now, npm shall be the canonical default (compare README.md).
|
||||
yarn.lock
|
||||
|
||||
config.overrides.py
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
FROM python:3.11.3-bullseye
|
||||
FROM python:3.9.7-bullseye
|
||||
|
||||
WORKDIR /opt/obs/api
|
||||
|
||||
|
|
102
api/alembic.ini
102
api/alembic.ini
|
@ -1,102 +0,0 @@
|
|||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date within the migration file
|
||||
# as well as the filename.
|
||||
# If specified, requires the python-dateutil library that can be
|
||||
# installed by adding `alembic[tz]` to the pip requirements
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; This defaults
|
||||
# to api/migrations/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path.
|
||||
# The path separator used here should be the separator specified by "version_path_separator" below.
|
||||
# version_locations = %(here)s/bar:%(here)s/bat:api/migrations/versions
|
||||
|
||||
# version path separator; As mentioned above, this is the character used to split
|
||||
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
|
||||
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
|
||||
# Valid values for version_path_separator are:
|
||||
#
|
||||
# version_path_separator = :
|
||||
# version_path_separator = ;
|
||||
# version_path_separator = space
|
||||
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
|
@ -1,12 +1,9 @@
|
|||
HOST = "0.0.0.0"
|
||||
PORT = 3000
|
||||
DEBUG = True
|
||||
VERBOSE = False
|
||||
AUTO_RELOAD = True
|
||||
AUTO_RESTART = True
|
||||
SECRET = "!!!!!!!!!!!!CHANGE ME!!!!!!!!!!!!"
|
||||
POSTGRES_URL = "postgresql+asyncpg://obs:obs@postgres/obs"
|
||||
POSTGRES_POOL_SIZE = 20
|
||||
POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
|
||||
KEYCLOAK_URL = "http://keycloak:8080/auth/realms/obs-dev/"
|
||||
KEYCLOAK_CLIENT_ID = "portal"
|
||||
KEYCLOAK_CLIENT_SECRET = "c385278e-bd2e-4f13-9937-34b0c0f44c2d"
|
||||
|
@ -18,7 +15,6 @@ FRONTEND_DIR = None
|
|||
FRONTEND_CONFIG = {
|
||||
"imprintUrl": "https://example.com/imprint",
|
||||
"privacyPolicyUrl": "https://example.com/privacy",
|
||||
# "termsUrl": "https://example.com/terms", # Link is only shown when set
|
||||
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
|
||||
# "banner": {"text": "This is a development installation.", "style": "info"},
|
||||
}
|
||||
|
@ -29,7 +25,5 @@ ADDITIONAL_CORS_ORIGINS = [
|
|||
"http://localhost:8880/", # for maputnik on 8880
|
||||
"http://localhost:8888/", # for maputnik on 8888
|
||||
]
|
||||
TILE_SEMAPHORE_SIZE = 4
|
||||
EXPORT_SEMAPHORE_SIZE = 4
|
||||
|
||||
# vim: set ft=python :
|
||||
|
|
|
@ -4,16 +4,13 @@ PORT = 3000
|
|||
|
||||
# Extended log output, but slower
|
||||
DEBUG = False
|
||||
VERBOSE = DEBUG
|
||||
AUTO_RELOAD = DEBUG
|
||||
AUTO_RESTART = DEBUG
|
||||
|
||||
# Required to encrypt or sign sessions, cookies, tokens, etc.
|
||||
SECRET = "!!!<<<CHANGEME>>>!!!"
|
||||
|
||||
# Connection to the database
|
||||
POSTGRES_URL = "postgresql+asyncpg://user:pass@host/dbname"
|
||||
POSTGRES_POOL_SIZE = 20
|
||||
POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
|
||||
|
||||
# URL to the keycloak realm, as reachable by the API service. This is not
|
||||
# necessarily its publicly reachable URL, keycloak advertises that iself.
|
||||
|
@ -39,7 +36,6 @@ FRONTEND_DIR = "../frontend/build/"
|
|||
FRONTEND_CONFIG = {
|
||||
"imprintUrl": "https://example.com/imprint",
|
||||
"privacyPolicyUrl": "https://example.com/privacy",
|
||||
# "termsUrl": "https://example.com/user_terms_and_conditions", # Link is only shown when set
|
||||
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
|
||||
"banner": {"text": "This is a test installation.", "style": "warning"},
|
||||
}
|
||||
|
@ -61,13 +57,4 @@ TILES_FILE = None
|
|||
# default. Python list, or whitespace separated string.
|
||||
ADDITIONAL_CORS_ORIGINS = None
|
||||
|
||||
# How many asynchronous requests may be sent to the database to generate tile
|
||||
# information. Should be less than POSTGRES_POOL_SIZE to leave some connections
|
||||
# to the other features of the API ;)
|
||||
TILE_SEMAPHORE_SIZE = 4
|
||||
|
||||
# How many asynchronous requests may generate exported data simultaneously.
|
||||
# Keep this small.
|
||||
EXPORT_SEMAPHORE_SIZE = 1
|
||||
|
||||
# vim: set ft=python :
|
||||
|
|
|
@ -1 +0,0 @@
|
|||
Generic single-database configuration.
|
|
@ -1,83 +0,0 @@
|
|||
import asyncio
|
||||
from logging.config import fileConfig
|
||||
|
||||
from sqlalchemy import engine_from_config
|
||||
from sqlalchemy import pool
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
target_metadata = None
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def do_run_migrations(connection):
|
||||
context.configure(connection=connection, target_metadata=target_metadata)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
from obs.api.app import app
|
||||
|
||||
url = app.config.POSTGRES_URL
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
async def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
from obs.api.app import app, connect_db
|
||||
|
||||
url = app.config.POSTGRES_URL
|
||||
async with connect_db(url) as engine:
|
||||
async with engine.connect() as connection:
|
||||
await connection.run_sync(do_run_migrations)
|
||||
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
asyncio.run(run_migrations_online())
|
|
@ -1,24 +0,0 @@
|
|||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
|
@ -1,16 +0,0 @@
|
|||
import sqlalchemy as sa
|
||||
|
||||
|
||||
def dbtype(name):
|
||||
"""
|
||||
Create a UserDefinedType for use in migrations as the type of a column,
|
||||
when the type already exists in the database, but isn't available as a
|
||||
proper sqlalchemy type.
|
||||
"""
|
||||
|
||||
class TheType(sa.types.UserDefinedType):
|
||||
def get_col_spec(self):
|
||||
return name
|
||||
|
||||
TheType.__name__ = name
|
||||
return TheType
|
|
@ -1,39 +0,0 @@
|
|||
"""create table road
|
||||
|
||||
Revision ID: 35e7f1768f9b
|
||||
Revises: 5d75febe2d59
|
||||
Create Date: 2022-03-30 21:36:48.157457
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
from migrations.utils import dbtype
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "35e7f1768f9b"
|
||||
down_revision = "920aed1450c9"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"road",
|
||||
sa.Column(
|
||||
"way_id", sa.BIGINT, primary_key=True, index=True, autoincrement=False
|
||||
),
|
||||
sa.Column("zone", dbtype("zone_type")),
|
||||
sa.Column("name", sa.Text),
|
||||
sa.Column("geometry", dbtype("geometry(LINESTRING,3857)")),
|
||||
sa.Column("directionality", sa.Integer),
|
||||
sa.Column("oneway", sa.Boolean),
|
||||
)
|
||||
op.execute(
|
||||
"CREATE INDEX road_geometry_idx ON road USING GIST (geometry) WITH (FILLFACTOR=100);"
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("road")
|
|
@ -1,28 +0,0 @@
|
|||
"""create extensions
|
||||
|
||||
Revision ID: 3856f240bb6d
|
||||
Revises: a9627f63fbed
|
||||
Create Date: 2022-03-30 21:31:06.282725
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "3856f240bb6d"
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.execute('CREATE EXTENSION IF NOT EXISTS "hstore";')
|
||||
op.execute('CREATE EXTENSION IF NOT EXISTS "postgis";')
|
||||
op.execute('CREATE EXTENSION IF NOT EXISTS "uuid-ossp";')
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.execute('DROP EXTENSION "hstore";')
|
||||
op.execute('DROP EXTENSION "postgis";')
|
||||
op.execute('DROP EXTENSION "uuid-ossp";')
|
|
@ -1,30 +0,0 @@
|
|||
"""transform overtaking_event geometry to 3857
|
||||
|
||||
Revision ID: 587e69ecb466
|
||||
Revises: f4b0f460254d
|
||||
Create Date: 2023-04-01 14:30:49.927505
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "587e69ecb466"
|
||||
down_revision = "f4b0f460254d"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.execute("UPDATE overtaking_event SET geometry = ST_Transform(geometry, 3857);")
|
||||
op.execute(
|
||||
"ALTER TABLE overtaking_event ALTER COLUMN geometry TYPE geometry(POINT, 3857);"
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.execute(
|
||||
"ALTER TABLE overtaking_event ALTER COLUMN geometry TYPE geometry;"
|
||||
)
|
||||
op.execute("UPDATE overtaking_event SET geometry = ST_Transform(geometry, 4326);")
|
|
@ -1,43 +0,0 @@
|
|||
"""create table overtaking_event
|
||||
|
||||
Revision ID: 5d75febe2d59
|
||||
Revises: 920aed1450c9
|
||||
Create Date: 2022-03-30 21:36:37.687080
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
from migrations.utils import dbtype
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "5d75febe2d59"
|
||||
down_revision = "9336eef458e7"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"overtaking_event",
|
||||
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True, index=True),
|
||||
sa.Column(
|
||||
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
|
||||
),
|
||||
sa.Column("hex_hash", sa.String, unique=True, index=True),
|
||||
sa.Column("way_id", sa.BIGINT, index=True),
|
||||
sa.Column("direction_reversed", sa.Boolean),
|
||||
sa.Column("geometry", dbtype("GEOMETRY")),
|
||||
sa.Column("latitude", sa.Float),
|
||||
sa.Column("longitude", sa.Float),
|
||||
sa.Column("time", sa.DateTime),
|
||||
sa.Column("distance_overtaker", sa.Float),
|
||||
sa.Column("distance_stationary", sa.Float),
|
||||
sa.Column("course", sa.Float),
|
||||
sa.Column("speed", sa.Float),
|
||||
sa.Index("road_segment", "way_id", "direction_reversed"),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("overtaking_event")
|
|
@ -1,26 +0,0 @@
|
|||
"""add_overtaking_event_index
|
||||
|
||||
|
||||
Revision ID: 7868aed76122
|
||||
Revises: 587e69ecb466
|
||||
Create Date: 2023-07-16 13:37:17.694079
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '7868aed76122'
|
||||
down_revision = '587e69ecb466'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.execute("CREATE INDEX IF NOT EXISTS ix_overtaking_event_geometry ON overtaking_event using GIST(geometry);")
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_index("ix_overtaking_event_geometry")
|
||||
|
|
@ -1,31 +0,0 @@
|
|||
"""create enum processing_status
|
||||
|
||||
Revision ID: 920aed1450c9
|
||||
Revises: 986c6953e431
|
||||
Create Date: 2022-03-30 21:36:25.896192
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "920aed1450c9"
|
||||
down_revision = "986c6953e431"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def _get_enum_type():
|
||||
return postgresql.ENUM(
|
||||
"created", "queued", "processing", "complete", "error", name="processing_status"
|
||||
)
|
||||
|
||||
|
||||
def upgrade():
|
||||
_get_enum_type().create(op.get_bind(), checkfirst=True)
|
||||
|
||||
|
||||
def downgrade():
|
||||
_get_enum_type().drop(op.get_bind())
|
|
@ -1,42 +0,0 @@
|
|||
"""create table comment
|
||||
|
||||
Revision ID: 9336eef458e7
|
||||
Revises: 9d8c8c38a1d0
|
||||
Create Date: 2022-03-30 21:37:02.080429
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "9336eef458e7"
|
||||
down_revision = "d66baafab5ec"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
NOW = sa.text("NOW()")
|
||||
|
||||
op.create_table(
|
||||
"comment",
|
||||
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
|
||||
sa.Column("uid", UUID, server_default=sa.func.uuid_generate_v4()),
|
||||
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
|
||||
sa.Column(
|
||||
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
|
||||
),
|
||||
sa.Column("body", sa.TEXT),
|
||||
sa.Column(
|
||||
"author_id", sa.Integer, sa.ForeignKey("user.id", ondelete="CASCADE")
|
||||
),
|
||||
sa.Column(
|
||||
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("comment")
|
|
@ -1,29 +0,0 @@
|
|||
"""create enum zone_type
|
||||
|
||||
Revision ID: 986c6953e431
|
||||
Revises: 3856f240bb6d
|
||||
Create Date: 2022-03-30 21:36:19.888268
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "986c6953e431"
|
||||
down_revision = "3856f240bb6d"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def _get_enum_type():
|
||||
return postgresql.ENUM("rural", "urban", "motorway", name="zone_type")
|
||||
|
||||
|
||||
def upgrade():
|
||||
_get_enum_type().create(op.get_bind(), checkfirst=True)
|
||||
|
||||
|
||||
def downgrade():
|
||||
_get_enum_type().drop(op.get_bind())
|
|
@ -1,26 +0,0 @@
|
|||
"""add user display_name
|
||||
|
||||
Revision ID: 99a3d2eb08f9
|
||||
Revises: a9627f63fbed
|
||||
Create Date: 2022-09-13 07:30:18.747880
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "99a3d2eb08f9"
|
||||
down_revision = "a9627f63fbed"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.add_column(
|
||||
"user", sa.Column("display_name", sa.String, nullable=True), schema="public"
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_column("user", "display_name", schema="public")
|
|
@ -1,45 +0,0 @@
|
|||
"""create table user
|
||||
|
||||
Revision ID: 9d8c8c38a1d0
|
||||
Revises: d66baafab5ec
|
||||
Create Date: 2022-03-30 21:36:59.375149
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "9d8c8c38a1d0"
|
||||
down_revision = "35e7f1768f9b"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
NOW = sa.text("NOW()")
|
||||
|
||||
op.create_table(
|
||||
"user",
|
||||
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
|
||||
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
|
||||
sa.Column(
|
||||
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
|
||||
),
|
||||
sa.Column("sub", sa.String, unique=True, nullable=False),
|
||||
sa.Column("username", sa.String, unique=True, nullable=False),
|
||||
sa.Column("email", sa.String, nullable=False),
|
||||
sa.Column("bio", sa.TEXT),
|
||||
sa.Column("image", sa.String),
|
||||
sa.Column(
|
||||
"are_tracks_visible_for_all",
|
||||
sa.Boolean,
|
||||
server_default=sa.false(),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("api_key", sa.String),
|
||||
sa.Column("match_by_username_email", sa.Boolean, server_default=sa.false()),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("user")
|
|
@ -1,35 +0,0 @@
|
|||
"""create table region
|
||||
|
||||
Revision ID: a049e5eb24dd
|
||||
Revises: a9627f63fbed
|
||||
Create Date: 2022-04-02 21:28:43.124521
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
from migrations.utils import dbtype
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "a049e5eb24dd"
|
||||
down_revision = "99a3d2eb08f9"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"region",
|
||||
sa.Column("id", sa.String(24), primary_key=True, index=True),
|
||||
sa.Column("name", sa.Text),
|
||||
sa.Column("geometry", dbtype("GEOMETRY(GEOMETRY,3857)"), index=False),
|
||||
sa.Column("admin_level", sa.Integer, index=True),
|
||||
)
|
||||
op.execute(
|
||||
"CREATE INDEX region_geometry_idx ON region USING GIST (geometry) WITH (FILLFACTOR=100);"
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("region")
|
|
@ -1,34 +0,0 @@
|
|||
"""create table road_usage
|
||||
|
||||
Revision ID: a9627f63fbed
|
||||
Revises:
|
||||
Create Date: 2022-03-16 20:26:17.449569
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "a9627f63fbed"
|
||||
down_revision = "5d75febe2d59"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"road_usage",
|
||||
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True, index=True),
|
||||
sa.Column(
|
||||
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
|
||||
),
|
||||
sa.Column("hex_hash", sa.String, unique=True, index=True),
|
||||
sa.Column("way_id", sa.BIGINT, index=True),
|
||||
sa.Column("time", sa.DateTime),
|
||||
sa.Column("direction_reversed", sa.Boolean),
|
||||
sa.Index("road_usage_segment", "way_id", "direction_reversed"),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("road_usage")
|
|
@ -1,39 +0,0 @@
|
|||
"""add import groups
|
||||
|
||||
Revision ID: b8b0fbae50a4
|
||||
Revises: f7b21148126a
|
||||
Create Date: 2023-03-26 09:41:36.621203
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "b8b0fbae50a4"
|
||||
down_revision = "f7b21148126a"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.add_column(
|
||||
"road",
|
||||
sa.Column("import_group", sa.String(), nullable=True),
|
||||
)
|
||||
op.add_column(
|
||||
"region",
|
||||
sa.Column("import_group", sa.String(), nullable=True),
|
||||
)
|
||||
|
||||
# Set existing to "osm2pgsql"
|
||||
road = sa.table("road", sa.column("import_group", sa.String))
|
||||
op.execute(road.update().values(import_group="osm2pgsql"))
|
||||
|
||||
region = sa.table("region", sa.column("import_group", sa.String))
|
||||
op.execute(region.update().values(import_group="osm2pgsql"))
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_column("road", "import_group")
|
||||
op.drop_column("region", "import_group")
|
|
@ -1,66 +0,0 @@
|
|||
"""create table track
|
||||
|
||||
Revision ID: d66baafab5ec
|
||||
Revises: 35e7f1768f9b
|
||||
Create Date: 2022-03-30 21:36:54.848452
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects import postgresql
|
||||
from migrations.utils import dbtype
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "d66baafab5ec"
|
||||
down_revision = "9d8c8c38a1d0"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
NOW = sa.text("NOW()")
|
||||
|
||||
op.create_table(
|
||||
"track",
|
||||
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
|
||||
sa.Column("slug", sa.String, unique=True, nullable=False, index=True),
|
||||
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
|
||||
sa.Column(
|
||||
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
|
||||
),
|
||||
sa.Column("title", sa.String),
|
||||
sa.Column(
|
||||
"processing_status",
|
||||
dbtype("processing_status"),
|
||||
server_default=sa.literal("created"),
|
||||
),
|
||||
sa.Column("processing_queued_at", sa.DateTime),
|
||||
sa.Column("processed_at", sa.DateTime),
|
||||
sa.Column("processing_log", sa.TEXT),
|
||||
sa.Column(
|
||||
"customized_title", sa.Boolean, server_default=sa.false(), nullable=False
|
||||
),
|
||||
sa.Column("description", sa.TEXT),
|
||||
sa.Column("public", sa.Boolean, server_default=sa.false()),
|
||||
sa.Column("uploaded_by_user_agent", sa.String),
|
||||
sa.Column("original_file_name", sa.String),
|
||||
sa.Column("original_file_hash", sa.String, nullable=False),
|
||||
sa.Column(
|
||||
"author_id",
|
||||
sa.Integer,
|
||||
sa.ForeignKey("user.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("recorded_at", sa.DateTime),
|
||||
sa.Column("recorded_until", sa.DateTime),
|
||||
sa.Column("duration", sa.Float),
|
||||
sa.Column("length", sa.Float),
|
||||
sa.Column("segments", sa.Integer),
|
||||
sa.Column("num_events", sa.Integer),
|
||||
sa.Column("num_measurements", sa.Integer),
|
||||
sa.Column("num_valid", sa.Integer),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("track")
|
|
@ -1,24 +0,0 @@
|
|||
"""add osm id indexes
|
||||
|
||||
Revision ID: f4b0f460254d
|
||||
Revises: b8b0fbae50a4
|
||||
Create Date: 2023-03-30 10:56:22.066768
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "f4b0f460254d"
|
||||
down_revision = "b8b0fbae50a4"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.execute("CREATE INDEX IF NOT EXISTS ix_road_way_id ON road (way_id);")
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_index("ix_road_way_id")
|
|
@ -1,41 +0,0 @@
|
|||
"""add user_device
|
||||
|
||||
Revision ID: f7b21148126a
|
||||
Revises: a9627f63fbed
|
||||
Create Date: 2022-09-15 17:48:06.764342
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "f7b21148126a"
|
||||
down_revision = "a049e5eb24dd"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"user_device",
|
||||
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
|
||||
sa.Column("user_id", sa.Integer, sa.ForeignKey("user.id", ondelete="CASCADE")),
|
||||
sa.Column("identifier", sa.String, nullable=False),
|
||||
sa.Column("display_name", sa.String, nullable=True),
|
||||
sa.Index("user_id_identifier", "user_id", "identifier", unique=True),
|
||||
)
|
||||
op.add_column(
|
||||
"track",
|
||||
sa.Column(
|
||||
"user_device_id",
|
||||
sa.Integer,
|
||||
sa.ForeignKey("user_device.id", ondelete="RESTRICT"),
|
||||
nullable=True,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_column("track", "user_device_id")
|
||||
op.drop_table("user_device")
|
|
@ -1 +1 @@
|
|||
__version__ = "0.8.1"
|
||||
__version__ = "0.3.4"
|
||||
|
|
|
@ -1,11 +1,9 @@
|
|||
import asyncio
|
||||
import logging
|
||||
import re
|
||||
|
||||
from json import JSONEncoder, dumps
|
||||
from functools import wraps, partial
|
||||
from urllib.parse import urlparse
|
||||
from os.path import dirname, join, normpath, abspath, isfile
|
||||
from os.path import dirname, join, normpath, abspath
|
||||
from datetime import datetime, date
|
||||
|
||||
from sanic import Sanic, Blueprint
|
||||
|
@ -20,128 +18,22 @@ from sanic_session import Session, InMemorySessionInterface
|
|||
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from obs.api.db import User, make_session, connect_db
|
||||
from obs.api.cors import setup_options, add_cors_headers
|
||||
from obs.api.utils import get_single_arg
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SanicAccessMessageFilter(logging.Filter):
|
||||
"""
|
||||
A filter that modifies the log message of a sanic.access log entry to
|
||||
include useful information.
|
||||
"""
|
||||
|
||||
def filter(self, record):
|
||||
record.msg = f"{record.request} -> {record.status}"
|
||||
return True
|
||||
|
||||
|
||||
def configure_sanic_logging():
|
||||
for logger_name in ["sanic.root", "sanic.access", "sanic.error"]:
|
||||
logger = logging.getLogger(logger_name)
|
||||
for handler in logger.handlers:
|
||||
logger.removeHandler(handler)
|
||||
|
||||
logger = logging.getLogger("sanic.access")
|
||||
for filter_ in logger.filters:
|
||||
logger.removeFilter(filter_)
|
||||
logger.addFilter(SanicAccessMessageFilter())
|
||||
logging.getLogger("sanic.root").setLevel(logging.WARNING)
|
||||
|
||||
|
||||
app = Sanic(
|
||||
"openbikesensor-api",
|
||||
env_prefix="OBS_",
|
||||
)
|
||||
configure_sanic_logging()
|
||||
|
||||
app.config.update(
|
||||
dict(
|
||||
DEBUG=False,
|
||||
VERBOSE=False,
|
||||
AUTO_RELOAD=False,
|
||||
POSTGRES_POOL_SIZE=20,
|
||||
POSTGRES_MAX_OVERFLOW=40,
|
||||
DEDICATED_WORKER=True,
|
||||
FRONTEND_URL=None,
|
||||
FRONTEND_HTTPS=True,
|
||||
TILES_FILE=None,
|
||||
TILE_SEMAPHORE_SIZE=4,
|
||||
EXPORT_SEMAPHORE_SIZE=1,
|
||||
)
|
||||
)
|
||||
|
||||
# overwrite from defaults again
|
||||
app.config.load_environment_vars("OBS_")
|
||||
|
||||
if isfile("./config.py"):
|
||||
app.update_config("./config.py")
|
||||
|
||||
# For developers to override the config without committing it
|
||||
if isfile("./config.overrides.py"):
|
||||
app.update_config("./config.overrides.py")
|
||||
|
||||
app = Sanic("OpenBikeSensor Portal API")
|
||||
app.update_config("./config.py")
|
||||
c = app.config
|
||||
|
||||
api = Blueprint("api", url_prefix="/api")
|
||||
auth = Blueprint("auth", url_prefix="")
|
||||
|
||||
import re
|
||||
|
||||
TILE_REQUEST_CANCELLED = re.compile(
|
||||
r"Connection lost before response written.*GET /tiles"
|
||||
)
|
||||
|
||||
|
||||
class NoConnectionLostFilter(logging.Filter):
|
||||
def filter(record):
|
||||
return not TILE_REQUEST_CANCELLED.match(record.getMessage())
|
||||
|
||||
|
||||
logging.getLogger("sanic.error").addFilter(NoConnectionLostFilter)
|
||||
|
||||
|
||||
def setup_cors(app):
|
||||
frontend_url = app.config.get("FRONTEND_URL")
|
||||
additional_origins = app.config.get("ADDITIONAL_CORS_ORIGINS")
|
||||
if not frontend_url and not additional_origins:
|
||||
# No CORS configured
|
||||
return
|
||||
|
||||
origins = []
|
||||
if frontend_url:
|
||||
u = urlparse(frontend_url)
|
||||
origins.append(f"{u.scheme}://{u.netloc}")
|
||||
|
||||
if isinstance(additional_origins, str):
|
||||
origins += re.split(r"\s+", additional_origins)
|
||||
elif isinstance(additional_origins, list):
|
||||
origins += additional_origins
|
||||
elif additional_origins is not None:
|
||||
raise ValueError(
|
||||
"invalid option type for ADDITIONAL_CORS_ORIGINS, must be list or space separated str"
|
||||
)
|
||||
|
||||
app.ctx.cors_origins = origins
|
||||
|
||||
# Add OPTIONS handlers to any route that is missing it
|
||||
app.register_listener(setup_options, "before_server_start")
|
||||
|
||||
# Fill in CORS headers
|
||||
app.register_middleware(add_cors_headers, "response")
|
||||
|
||||
|
||||
setup_cors(app)
|
||||
|
||||
|
||||
@app.exception(SanicException, BaseException)
|
||||
async def _handle_sanic_errors(_request, exception):
|
||||
if isinstance(exception, asyncio.CancelledError):
|
||||
return None
|
||||
|
||||
@api.exception(SanicException, BaseException)
|
||||
def _handle_sanic_errors(_request, exception):
|
||||
log.error("Exception in handler: %s", exception, exc_info=True)
|
||||
return json_response(
|
||||
{
|
||||
|
@ -173,6 +65,38 @@ def configure_paths(c):
|
|||
configure_paths(app.config)
|
||||
|
||||
|
||||
def setup_cors(app):
|
||||
frontend_url = app.config.get("FRONTEND_URL")
|
||||
additional_origins = app.config.get("ADDITIONAL_CORS_ORIGINS")
|
||||
if not frontend_url and not additional_origins:
|
||||
# No CORS configured
|
||||
return
|
||||
|
||||
origins = []
|
||||
if frontend_url:
|
||||
u = urlparse(frontend_url)
|
||||
origins.append(f"{u.scheme}://{u.netloc}")
|
||||
|
||||
if isinstance(additional_origins, str):
|
||||
origins += re.split(r"\s+", additional_origins)
|
||||
elif isinstance(additional_origins, list):
|
||||
origins += additional_origins
|
||||
elif additional_origins is not None:
|
||||
raise ValueError(
|
||||
"invalid option type for ADDITIONAL_CORS_ORIGINS, must be list or space separated str"
|
||||
)
|
||||
|
||||
from sanic_cors import CORS
|
||||
|
||||
CORS(
|
||||
app,
|
||||
origins=origins,
|
||||
supports_credentials=True,
|
||||
)
|
||||
|
||||
|
||||
setup_cors(app)
|
||||
|
||||
# TODO: use a different interface, maybe backed by the PostgreSQL, to allow
|
||||
# scaling the API
|
||||
Session(app, interface=InMemorySessionInterface())
|
||||
|
@ -180,19 +104,9 @@ Session(app, interface=InMemorySessionInterface())
|
|||
|
||||
@app.before_server_start
|
||||
async def app_connect_db(app, loop):
|
||||
app.ctx._db_engine_ctx = connect_db(
|
||||
app.config.POSTGRES_URL,
|
||||
app.config.POSTGRES_POOL_SIZE,
|
||||
app.config.POSTGRES_MAX_OVERFLOW,
|
||||
)
|
||||
app.ctx._db_engine_ctx = connect_db(app.config.POSTGRES_URL)
|
||||
app.ctx._db_engine = await app.ctx._db_engine_ctx.__aenter__()
|
||||
|
||||
if app.config.TILE_SEMAPHORE_SIZE:
|
||||
app.ctx.tile_semaphore = asyncio.Semaphore(app.config.TILE_SEMAPHORE_SIZE)
|
||||
|
||||
if app.config.EXPORT_SEMAPHORE_SIZE:
|
||||
app.ctx.export_semaphore = asyncio.Semaphore(app.config.EXPORT_SEMAPHORE_SIZE)
|
||||
|
||||
|
||||
@app.after_server_stop
|
||||
async def app_disconnect_db(app, loop):
|
||||
|
@ -206,11 +120,6 @@ def remove_right(l, r):
|
|||
return l
|
||||
|
||||
|
||||
@app.middleware("request")
|
||||
async def inject_arg_getter(req):
|
||||
req.ctx.get_single_arg = partial(get_single_arg, req)
|
||||
|
||||
|
||||
@app.middleware("request")
|
||||
async def inject_urls(req):
|
||||
if req.app.config.FRONTEND_HTTPS:
|
||||
|
@ -254,6 +163,7 @@ async def inject_urls(req):
|
|||
async def inject_session(req):
|
||||
req.ctx._session_ctx = make_session()
|
||||
req.ctx.db = await req.ctx._session_ctx.__aenter__()
|
||||
sessionmaker(req.app.ctx._db_engine, class_=AsyncSession, expire_on_commit=False)()
|
||||
|
||||
|
||||
@app.middleware("response")
|
||||
|
@ -340,12 +250,12 @@ from .routes import (
|
|||
info,
|
||||
login,
|
||||
stats,
|
||||
tiles,
|
||||
tracks,
|
||||
users,
|
||||
exports,
|
||||
mapdetails,
|
||||
)
|
||||
|
||||
from .routes import tiles, mapdetails
|
||||
from .routes import frontend
|
||||
|
||||
|
||||
|
|
|
@ -1,68 +0,0 @@
|
|||
from collections import defaultdict
|
||||
from typing import Dict, FrozenSet, Iterable
|
||||
|
||||
from sanic import Sanic, response
|
||||
from sanic_routing.router import Route
|
||||
|
||||
|
||||
def _add_cors_headers(request, response, methods: Iterable[str]) -> None:
|
||||
allow_methods = list(set(methods))
|
||||
|
||||
if "OPTIONS" not in allow_methods:
|
||||
allow_methods.append("OPTIONS")
|
||||
|
||||
origin = request.headers.get("origin")
|
||||
if origin in request.app.ctx.cors_origins:
|
||||
headers = {
|
||||
"Access-Control-Allow-Methods": ",".join(allow_methods),
|
||||
"Access-Control-Allow-Origin": origin,
|
||||
"Access-Control-Allow-Credentials": "true",
|
||||
"Access-Control-Allow-Headers": (
|
||||
"origin, content-type, accept, "
|
||||
"authorization, x-xsrf-token, x-request-id"
|
||||
),
|
||||
"Access-Control-Expose-Headers": "content-disposition",
|
||||
}
|
||||
response.headers.extend(headers)
|
||||
|
||||
|
||||
def add_cors_headers(request, response):
|
||||
if request.method != "OPTIONS":
|
||||
methods = [method for method in request.route.methods]
|
||||
_add_cors_headers(request, response, methods)
|
||||
|
||||
|
||||
def _compile_routes_needing_options(routes: Dict[str, Route]) -> Dict[str, FrozenSet]:
|
||||
needs_options = defaultdict(list)
|
||||
# This is 21.12 and later. You will need to change this for older versions.
|
||||
for route in routes.values():
|
||||
if "OPTIONS" not in route.methods:
|
||||
needs_options[route.uri].extend(route.methods)
|
||||
|
||||
return {uri: frozenset(methods) for uri, methods in dict(needs_options).items()}
|
||||
|
||||
|
||||
def _options_wrapper(handler, methods):
|
||||
def wrapped_handler(request, *args, **kwargs):
|
||||
nonlocal methods
|
||||
return handler(request, methods)
|
||||
|
||||
return wrapped_handler
|
||||
|
||||
|
||||
async def options_handler(request, methods) -> response.HTTPResponse:
|
||||
resp = response.empty()
|
||||
_add_cors_headers(request, resp, methods)
|
||||
return resp
|
||||
|
||||
|
||||
def setup_options(app: Sanic, _):
|
||||
app.router.reset()
|
||||
needs_options = _compile_routes_needing_options(app.router.routes_all)
|
||||
for uri, methods in needs_options.items():
|
||||
app.add_route(
|
||||
_options_wrapper(options_handler, methods),
|
||||
uri,
|
||||
methods=["OPTIONS"],
|
||||
)
|
||||
app.router.finalize()
|
|
@ -3,7 +3,7 @@ from contextvars import ContextVar
|
|||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime
|
||||
import os
|
||||
from os.path import exists, join, dirname
|
||||
from os.path import join, dirname
|
||||
from json import loads
|
||||
import re
|
||||
import math
|
||||
|
@ -12,7 +12,6 @@ import random
|
|||
import string
|
||||
import secrets
|
||||
from slugify import slugify
|
||||
import logging
|
||||
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
@ -34,33 +33,26 @@ from sqlalchemy import (
|
|||
select,
|
||||
text,
|
||||
literal,
|
||||
Text,
|
||||
)
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
from sqlalchemy.dialects.postgresql import HSTORE, UUID
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
engine = None
|
||||
sessionmaker: SessionMaker
|
||||
sessionmaker = None
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def make_session():
|
||||
async with sessionmaker(autoflush=True) as session:
|
||||
async with sessionmaker() as session:
|
||||
yield session
|
||||
|
||||
|
||||
async def drop_all():
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.drop_all)
|
||||
|
||||
|
||||
async def init_models():
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.drop_all)
|
||||
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "hstore";'))
|
||||
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "postgis";'))
|
||||
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "uuid-ossp";'))
|
||||
|
@ -73,12 +65,10 @@ def random_string(length):
|
|||
|
||||
|
||||
@asynccontextmanager
|
||||
async def connect_db(url, pool_size=10, max_overflow=20):
|
||||
async def connect_db(url):
|
||||
global engine, sessionmaker
|
||||
|
||||
engine = create_async_engine(
|
||||
url, echo=False, pool_size=pool_size, max_overflow=max_overflow
|
||||
)
|
||||
engine = create_async_engine(url, echo=False)
|
||||
sessionmaker = SessionMaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
yield engine
|
||||
|
@ -108,28 +98,6 @@ class Geometry(UserDefinedType):
|
|||
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
|
||||
|
||||
|
||||
class LineString(UserDefinedType):
|
||||
def get_col_spec(self):
|
||||
return "geometry(LineString, 3857)"
|
||||
|
||||
def bind_expression(self, bindvalue):
|
||||
return func.ST_GeomFromGeoJSON(bindvalue, type_=self)
|
||||
|
||||
def column_expression(self, col):
|
||||
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
|
||||
|
||||
|
||||
class GeometryGeometry(UserDefinedType):
|
||||
def get_col_spec(self):
|
||||
return "geometry(GEOMETRY, 3857)"
|
||||
|
||||
def bind_expression(self, bindvalue):
|
||||
return func.ST_GeomFromGeoJSON(bindvalue, type_=self)
|
||||
|
||||
def column_expression(self, col):
|
||||
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
|
||||
|
||||
|
||||
class OvertakingEvent(Base):
|
||||
__tablename__ = "overtaking_event"
|
||||
__table_args__ = (Index("road_segment", "way_id", "direction_reversed"),)
|
||||
|
@ -157,23 +125,12 @@ class OvertakingEvent(Base):
|
|||
|
||||
class Road(Base):
|
||||
__tablename__ = "road"
|
||||
way_id = Column(BIGINT, primary_key=True, index=True, autoincrement=False)
|
||||
way_id = Column(BIGINT, primary_key=True, index=True)
|
||||
zone = Column(ZoneType)
|
||||
name = Column(Text)
|
||||
geometry = Column(LineString)
|
||||
name = Column(String)
|
||||
geometry = Column(Geometry)
|
||||
directionality = Column(Integer)
|
||||
oneway = Column(Boolean)
|
||||
import_group = Column(String)
|
||||
|
||||
__table_args__ = (
|
||||
# We keep the index name as osm2pgsql created it, way back when.
|
||||
Index(
|
||||
"road_geometry_idx",
|
||||
"geometry",
|
||||
postgresql_using="gist",
|
||||
postgresql_with={"fillfactor": 100},
|
||||
),
|
||||
)
|
||||
|
||||
def to_dict(self):
|
||||
return {
|
||||
|
@ -186,34 +143,11 @@ class Road(Base):
|
|||
}
|
||||
|
||||
|
||||
class RoadUsage(Base):
|
||||
__tablename__ = "road_usage"
|
||||
__table_args__ = (Index("road_usage_segment", "way_id", "direction_reversed"),)
|
||||
|
||||
id = Column(Integer, autoincrement=True, primary_key=True, index=True)
|
||||
track_id = Column(Integer, ForeignKey("track.id", ondelete="CASCADE"))
|
||||
hex_hash = Column(String, unique=True, index=True)
|
||||
way_id = Column(BIGINT, index=True)
|
||||
time = Column(DateTime)
|
||||
direction_reversed = Column(Boolean)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<RoadUsage {self.id}>"
|
||||
|
||||
def __hash__(self):
|
||||
return int(self.hex_hash, 16)
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.hex_hash == other.hex_hash
|
||||
|
||||
|
||||
NOW = text("NOW()")
|
||||
|
||||
|
||||
class DuplicateTrackFileError(ValueError):
|
||||
pass
|
||||
|
||||
|
||||
class Track(Base):
|
||||
__tablename__ = "track"
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
|
@ -261,12 +195,6 @@ class Track(Base):
|
|||
Integer, ForeignKey("user.id", ondelete="CASCADE"), nullable=False
|
||||
)
|
||||
|
||||
user_device_id = Column(
|
||||
Integer,
|
||||
ForeignKey("user_device.id", ondelete="RESTRICT"),
|
||||
nullable=True,
|
||||
)
|
||||
|
||||
# Statistics... maybe we'll drop some of this if we can easily compute them from SQL
|
||||
recorded_at = Column(DateTime)
|
||||
recorded_until = Column(DateTime)
|
||||
|
@ -299,7 +227,6 @@ class Track(Base):
|
|||
if for_user_id is not None and for_user_id == self.author_id:
|
||||
result["uploadedByUserAgent"] = self.uploaded_by_user_agent
|
||||
result["originalFileName"] = self.original_file_name
|
||||
result["userDeviceId"] = self.user_device_id
|
||||
|
||||
if self.author:
|
||||
result["author"] = self.author.to_dict(for_user_id=for_user_id)
|
||||
|
@ -401,7 +328,6 @@ class User(Base):
|
|||
updated_at = Column(DateTime, nullable=False, server_default=NOW, onupdate=NOW)
|
||||
sub = Column(String, unique=True, nullable=False)
|
||||
username = Column(String, unique=True, nullable=False)
|
||||
display_name = Column(String, nullable=True)
|
||||
email = Column(String, nullable=False)
|
||||
bio = Column(TEXT)
|
||||
image = Column(String)
|
||||
|
@ -409,7 +335,7 @@ class User(Base):
|
|||
api_key = Column(String)
|
||||
|
||||
# This user can be matched by the email address from the auth service
|
||||
# instead of having to match by `sub`. If a matching user logs in, the
|
||||
# instead of having to match by `sub`. If a matching user logs in, the
|
||||
# `sub` is updated to the new sub and this flag is disabled. This is for
|
||||
# migrating *to* the external authentication scheme.
|
||||
match_by_username_email = Column(Boolean, server_default=false())
|
||||
|
@ -422,60 +348,11 @@ class User(Base):
|
|||
self.api_key = secrets.token_urlsafe(24)
|
||||
|
||||
def to_dict(self, for_user_id=None):
|
||||
result = {
|
||||
"id": self.id,
|
||||
"displayName": self.display_name or self.username,
|
||||
return {
|
||||
"username": self.username,
|
||||
"bio": self.bio,
|
||||
"image": self.image,
|
||||
}
|
||||
if for_user_id == self.id:
|
||||
result["username"] = self.username
|
||||
return result
|
||||
|
||||
async def rename(self, config, new_name):
|
||||
old_name = self.username
|
||||
|
||||
renames = [
|
||||
(join(basedir, old_name), join(basedir, new_name))
|
||||
for basedir in [config.PROCESSING_OUTPUT_DIR, config.TRACKS_DIR]
|
||||
]
|
||||
|
||||
for src, dst in renames:
|
||||
if exists(dst):
|
||||
raise FileExistsError(
|
||||
f"cannot move {src!r} to {dst!r}, destination exists"
|
||||
)
|
||||
|
||||
for src, dst in renames:
|
||||
if not exists(src):
|
||||
log.debug("Rename user %s: Not moving %s, not found", self.id, src)
|
||||
else:
|
||||
log.info("Rename user %s: Moving %s to %s", self.id, src, dst)
|
||||
os.rename(src, dst)
|
||||
|
||||
self.username = new_name
|
||||
|
||||
|
||||
class UserDevice(Base):
|
||||
__tablename__ = "user_device"
|
||||
id = Column(Integer, autoincrement=True, primary_key=True)
|
||||
user_id = Column(Integer, ForeignKey("user.id", ondelete="CASCADE"))
|
||||
identifier = Column(String, nullable=False)
|
||||
display_name = Column(String, nullable=True)
|
||||
|
||||
__table_args__ = (
|
||||
Index("user_id_identifier", "user_id", "identifier", unique=True),
|
||||
)
|
||||
|
||||
def to_dict(self, for_user_id=None):
|
||||
if for_user_id != self.user_id:
|
||||
return {}
|
||||
|
||||
return {
|
||||
"id": self.id,
|
||||
"identifier": self.identifier,
|
||||
"displayName": self.display_name,
|
||||
}
|
||||
|
||||
|
||||
class Comment(Base):
|
||||
|
@ -501,58 +378,24 @@ class Comment(Base):
|
|||
}
|
||||
|
||||
|
||||
class Region(Base):
|
||||
__tablename__ = "region"
|
||||
|
||||
id = Column(String(24), primary_key=True, index=True)
|
||||
name = Column(Text)
|
||||
geometry = Column(GeometryGeometry)
|
||||
admin_level = Column(Integer, index=True)
|
||||
import_group = Column(String)
|
||||
|
||||
__table_args__ = (
|
||||
# We keep the index name as osm2pgsql created it, way back when.
|
||||
Index(
|
||||
"region_geometry_idx",
|
||||
"geometry",
|
||||
postgresql_using="gist",
|
||||
postgresql_with={"fillfactor": 100},
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
Comment.author = relationship("User", back_populates="authored_comments")
|
||||
User.authored_comments = relationship(
|
||||
"Comment",
|
||||
order_by=Comment.created_at,
|
||||
back_populates="author",
|
||||
passive_deletes=True,
|
||||
"Comment", order_by=Comment.created_at, back_populates="author"
|
||||
)
|
||||
|
||||
Track.author = relationship("User", back_populates="authored_tracks")
|
||||
User.authored_tracks = relationship(
|
||||
"Track", order_by=Track.created_at, back_populates="author", passive_deletes=True
|
||||
"Track", order_by=Track.created_at, back_populates="author"
|
||||
)
|
||||
|
||||
Comment.track = relationship("Track", back_populates="comments")
|
||||
Track.comments = relationship(
|
||||
"Comment", order_by=Comment.created_at, back_populates="track", passive_deletes=True
|
||||
"Comment", order_by=Comment.created_at, back_populates="track"
|
||||
)
|
||||
|
||||
OvertakingEvent.track = relationship("Track", back_populates="overtaking_events")
|
||||
Track.overtaking_events = relationship(
|
||||
"OvertakingEvent",
|
||||
order_by=OvertakingEvent.time,
|
||||
back_populates="track",
|
||||
passive_deletes=True,
|
||||
)
|
||||
|
||||
Track.user_device = relationship("UserDevice", back_populates="tracks")
|
||||
UserDevice.tracks = relationship(
|
||||
"Track",
|
||||
order_by=Track.created_at,
|
||||
back_populates="user_device",
|
||||
passive_deletes=False,
|
||||
"OvertakingEvent", order_by=OvertakingEvent.time, back_populates="track"
|
||||
)
|
||||
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@ import pytz
|
|||
from os.path import join
|
||||
from datetime import datetime
|
||||
|
||||
from sqlalchemy import delete, func, select, and_
|
||||
from sqlalchemy import delete, select
|
||||
from sqlalchemy.orm import joinedload
|
||||
|
||||
from obs.face.importer import ImportMeasurementsCsv
|
||||
|
@ -27,21 +27,12 @@ from obs.face.filter import (
|
|||
|
||||
from obs.face.osm import DataSource, DatabaseTileSource
|
||||
|
||||
from obs.api.db import OvertakingEvent, RoadUsage, Track, UserDevice, make_session
|
||||
from obs.api.db import OvertakingEvent, Track, make_session
|
||||
from obs.api.app import app
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_data_source():
|
||||
"""
|
||||
Creates a data source based on the configuration of the portal. In *lean*
|
||||
mode, the OverpassTileSource is used to fetch data on demand. In normal
|
||||
mode, the roads database is used.
|
||||
"""
|
||||
return DataSource(DatabaseTileSource())
|
||||
|
||||
|
||||
async def process_tracks_loop(delay):
|
||||
while True:
|
||||
try:
|
||||
|
@ -59,7 +50,9 @@ async def process_tracks_loop(delay):
|
|||
await asyncio.sleep(delay)
|
||||
continue
|
||||
|
||||
data_source = get_data_source()
|
||||
tile_source = DatabaseTileSource()
|
||||
data_source = DataSource(tile_source)
|
||||
|
||||
await process_track(session, track, data_source)
|
||||
except BaseException:
|
||||
log.exception("Failed to process track. Will continue.")
|
||||
|
@ -73,7 +66,8 @@ async def process_tracks(tracks):
|
|||
|
||||
:param tracks: A list of strings which
|
||||
"""
|
||||
data_source = get_data_source()
|
||||
tile_source = DatabaseTileSource()
|
||||
data_source = DataSource(tile_source)
|
||||
|
||||
async with make_session() as session:
|
||||
for track_id_or_slug in tracks:
|
||||
|
@ -101,30 +95,6 @@ def to_naive_utc(t):
|
|||
return t.astimezone(pytz.UTC).replace(tzinfo=None)
|
||||
|
||||
|
||||
async def export_gpx(track, filename, name):
|
||||
import xml.etree.ElementTree as ET
|
||||
|
||||
gpx = ET.Element("gpx")
|
||||
metadata = ET.SubElement(gpx, "metadata")
|
||||
ET.SubElement(metadata, "name").text = name
|
||||
|
||||
trk = ET.SubElement(gpx, "trk")
|
||||
|
||||
ET.SubElement(trk, "name").text = name
|
||||
ET.SubElement(trk, "type").text = "Cycling"
|
||||
|
||||
trkseg = ET.SubElement(trk, "trkseg")
|
||||
|
||||
for point in track:
|
||||
trkpt = ET.SubElement(
|
||||
trkseg, "trkpt", lat=str(point["latitude"]), lon=str(point["longitude"])
|
||||
)
|
||||
ET.SubElement(trkpt, "time").text = point["time"].isoformat()
|
||||
|
||||
et = ET.ElementTree(gpx)
|
||||
et.write(filename, encoding="utf-8", xml_declaration=True)
|
||||
|
||||
|
||||
async def process_track(session, track, data_source):
|
||||
try:
|
||||
track.processing_status = "complete"
|
||||
|
@ -139,17 +109,14 @@ async def process_track(session, track, data_source):
|
|||
os.makedirs(output_dir, exist_ok=True)
|
||||
|
||||
log.info("Annotating and filtering CSV file")
|
||||
imported_data, statistics, track_metadata = ImportMeasurementsCsv().read(
|
||||
imported_data, statistics = ImportMeasurementsCsv().read(
|
||||
original_file_path,
|
||||
user_id="dummy", # TODO: user username or id or nothing?
|
||||
dataset_id=Track.slug, # TODO: use track id or slug or nothing?
|
||||
return_metadata=True,
|
||||
)
|
||||
|
||||
annotator = AnnotateMeasurements(
|
||||
data_source,
|
||||
cache_dir=app.config.OBS_FACE_CACHE_DIR,
|
||||
fully_annotate_unconfirmed=True,
|
||||
data_source, cache_dir=app.config.OBS_FACE_CACHE_DIR
|
||||
)
|
||||
input_data = await annotator.annotate(imported_data)
|
||||
|
||||
|
@ -186,69 +153,23 @@ async def process_track(session, track, data_source):
|
|||
},
|
||||
}
|
||||
|
||||
track_raw_json = {
|
||||
"type": "Feature",
|
||||
"geometry": {
|
||||
"type": "LineString",
|
||||
"coordinates": [
|
||||
[m["longitude_GPS"], m["latitude_GPS"]] for m in track_points
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
for output_filename, data in [
|
||||
("measurements.json", measurements_json),
|
||||
("overtakingEvents.json", overtaking_events_json),
|
||||
("track.json", track_json),
|
||||
("trackRaw.json", track_raw_json),
|
||||
]:
|
||||
target = join(output_dir, output_filename)
|
||||
log.debug("Writing file %s", target)
|
||||
with open(target, "w") as fp:
|
||||
json.dump(data, fp, indent=4)
|
||||
|
||||
await export_gpx(track_points, join(output_dir, "track.gpx"), track.slug)
|
||||
|
||||
log.info("Clearing old track data...")
|
||||
await clear_track_data(session, track)
|
||||
await session.commit()
|
||||
|
||||
device_identifier = track_metadata.get("DeviceId")
|
||||
if device_identifier:
|
||||
if isinstance(device_identifier, list):
|
||||
device_identifier = device_identifier[0]
|
||||
|
||||
log.info("Finding or creating device %s", device_identifier)
|
||||
user_device = (
|
||||
await session.execute(
|
||||
select(UserDevice).where(
|
||||
and_(
|
||||
UserDevice.user_id == track.author_id,
|
||||
UserDevice.identifier == device_identifier,
|
||||
)
|
||||
)
|
||||
)
|
||||
).scalar()
|
||||
|
||||
log.debug("user_device is %s", user_device)
|
||||
|
||||
if not user_device:
|
||||
user_device = UserDevice(
|
||||
user_id=track.author_id, identifier=device_identifier
|
||||
)
|
||||
log.debug("Create new device for this user")
|
||||
session.add(user_device)
|
||||
|
||||
track.user_device = user_device
|
||||
else:
|
||||
log.info("No DeviceId in track metadata.")
|
||||
|
||||
log.info("Import events into database...")
|
||||
await import_overtaking_events(session, track, overtaking_events)
|
||||
|
||||
log.info("import road usages...")
|
||||
await import_road_usages(session, track, track_points)
|
||||
|
||||
log.info("Write track statistics and update status...")
|
||||
track.recorded_at = to_naive_utc(statistics["t_min"])
|
||||
track.recorded_until = to_naive_utc(statistics["t_max"])
|
||||
|
@ -286,7 +207,6 @@ async def clear_track_data(session, track):
|
|||
await session.execute(
|
||||
delete(OvertakingEvent).where(OvertakingEvent.track_id == track.id)
|
||||
)
|
||||
await session.execute(delete(RoadUsage).where(RoadUsage.track_id == track.id))
|
||||
|
||||
|
||||
async def import_overtaking_events(session, track, overtaking_events):
|
||||
|
@ -306,16 +226,11 @@ async def import_overtaking_events(session, track, overtaking_events):
|
|||
hex_hash=hex_hash,
|
||||
way_id=m.get("OSM_way_id"),
|
||||
direction_reversed=m.get("OSM_way_orientation", 0) < 0,
|
||||
geometry=func.ST_Transform(
|
||||
func.ST_GeomFromGeoJSON(
|
||||
json.dumps(
|
||||
{
|
||||
"type": "Point",
|
||||
"coordinates": [m["longitude"], m["latitude"]],
|
||||
}
|
||||
)
|
||||
),
|
||||
3857,
|
||||
geometry=json.dumps(
|
||||
{
|
||||
"type": "Point",
|
||||
"coordinates": [m["longitude"], m["latitude"]],
|
||||
}
|
||||
),
|
||||
latitude=m["latitude"],
|
||||
longitude=m["longitude"],
|
||||
|
@ -327,51 +242,3 @@ async def import_overtaking_events(session, track, overtaking_events):
|
|||
)
|
||||
|
||||
session.add_all(event_models.values())
|
||||
|
||||
|
||||
def get_road_usages(track_points):
|
||||
last_key = None
|
||||
last = None
|
||||
|
||||
for p in track_points:
|
||||
way_id = p.get("OSM_way_id")
|
||||
direction_reversed = p.get("OSM_way_orientation", 0) < 0
|
||||
|
||||
key = (way_id, direction_reversed)
|
||||
|
||||
if last_key is None or last_key[0] is None:
|
||||
last = p
|
||||
last_key = key
|
||||
continue
|
||||
|
||||
if last_key != key:
|
||||
if last_key[0] is not None:
|
||||
yield last
|
||||
last_key = key
|
||||
last = p
|
||||
|
||||
if last is not None and last_key[0] is not None:
|
||||
yield last
|
||||
|
||||
|
||||
async def import_road_usages(session, track, track_points):
|
||||
usages = set()
|
||||
for p in get_road_usages(track_points):
|
||||
direction_reversed = p.get("OSM_way_orientation", 0) < 0
|
||||
way_id = p.get("OSM_way_id")
|
||||
time = p["time"]
|
||||
|
||||
hex_hash = hashlib.sha256(
|
||||
struct.pack("dQ", way_id, int(time.timestamp()))
|
||||
).hexdigest()
|
||||
|
||||
usages.add(
|
||||
RoadUsage(
|
||||
track_id=track.id,
|
||||
hex_hash=hex_hash,
|
||||
way_id=way_id,
|
||||
time=time.astimezone(pytz.utc).replace(tzinfo=None),
|
||||
direction_reversed=direction_reversed,
|
||||
)
|
||||
)
|
||||
session.add_all(usages)
|
||||
|
|
|
@ -1,261 +0,0 @@
|
|||
import json
|
||||
from enum import Enum
|
||||
from contextlib import contextmanager
|
||||
import zipfile
|
||||
import io
|
||||
import re
|
||||
import math
|
||||
from sqlite3 import connect
|
||||
|
||||
import shapefile
|
||||
from obs.api.db import OvertakingEvent
|
||||
from sqlalchemy import select, func, text
|
||||
from sanic.response import raw
|
||||
from sanic.exceptions import InvalidUsage
|
||||
|
||||
from obs.api.app import api, json as json_response
|
||||
from obs.api.utils import use_request_semaphore
|
||||
|
||||
import logging
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ExportFormat(str, Enum):
|
||||
SHAPEFILE = "shapefile"
|
||||
GEOJSON = "geojson"
|
||||
|
||||
|
||||
def parse_bounding_box(input_string):
|
||||
left, bottom, right, top = map(float, input_string.split(","))
|
||||
return func.ST_SetSRID(
|
||||
func.ST_MakeBox2D(
|
||||
func.ST_Point(left, bottom),
|
||||
func.ST_Point(right, top),
|
||||
),
|
||||
4326,
|
||||
)
|
||||
|
||||
|
||||
PROJECTION_4326 = (
|
||||
'GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],'
|
||||
'AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],'
|
||||
'UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]'
|
||||
)
|
||||
|
||||
|
||||
@contextmanager
|
||||
def shapefile_zip(shape_type=shapefile.POINT, basename="events"):
|
||||
zip_buffer = io.BytesIO()
|
||||
shp, shx, dbf = (io.BytesIO() for _ in range(3))
|
||||
writer = shapefile.Writer(
|
||||
shp=shp, shx=shx, dbf=dbf, shapeType=shape_type, encoding="utf8"
|
||||
)
|
||||
|
||||
yield writer, zip_buffer
|
||||
|
||||
writer.balance()
|
||||
writer.close()
|
||||
|
||||
zip_file = zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False)
|
||||
zip_file.writestr(f"{basename}.shp", shp.getbuffer())
|
||||
zip_file.writestr(f"{basename}.shx", shx.getbuffer())
|
||||
zip_file.writestr(f"{basename}.dbf", dbf.getbuffer())
|
||||
zip_file.writestr(f"{basename}.prj", PROJECTION_4326)
|
||||
zip_file.close()
|
||||
|
||||
|
||||
@api.get(r"/export/events")
|
||||
async def export_events(req):
|
||||
async with use_request_semaphore(req, "export_semaphore", timeout=30):
|
||||
bbox = req.ctx.get_single_arg("bbox", default="-180,-90,180,90")
|
||||
assert re.match(r"(-?\d+\.?\d+,?){4}", bbox)
|
||||
bbox = list(map(float, bbox.split(",")))
|
||||
|
||||
fmt = req.ctx.get_single_arg("fmt", convert=ExportFormat)
|
||||
|
||||
events = await req.ctx.db.stream(
|
||||
text(
|
||||
"""
|
||||
SELECT
|
||||
ST_AsGeoJSON(ST_Transform(geometry, 4326)) AS geometry,
|
||||
distance_overtaker,
|
||||
distance_stationary,
|
||||
way_id,
|
||||
direction,
|
||||
speed,
|
||||
time_stamp,
|
||||
course,
|
||||
zone
|
||||
FROM
|
||||
layer_obs_events(
|
||||
ST_Transform(ST_MakeEnvelope(:bbox0, :bbox1, :bbox2, :bbox3, 4326), 3857),
|
||||
19,
|
||||
NULL,
|
||||
'1900-01-01'::timestamp,
|
||||
'2100-01-01'::timestamp
|
||||
)
|
||||
"""
|
||||
).bindparams(bbox0=bbox[0], bbox1=bbox[1], bbox2=bbox[2], bbox3=bbox[3])
|
||||
)
|
||||
|
||||
if fmt == ExportFormat.SHAPEFILE:
|
||||
with shapefile_zip(basename="events") as (writer, zip_buffer):
|
||||
writer.field("distance_overtaker", "N", decimal=4)
|
||||
writer.field("distance_stationary", "N", decimal=4)
|
||||
writer.field("way_id", "N", decimal=0)
|
||||
writer.field("direction", "N", decimal=0)
|
||||
writer.field("course", "N", decimal=4)
|
||||
writer.field("speed", "N", decimal=4)
|
||||
writer.field("zone", "C")
|
||||
|
||||
async for event in events:
|
||||
coords = json.loads(event.geometry)["coordinates"]
|
||||
writer.point(*coords)
|
||||
writer.record(
|
||||
distance_overtaker=event.distance_overtaker,
|
||||
distance_stationary=event.distance_stationary,
|
||||
direction=event.direction,
|
||||
way_id=event.way_id,
|
||||
course=event.course,
|
||||
speed=event.speed,
|
||||
zone=event.zone
|
||||
# "time"=event.time,
|
||||
)
|
||||
|
||||
return raw(zip_buffer.getbuffer())
|
||||
|
||||
if fmt == ExportFormat.GEOJSON:
|
||||
features = []
|
||||
async for event in events:
|
||||
geom = json.loads(event.geometry)
|
||||
features.append(
|
||||
{
|
||||
"type": "Feature",
|
||||
"geometry": geom,
|
||||
"properties": {
|
||||
"distance_overtaker": event.distance_overtaker
|
||||
if event.distance_overtaker is not None
|
||||
and not math.isnan(event.distance_overtaker)
|
||||
else None,
|
||||
"distance_stationary": event.distance_stationary
|
||||
if event.distance_stationary is not None
|
||||
and not math.isnan(event.distance_stationary)
|
||||
else None,
|
||||
"direction": event.direction
|
||||
if event.direction is not None
|
||||
and not math.isnan(event.direction)
|
||||
else None,
|
||||
"way_id": event.way_id,
|
||||
"course": event.course
|
||||
if event.course is not None and not math.isnan(event.course)
|
||||
else None,
|
||||
"speed": event.speed
|
||||
if event.speed is not None and not math.isnan(event.speed)
|
||||
else None,
|
||||
"time": event.time_stamp,
|
||||
"zone": event.zone,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
geojson = {"type": "FeatureCollection", "features": features}
|
||||
return json_response(geojson)
|
||||
|
||||
raise InvalidUsage("unknown export format")
|
||||
|
||||
|
||||
@api.get(r"/export/segments")
|
||||
async def export_segments(req):
|
||||
async with use_request_semaphore(req, "export_semaphore", timeout=30):
|
||||
bbox = req.ctx.get_single_arg("bbox", default="-180,-90,180,90")
|
||||
assert re.match(r"(-?\d+\.?\d+,?){4}", bbox)
|
||||
bbox = list(map(float, bbox.split(",")))
|
||||
|
||||
fmt = req.ctx.get_single_arg("fmt", convert=ExportFormat)
|
||||
segments = await req.ctx.db.stream(
|
||||
text(
|
||||
"""
|
||||
SELECT
|
||||
ST_AsGeoJSON(ST_Transform(geometry, 4326)) AS geometry,
|
||||
way_id,
|
||||
distance_overtaker_mean,
|
||||
distance_overtaker_min,
|
||||
distance_overtaker_max,
|
||||
distance_overtaker_median,
|
||||
overtaking_event_count,
|
||||
usage_count,
|
||||
direction,
|
||||
zone,
|
||||
offset_direction,
|
||||
distance_overtaker_array
|
||||
FROM
|
||||
layer_obs_roads(
|
||||
ST_Transform(ST_MakeEnvelope(:bbox0, :bbox1, :bbox2, :bbox3, 4326), 3857),
|
||||
11,
|
||||
NULL,
|
||||
'1900-01-01'::timestamp,
|
||||
'2100-01-01'::timestamp
|
||||
)
|
||||
WHERE usage_count > 0
|
||||
"""
|
||||
).bindparams(bbox0=bbox[0], bbox1=bbox[1], bbox2=bbox[2], bbox3=bbox[3])
|
||||
)
|
||||
|
||||
if fmt == ExportFormat.SHAPEFILE:
|
||||
with shapefile_zip(shape_type=3, basename="segments") as (
|
||||
writer,
|
||||
zip_buffer,
|
||||
):
|
||||
writer.field("distance_overtaker_mean", "N", decimal=4)
|
||||
writer.field("distance_overtaker_max", "N", decimal=4)
|
||||
writer.field("distance_overtaker_min", "N", decimal=4)
|
||||
writer.field("distance_overtaker_median", "N", decimal=4)
|
||||
writer.field("overtaking_event_count", "N", decimal=4)
|
||||
writer.field("usage_count", "N", decimal=4)
|
||||
writer.field("way_id", "N", decimal=0)
|
||||
writer.field("direction", "N", decimal=0)
|
||||
writer.field("zone", "C")
|
||||
|
||||
async for segment in segments:
|
||||
geom = json.loads(segment.st_asgeojson)
|
||||
writer.line([geom["coordinates"]])
|
||||
writer.record(
|
||||
distance_overtaker_mean=segment.distance_overtaker_mean,
|
||||
distance_overtaker_median=segment.distance_overtaker_median,
|
||||
distance_overtaker_max=segment.distance_overtaker_max,
|
||||
distance_overtaker_min=segment.distance_overtaker_min,
|
||||
usage_count=segment.usage_count,
|
||||
overtaking_event_count=segment.overtaking_event_count,
|
||||
direction=segment.direction,
|
||||
way_id=segment.way_id,
|
||||
zone=segment.zone,
|
||||
)
|
||||
|
||||
return raw(zip_buffer.getbuffer())
|
||||
|
||||
if fmt == ExportFormat.GEOJSON:
|
||||
features = []
|
||||
async for segment in segments:
|
||||
features.append(
|
||||
{
|
||||
"type": "Feature",
|
||||
"geometry": json.loads(segment.geometry),
|
||||
"properties": {
|
||||
"distance_overtaker_mean": segment.distance_overtaker_mean,
|
||||
"distance_overtaker_max": segment.distance_overtaker_max,
|
||||
"distance_overtaker_median": segment.distance_overtaker_median,
|
||||
"overtaking_event_count": segment.overtaking_event_count,
|
||||
"usage_count": segment.usage_count,
|
||||
"distance_overtaker_array": segment.distance_overtaker_array,
|
||||
"direction": segment.direction,
|
||||
"way_id": segment.way_id,
|
||||
"zone": segment.zone,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
geojson = {"type": "FeatureCollection", "features": features}
|
||||
return json_response(geojson)
|
||||
|
||||
raise InvalidUsage("unknown export format")
|
|
@ -1,4 +1,4 @@
|
|||
from os.path import join, exists, isfile, abspath
|
||||
from os.path import join, exists, isfile
|
||||
|
||||
import sanic.response as response
|
||||
from sanic.exceptions import NotFound
|
||||
|
@ -6,7 +6,6 @@ from sanic.exceptions import NotFound
|
|||
from obs.api.app import app
|
||||
|
||||
if app.config.FRONTEND_CONFIG:
|
||||
|
||||
@app.get("/config.json")
|
||||
def get_frontend_config(req):
|
||||
result = {
|
||||
|
@ -23,7 +22,7 @@ if app.config.FRONTEND_CONFIG:
|
|||
.replace("111", "{x}")
|
||||
.replace("222", "{y}")
|
||||
],
|
||||
"minzoom": 0,
|
||||
"minzoom": 12,
|
||||
"maxzoom": 14,
|
||||
},
|
||||
}
|
||||
|
@ -46,9 +45,6 @@ if INDEX_HTML and exists(INDEX_HTML):
|
|||
raise NotFound()
|
||||
|
||||
file = join(app.config.FRONTEND_DIR, path)
|
||||
if not abspath(file).startswith(abspath(app.config.FRONTEND_DIR)):
|
||||
raise NotFound()
|
||||
|
||||
if not exists(file) or not path or not isfile(file):
|
||||
return response.html(
|
||||
index_file_contents.replace("__BASE_HREF__", req.ctx.frontend_url + "/")
|
||||
|
|
|
@ -1,8 +1,5 @@
|
|||
import asyncio
|
||||
import logging
|
||||
import re
|
||||
|
||||
from requests.exceptions import RequestException
|
||||
import os
|
||||
|
||||
from sqlalchemy import select
|
||||
|
||||
|
@ -11,46 +8,36 @@ from oic.oic import Client
|
|||
from oic.oic.message import AuthorizationResponse, RegistrationResponse
|
||||
from oic.utils.authn.client import CLIENT_AUTHN_METHOD
|
||||
|
||||
from obs.api.app import auth, api
|
||||
from obs.api.app import auth
|
||||
from obs.api.db import User
|
||||
|
||||
from sanic.response import json, redirect
|
||||
from sanicargs import parse_parameters
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
client = Client(client_authn_method=CLIENT_AUTHN_METHOD)
|
||||
|
||||
# Do not show verbose library output, even when the appliaction is in debug mode
|
||||
logging.getLogger("oic").setLevel(logging.INFO)
|
||||
|
||||
|
||||
@auth.before_server_start
|
||||
async def connect_auth_client(app, loop):
|
||||
client.allow["issuer_mismatch"] = True
|
||||
try:
|
||||
client.provider_config(app.config.KEYCLOAK_URL)
|
||||
client.store_registration_info(
|
||||
RegistrationResponse(
|
||||
client_id=app.config.KEYCLOAK_CLIENT_ID,
|
||||
client_secret=app.config.KEYCLOAK_CLIENT_SECRET,
|
||||
)
|
||||
client.provider_config(app.config.KEYCLOAK_URL)
|
||||
client.store_registration_info(
|
||||
RegistrationResponse(
|
||||
client_id=app.config.KEYCLOAK_CLIENT_ID,
|
||||
client_secret=app.config.KEYCLOAK_CLIENT_SECRET,
|
||||
)
|
||||
except RequestException:
|
||||
log.exception(f"could not connect to {app.config.KEYCLOAK_URL}")
|
||||
log.info("will retry")
|
||||
await asyncio.sleep(2)
|
||||
log.info("retrying")
|
||||
await connect_auth_client(app, loop)
|
||||
)
|
||||
|
||||
|
||||
@auth.route("/login")
|
||||
async def login(req):
|
||||
next_url = req.ctx.get_single_arg("next", default=None)
|
||||
|
||||
@parse_parameters
|
||||
async def login(req, next: str = None):
|
||||
session = req.ctx.session
|
||||
session["state"] = rndstr()
|
||||
session["nonce"] = rndstr()
|
||||
session["next"] = next_url
|
||||
session["next"] = next
|
||||
args = {
|
||||
"client_id": client.client_id,
|
||||
"response_type": "code",
|
||||
|
@ -92,15 +79,6 @@ async def login_redirect(req):
|
|||
preferred_username = userinfo["preferred_username"]
|
||||
email = userinfo.get("email")
|
||||
|
||||
clean_username = re.sub(r"[^a-zA-Z0-9_.-]", "", preferred_username)
|
||||
if clean_username != preferred_username:
|
||||
log.warning(
|
||||
"Username %r contained invalid characters and was changed to %r",
|
||||
preferred_username,
|
||||
clean_username,
|
||||
)
|
||||
preferred_username = clean_username
|
||||
|
||||
if email is None:
|
||||
raise ValueError(
|
||||
"user has no email set, please configure keycloak to require emails"
|
||||
|
@ -138,20 +116,16 @@ async def login_redirect(req):
|
|||
user = User(sub=sub, username=preferred_username, email=email)
|
||||
req.ctx.db.add(user)
|
||||
else:
|
||||
log.info(
|
||||
"Logged in known user (id: %s, sub: %s, %s).",
|
||||
user.id,
|
||||
user.sub,
|
||||
preferred_username,
|
||||
)
|
||||
log.info("Logged in known user (id: %s, sub: %s).", user.id, user.sub)
|
||||
|
||||
if email != user.email:
|
||||
log.debug("Updating user (id: %s) email from auth system.", user.id)
|
||||
user.email = email
|
||||
|
||||
if preferred_username != user.username:
|
||||
log.debug("Updating user (id: %s) username from auth system.", user.id)
|
||||
await user.rename(req.app.config, preferred_username)
|
||||
# TODO: re-add username change when we can safely rename users
|
||||
# if preferred_username != user.username:
|
||||
# log.debug("Updating user (id: %s) username from auth system.", user.id)
|
||||
# user.username = preferred_username
|
||||
|
||||
await req.ctx.db.commit()
|
||||
|
||||
|
@ -159,15 +133,3 @@ async def login_redirect(req):
|
|||
|
||||
next_ = session.pop("next", "/") or "/"
|
||||
return redirect(next_)
|
||||
|
||||
|
||||
@api.route("/logout")
|
||||
async def logout(req):
|
||||
session = req.ctx.session
|
||||
if "user_id" in session:
|
||||
del session["user_id"]
|
||||
|
||||
auth_req = client.construct_EndSessionRequest(state=session["state"])
|
||||
logout_url = auth_req.request(client.end_session_endpoint)
|
||||
|
||||
return redirect(logout_url + f"&post_logout_redirect_uri={req.ctx.api_url}/logout")
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
import json
|
||||
from functools import partial
|
||||
import logging
|
||||
import numpy
|
||||
import math
|
||||
|
||||
|
@ -11,37 +10,53 @@ from sanic.exceptions import InvalidUsage
|
|||
|
||||
from obs.api.app import api
|
||||
from obs.api.db import Road, OvertakingEvent, Track
|
||||
from obs.api.utils import round_to
|
||||
|
||||
|
||||
from .stats import round_to
|
||||
|
||||
round_distance = partial(round_to, multiples=0.001)
|
||||
round_speed = partial(round_to, multiples=0.1)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
RAISE = object()
|
||||
|
||||
|
||||
def get_bearing(b, a):
|
||||
def get_single_arg(req, name, default=RAISE, convert=None):
|
||||
try:
|
||||
value = req.args[name][0]
|
||||
except LookupError as e:
|
||||
if default is not RAISE:
|
||||
return default
|
||||
raise InvalidUsage("missing `{name}`") from e
|
||||
|
||||
if convert is not None:
|
||||
try:
|
||||
value = convert(value)
|
||||
except (ValueError, TypeError) as e:
|
||||
raise InvalidUsage("invalid `{name}`") from e
|
||||
|
||||
return value
|
||||
|
||||
|
||||
def get_bearing(a, b):
|
||||
# longitude, latitude
|
||||
dL = b[0] - a[0]
|
||||
X = numpy.cos(b[1]) * numpy.sin(dL)
|
||||
Y = numpy.cos(a[1]) * numpy.sin(b[1]) - numpy.sin(a[1]) * numpy.cos(
|
||||
b[1]
|
||||
) * numpy.cos(dL)
|
||||
return numpy.arctan2(Y, X) + 0.5 * math.pi
|
||||
return numpy.arctan2(X, Y)
|
||||
|
||||
|
||||
# Bins for histogram on overtaker distances. 0, 0.25, ... 2.25, infinity
|
||||
DISTANCE_BINS = numpy.arange(0, 2.5, 0.25).tolist() + [float('inf')]
|
||||
|
||||
@api.route("/mapdetails/road", methods=["GET"])
|
||||
async def mapdetails_road(req):
|
||||
longitude = req.ctx.get_single_arg("longitude", convert=float)
|
||||
latitude = req.ctx.get_single_arg("latitude", convert=float)
|
||||
radius = req.ctx.get_single_arg("radius", default=100, convert=float)
|
||||
longitude = get_single_arg(req, "longitude", convert=float)
|
||||
latitude = get_single_arg(req, "latitude", convert=float)
|
||||
radius = get_single_arg(req, "radius", default=100, convert=float)
|
||||
|
||||
if not (1 <= radius <= 1000):
|
||||
raise InvalidUsage("`radius` parameter must be between 1 and 1000")
|
||||
|
||||
road_geometry = Road.geometry
|
||||
road_geometry = func.ST_Transform(Road.geometry, 3857)
|
||||
point = func.ST_Transform(
|
||||
func.ST_GeomFromGeoJSON(
|
||||
json.dumps(
|
||||
|
@ -84,25 +99,26 @@ async def mapdetails_road(req):
|
|||
arrays = numpy.array(arrays).T
|
||||
|
||||
if len(arrays) == 0:
|
||||
arrays = numpy.array([[], [], [], []], dtype=float)
|
||||
arrays = numpy.array([[], [], [], []], dtype=numpy.float)
|
||||
|
||||
data, mask = arrays[:-1], arrays[-1]
|
||||
data = data.astype(numpy.float64)
|
||||
mask = mask.astype(bool)
|
||||
mask = mask.astype(numpy.bool)
|
||||
|
||||
def partition(arr, cond):
|
||||
return arr[:, cond], arr[:, ~cond]
|
||||
|
||||
forwards, backwards = partition(data, ~mask)
|
||||
print("for", forwards.dtype, "back", backwards.dtype)
|
||||
|
||||
def array_stats(arr, rounder, bins=30):
|
||||
def array_stats(arr, rounder):
|
||||
if len(arr):
|
||||
print("ARR DTYPE", arr.dtype)
|
||||
print("ARR", arr)
|
||||
arr = arr[~numpy.isnan(arr)]
|
||||
|
||||
n = len(arr)
|
||||
|
||||
hist, bins = numpy.histogram(arr, bins=bins)
|
||||
|
||||
return {
|
||||
"statistics": {
|
||||
"count": n,
|
||||
|
@ -111,11 +127,6 @@ async def mapdetails_road(req):
|
|||
"max": rounder(numpy.max(arr)) if n else None,
|
||||
"median": rounder(numpy.median(arr)) if n else None,
|
||||
},
|
||||
"histogram": {
|
||||
"bins": [None if math.isinf(b) else b for b in bins.tolist()],
|
||||
"counts": hist.tolist(),
|
||||
"zone": road.zone
|
||||
},
|
||||
"values": list(map(rounder, arr.tolist())),
|
||||
}
|
||||
|
||||
|
@ -128,13 +139,15 @@ async def mapdetails_road(req):
|
|||
# convert to degrees, as this is more natural to understand for consumers
|
||||
bearing = round_to((bearing / math.pi * 180 + 360) % 360, 1)
|
||||
|
||||
print(road.geometry)
|
||||
|
||||
def get_direction_stats(direction_arrays, backwards=False):
|
||||
return {
|
||||
"bearing": ((bearing + 180) % 360 if backwards else bearing)
|
||||
if bearing is not None
|
||||
else None,
|
||||
"distanceOvertaker": array_stats(direction_arrays[0], round_distance, bins=DISTANCE_BINS),
|
||||
"distanceStationary": array_stats(direction_arrays[1], round_distance, bins=DISTANCE_BINS),
|
||||
"distanceOvertaker": array_stats(direction_arrays[0], round_distance),
|
||||
"distanceStationary": array_stats(direction_arrays[1], round_distance),
|
||||
"speed": array_stats(direction_arrays[2], round_speed),
|
||||
}
|
||||
|
||||
|
|
|
@ -4,13 +4,13 @@ from typing import Optional
|
|||
from operator import and_
|
||||
from functools import reduce
|
||||
|
||||
from sqlalchemy import distinct, select, func, desc
|
||||
from sqlalchemy import select, func
|
||||
|
||||
from sanic.response import json
|
||||
from sanicargs import parse_parameters
|
||||
|
||||
from obs.api.app import api
|
||||
from obs.api.db import Track, OvertakingEvent, User, Region, UserDevice
|
||||
from obs.api.utils import round_to
|
||||
from obs.api.db import Track, OvertakingEvent, User
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
@ -26,12 +26,15 @@ TRACK_DURATION_ROUNDING = 120
|
|||
MINUMUM_RECORDING_DATE = datetime(2010, 1, 1)
|
||||
|
||||
|
||||
@api.route("/stats")
|
||||
async def stats(req):
|
||||
user = req.ctx.get_single_arg("user", default=None)
|
||||
start = req.ctx.get_single_arg("start", default=None, convert=datetime)
|
||||
end = req.ctx.get_single_arg("end", default=None, convert=datetime)
|
||||
def round_to(value: float, multiples: float) -> float:
|
||||
if value is None:
|
||||
return None
|
||||
return round(value / multiples) * multiples
|
||||
|
||||
|
||||
@api.route("/stats")
|
||||
@parse_parameters
|
||||
async def stats(req, user: str = None, start: datetime = None, end: datetime = None):
|
||||
conditions = [
|
||||
Track.recorded_at != None,
|
||||
Track.recorded_at > MINUMUM_RECORDING_DATE,
|
||||
|
@ -45,7 +48,7 @@ async def stats(req):
|
|||
|
||||
# Only the user can look for their own stats, for now
|
||||
by_user = (
|
||||
user is not None and req.ctx.user is not None and req.ctx.user.id == int(user)
|
||||
user is not None and req.ctx.user is not None and req.ctx.user.username == user
|
||||
)
|
||||
if by_user:
|
||||
conditions.append(Track.author_id == req.ctx.user.id)
|
||||
|
@ -92,14 +95,6 @@ async def stats(req):
|
|||
.where(track_condition)
|
||||
)
|
||||
).scalar()
|
||||
device_count = (
|
||||
await req.ctx.db.execute(
|
||||
select(func.count(distinct(UserDevice.id)))
|
||||
.select_from(UserDevice)
|
||||
.join(Track.user_device)
|
||||
.where(track_condition)
|
||||
)
|
||||
).scalar()
|
||||
|
||||
result = {
|
||||
"numEvents": event_count,
|
||||
|
@ -108,7 +103,6 @@ async def stats(req):
|
|||
"trackDuration": round_to(track_duration or 0, TRACK_DURATION_ROUNDING),
|
||||
"publicTrackCount": public_track_count,
|
||||
"trackCount": track_count,
|
||||
"deviceCount": device_count,
|
||||
}
|
||||
|
||||
return json(result)
|
||||
|
@ -176,31 +170,3 @@ async def stats(req):
|
|||
# });
|
||||
# }),
|
||||
# );
|
||||
|
||||
|
||||
@api.route("/stats/regions")
|
||||
async def stats(req):
|
||||
query = (
|
||||
select(
|
||||
[
|
||||
Region.id,
|
||||
Region.name,
|
||||
func.count(OvertakingEvent.id).label("overtaking_event_count"),
|
||||
]
|
||||
)
|
||||
.select_from(Region)
|
||||
.join(
|
||||
OvertakingEvent,
|
||||
func.ST_Within(OvertakingEvent.geometry, Region.geometry),
|
||||
)
|
||||
.group_by(
|
||||
Region.id,
|
||||
Region.name,
|
||||
Region.geometry,
|
||||
)
|
||||
.having(func.count(OvertakingEvent.id) > 0)
|
||||
.order_by(desc("overtaking_event_count"))
|
||||
)
|
||||
|
||||
regions = list(map(dict, (await req.ctx.db.execute(query)).all()))
|
||||
return json(regions)
|
||||
|
|
|
@ -1,16 +1,11 @@
|
|||
from gzip import decompress
|
||||
from sqlite3 import connect
|
||||
from datetime import datetime, time, timedelta
|
||||
from typing import Optional, Tuple
|
||||
|
||||
import dateutil.parser
|
||||
from sanic.exceptions import Forbidden, InvalidUsage
|
||||
from sanic.response import raw
|
||||
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy import select, text
|
||||
from sqlalchemy.sql.expression import table, column
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.utils import use_request_semaphore
|
||||
|
||||
|
||||
def get_tile(filename, zoom, x, y):
|
||||
|
@ -28,84 +23,28 @@ def get_tile(filename, zoom, x, y):
|
|||
|
||||
content = db.execute(
|
||||
"SELECT tile_data FROM tiles WHERE zoom_level=? AND tile_column=? AND tile_row=?",
|
||||
(zoom, x, (2**zoom - 1) - y),
|
||||
(zoom, x, (2 ** zoom - 1) - y),
|
||||
).fetchone()
|
||||
return content and content[0] or None
|
||||
|
||||
|
||||
def round_date(date, to="weeks", up=False):
|
||||
if to != "weeks":
|
||||
raise ValueError(f"cannot round to {to}")
|
||||
|
||||
midnight = time(0, 0, 0, 0)
|
||||
start_of_day = date.date() # ignore time
|
||||
weekday = date.weekday()
|
||||
|
||||
is_rounded = date.time() == midnight and weekday == 0
|
||||
if is_rounded:
|
||||
return date
|
||||
|
||||
if up:
|
||||
return datetime.combine(start_of_day + timedelta(days=7 - weekday), midnight)
|
||||
else:
|
||||
return datetime.combine(start_of_day - timedelta(days=weekday), midnight)
|
||||
|
||||
|
||||
# regenerate approx. once each day
|
||||
TILE_CACHE_MAX_AGE = 3600 * 24
|
||||
|
||||
|
||||
def get_filter_options(
|
||||
req,
|
||||
) -> Tuple[Optional[str], Optional[datetime], Optional[datetime]]:
|
||||
"""
|
||||
Returns parsed, validated and normalized options for filtering map data, a
|
||||
tuple of
|
||||
|
||||
* user_id (str|None)
|
||||
* start (datetime|None)
|
||||
* end (datetime|None)
|
||||
"""
|
||||
user_id = req.ctx.get_single_arg("user", default=None, convert=int)
|
||||
if user_id is not None and (req.ctx.user is None or req.ctx.user.id != user_id):
|
||||
raise Forbidden()
|
||||
|
||||
parse_date = lambda s: dateutil.parser.parse(s)
|
||||
start = req.ctx.get_single_arg("start", default=None, convert=parse_date)
|
||||
end = req.ctx.get_single_arg("end", default=None, convert=parse_date)
|
||||
|
||||
start = round_date(start, to="weeks", up=False) if start else None
|
||||
end = round_date(end, to="weeks", up=True) if end else None
|
||||
|
||||
if start is not None and end is not None and start >= end:
|
||||
raise InvalidUsage(
|
||||
"end date must be later than start date (note: dates are rounded to weeks)"
|
||||
)
|
||||
|
||||
return user_id, start, end
|
||||
|
||||
|
||||
@app.route(r"/tiles/<zoom:int>/<x:int>/<y:(\d+)\.pbf>")
|
||||
async def tiles(req, zoom: int, x: int, y: str):
|
||||
async with use_request_semaphore(req, "tile_semaphore"):
|
||||
if app.config.get("TILES_FILE"):
|
||||
tile = get_tile(req.app.config.TILES_FILE, int(zoom), int(x), int(y))
|
||||
if app.config.get("TILES_FILE"):
|
||||
tile = get_tile(req.app.config.TILES_FILE, int(zoom), int(x), int(y))
|
||||
|
||||
else:
|
||||
user_id, start, end = get_filter_options(req)
|
||||
|
||||
tile = await req.ctx.db.scalar(
|
||||
text(
|
||||
"select data from getmvt(:zoom, :x, :y, :user_id, :min_time, :max_time) as b(data, key);"
|
||||
).bindparams(
|
||||
zoom=int(zoom),
|
||||
x=int(x),
|
||||
y=int(y),
|
||||
user_id=user_id,
|
||||
min_time=start,
|
||||
max_time=end,
|
||||
)
|
||||
else:
|
||||
tile = await req.ctx.db.scalar(
|
||||
text(f"select data from getmvt(:zoom, :x, :y) as b(data, key);").bindparams(
|
||||
zoom=int(zoom),
|
||||
x=int(x),
|
||||
y=int(y),
|
||||
)
|
||||
)
|
||||
|
||||
gzip = "gzip" in req.headers["accept-encoding"]
|
||||
|
||||
|
|
|
@ -1,18 +1,17 @@
|
|||
import logging
|
||||
import re
|
||||
from datetime import date
|
||||
from json import load as jsonload
|
||||
from os.path import join, exists, isfile
|
||||
|
||||
from sanic.exceptions import InvalidUsage, NotFound, Forbidden
|
||||
from sanic.response import file_stream, empty
|
||||
from slugify import slugify
|
||||
from sqlalchemy import select, func, and_
|
||||
from sqlalchemy import select, func
|
||||
from sqlalchemy.orm import joinedload
|
||||
|
||||
from obs.api.db import Track, User, Comment, DuplicateTrackFileError
|
||||
from obs.api.app import api, require_auth, read_api_key, json
|
||||
from obs.api.db import Track, Comment, DuplicateTrackFileError
|
||||
from obs.api.utils import tar_of_tracks
|
||||
|
||||
from sanic.response import file_stream, empty
|
||||
from sanic.exceptions import InvalidUsage, NotFound, Forbidden
|
||||
from sanicargs import parse_parameters
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
@ -25,8 +24,8 @@ def normalize_user_agent(user_agent):
|
|||
return m[0] if m else None
|
||||
|
||||
|
||||
async def _return_tracks(req, extend_query, limit, offset, order_by=None):
|
||||
if limit <= 0 or limit > 1000:
|
||||
async def _return_tracks(req, extend_query, limit, offset):
|
||||
if limit <= 0 or limit > 100:
|
||||
raise InvalidUsage("invalid limit")
|
||||
|
||||
if offset < 0:
|
||||
|
@ -41,7 +40,7 @@ async def _return_tracks(req, extend_query, limit, offset, order_by=None):
|
|||
extend_query(select(Track).options(joinedload(Track.author)))
|
||||
.limit(limit)
|
||||
.offset(offset)
|
||||
.order_by(order_by if order_by is not None else Track.created_at)
|
||||
.order_by(Track.created_at.desc())
|
||||
)
|
||||
|
||||
tracks = (await req.ctx.db.execute(query)).scalars()
|
||||
|
@ -62,117 +61,27 @@ async def _return_tracks(req, extend_query, limit, offset, order_by=None):
|
|||
|
||||
|
||||
@api.get("/tracks")
|
||||
async def get_tracks(req):
|
||||
limit = req.ctx.get_single_arg("limit", default=20, convert=int)
|
||||
offset = req.ctx.get_single_arg("offset", default=0, convert=int)
|
||||
# author = req.ctx.get_single_arg("author", default=None, convert=int)
|
||||
|
||||
@parse_parameters
|
||||
async def get_tracks(req, limit: int = 20, offset: int = 0, author: str = None):
|
||||
def extend_query(q):
|
||||
q = q.where(Track.public)
|
||||
|
||||
# if author is not None:
|
||||
# q = q.where(Track.author_id == author)
|
||||
if author is not None:
|
||||
q = q.where(User.username == author)
|
||||
|
||||
return q
|
||||
|
||||
return await _return_tracks(req, extend_query, limit, offset)
|
||||
|
||||
|
||||
def parse_boolean(s):
|
||||
if s is None:
|
||||
return None
|
||||
|
||||
s = s.lower()
|
||||
if s in ("true", "1", "yes", "y", "t"):
|
||||
return True
|
||||
if s in ("false", "0", "no", "n", "f"):
|
||||
return False
|
||||
|
||||
raise ValueError("invalid value for boolean")
|
||||
|
||||
|
||||
@api.get("/tracks/feed")
|
||||
@require_auth
|
||||
async def get_feed(req):
|
||||
limit = req.ctx.get_single_arg("limit", default=20, convert=int)
|
||||
offset = req.ctx.get_single_arg("offset", default=0, convert=int)
|
||||
user_device_id = req.ctx.get_single_arg("user_device_id", default=None, convert=int)
|
||||
|
||||
order_by_columns = {
|
||||
"recordedAt": Track.recorded_at,
|
||||
"title": Track.title,
|
||||
"visibility": Track.public,
|
||||
"length": Track.length,
|
||||
"duration": Track.duration,
|
||||
"user_device_id": Track.user_device_id,
|
||||
}
|
||||
order_by = req.ctx.get_single_arg(
|
||||
"order_by", default=None, convert=order_by_columns.get
|
||||
)
|
||||
|
||||
reversed_ = req.ctx.get_single_arg("reversed", convert=parse_boolean, default=False)
|
||||
if reversed_:
|
||||
order_by = order_by.desc()
|
||||
|
||||
public = req.ctx.get_single_arg("public", convert=parse_boolean, default=None)
|
||||
|
||||
@parse_parameters
|
||||
async def get_feed(req, limit: int = 20, offset: int = 0):
|
||||
def extend_query(q):
|
||||
q = q.where(Track.author_id == req.ctx.user.id)
|
||||
return q.where(Track.author_id == req.ctx.user.id)
|
||||
|
||||
if user_device_id is not None:
|
||||
q = q.where(Track.user_device_id == user_device_id)
|
||||
|
||||
if public is not None:
|
||||
q = q.where(Track.public == public)
|
||||
|
||||
return q
|
||||
|
||||
return await _return_tracks(req, extend_query, limit, offset, order_by)
|
||||
|
||||
|
||||
@api.post("/tracks/bulk")
|
||||
@require_auth
|
||||
async def tracks_bulk_action(req):
|
||||
body = req.json
|
||||
action = body["action"]
|
||||
track_slugs = body["tracks"]
|
||||
|
||||
if action not in ("delete", "makePublic", "makePrivate", "reprocess", "download"):
|
||||
raise InvalidUsage("invalid action")
|
||||
|
||||
query = select(Track).where(
|
||||
and_(Track.author_id == req.ctx.user.id, Track.slug.in_(track_slugs))
|
||||
)
|
||||
|
||||
files = set()
|
||||
|
||||
for track in (await req.ctx.db.execute(query)).scalars():
|
||||
if action == "delete":
|
||||
await req.ctx.db.delete(track)
|
||||
elif action == "makePublic":
|
||||
if not track.public:
|
||||
track.queue_processing()
|
||||
track.public = True
|
||||
elif action == "makePrivate":
|
||||
if track.public:
|
||||
track.queue_processing()
|
||||
track.public = False
|
||||
elif action == "reprocess":
|
||||
track.queue_processing()
|
||||
elif action == "download":
|
||||
files.add(track.get_original_file_path(req.app.config))
|
||||
|
||||
await req.ctx.db.commit()
|
||||
|
||||
if action == "download":
|
||||
username_slug = slugify(req.ctx.user.username, separator="-")
|
||||
date_str = date.today().isoformat()
|
||||
file_basename = f"tracks_{username_slug}_{date_str}"
|
||||
|
||||
await tar_of_tracks(req, files, file_basename)
|
||||
return
|
||||
|
||||
return empty()
|
||||
return await _return_tracks(req, extend_query, limit, offset)
|
||||
|
||||
|
||||
@api.post("/tracks")
|
||||
|
@ -268,7 +177,6 @@ async def get_track_data(req, slug: str):
|
|||
"measurements": "measurements.json",
|
||||
"overtakingEvents": "overtakingEvents.json",
|
||||
"track": "track.json",
|
||||
"trackRaw": "trackRaw.json",
|
||||
}
|
||||
|
||||
result = {}
|
||||
|
@ -295,29 +203,7 @@ async def download_original_file(req, slug: str):
|
|||
if not track.is_visible_to_private(req.ctx.user):
|
||||
raise Forbidden()
|
||||
|
||||
return await file_stream(
|
||||
track.get_original_file_path(req.app.config),
|
||||
mime_type="text/csv",
|
||||
filename=f"{slug}.csv",
|
||||
)
|
||||
|
||||
|
||||
@api.get("/tracks/<slug:str>/download/track.gpx")
|
||||
async def download_track_gpx(req, slug: str):
|
||||
track = await _load_track(req, slug)
|
||||
|
||||
if not track.is_visible_to(req.ctx.user):
|
||||
raise Forbidden()
|
||||
|
||||
file_path = join(req.app.config.PROCESSING_OUTPUT_DIR, track.file_path, "track.gpx")
|
||||
if not exists(file_path) or not isfile(file_path):
|
||||
raise NotFound()
|
||||
|
||||
return await file_stream(
|
||||
file_path,
|
||||
mime_type="application/gpx+xml",
|
||||
filename=f"{slug}.gpx",
|
||||
)
|
||||
return await file_stream(track.get_original_file_path(req.app.config))
|
||||
|
||||
|
||||
@api.put("/tracks/<slug:str>")
|
||||
|
@ -374,10 +260,8 @@ async def put_track(req, slug: str):
|
|||
|
||||
|
||||
@api.get("/tracks/<slug:str>/comments")
|
||||
async def get_track_comments(req, slug: str):
|
||||
limit = req.ctx.get_single_arg("limit", default=20, convert=int)
|
||||
offset = req.ctx.get_single_arg("offset", default=0, convert=int)
|
||||
|
||||
@parse_parameters
|
||||
async def get_track_comments(req, slug: str, limit: int = 20, offset: int = 0):
|
||||
track = await _load_track(req, slug)
|
||||
|
||||
comment_count = await req.ctx.db.scalar(
|
||||
|
|
|
@ -1,11 +1,9 @@
|
|||
import logging
|
||||
|
||||
from sanic.response import json
|
||||
from sanic.exceptions import InvalidUsage, Forbidden, NotFound
|
||||
from sqlalchemy import and_, select
|
||||
from sanic.exceptions import InvalidUsage
|
||||
|
||||
from obs.api.app import api, require_auth
|
||||
from obs.api.db import UserDevice
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
@ -14,9 +12,7 @@ from obs.api import __version__ as version
|
|||
|
||||
def user_to_json(user):
|
||||
return {
|
||||
"id": user.id,
|
||||
"username": user.username,
|
||||
"displayName": user.display_name,
|
||||
"email": user.email,
|
||||
"bio": user.bio,
|
||||
"image": user.image,
|
||||
|
@ -30,48 +26,6 @@ async def get_user(req):
|
|||
return json(user_to_json(req.ctx.user) if req.ctx.user else None)
|
||||
|
||||
|
||||
@api.get("/user/devices")
|
||||
async def get_user_devices(req):
|
||||
if not req.ctx.user:
|
||||
raise Forbidden()
|
||||
|
||||
query = (
|
||||
select(UserDevice)
|
||||
.where(UserDevice.user_id == req.ctx.user.id)
|
||||
.order_by(UserDevice.id)
|
||||
)
|
||||
|
||||
devices = (await req.ctx.db.execute(query)).scalars()
|
||||
|
||||
return json([device.to_dict(req.ctx.user.id) for device in devices])
|
||||
|
||||
|
||||
@api.put("/user/devices/<device_id:int>")
|
||||
async def put_user_device(req, device_id):
|
||||
if not req.ctx.user:
|
||||
raise Forbidden()
|
||||
|
||||
body = req.json
|
||||
|
||||
query = (
|
||||
select(UserDevice)
|
||||
.where(and_(UserDevice.user_id == req.ctx.user.id, UserDevice.id == device_id))
|
||||
.limit(1)
|
||||
)
|
||||
|
||||
device = (await req.ctx.db.execute(query)).scalar()
|
||||
|
||||
if device is None:
|
||||
raise NotFound()
|
||||
|
||||
new_name = body.get("displayName", "").strip()
|
||||
if new_name and device.display_name != new_name:
|
||||
device.display_name = new_name
|
||||
await req.ctx.db.commit()
|
||||
|
||||
return json(device.to_dict())
|
||||
|
||||
|
||||
@api.put("/user")
|
||||
@require_auth
|
||||
async def put_user(req):
|
||||
|
@ -82,9 +36,6 @@ async def put_user(req):
|
|||
if key in data and isinstance(data[key], (str, type(None))):
|
||||
setattr(user, key, data[key])
|
||||
|
||||
if "displayName" in data:
|
||||
user.display_name = data["displayName"] or None
|
||||
|
||||
if "areTracksVisibleForAll" in data:
|
||||
user.are_tracks_visible_for_all = bool(data["areTracksVisibleForAll"])
|
||||
|
||||
|
|
|
@ -1,162 +0,0 @@
|
|||
import asyncio
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime
|
||||
import logging
|
||||
from os.path import commonpath, join, relpath
|
||||
import queue
|
||||
import tarfile
|
||||
|
||||
import dateutil.parser
|
||||
from sanic.exceptions import InvalidUsage, ServiceUnavailable
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
RAISE = object()
|
||||
|
||||
|
||||
def get_single_arg(req, name, default=RAISE, convert=None):
|
||||
try:
|
||||
value = req.args[name][0]
|
||||
except LookupError as e:
|
||||
if default is RAISE:
|
||||
raise InvalidUsage(f"missing `{name}`") from e
|
||||
|
||||
value = default
|
||||
|
||||
if convert is not None and value is not None:
|
||||
if convert is datetime or convert in ("date", "datetime"):
|
||||
convert = lambda s: dateutil.parser.parse(s)
|
||||
|
||||
try:
|
||||
value = convert(value)
|
||||
except (ValueError, TypeError) as e:
|
||||
raise InvalidUsage(f"invalid `{name}`: {str(e)}") from e
|
||||
|
||||
return value
|
||||
|
||||
|
||||
def round_to(value: float, multiples: float) -> float:
|
||||
if value is None:
|
||||
return None
|
||||
return round(value / multiples) * multiples
|
||||
|
||||
|
||||
def chunk_list(lst, n):
|
||||
for s in range(0, len(lst), n):
|
||||
yield lst[s : s + n]
|
||||
|
||||
|
||||
class chunk:
|
||||
def __init__(self, iterable, n):
|
||||
self.iterable = iterable
|
||||
self.n = n
|
||||
|
||||
def __iter__(self):
|
||||
if isinstance(self.iterable, list):
|
||||
yield from chunk_list(self.iterable, self.n)
|
||||
return
|
||||
|
||||
it = iter(self.iterable)
|
||||
while True:
|
||||
current = []
|
||||
try:
|
||||
for _ in range(self.n):
|
||||
current.append(next(it))
|
||||
yield current
|
||||
except StopIteration:
|
||||
if current:
|
||||
yield current
|
||||
break
|
||||
|
||||
async def __aiter__(self):
|
||||
if hasattr(self.iterable, "__iter__"):
|
||||
for item in self:
|
||||
yield item
|
||||
return
|
||||
|
||||
it = self.iterable.__aiter__()
|
||||
while True:
|
||||
current = []
|
||||
try:
|
||||
for _ in range(self.n):
|
||||
current.append(await it.__anext__())
|
||||
yield current
|
||||
except StopAsyncIteration:
|
||||
if len(current):
|
||||
yield current
|
||||
break
|
||||
|
||||
|
||||
async def tar_of_tracks(req, files, file_basename="tracks"):
|
||||
response = await req.respond(
|
||||
content_type="application/x-gtar",
|
||||
headers={
|
||||
"content-disposition": f'attachment; filename="{file_basename}.tar.bz2"'
|
||||
},
|
||||
)
|
||||
|
||||
helper = StreamerHelper(response)
|
||||
|
||||
tar = tarfile.open(name=None, fileobj=helper, mode="w|bz2", bufsize=256 * 512)
|
||||
|
||||
root = commonpath(list(files))
|
||||
for fname in files:
|
||||
log.info("Write file to tar: %s", fname)
|
||||
with open(fname, "rb") as fobj:
|
||||
tarinfo = tar.gettarinfo(fname)
|
||||
tarinfo.name = join(file_basename, relpath(fname, root))
|
||||
tar.addfile(tarinfo, fobj)
|
||||
await helper.send_all()
|
||||
tar.close()
|
||||
await helper.send_all()
|
||||
|
||||
await response.eof()
|
||||
|
||||
|
||||
class StreamerHelper:
|
||||
def __init__(self, response):
|
||||
self.response = response
|
||||
self.towrite = queue.Queue()
|
||||
|
||||
def write(self, data):
|
||||
self.towrite.put(data)
|
||||
|
||||
async def send_all(self):
|
||||
while True:
|
||||
try:
|
||||
tosend = self.towrite.get(block=False)
|
||||
await self.response.send(tosend)
|
||||
except queue.Empty:
|
||||
break
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def use_request_semaphore(req, semaphore_name, timeout=10):
|
||||
"""
|
||||
If configured, acquire a semaphore for the map tile request and release it
|
||||
after the context has finished.
|
||||
|
||||
If the semaphore cannot be acquired within the timeout, issue a 503 Service
|
||||
Unavailable error response that describes that the database is overloaded,
|
||||
so users know what the problem is.
|
||||
|
||||
Operates as a noop when the tile semaphore is not enabled.
|
||||
"""
|
||||
semaphore = getattr(req.app.ctx, semaphore_name, None)
|
||||
|
||||
if semaphore is None:
|
||||
yield
|
||||
return
|
||||
|
||||
try:
|
||||
await asyncio.wait_for(semaphore.acquire(), timeout)
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
semaphore.release()
|
||||
|
||||
except asyncio.TimeoutError:
|
||||
raise ServiceUnavailable(
|
||||
"Too many requests, database overloaded. Please retry later."
|
||||
)
|
|
@ -1,64 +1,28 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import math
|
||||
import sys
|
||||
import os
|
||||
import argparse
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
import coloredlogs
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.db import connect_db
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def format_size(n, b=1024):
|
||||
if n == 0:
|
||||
return "0 B"
|
||||
if n < 0:
|
||||
return "-" + format_size(n, b)
|
||||
e = math.floor(math.log(n, b))
|
||||
prefixes = ["", "Ki", "Mi", "Gi", "Ti"] if b == 1024 else ["", "K", "M", "G", "T"]
|
||||
e = min(e, len(prefixes) - 1)
|
||||
r = n / b**e
|
||||
s = f"{r:0.2f}" if e > 0 else str(n)
|
||||
return f"{s} {prefixes[e]}B"
|
||||
|
||||
|
||||
class AccessLogFilter(logging.Filter):
|
||||
def filter(self, record):
|
||||
if not record.msg:
|
||||
record.msg = (
|
||||
f"{record.request} - {record.status} ({format_size(record.byte)})"
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
def main():
|
||||
debug = app.config.DEBUG
|
||||
|
||||
coloredlogs.install(
|
||||
level=logging.DEBUG if app.config.get("VERBOSE", debug) else logging.INFO,
|
||||
milliseconds=True,
|
||||
isatty=True,
|
||||
logging.basicConfig(
|
||||
level=logging.DEBUG if debug else logging.INFO,
|
||||
format="%(levelname)s: %(message)s",
|
||||
)
|
||||
|
||||
for ln in ["sanic.root", "sanic.error", "sanic.access"]:
|
||||
l = logging.getLogger(ln)
|
||||
for h in list(l.handlers):
|
||||
l.removeHandler(h)
|
||||
|
||||
logging.getLogger("sanic.access").addFilter(AccessLogFilter())
|
||||
|
||||
app.run(
|
||||
host=app.config.HOST,
|
||||
port=app.config.PORT,
|
||||
debug=debug,
|
||||
auto_reload=app.config.get("AUTO_RELOAD", debug),
|
||||
access_log=True,
|
||||
)
|
||||
|
||||
|
||||
|
|
|
@ -1,191 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import sys
|
||||
import re
|
||||
import msgpack
|
||||
|
||||
import osmium
|
||||
import shapely.wkb as wkb
|
||||
from shapely.ops import transform
|
||||
|
||||
HIGHWAY_TYPES = {
|
||||
"trunk",
|
||||
"primary",
|
||||
"secondary",
|
||||
"tertiary",
|
||||
"unclassified",
|
||||
"residential",
|
||||
"trunk_link",
|
||||
"primary_link",
|
||||
"secondary_link",
|
||||
"tertiary_link",
|
||||
"living_street",
|
||||
"service",
|
||||
"track",
|
||||
"road",
|
||||
}
|
||||
ZONE_TYPES = {
|
||||
"urban",
|
||||
"rural",
|
||||
"motorway",
|
||||
}
|
||||
URBAN_TYPES = {
|
||||
"residential",
|
||||
"living_street",
|
||||
"road",
|
||||
}
|
||||
MOTORWAY_TYPES = {
|
||||
"motorway",
|
||||
"motorway_link",
|
||||
}
|
||||
|
||||
ADMIN_LEVEL_MIN = 2
|
||||
ADMIN_LEVEL_MAX = 8
|
||||
MINSPEED_RURAL = 60
|
||||
|
||||
ONEWAY_YES = {"yes", "true", "1"}
|
||||
ONEWAY_REVERSE = {"reverse", "-1"}
|
||||
|
||||
|
||||
def parse_number(tag):
|
||||
if not tag:
|
||||
return None
|
||||
|
||||
match = re.search(r"[0-9]+", tag)
|
||||
if not match:
|
||||
return None
|
||||
|
||||
digits = match.group(0)
|
||||
try:
|
||||
return int(digits)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def determine_zone(tags):
|
||||
highway = tags.get("highway")
|
||||
zone = tags.get("zone:traffic")
|
||||
|
||||
if zone is not None:
|
||||
if "rural" in zone:
|
||||
return "rural"
|
||||
|
||||
if "motorway" in zone:
|
||||
return "motorway"
|
||||
|
||||
return "urban"
|
||||
|
||||
# From here on we are guessing based on other tags
|
||||
|
||||
if highway in URBAN_TYPES:
|
||||
return "urban"
|
||||
|
||||
if highway in MOTORWAY_TYPES:
|
||||
return "motorway"
|
||||
|
||||
maxspeed_source = tags.get("source:maxspeed")
|
||||
if maxspeed_source and "rural" in maxspeed_source:
|
||||
return "rural"
|
||||
if maxspeed_source and "urban" in maxspeed_source:
|
||||
return "urban"
|
||||
|
||||
for key in ["maxspeed", "maxspeed:forward", "maxspeed:backward"]:
|
||||
maxspeed = parse_number(tags.get(key))
|
||||
if maxspeed is not None and maxspeed > MINSPEED_RURAL:
|
||||
return "rural"
|
||||
|
||||
# default to urban if we have no idea
|
||||
return "urban"
|
||||
|
||||
|
||||
def determine_direction(tags, zone):
|
||||
if (
|
||||
tags.get("oneway") in ONEWAY_YES
|
||||
or tags.get("junction") == "roundabout"
|
||||
or zone == "motorway"
|
||||
):
|
||||
return 1, True
|
||||
|
||||
if tags.get("oneway") in ONEWAY_REVERSE:
|
||||
return -1, True
|
||||
|
||||
return 0, False
|
||||
|
||||
|
||||
class StreamPacker:
|
||||
def __init__(self, stream, *args, **kwargs):
|
||||
self.stream = stream
|
||||
self.packer = msgpack.Packer(*args, autoreset=False, **kwargs)
|
||||
|
||||
def _write_out(self):
|
||||
if hasattr(self.packer, "getbuffer"):
|
||||
chunk = self.packer.getbuffer()
|
||||
else:
|
||||
chunk = self.packer.bytes()
|
||||
|
||||
self.stream.write(chunk)
|
||||
self.packer.reset()
|
||||
|
||||
def pack(self, *args, **kwargs):
|
||||
self.packer.pack(*args, **kwargs)
|
||||
self._write_out()
|
||||
|
||||
def pack_array_header(self, *args, **kwargs):
|
||||
self.packer.pack_array_header(*args, **kwargs)
|
||||
self._write_out()
|
||||
|
||||
def pack_map_header(self, *args, **kwargs):
|
||||
self.packer.pack_map_header(*args, **kwargs)
|
||||
self._write_out()
|
||||
|
||||
def pack_map_pairs(self, *args, **kwargs):
|
||||
self.packer.pack_map_pairs(*args, **kwargs)
|
||||
self._write_out()
|
||||
|
||||
|
||||
# A global factory that creates WKB from a osmium geometry
|
||||
wkbfab = osmium.geom.WKBFactory()
|
||||
|
||||
from pyproj import Transformer
|
||||
|
||||
project = Transformer.from_crs("EPSG:4326", "EPSG:3857", always_xy=True).transform
|
||||
|
||||
|
||||
class OSMHandler(osmium.SimpleHandler):
|
||||
def __init__(self, packer):
|
||||
self.packer = packer
|
||||
super().__init__()
|
||||
|
||||
def way(self, way):
|
||||
tags = way.tags
|
||||
|
||||
highway = tags.get("highway")
|
||||
if not highway or highway not in HIGHWAY_TYPES:
|
||||
return
|
||||
|
||||
access = tags.get("access", None)
|
||||
bicycle = tags.get("bicycle", None)
|
||||
if access == "no" and bicycle not in ["designated", "yes", "permissive", "destination"]:
|
||||
return
|
||||
|
||||
zone = determine_zone(tags)
|
||||
directionality, oneway = determine_direction(tags, zone)
|
||||
name = tags.get("name")
|
||||
|
||||
geometry = wkb.loads(wkbfab.create_linestring(way), hex=True)
|
||||
geometry = transform(project, geometry)
|
||||
geometry = wkb.dumps(geometry)
|
||||
self.packer.pack(
|
||||
[b"\x01", way.id, name, zone, directionality, oneway, geometry]
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
with open(sys.argv[2], "wb") as fout:
|
||||
packer = StreamPacker(fout)
|
||||
osmhandler = OSMHandler(packer)
|
||||
osmhandler.apply_file(sys.argv[1], locations=True)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
|
@ -1,22 +1,12 @@
|
|||
coloredlogs~=15.0.1
|
||||
sanic==22.6.2
|
||||
oic~=1.5.0
|
||||
sanic~=21.9.1
|
||||
oic~=1.3.0
|
||||
sanic-session~=0.8.0
|
||||
python-slugify~=6.1.2
|
||||
motor~=3.1.1
|
||||
pyyaml~=5.3.1
|
||||
sanicargs~=2.1.0
|
||||
sanic-cors~=1.0.1
|
||||
python-slugify~=5.0.2
|
||||
motor~=2.5.1
|
||||
pyyaml<6
|
||||
-e git+https://github.com/openmaptiles/openmaptiles-tools#egg=openmaptiles-tools
|
||||
sqlparse~=0.4.3
|
||||
sqlalchemy[asyncio]~=1.4.46
|
||||
asyncpg~=0.27.0
|
||||
pyshp~=2.3.1
|
||||
alembic~=1.9.4
|
||||
stream-zip~=0.0.50
|
||||
msgpack~=1.0.5
|
||||
osmium~=3.6.0
|
||||
psycopg~=3.1.8
|
||||
shapely~=2.0.1
|
||||
pyproj~=3.4.1
|
||||
aiohttp~=3.8.1
|
||||
# sanic requires websocets and chockes on >=10 in 2022.6.2
|
||||
websockets<11
|
||||
sqlparse~=0.4.2
|
||||
sqlalchemy[asyncio]~=1.4.25
|
||||
asyncpg~=0.24.0
|
||||
|
|
|
@ -1 +1 @@
|
|||
Subproject commit 664e4d606416417c0651ea1748d32dd36209be6a
|
||||
Subproject commit 8e9395fd3cd0f1e83b4413546bc2d3cb0c726738
|
18
api/setup.py
18
api/setup.py
|
@ -10,25 +10,19 @@ setup(
|
|||
packages=find_packages(),
|
||||
package_data={},
|
||||
install_requires=[
|
||||
"coloredlogs~=15.0.1",
|
||||
"sanic==22.6.2",
|
||||
"sanic~=21.9.1",
|
||||
"oic>=1.3.0, <2",
|
||||
"sanic-session~=0.8.0",
|
||||
"python-slugify>=5.0.2,<6.2.0",
|
||||
"motor>=2.5.1,<3.1.2",
|
||||
"pyyaml<6",
|
||||
"sqlparse~=0.4.3",
|
||||
"sanicargs~=2.1.0",
|
||||
"sanic-cors~=1.0.1",
|
||||
"python-slugify~=5.0.2",
|
||||
"motor~=2.5.1",
|
||||
"sqlparse~=0.4.2",
|
||||
"openmaptiles-tools", # install from git
|
||||
"pyshp>=2.2,<2.4",
|
||||
"sqlalchemy[asyncio]~=1.4.46",
|
||||
"asyncpg~=0.27.0",
|
||||
"alembic~=1.9.4",
|
||||
"stream-zip~=0.0.50",
|
||||
],
|
||||
entry_points={
|
||||
"console_scripts": [
|
||||
"openbikesensor-api=obs.bin.openbikesensor_api:main",
|
||||
"openbikesensor-transform-osm=obs.bin.openbikesensor_transform_osm:main",
|
||||
]
|
||||
},
|
||||
)
|
||||
|
|
|
@ -1,108 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
from dataclasses import dataclass
|
||||
import asyncio
|
||||
from os.path import basename, splitext
|
||||
import sys
|
||||
import logging
|
||||
|
||||
import msgpack
|
||||
import psycopg
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.utils import chunk
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
ROAD_BUFFER = 1000
|
||||
AREA_BUFFER = 100
|
||||
|
||||
|
||||
@dataclass
|
||||
class Road:
|
||||
way_id: int
|
||||
name: str
|
||||
zone: str
|
||||
directionality: int
|
||||
oneway: int
|
||||
geometry: bytes
|
||||
|
||||
|
||||
def read_file(filename):
|
||||
"""
|
||||
Reads a file iteratively, yielding
|
||||
appear. Those may be mixed.
|
||||
"""
|
||||
|
||||
with open(filename, "rb") as f:
|
||||
unpacker = msgpack.Unpacker(f)
|
||||
try:
|
||||
while True:
|
||||
type_id, *data = unpacker.unpack()
|
||||
|
||||
if type_id == b"\x01":
|
||||
yield Road(*data)
|
||||
|
||||
except msgpack.OutOfData:
|
||||
pass
|
||||
|
||||
|
||||
async def import_osm(connection, filename, import_group=None):
|
||||
if import_group is None:
|
||||
import_group = splitext(basename(filename))[0]
|
||||
|
||||
# Pass 1: Find IDs only
|
||||
road_ids = []
|
||||
for item in read_file(filename):
|
||||
road_ids.append(item.way_id)
|
||||
|
||||
async with connection.cursor() as cursor:
|
||||
log.info("Pass 1: Delete previously imported data")
|
||||
|
||||
log.debug("Delete import group %s", import_group)
|
||||
await cursor.execute(
|
||||
"DELETE FROM road WHERE import_group = %s", (import_group,)
|
||||
)
|
||||
|
||||
log.debug("Delete roads by way_id")
|
||||
for ids in chunk(road_ids, 10000):
|
||||
await cursor.execute("DELETE FROM road WHERE way_id = ANY(%s)", (ids,))
|
||||
|
||||
# Pass 2: Import
|
||||
log.info("Pass 2: Import roads")
|
||||
amount = 0
|
||||
for items in chunk(read_file(filename), 10000):
|
||||
amount += 10000
|
||||
log.info(f"...{amount}/{len(road_ids)} ({100*amount/len(road_ids)}%)")
|
||||
async with cursor.copy(
|
||||
"COPY road (way_id, name, zone, directionality, oneway, geometry, import_group) FROM STDIN"
|
||||
) as copy:
|
||||
for item in items:
|
||||
await copy.write_row(
|
||||
(
|
||||
item.way_id,
|
||||
item.name,
|
||||
item.zone,
|
||||
item.directionality,
|
||||
item.oneway,
|
||||
bytes.hex(item.geometry),
|
||||
import_group,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
async def main():
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
|
||||
url = app.config.POSTGRES_URL
|
||||
url = url.replace("+asyncpg", "")
|
||||
|
||||
async with await psycopg.AsyncConnection.connect(url) as connection:
|
||||
for filename in sys.argv[1:]:
|
||||
log.debug("Loading file: %s", filename)
|
||||
await import_osm(connection, filename)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
|
@ -1,93 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
This script downloads and/or imports regions for statistical analysis into the
|
||||
PostGIS database. The regions are sourced from:
|
||||
|
||||
* EU countries are covered by
|
||||
[NUTS](https://ec.europa.eu/eurostat/web/gisco/geodata/reference-data/administrative-units-statistical-units/nuts).
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from dataclasses import dataclass
|
||||
import json
|
||||
import asyncio
|
||||
from os.path import basename, splitext
|
||||
import sys
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
import aiohttp
|
||||
import psycopg
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.utils import chunk
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
NUTS_URL = "https://gisco-services.ec.europa.eu/distribution/v2/nuts/geojson/NUTS_RG_01M_2021_3857.geojson"
|
||||
|
||||
from pyproj import Transformer
|
||||
|
||||
project = Transformer.from_crs("EPSG:4326", "EPSG:3857", always_xy=True).transform
|
||||
from shapely.ops import transform
|
||||
from shapely.geometry import shape
|
||||
import shapely.wkb as wkb
|
||||
|
||||
|
||||
async def import_nuts(
|
||||
connection, filename=None, level: int = 3, import_group: Optional[str] = None
|
||||
):
|
||||
if import_group is None:
|
||||
import_group = f"nuts{level}"
|
||||
|
||||
if filename:
|
||||
log.info("Load NUTS from file")
|
||||
with open(filename) as f:
|
||||
data = json.load(f)
|
||||
else:
|
||||
log.info("Download NUTS regions from europa.eu")
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(NUTS_URL) as resp:
|
||||
data = await resp.json(content_type=None)
|
||||
|
||||
async with connection.cursor() as cursor:
|
||||
log.info(
|
||||
"Delete previously imported regions with import group %s", import_group
|
||||
)
|
||||
await cursor.execute(
|
||||
"DELETE FROM region WHERE import_group = %s", (import_group,)
|
||||
)
|
||||
|
||||
log.info("Import regions")
|
||||
async with cursor.copy(
|
||||
"COPY region (id, name, geometry, import_group) FROM STDIN"
|
||||
) as copy:
|
||||
for feature in data["features"]:
|
||||
if feature["properties"]["LEVL_CODE"] == level:
|
||||
geometry = shape(feature["geometry"])
|
||||
# geometry = transform(project, geometry)
|
||||
geometry = wkb.dumps(geometry)
|
||||
geometry = bytes.hex(geometry)
|
||||
await copy.write_row(
|
||||
(
|
||||
feature["properties"]["NUTS_ID"],
|
||||
feature["properties"]["NUTS_NAME"],
|
||||
geometry,
|
||||
import_group,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
async def main():
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
|
||||
url = app.config.POSTGRES_URL
|
||||
url = url.replace("+asyncpg", "")
|
||||
|
||||
async with await psycopg.AsyncConnection.connect(url) as connection:
|
||||
await import_nuts(connection, sys.argv[1] if len(sys.argv) > 1 else None)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
|
@ -6,12 +6,9 @@ import re
|
|||
import os
|
||||
import glob
|
||||
from os.path import normpath, abspath, join
|
||||
from typing import List, Tuple
|
||||
|
||||
|
||||
from sqlalchemy import text
|
||||
import sqlparse
|
||||
from openmaptiles.sqltomvt import MvtGenerator
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.db import connect_db, make_session
|
||||
|
@ -24,32 +21,6 @@ TILE_GENERATOR = normpath(
|
|||
)
|
||||
TILESET_FILE = join(TILE_GENERATOR, "openbikesensor.yaml")
|
||||
|
||||
EXTRA_ARGS = [
|
||||
# name, type, default
|
||||
("user_id", "integer", "NULL"),
|
||||
("min_time", "timestamp", "NULL"),
|
||||
("max_time", "timestamp", "NULL"),
|
||||
]
|
||||
|
||||
|
||||
class CustomMvtGenerator(MvtGenerator):
|
||||
def generate_sqltomvt_func(self, fname, extra_args: List[Tuple[str, str]]) -> str:
|
||||
"""
|
||||
Creates a SQL function that returns a single bytea value or null. This
|
||||
method is overridden to allow for custom arguments in the created function
|
||||
"""
|
||||
extra_args_types = "".join([f", {a[1]}" for a in extra_args])
|
||||
extra_args_definitions = "".join(
|
||||
[f", {a[0]} {a[1]} DEFAULT {a[2]}" for a in extra_args]
|
||||
)
|
||||
|
||||
return f"""\
|
||||
DROP FUNCTION IF EXISTS {fname}(integer, integer, integer{extra_args_types});
|
||||
CREATE FUNCTION {fname}(zoom integer, x integer, y integer{extra_args_definitions})
|
||||
RETURNS {'TABLE(mvt bytea, key text)' if self.key_column else 'bytea'} AS $$
|
||||
{self.generate_sql()};
|
||||
$$ LANGUAGE SQL STABLE CALLED ON NULL INPUT;"""
|
||||
|
||||
|
||||
def parse_pg_url(url=app.config.POSTGRES_URL):
|
||||
m = re.match(
|
||||
|
@ -68,10 +39,7 @@ def parse_pg_url(url=app.config.POSTGRES_URL):
|
|||
|
||||
async def main():
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
await prepare_sql_tiles()
|
||||
|
||||
|
||||
async def prepare_sql_tiles():
|
||||
with tempfile.TemporaryDirectory() as build_dir:
|
||||
await generate_data_yml(build_dir)
|
||||
sql_snippets = await generate_sql(build_dir)
|
||||
|
@ -143,20 +111,9 @@ async def generate_sql(build_dir):
|
|||
with open(filename, "rt") as f:
|
||||
sql_snippets.append(f.read())
|
||||
|
||||
mvt = CustomMvtGenerator(
|
||||
tileset=TILESET_FILE,
|
||||
postgis_ver="3.0.1",
|
||||
zoom="zoom",
|
||||
x="x",
|
||||
y="y",
|
||||
gzip=True,
|
||||
test_geometry=False, # ?
|
||||
key_column=True,
|
||||
getmvt_sql = await _run(
|
||||
f"python $(which generate-sqltomvt) {TILESET_FILE!r} --key --gzip --postgis-ver 3.0.1 --function --fname=getmvt"
|
||||
)
|
||||
getmvt_sql = mvt.generate_sqltomvt_func("getmvt", EXTRA_ARGS)
|
||||
|
||||
# drop old versions of the function
|
||||
sql_snippets.append("DROP FUNCTION IF EXISTS getmvt(integer, integer, integer);")
|
||||
sql_snippets.append(getmvt_sql)
|
||||
|
||||
return sql_snippets
|
||||
|
@ -164,11 +121,7 @@ async def generate_sql(build_dir):
|
|||
|
||||
async def import_sql(sql_snippets):
|
||||
statements = sum(map(sqlparse.split, sql_snippets), [])
|
||||
async with connect_db(
|
||||
app.config.POSTGRES_URL,
|
||||
app.config.POSTGRES_POOL_SIZE,
|
||||
app.config.POSTGRES_MAX_OVERFLOW,
|
||||
):
|
||||
async with connect_db(app.config.POSTGRES_URL):
|
||||
for i, statement in enumerate(statements):
|
||||
clean_statement = sqlparse.format(
|
||||
statement,
|
||||
|
|
|
@ -35,7 +35,7 @@ async def main():
|
|||
|
||||
args = parser.parse_args()
|
||||
|
||||
async with connect_db(app.config.POSTGRES_URL, app.config.POSTGRES_POOL_SIZE, app.config.POSTGRES_MAX_OVERFLOW):
|
||||
async with connect_db(app.config.POSTGRES_URL):
|
||||
if args.tracks:
|
||||
await process_tracks(args.tracks)
|
||||
else:
|
||||
|
|
|
@ -1,30 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
import logging
|
||||
import asyncio
|
||||
|
||||
from sqlalchemy import text
|
||||
|
||||
from obs.api.app import app
|
||||
from obs.api.db import connect_db, make_session
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
async def main():
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
await reimport_tracks()
|
||||
|
||||
|
||||
async def reimport_tracks():
|
||||
|
||||
async with connect_db(
|
||||
app.config.POSTGRES_URL,
|
||||
app.config.POSTGRES_POOL_SIZE,
|
||||
app.config.POSTGRES_MAX_OVERFLOW,
|
||||
):
|
||||
async with make_session() as session:
|
||||
await session.execute(text("UPDATE track SET processing_status = 'queued';"))
|
||||
await session.commit()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
|
@ -1,34 +1,18 @@
|
|||
#!/usr/bin/env python3
|
||||
import logging
|
||||
import asyncio
|
||||
import argparse
|
||||
|
||||
from obs.api.db import drop_all, init_models, connect_db
|
||||
from obs.api.db import init_models, connect_db
|
||||
from obs.api.app import app
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="drops the whole database, and possibly creates new table schema"
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"-s",
|
||||
"--create-schema",
|
||||
action="store_true",
|
||||
help="create the schema",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
|
||||
async with connect_db(app.config.POSTGRES_URL):
|
||||
await drop_all()
|
||||
if args.create_schema:
|
||||
await init_models()
|
||||
await init_models()
|
||||
log.info("Database initialized.")
|
||||
|
||||
|
||||
|
|
|
@ -1,6 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
from obs.bin.openbikesensor_transform_osm import main
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
|
@ -1,32 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
from prepare_sql_tiles import prepare_sql_tiles, _run
|
||||
|
||||
from import_regions import main as import_nuts
|
||||
|
||||
from reimport_tracks import main as reimport_tracks
|
||||
|
||||
|
||||
async def _migrate():
|
||||
await _run("alembic upgrade head")
|
||||
|
||||
|
||||
async def main():
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
log.info("Running migrations...")
|
||||
await _migrate()
|
||||
log.info("Preparing SQL tiles...")
|
||||
await prepare_sql_tiles()
|
||||
log.info("Importing nuts regions...")
|
||||
await import_nuts()
|
||||
log.info("Nuts regions imported, scheduling reimport of tracks")
|
||||
await reimport_tracks()
|
||||
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
|
@ -1,49 +0,0 @@
|
|||
###################################################
|
||||
# Keycloak
|
||||
###################################################
|
||||
|
||||
OBS_KEYCLOAK_URI=login.example.com
|
||||
|
||||
# Postgres
|
||||
|
||||
OBS_KEYCLOAK_POSTGRES_USER=obs
|
||||
OBS_KEYCLOAK_POSTGRES_PASSWORD=<<TODO>>
|
||||
OBS_KEYCLOAK_POSTGRES_DB=obs
|
||||
OBS_POSTGRES_MAX_OVERFLOW=20
|
||||
OBS_POSTGRES_POOL_SIZE=40
|
||||
|
||||
# KeyCloak
|
||||
|
||||
OBS_KEYCLOAK_POSTGRES_HOST=postgres-keycloak
|
||||
OBS_KEYCLOAK_ADMIN_USER=admin
|
||||
OBS_KEYCLOAK_ADMIN_PASSWORD=<<TODO>>
|
||||
OBS_KEYCLOAK_REALM=obs
|
||||
OBS_KEYCLOAK_PORTAL_REDIRECT_URI=https://portal.example.com/*
|
||||
|
||||
###################################################
|
||||
# Portal
|
||||
###################################################
|
||||
|
||||
OBS_PORTAL_URI=portal.example.com
|
||||
|
||||
# Postgres + osm2pgsql
|
||||
|
||||
OBS_POSTGRES_HOST=postgres
|
||||
OBS_POSTGRES_USER=obs
|
||||
OBS_POSTGRES_PASSWORD=<<TODO>>
|
||||
OBS_POSTGRES_DB=obs
|
||||
|
||||
# Portal
|
||||
|
||||
OBS_HOST=0.0.0.0
|
||||
OBS_PORT=3000
|
||||
OBS_SECRET=<<TODO>>
|
||||
OBS_POSTGRES_URL=postgresql+asyncpg://obs:<<TODO>>@postgres/obs
|
||||
OBS_KEYCLOAK_URL=https://login.example.com/auth/realms/obs/
|
||||
OBS_KEYCLOAK_CLIENT_ID=portal
|
||||
OBS_KEYCLOAK_CLIENT_SECRET=<<TODO>>
|
||||
OBS_DEDICATED_WORKER="True"
|
||||
OBS_DATA_DIR=/data
|
||||
OBS_PROXIES_COUNT=1
|
||||
|
||||
###################################################
|
205
deployment/README.md
Normal file
205
deployment/README.md
Normal file
|
@ -0,0 +1,205 @@
|
|||
# Deploying an OpenBikeSensor Portal with Docker
|
||||
|
||||
## Introduction
|
||||
|
||||
The main idea of this document is to provide an easy docker-based
|
||||
production-ready setup of the openbikesensor portal. It uses the [the traefik
|
||||
proxy](https://doc.traefik.io/traefik/) as a reverse proxy, which listens
|
||||
on port 80 and 443. Based on some labels, traefik routes the domains to the
|
||||
corresponding docker containers.
|
||||
|
||||
## Before Getting Started
|
||||
|
||||
The guide and example configuration assumes one domain, which points to the
|
||||
server's IP address. This documentation uses `portal.example.com` as an
|
||||
example. The API is hosted at `https://portal.example.com/api`, while the main
|
||||
frontend is reachable at the domain root.
|
||||
|
||||
## Setup instructions
|
||||
|
||||
### Clone the repository
|
||||
|
||||
First create a folder somewhere in your system, in the example we use
|
||||
`/opt/openbikesensor` and export it as `$ROOT` to more easily refer to it.
|
||||
|
||||
Clone the repository to `$ROOT/source`.
|
||||
|
||||
```bash
|
||||
export ROOT=/opt/openbikesensor
|
||||
mkdir -p $ROOT
|
||||
cd $ROOT
|
||||
git clone --recursive https://github.com/openbikesensor/portal source/
|
||||
# If you accidentally cloned without --recursive, fix it by running:
|
||||
# git submodule update --init --recursive
|
||||
```
|
||||
|
||||
Unless otherwise mentioned, commands below assume your current working
|
||||
directory to be `$ROOT`.
|
||||
|
||||
|
||||
### Configure `traefik.toml`
|
||||
|
||||
```bash
|
||||
mkdir -p config/
|
||||
cp source/deployment/examples/traefik.toml config/traefik.toml
|
||||
vim config/traefik.toml
|
||||
```
|
||||
|
||||
Configure your email in the `config/traefik.toml`. This email is used by
|
||||
*Let's Encrypt* to send you some emails regarding your certificates.
|
||||
|
||||
|
||||
### Configure `docker-compose.yaml`
|
||||
|
||||
```bash
|
||||
cp source/deployment/examples/docker-compose.yaml docker-compose.yaml
|
||||
vim docker-compose.yaml
|
||||
```
|
||||
|
||||
* Change the domain where it occurs, such as in `Host()` rules.
|
||||
* Generate a secure password for the PostgreSQL database user. You will need to
|
||||
configure this in the application later.
|
||||
|
||||
|
||||
### Create a keycloak instance
|
||||
|
||||
Follow the [official guides](https://www.keycloak.org/documentation) to create
|
||||
your own keycloak server. You can run the keycloak in docker and include it in
|
||||
your `docker-compose.yaml`, if you like.
|
||||
|
||||
Documenting the details of this is out of scope for our project. Please make
|
||||
sure to configure:
|
||||
|
||||
* An admin account for yourself
|
||||
* A realm for the portal
|
||||
* A client in that realm with "Access Type" set to "confidential" and a
|
||||
redirect URL of this pattern: `https://portal.example.com/login/redirect`
|
||||
|
||||
|
||||
### Prepare database
|
||||
|
||||
Follow the procedure outlined in [README.md](../README.md) under "Prepare
|
||||
database". Whenever the docker-compose service `api` is referenced, replace it
|
||||
with `portal`, which contains the same python code as the development `api`
|
||||
service, but also the frontend. For example:
|
||||
|
||||
```bash
|
||||
# development
|
||||
docker-compose run --rm api tools/prepare_sql_tiles.py
|
||||
# production
|
||||
docker-compose run --rm portal tools/prepare_sql_tiles.py
|
||||
```
|
||||
|
||||
### Import OpenStreetMap data
|
||||
|
||||
Follow the procedure outlined in [README.md](../README.md) under "Import OpenStreetMap data".
|
||||
|
||||
|
||||
### Configure portal
|
||||
|
||||
```bash
|
||||
cp source/api/config.py.example config/config.py
|
||||
```
|
||||
|
||||
Then edit `config/config.py` to your heart's content (and matching the
|
||||
configuration of the keycloak). Do not forget to generate a secure secret
|
||||
string.
|
||||
|
||||
Also set `PROXIES_COUNT = 1` in your config, even if that option is not
|
||||
included in the example file. Read the
|
||||
[Sanic docs](https://sanicframework.org/en/guide/advanced/proxy-headers.html)
|
||||
for why this needs to be done. If your reverse proxy supports it, you can also
|
||||
use a forwarded secret to secure your proxy target from spoofing. This is not
|
||||
required if your application server does not listen on a public interface, but
|
||||
it is recommended anyway, if possible.
|
||||
|
||||
### Build container and run them
|
||||
|
||||
```bash
|
||||
docker-compose build portal
|
||||
docker-compose up -d portal
|
||||
```
|
||||
|
||||
## Running a dedicated worker
|
||||
|
||||
Extend your `docker-compose.yaml` with the following service:
|
||||
|
||||
```yaml
|
||||
worker:
|
||||
image: openbikesensor-portal
|
||||
build:
|
||||
context: ./source
|
||||
volumes:
|
||||
- ./data/api-data:/data
|
||||
- ./config/config.py:/opt/obs/api/config.py
|
||||
restart: on-failure
|
||||
links:
|
||||
- postgres
|
||||
networks:
|
||||
- backend
|
||||
command:
|
||||
- python
|
||||
- tools/process_track.py
|
||||
```
|
||||
|
||||
Change the `DEDICATED_WORKER` option in your config to `True` to stop
|
||||
processing tracks in the portal container. Then restart the `portal` service
|
||||
and start the `worker` service.
|
||||
|
||||
## Miscellaneous
|
||||
|
||||
### Logs
|
||||
|
||||
To read logs, run
|
||||
|
||||
```bash
|
||||
docker-compose logs -f
|
||||
```
|
||||
|
||||
If something went wrong, you can reconfigure your config files and rerun:
|
||||
|
||||
```bash
|
||||
docker-compose build
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
### Updates
|
||||
|
||||
Before updating make sure that you have properly backed-up your instance so you
|
||||
can always roll back to a pre-update state.
|
||||
|
||||
### Backups
|
||||
|
||||
To backup your instances private data you only need to backup the ``$ROOT`` folder.
|
||||
This should contain everything needed to start your instance again, no persistent
|
||||
data lives in docker containers. You should stop the containers for a clean backup.
|
||||
|
||||
This backup contains the imported OSM data as well. That is of course a lot of
|
||||
redundant data, but very nice to have for a quick restore operation. If you
|
||||
want to generate smaller, nonredundant backups, or backups during live
|
||||
operation of the database, use a tool like `pg_dump` and extract only the
|
||||
required tables:
|
||||
|
||||
* `overtaking_event`
|
||||
* `track`
|
||||
* `user` (make sure to reference `public.user`, not the postgres user table)
|
||||
* `comment`
|
||||
|
||||
You might also instead use the `--exclude-table` option to ignore the `road`
|
||||
table only (adjust connection parameters and names):
|
||||
|
||||
```bash
|
||||
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
|
||||
```
|
||||
|
||||
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
|
||||
directory. The processed data can be regenerated, but you can also back that
|
||||
up, from `local/api-data/processing-output`.
|
||||
|
||||
Finally, make sure to create a backup of your keycloak instance. Refer to the
|
||||
keycloak documentation for how to export its data in a restorable way. This
|
||||
should work very well if you are storing keycloak data in the PostgreSQL and
|
||||
exporting that with an exclusion pattern instead of an explicit list.
|
||||
|
||||
And then, please test your backup and restore strategy before going live, or at
|
||||
least before you need it!
|
|
@ -1,63 +0,0 @@
|
|||
# Bind address of the server
|
||||
# HOST = "127.0.0.1"
|
||||
# PORT = 3000
|
||||
|
||||
# Extended log output, but slower
|
||||
DEBUG = False
|
||||
VERBOSE = DEBUG
|
||||
AUTO_RELOAD = DEBUG
|
||||
|
||||
# Required to encrypt or sign sessions, cookies, tokens, etc.
|
||||
# SECRET = "!!!<<<CHANGEME>>>!!!"
|
||||
|
||||
# Connection to the database
|
||||
# POSTGRES_URL = "postgresql+asyncpg://user:pass@host/dbname"
|
||||
# POSTGRES_POOL_SIZE = 20
|
||||
# POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
|
||||
|
||||
# URL to the keycloak realm, as reachable by the API service. This is not
|
||||
# necessarily its publicly reachable URL, keycloak advertises that iself.
|
||||
# KEYCLOAK_URL = "http://localhost:1234/auth/realms/obs/"
|
||||
|
||||
# Auth client credentials
|
||||
# KEYCLOAK_CLIENT_ID = "portal"
|
||||
# KEYCLOAK_CLIENT_SECRET = "00000000-0000-0000-0000-000000000000"
|
||||
|
||||
# Whether the API should run the worker loop, or a dedicated worker is used
|
||||
# DEDICATED_WORKER = True
|
||||
|
||||
# The root of the frontend. Needed for redirecting after login, and for CORS.
|
||||
# Set to None if frontend is served by the API.
|
||||
FRONTEND_URL = None
|
||||
FRONTEND_HTTPS = True
|
||||
|
||||
# Where to find the compiled frontend assets (must include index.html), or None
|
||||
# to disable serving the frontend.
|
||||
FRONTEND_DIR = "../frontend/build/"
|
||||
|
||||
# Can be an object or a JSON string
|
||||
FRONTEND_CONFIG = {
|
||||
"imprintUrl": "https://example.com/imprint",
|
||||
"privacyPolicyUrl": "https://example.com/privacy",
|
||||
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
|
||||
"banner": {"text": "This is a test installation.", "style": "warning"},
|
||||
}
|
||||
|
||||
# If the API should serve generated tiles, this is the path where the tiles are
|
||||
# built. This is an experimental option and probably very inefficient, a proper
|
||||
# tileserver should be prefered. Set to None to disable.
|
||||
TILES_FILE = None
|
||||
|
||||
# Path overrides:
|
||||
# API_ROOT_DIR = "??" # default: api/ inside repository
|
||||
# DATA_DIR = "??" # default: $API_ROOT_DIR/..
|
||||
# PROCESSING_DIR = "??" # default: DATA_DIR/processing
|
||||
# PROCESSING_OUTPUT_DIR = "??" # default: DATA_DIR/processing-output
|
||||
# TRACKS_DIR = "??" # default: DATA_DIR/tracks
|
||||
# OBS_FACE_CACHE_DIR = "??" # default: DATA_DIR/obs-face-cache
|
||||
|
||||
# Additional allowed origins for CORS headers. The FRONTEND_URL is included by
|
||||
# default. Python list, or whitespace separated string.
|
||||
ADDITIONAL_CORS_ORIGINS = None
|
||||
|
||||
# vim: set ft=python :
|
|
@ -1,22 +0,0 @@
|
|||
|
||||
events {}
|
||||
http {
|
||||
proxy_cache_path /data/nginx/cache levels=1:2 keys_zone=STATIC:10m
|
||||
inactive=24h max_size=1g;
|
||||
server {
|
||||
location ~* ^/tiles/\d[012]?/[^?]+$ {
|
||||
proxy_pass http://portal:3000;
|
||||
proxy_set_header Host $host:3000;
|
||||
proxy_buffering on;
|
||||
proxy_cache_methods GET HEAD;
|
||||
proxy_cache STATIC;
|
||||
proxy_cache_valid 200 1d;
|
||||
proxy_cache_use_stale error timeout invalid_header updating
|
||||
http_500 http_502 http_503 http_504;
|
||||
}
|
||||
location / {
|
||||
proxy_pass http://portal:3000;
|
||||
proxy_set_header Host $host:3000;
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,150 +0,0 @@
|
|||
version: '3.5'
|
||||
|
||||
networks:
|
||||
gateway:
|
||||
external: true
|
||||
name: gateway
|
||||
backend:
|
||||
internal: true
|
||||
|
||||
services:
|
||||
|
||||
############################################################
|
||||
# Portal
|
||||
############################################################
|
||||
|
||||
postgres:
|
||||
image: "openmaptiles/postgis:7.0"
|
||||
environment:
|
||||
- POSTGRES_DB=${OBS_POSTGRES_DB}
|
||||
- POSTGRES_USER=${OBS_POSTGRES_USER}
|
||||
- POSTGRES_PASSWORD=${OBS_POSTGRES_PASSWORD}
|
||||
volumes:
|
||||
- ./data/postgres/data:/var/lib/postgresql/data
|
||||
networks:
|
||||
- backend
|
||||
|
||||
portal:
|
||||
image: openbikesensor-portal
|
||||
build:
|
||||
context: ./source
|
||||
env_file: .env
|
||||
volumes:
|
||||
- ./data/api-data:${OBS_DATA_DIR}
|
||||
- ./config/config.py:/opt/obs/api/config.py
|
||||
- ./data/tiles/:/tiles
|
||||
- ./data/pbf/:/pbf
|
||||
restart: on-failure
|
||||
depends_on:
|
||||
- traefik
|
||||
- postgres
|
||||
- worker
|
||||
# - keycloak
|
||||
labels:
|
||||
- traefik.http.routers.portal.rule=Host(`${OBS_PORTAL_URI}`)
|
||||
- traefik.http.routers.portal.entrypoints=websecure
|
||||
- traefik.http.routers.portal.tls=true
|
||||
- traefik.http.routers.portal.tls.certresolver=leresolver
|
||||
- traefik.docker.network=gateway
|
||||
# - traefik.http.services.portal.loadbalancer.server.port=3000
|
||||
networks:
|
||||
- gateway
|
||||
- backend
|
||||
|
||||
worker:
|
||||
image: openbikesensor-portal
|
||||
build:
|
||||
context: ./source
|
||||
env_file: .env
|
||||
volumes:
|
||||
- ./data/api-data:${OBS_DATA_DIR}
|
||||
- ./config/config.py:/opt/obs/api/config.py
|
||||
restart: on-failure
|
||||
depends_on:
|
||||
- postgres
|
||||
networks:
|
||||
- backend
|
||||
command:
|
||||
- python
|
||||
- tools/process_track.py
|
||||
|
||||
############################################################
|
||||
# Traefik
|
||||
############################################################
|
||||
|
||||
traefik:
|
||||
image: traefik:2.4.8
|
||||
restart: always
|
||||
ports:
|
||||
- "80:80"
|
||||
- "443:443"
|
||||
# The Web UI (enabled by [api] in traefik.toml)
|
||||
# - "8080:8080"
|
||||
|
||||
volumes:
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
- ./config/traefik.toml:/traefik.toml
|
||||
- ./config/usersfile:/usersfile
|
||||
- ./config/acme:/acme
|
||||
|
||||
networks:
|
||||
- gateway
|
||||
|
||||
labels:
|
||||
# global redirect from http to https
|
||||
- "traefik.http.routers.http-catchall.rule=hostregexp(`{host:.+}`)"
|
||||
- "traefik.http.routers.http-catchall.entrypoints=web"
|
||||
# Define middlewares to be used
|
||||
- "traefik.http.routers.http-catchall.middlewares=redirect-http-to-https"
|
||||
# Configure middlewares
|
||||
- "traefik.http.middlewares.redirect-http-to-https.redirectscheme.scheme=https"
|
||||
|
||||
############################################################
|
||||
# Keycloak
|
||||
############################################################
|
||||
|
||||
keycloak:
|
||||
image: jboss/keycloak:15.1.0
|
||||
restart: always
|
||||
networks:
|
||||
- gateway
|
||||
- backend
|
||||
env_file: .env
|
||||
environment:
|
||||
# database
|
||||
- DB_VENDOR=postgres
|
||||
- DB_ADDR=${OBS_KEYCLOAK_POSTGRES_HOST}
|
||||
- DB_DATABASE=${OBS_KEYCLOAK_POSTGRES_DB}
|
||||
- DB_USER=${OBS_KEYCLOAK_POSTGRES_USER}
|
||||
- DB_PASSWORD=${OBS_KEYCLOAK_POSTGRES_PASSWORD}
|
||||
# admin user
|
||||
- KEYCLOAK_USER=${OBS_KEYCLOAK_ADMIN_USER}
|
||||
- KEYCLOAK_PASSWORD=${OBS_KEYCLOAK_ADMIN_PASSWORD}
|
||||
- PROXY_ADDRESS_FORWARDING=true
|
||||
- OBS_KEYCLOAK_PORTAL_REDIRECT_URI=${OBS_KEYCLOAK_PORTAL_REDIRECT_URI}
|
||||
depends_on:
|
||||
- traefik
|
||||
- postgres-keycloak
|
||||
labels:
|
||||
- "traefik.http.routers.login.rule=Host(`${OBS_KEYCLOAK_URI}`)"
|
||||
- "traefik.http.routers.login.entrypoints=websecure"
|
||||
- "traefik.http.routers.login.tls=true"
|
||||
- "traefik.http.routers.login.tls.certresolver=leresolver"
|
||||
# This container runs on two ports (8080/tcp, 8443/tcp). Tell traefik, which one to use.
|
||||
- "traefik.http.services.login.loadbalancer.server.port=8080"
|
||||
# This container runs on more than one network. Tell traefik, which one to use.
|
||||
- "traefik.docker.network=gateway"
|
||||
|
||||
postgres-keycloak:
|
||||
image: postgres:15
|
||||
restart: always
|
||||
networks:
|
||||
- backend
|
||||
volumes:
|
||||
- ./data/postgres-keycloak:/var/lib/postgresql/data
|
||||
environment:
|
||||
- POSTGRES_DB=${OBS_KEYCLOAK_POSTGRES_DB}
|
||||
- POSTGRES_USER=${OBS_KEYCLOAK_POSTGRES_USER}
|
||||
- POSTGRES_PASSWORD=${OBS_KEYCLOAK_POSTGRES_PASSWORD}
|
||||
labels:
|
||||
- traefik.enable=false
|
78
deployment/examples/docker-compose.yaml
Normal file
78
deployment/examples/docker-compose.yaml
Normal file
|
@ -0,0 +1,78 @@
|
|||
version: '3'
|
||||
|
||||
networks:
|
||||
gateway:
|
||||
external: true
|
||||
name: gateway
|
||||
backend:
|
||||
internal: true
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: "openmaptiles/postgis:6.0"
|
||||
environment:
|
||||
POSTGRES_USER: obs
|
||||
POSTGRES_PASSWORD: obs
|
||||
POSTGRES_DB: obs
|
||||
volumes:
|
||||
- ./data/postgres/data:/var/lib/postgresql/data
|
||||
|
||||
portal:
|
||||
image: openbikesensor-portal
|
||||
build:
|
||||
context: ./source
|
||||
volumes:
|
||||
- ./data/api-data:/data
|
||||
- ./config/config.py:/opt/obs/api/config.py
|
||||
- ./data/tiles/:/tiles
|
||||
restart: on-failure
|
||||
depends_on:
|
||||
- postgres
|
||||
# if you introduce a dockerized keycloak instance within this compose also:
|
||||
# - keycloak
|
||||
labels:
|
||||
- traefik.http.routers.portal.rule=Host(`portal.example.com`)
|
||||
- traefik.http.routers.portal.entrypoints=websecure
|
||||
- traefik.http.routers.portal.tls=true
|
||||
- traefik.http.routers.portal.tls.certresolver=leresolver
|
||||
- traefik.docker.network=gateway
|
||||
- traefik.http.services.whoami.loadbalancer.server.port=80
|
||||
networks:
|
||||
- gateway
|
||||
- backend
|
||||
|
||||
traefik:
|
||||
image: traefik:2.4.8
|
||||
restart: always
|
||||
ports:
|
||||
- "80:80"
|
||||
- "443:443"
|
||||
# The Web UI (enabled by [api] in traefik.toml)
|
||||
# - "8080:8080"
|
||||
|
||||
volumes:
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
- ./config/traefik.toml:/traefik.toml
|
||||
- ./config/usersfile:/usersfile
|
||||
- ./config/acme:/acme
|
||||
|
||||
networks:
|
||||
- gateway
|
||||
|
||||
labels:
|
||||
# global redirect from http to https
|
||||
- "traefik.http.routers.http-catchall.rule=hostregexp(`{host:.+}`)"
|
||||
- "traefik.http.routers.http-catchall.entrypoints=web"
|
||||
# Define middlewares to be used
|
||||
- "traefik.http.routers.http-catchall.middlewares=redirect-http-to-https"
|
||||
# Configure middlewares
|
||||
- "traefik.http.middlewares.redirect-http-to-https.redirectscheme.scheme=https"
|
||||
|
||||
# Show Traefik Dashboard. Enable the dashboard in traefik.toml if you use these.
|
||||
# - "traefik.http.routers.traefik.rule=Host(`traefik.example.com`)"
|
||||
# - "traefik.http.routers.traefik.service=api@internal"
|
||||
# - "traefik.http.routers.traefik.tls=true"
|
||||
# - "traefik.http.routers.traefik.entrypoints=websecure"
|
||||
# - "traefik.http.routers.traefik.tls.certresolver=leresolver"
|
||||
# - "traefik.http.routers.traefik.middlewares=basic-auth"
|
||||
# - "traefik.http.middlewares.basic-auth.basicauth.usersfile=/usersfile"
|
|
@ -8,7 +8,7 @@ version: '3'
|
|||
|
||||
services:
|
||||
postgres:
|
||||
image: "openmaptiles/postgis:7.0"
|
||||
image: "openmaptiles/postgis:6.0"
|
||||
environment:
|
||||
POSTGRES_USER: obs
|
||||
POSTGRES_PASSWORD: obs
|
||||
|
@ -20,7 +20,6 @@ services:
|
|||
|
||||
api:
|
||||
image: openbikesensor-api
|
||||
tty: true
|
||||
build:
|
||||
context: ./api/
|
||||
dockerfile: Dockerfile
|
||||
|
@ -29,15 +28,10 @@ services:
|
|||
- ./api/scripts/obs:/opt/obs/scripts/obs
|
||||
- ./api/tools:/opt/obs/api/tools
|
||||
- ./api/config.dev.py:/opt/obs/api/config.py
|
||||
- ./api/config.overrides.py:/opt/obs/api/config.overrides.py
|
||||
- ./frontend/build:/opt/obs/frontend/build
|
||||
- ./tile-generator:/opt/obs/tile-generator
|
||||
- ./local/api-data:/data
|
||||
- ./tile-generator/data/:/tiles
|
||||
- ./api/migrations:/opt/obs/api/migrations
|
||||
- ./api/alembic.ini:/opt/obs/api/alembic.ini
|
||||
- ./local/pbf:/pbf
|
||||
- ./local/obsdata:/obsdata
|
||||
depends_on:
|
||||
- postgres
|
||||
- keycloak
|
||||
|
@ -49,7 +43,6 @@ services:
|
|||
|
||||
worker:
|
||||
image: openbikesensor-api
|
||||
tty: true
|
||||
build:
|
||||
context: ./api/
|
||||
dockerfile: Dockerfile
|
||||
|
@ -58,7 +51,6 @@ services:
|
|||
- ./api/scripts/obs:/opt/obs/scripts/obs
|
||||
- ./api/tools:/opt/obs/api/tools
|
||||
- ./api/config.dev.py:/opt/obs/api/config.py
|
||||
- ./api/config.overrides.py:/opt/obs/api/config.overrides.py
|
||||
- ./local/api-data:/data
|
||||
depends_on:
|
||||
- postgres
|
||||
|
|
|
@ -1,103 +0,0 @@
|
|||
# Importing OpenStreetMap data
|
||||
|
||||
The application requires a lot of data from the OpenStreetMap to work.
|
||||
|
||||
The required information is stored in the PostgreSQL database and used when
|
||||
processing tracks, as well as for vector tile generation. The process applies
|
||||
to both development and production setups. For development, you should choose a
|
||||
small area for testing, such as your local county or city, to keep the amount
|
||||
of data small. For production use you have to import the whole region you are
|
||||
serving.
|
||||
|
||||
## General pipeline overview
|
||||
|
||||
1. Download OpenStreetMap data as one or more `.osm.pbf` files.
|
||||
2. Transform this data to generate geometry data for all roads and regions, so
|
||||
we don't need to look up nodes separately. This step requires a lot of CPU
|
||||
and memory, so it can be done "offline" on a high power machine.
|
||||
3. Import the transformed data into the PostgreSQL/PostGIS database.
|
||||
|
||||
## Community hosted transformed data
|
||||
|
||||
Since the first two steps are the same for everybody, the community will soon
|
||||
provide a service where relatively up-to-date transformed data can be
|
||||
downloaded for direct import. Stay tuned.
|
||||
|
||||
## Download data
|
||||
|
||||
[GeoFabrik](https://download.geofabrik.de) kindly hosts extracts of the
|
||||
OpenStreetMap planet by region. Download all regions you're interested in from
|
||||
there in `.osm.pbf` format, with the tool of your choice, e. g.:
|
||||
|
||||
```bash
|
||||
wget -P local/pbf/ https://download.geofabrik.de/europe/germany/baden-wuerttemberg-latest.osm.pbf
|
||||
```
|
||||
|
||||
## Transform data
|
||||
|
||||
To transform downloaded data, you can either use the docker image from a
|
||||
development or production environment, or locally install the API into your
|
||||
python environment. Then run the `api/tools/transform_osm.py` script on the data.
|
||||
|
||||
```bash
|
||||
api/tools/transform_osm.py baden-wuerttemberg-latest.osm.pbf baden-wuerttemberg-latest.msgpack
|
||||
```
|
||||
|
||||
In dockerized setups, make sure to mount your data somewhere in the container
|
||||
and also mount a directory where the result can be written. The development
|
||||
setup takes care of this, so you can use:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm api tools/transform_osm.py \
|
||||
/pbf/baden-wuerttemberg-latest.osm.pbf /obsdata/baden-wuerttemberg-latest.msgpack
|
||||
```
|
||||
|
||||
Repeat this command for every file you want to transform.
|
||||
|
||||
## Import transformed data
|
||||
|
||||
The command for importing looks like this:
|
||||
|
||||
```bash
|
||||
api/tools/import_osm.py baden-wuerttemberg-latest.msgpack
|
||||
```
|
||||
|
||||
This tool reads your application config from `config.py`, so set that up first
|
||||
as if you were setting up your application.
|
||||
|
||||
In dockerized setups, make sure to mount your data somewhere in the container.
|
||||
Again, the development setup takes care of this, so you can use:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm api tools/import_osm.py \
|
||||
/obsdata/baden-wuerttemberg-latest.msgpack
|
||||
```
|
||||
|
||||
The transform process should take a few seconds to minutes, depending on the area
|
||||
size. You can run the process multiple times, with the same or different area
|
||||
files, to import or update the data. You can update only one region and leave
|
||||
the others as they are, or add more filenames to the command line to
|
||||
bulk-import data.
|
||||
|
||||
## How this works
|
||||
|
||||
* The transformation is done with a python script that uses
|
||||
[pyosmium](https://osmcode.org/pyosmium/) to read the `.osm.pbf` file. This
|
||||
script then filters the data for only the required objects (such as road
|
||||
segments and administrative areas), and extracts the interesting information
|
||||
from those objects.
|
||||
* The node geolocations are looked up to generate a geometry for each object.
|
||||
This requires a lot of memory to run efficiently.
|
||||
* The geometry is projected to [Web Mercator](https://epsg.io/3857) in this
|
||||
step to avoid continous transformation when tiles are generated later. Most
|
||||
operations will work fine in this projection. Projection is done with the
|
||||
[pyproj](https://pypi.org/project/pyproj/) library.
|
||||
* The output is written to a binary file in a very simple format using
|
||||
[msgpack](https://github.com/msgpack/msgpack-python), which is way more
|
||||
efficient that (Geo-)JSON for example. This format is stremable, so the
|
||||
generated file is never fully written or read into memory.
|
||||
* The import script reads the msgpack file and sends it to the database using
|
||||
[psycopg](https://www.psycopg.org/). This is done because it supports
|
||||
PostgreSQL's `COPY FROM` statement, which enables much faster writes to the
|
||||
database that a traditionional `INSERT VALUES`. The file is streamed directly
|
||||
to the database, so it is never read into memory.
|
|
@ -1,414 +0,0 @@
|
|||
# Deploying an OpenBikeSensor Portal with Docker
|
||||
|
||||
## Introduction
|
||||
|
||||
The main idea of this document is to provide an easy docker-based
|
||||
production-ready setup of the openbikesensor portal. It uses the [the traefik
|
||||
proxy](https://doc.traefik.io/traefik/) as a reverse proxy, which listens
|
||||
on port 80 and 443. Based on some labels, traefik routes the domains to the
|
||||
corresponding docker containers.
|
||||
|
||||
## Requirements
|
||||
|
||||
This guide requires a Linux-system, where `docker` and `docker-compose` are installed.
|
||||
Ensure, that your system is up to date.
|
||||
|
||||
> TODO
|
||||
|
||||
```bash
|
||||
apt install docker.io docker-compose pwgen
|
||||
```
|
||||
|
||||
## Before Getting Started
|
||||
|
||||
The example configurations assume two domains, which points to the
|
||||
server's IP address. This documentation uses `portal.example.com` and
|
||||
`login.example.com`. The API is hosted at `https://portal.example.com/api`,
|
||||
while the main frontend is reachable at the domain root.
|
||||
|
||||
## Setup instructions
|
||||
|
||||
First of all, login into your system via SSH.
|
||||
|
||||
### Create working directory
|
||||
|
||||
Create a folder somewhere in your system, in this guide we use
|
||||
`/opt/openbikesensor`:
|
||||
|
||||
```bash
|
||||
mkdir /opt/openbikesensor
|
||||
```
|
||||
|
||||
### Clone the repository
|
||||
|
||||
Clone the repository to `/opt/openbikesensor/`:
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
git clone --recursive https://github.com/openbikesensor/portal source/
|
||||
# If you accidentally cloned without --recursive, fix it by running:
|
||||
# git submodule update --init --recursive
|
||||
```
|
||||
|
||||
### Copy predefined configuration files
|
||||
|
||||
```bash
|
||||
mkdir -p /opt/openbikesensor/config
|
||||
cd /opt/openbikesensor/
|
||||
cp -r source/deployment/config source/deployment/docker-compose.yaml source/deployment/.env .
|
||||
```
|
||||
|
||||
### Create a Docker network
|
||||
|
||||
```bash
|
||||
docker network create gateway
|
||||
```
|
||||
|
||||
### Traefik
|
||||
|
||||
#### Configure `traefik.toml`
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
nano config/traefik.toml
|
||||
```
|
||||
|
||||
Configure your email in the `config/traefik.toml`. This email is used by
|
||||
*Let's Encrypt* to send you some emails regarding your certificates.
|
||||
|
||||
#### Start Traefik
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
docker-compose up -d traefik
|
||||
docker-compose logs -f traefik
|
||||
```
|
||||
|
||||
> traefik_1 | time="2022-01-03T13:02:36Z" level=info msg="Configuration loaded from file: /traefik.toml"
|
||||
|
||||
### Generate passwords
|
||||
|
||||
Generate three passords, for example with `pwgen`:
|
||||
|
||||
```bash
|
||||
pwgen -n 20
|
||||
```
|
||||
|
||||
They will be uses in the next steps.
|
||||
|
||||
### KeyCloak
|
||||
|
||||
#### Configure `.env`
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
nano .env
|
||||
```
|
||||
|
||||
Configure:
|
||||
* `OBS_KEYCLOAK_URI`:
|
||||
* The subdomain of your keycloak
|
||||
* `OBS_KEYCLOAK_POSTGRES_PASSWORD`
|
||||
* One of the generated passwords for the KeyCloak-postgres
|
||||
* `OBS_KEYCLOAK_ADMIN_PASSWORD`:
|
||||
* One of the generated passwords for the KeyCloak-admin
|
||||
* `OBS_KEYCLOAK_PORTAL_REDIRECT_URI`:
|
||||
* The Redirect URI, e.g. the subdomain of your portal (ensure, it ends with `/*`)
|
||||
|
||||
#### Start KeyCloak
|
||||
|
||||
```bash
|
||||
docker-compose up -d keycloak
|
||||
docker-compose logs -f keycloak
|
||||
```
|
||||
|
||||
Wait until postgres and keycloak are started:
|
||||
|
||||
> keycloak_1 | 13:08:55,558 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
|
||||
|
||||
Open:
|
||||
|
||||
* https://login.example.com/
|
||||
* Test login to the admin console with your admin account
|
||||
|
||||
#### Configure Realm and Client
|
||||
|
||||
Jump into the KeyCloak container:
|
||||
|
||||
```bash
|
||||
docker-compose exec keycloak /bin/bash
|
||||
```
|
||||
|
||||
Since we configured the `.env`-file we can run the following commands
|
||||
to create a realm and a client now:
|
||||
|
||||
```bash
|
||||
# Login
|
||||
/opt/jboss/keycloak/bin/kcadm.sh config credentials --server http://localhost:8080/auth --realm master --user $KEYCLOAK_USER --password $KEYCLOAK_PASSWORD
|
||||
|
||||
# Create Realm
|
||||
/opt/jboss/keycloak/bin/kcadm.sh create realms -s realm=$OBS_KEYCLOAK_REALM -s enabled=true -o
|
||||
|
||||
# Create a client and remember the unique id of the client
|
||||
CID=$(/opt/jboss/keycloak/bin/kcadm.sh create clients -r $OBS_KEYCLOAK_REALM -s clientId=portal -s "redirectUris=[\"$OBS_KEYCLOAK_PORTAL_REDIRECT_URI\"]" -i)
|
||||
|
||||
# Create a secret for the client
|
||||
/opt/jboss/keycloak/bin/kcadm.sh create clients/$CID/client-secret -r $OBS_KEYCLOAK_REALM
|
||||
|
||||
# Get the secret of the client
|
||||
/opt/jboss/keycloak/bin/kcadm.sh get clients/$CID/client-secret -r $OBS_KEYCLOAK_REALM
|
||||
```
|
||||
|
||||
Exit the container with `exit`. Configure the client secret:
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
nano .env
|
||||
```
|
||||
|
||||
Configure:
|
||||
* `OBS_KEYCLOAK_CLIENT_SECRET`:
|
||||
* Use the obtained client secret
|
||||
|
||||
#### Create a user
|
||||
|
||||
* Login into your Keycloak with the admin user and select the realm obs
|
||||
* Create a user with username and email for the realm `obs` (*Hint*: email is required by the portal)
|
||||
* Configure a password in the tab `Credentials` as well
|
||||
|
||||
### Portal
|
||||
|
||||
#### Configure Postgres
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
nano .env
|
||||
```
|
||||
|
||||
Configure:
|
||||
* `OBS_POSTGRES_HOST`:
|
||||
* The should be the postgres-container, e.g. `postgres`
|
||||
* `OBS_POSTGRES_USER`:
|
||||
* The default postgres-user is `obs`
|
||||
* `OBS_POSTGRES_PASSWORD`:
|
||||
* Use one of the generated passwords for the postgres
|
||||
* `OBS_POSTGRES_DB`:
|
||||
* The default postgres-database is `obs`
|
||||
* `OBS_POSTGRES_URL`:
|
||||
* Use the same informations as aboe to configure the `POSTGRES_URL`,
|
||||
this one is used by the portal.
|
||||
|
||||
#### Start Postgres for the portal
|
||||
|
||||
```
|
||||
cd /opt/openbikesensor/
|
||||
docker-compose up -d postgres
|
||||
docker-compose logs -f postgres
|
||||
```
|
||||
Wait until started:
|
||||
|
||||
> postgres_1 | PostgreSQL init process complete; ready for start up.
|
||||
|
||||
|
||||
#### Build the portal image
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
docker-compose build portal
|
||||
```
|
||||
|
||||
*Hint*: This may take up to 10 minutes. In the future, we will provide a prebuild image.
|
||||
|
||||
#### Prepare database
|
||||
|
||||
Run the following scripts to prepare the database:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm portal tools/upgrade.py
|
||||
```
|
||||
|
||||
For more details, see [README.md](../README.md) under "Prepare database".
|
||||
|
||||
#### Import OpenStreetMap data
|
||||
|
||||
Follow [these instructions](./osm-import.md).
|
||||
|
||||
|
||||
#### Configure portal
|
||||
|
||||
The portal can be configured via env-vars or via the `config.py`.
|
||||
It's important to know, that the `config.py` overrides the env-vars.
|
||||
All env-vars start with `OBS_` and will be handled by the application without the prefix.
|
||||
For example, the env-var `OBS_SECRET` will be same as `SECRET` within the `config.py` and will be `SECRET` within the application.
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
nano .env
|
||||
```
|
||||
|
||||
Configure:
|
||||
|
||||
* `OBS_PORTAL_URI`:
|
||||
* The subdomain of your portal
|
||||
* `OBS_SECRET`:
|
||||
* Generate a UUID with `uuidgen` and use it as the secret
|
||||
* `OBS_POSTGRES_URL`:
|
||||
* Should be configured already
|
||||
* `OBS_KEYCLOAK_URL`:
|
||||
* You can find it as the `issuer`, when you click on *OpenID Endpoint Configuration* in the realm obs
|
||||
* `OBS_KEYCLOAK_CLIENT_SECRET`:
|
||||
* Should be configured already
|
||||
* `OBS: DEDICATED_WORKER`
|
||||
* Should be set to `"True"`, since it the workder will be started with the portal
|
||||
* `OBS_DATA_DIR`
|
||||
* The data dir must be the same for the portal and the worer.
|
||||
The default is `/data` within the containers
|
||||
* `OBS_PROXIES_COUNT`:
|
||||
* This sets `PROXIES_COUNT = 1` in your config
|
||||
* Read the [Sanic docs](https://sanicframework.org/en/guide/advanced/proxy-headers.html)
|
||||
for why this needs to be done. If your reverse proxy supports it, you can also
|
||||
use a forwarded secret to secure your proxy target from spoofing. This is not
|
||||
required if your application server does not listen on a public interface, but
|
||||
it is recommended anyway, if possible.
|
||||
|
||||
Have a look into the `config.py`, which other variables may affect you.
|
||||
|
||||
#### Start the portal
|
||||
|
||||
```bash
|
||||
cd /opt/openbikesensor/
|
||||
docker-compose up -d portal
|
||||
docker-compose logs -f portal worker
|
||||
```
|
||||
|
||||
> portal_1 | [2022-01-03 13:37:48 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:3000
|
||||
|
||||
This also starts a dedicated worker container to handle the tracks.
|
||||
|
||||
#### Test the portal
|
||||
|
||||
* Open: https://portal.example.com/ (URL depends on your setup)
|
||||
* Login with the user
|
||||
* Upload a track via My Tracks
|
||||
|
||||
You should see smth. like:
|
||||
|
||||
> worker_1 | INFO: Track uuqvcvlm imported.
|
||||
|
||||
When you click on *My Tracks*, you should see it on a map.
|
||||
|
||||
#### Configure the map position
|
||||
|
||||
Open the tab *Map** an zoom to the desired position. The URL contains the corresponding GPS position,
|
||||
for example:
|
||||
|
||||
> 14/53.86449349032097/10.696108517499198
|
||||
|
||||
Configure the map position in the `config.py` and restart the portal, by setting `mapHome` in the variable `FRONTEND_CONFIG`:
|
||||
|
||||
```
|
||||
cd /opt/openbikesensor/
|
||||
nano config/config.py
|
||||
|
||||
docker-compose restart portal
|
||||
```
|
||||
|
||||
**Hint**: Maybe it's required to disable the browser cache to see the change.
|
||||
|
||||
The tab *Map* should be the selected map section now.
|
||||
When you uploaded some tracks, you map should show a colors overlay on the streets.
|
||||
|
||||
## Miscellaneous
|
||||
|
||||
### Logs
|
||||
|
||||
To read the logs, run
|
||||
|
||||
```bash
|
||||
docker-compose logs -f
|
||||
```
|
||||
|
||||
If something went wrong, you can reconfigure your config files and rerun:
|
||||
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
### Updates
|
||||
|
||||
Before updating make sure that you have properly backed-up your instance so you
|
||||
can always roll back to a pre-update state.
|
||||
|
||||
#### Migrating
|
||||
|
||||
Migrations are done with
|
||||
[Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
|
||||
its documentation for help. Most of the time, running this command will do all
|
||||
the migrations you need:
|
||||
|
||||
```bash
|
||||
docker-compose run --rm portal alembic upgrade head
|
||||
```
|
||||
|
||||
You are advised to create a backup (see below) before running a migration, and
|
||||
to shut down the services before the migration and start them afterwards.
|
||||
|
||||
### Backups
|
||||
|
||||
To backup your instances private data you only need to backup the ``$ROOT`` folder.
|
||||
This should contain everything needed to start your instance again, no persistent
|
||||
data lives in docker containers. You should stop the containers for a clean backup.
|
||||
|
||||
This backup contains the imported OSM data as well. That is of course a lot of
|
||||
redundant data, but very nice to have for a quick restore operation. If you
|
||||
want to generate smaller, nonredundant backups, or backups during live
|
||||
operation of the database, use a tool like `pg_dump` and extract only the
|
||||
required tables:
|
||||
|
||||
* `road_usage`
|
||||
* `overtaking_event`
|
||||
* `track`
|
||||
* `user` (make sure to reference `public.user`, not the postgres user table)
|
||||
* `comment`
|
||||
|
||||
You might also instead use the `--exclude-table` option to ignore the `road`
|
||||
table only (adjust connection parameters and names):
|
||||
|
||||
```bash
|
||||
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
|
||||
```
|
||||
|
||||
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
|
||||
directory. The processed data can be regenerated, but you can also back that
|
||||
up, from `local/api-data/processing-output`.
|
||||
|
||||
Finally, make sure to create a backup of your keycloak instance. Refer to the
|
||||
keycloak documentation for how to export its data in a restorable way. This
|
||||
should work very well if you are storing keycloak data in the PostgreSQL and
|
||||
exporting that with an exclusion pattern instead of an explicit list.
|
||||
|
||||
And then, please test your backup and restore strategy before going live, or at
|
||||
least before you need it!
|
||||
|
||||
|
||||
### Connecting to the PostgreSQL database
|
||||
|
||||
Here are the quick steps for connecting to your PostgreSQL database, should you
|
||||
need that:
|
||||
|
||||
* Add the `gateway` network to your `postgres` service.
|
||||
* Add a port forwarding to your `postgres` service:
|
||||
```yaml
|
||||
ports:
|
||||
- 127.0.0.1:25432:5432
|
||||
```
|
||||
* Run `docker-compose up -d postgres` again
|
||||
* You can now connect from your server to the PostgreSQL service with:
|
||||
|
||||
```
|
||||
psql -h localhost -U obs -d obs -p 25432
|
||||
```
|
||||
|
||||
You will need your database password for the connection.
|
||||
* If you do not want to install `psql` outside your container, you can use an
|
||||
SSH tunnel from your local machine to your server and run `psql` locally.
|
|
@ -12,7 +12,7 @@
|
|||
"obsMapSource": {
|
||||
"type": "vector",
|
||||
"tiles": ["https://portal.example.com/tiles/{z}/{x}/{y}.pbf"],
|
||||
"minzoom": 0,
|
||||
"minzoom": 12,
|
||||
"maxzoom": 14
|
||||
}
|
||||
}
|
||||
|
|
4753
frontend/package-lock.json
generated
4753
frontend/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
@ -12,11 +12,7 @@
|
|||
"classnames": "^2.3.1",
|
||||
"colormap": "^2.3.2",
|
||||
"downloadjs": "^1.4.7",
|
||||
"echarts": "^5.3.2",
|
||||
"echarts-for-react": "^3.0.2",
|
||||
"fomantic-ui-less": "^2.8.8",
|
||||
"i18next-browser-languagedetector": "^6.1.4",
|
||||
"i18next-http-backend": "^1.4.1",
|
||||
"immer": "^9.0.7",
|
||||
"luxon": "^1.28.0",
|
||||
"maplibre-gl": "^1.15.2",
|
||||
|
@ -29,9 +25,7 @@
|
|||
"proj4": "^2.7.5",
|
||||
"react": "^17.0.2",
|
||||
"react-dom": "^17.0.2",
|
||||
"react-helmet": "^6.1.0",
|
||||
"react-hook-form": "^6.15.8",
|
||||
"react-i18next": "^11.18.1",
|
||||
"react-map-gl": "^6.1.17",
|
||||
"react-markdown": "^5.0.3",
|
||||
"react-redux": "^7.2.6",
|
||||
|
@ -44,13 +38,11 @@
|
|||
"sass": "^1.43.5",
|
||||
"semantic-ui-react": "^2.0.4",
|
||||
"ts-loader": "^9.2.6",
|
||||
"typescript": "^4.7.4",
|
||||
"yaml-loader": "^0.8.0"
|
||||
"typescript": "^4.5.2"
|
||||
},
|
||||
"eslintConfig": {
|
||||
"extends": [
|
||||
"react-app",
|
||||
"plugin:prettier/recommended"
|
||||
"react-app"
|
||||
]
|
||||
},
|
||||
"browserslist": {
|
||||
|
@ -80,12 +72,8 @@
|
|||
"@types/react-router-dom": "^5.3.2",
|
||||
"babel-loader": "^8.2.3",
|
||||
"css-loader": "^5.2.7",
|
||||
"eslint-config-prettier": "^8.5.0",
|
||||
"eslint-config-react-app": "^7.0.1",
|
||||
"eslint-plugin-prettier": "^4.2.1",
|
||||
"html-webpack-plugin": "^5.5.0",
|
||||
"less-loader": "^10.2.0",
|
||||
"prettier": "^2.7.1",
|
||||
"react-refresh": "^0.11.0",
|
||||
"style-loader": "^3.3.1",
|
||||
"webpack": "^5.64.4",
|
||||
|
|
|
@ -69,6 +69,7 @@
|
|||
}
|
||||
|
||||
.pageTitle a {
|
||||
font-family: 'Open Sans Condensed';
|
||||
font-weight: 600;
|
||||
font-size: 18pt;
|
||||
|
||||
|
@ -119,15 +120,6 @@
|
|||
}
|
||||
}
|
||||
|
||||
@media @mobile {
|
||||
.menu.menu {
|
||||
> :global(.ui.container) {
|
||||
height: @menuHeightMobile;
|
||||
align-items: stretch;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.banner {
|
||||
padding: 8px;
|
||||
z-index: 100;
|
||||
|
|
|
@ -6,16 +6,11 @@ import {BrowserRouter as Router, Switch, Route, Link} from 'react-router-dom'
|
|||
import {useObservable} from 'rxjs-hooks'
|
||||
import {from} from 'rxjs'
|
||||
import {pluck} from 'rxjs/operators'
|
||||
import {Helmet} from 'react-helmet'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import {useConfig} from 'config'
|
||||
import styles from './App.module.less'
|
||||
import {AVAILABLE_LOCALES, setLocale} from 'i18n'
|
||||
|
||||
import {
|
||||
AcknowledgementsPage,
|
||||
ExportPage,
|
||||
HomePage,
|
||||
LoginRedirectPage,
|
||||
LogoutPage,
|
||||
|
@ -26,7 +21,6 @@ import {
|
|||
TrackPage,
|
||||
TracksPage,
|
||||
UploadPage,
|
||||
MyTracksPage,
|
||||
} from 'pages'
|
||||
import {Avatar, LoginButton} from 'components'
|
||||
import api from 'api'
|
||||
|
@ -62,60 +56,44 @@ function Banner({text, style = 'warning'}: {text: string; style: 'warning' | 'in
|
|||
}
|
||||
|
||||
const App = connect((state) => ({login: state.login}))(function App({login}) {
|
||||
const {t} = useTranslation()
|
||||
const config = useConfig()
|
||||
const apiVersion = useObservable(() => from(api.get('/info')).pipe(pluck('version')))
|
||||
|
||||
const hasMap = Boolean(config?.obsMapSource)
|
||||
|
||||
React.useEffect(() => {
|
||||
api.loadUser()
|
||||
}, [])
|
||||
|
||||
return config ? (
|
||||
<Router basename={config.basename}>
|
||||
<Helmet>
|
||||
<meta charSet="utf-8" />
|
||||
<title>OpenBikeSensor Portal</title>
|
||||
</Helmet>
|
||||
{config?.banner && <Banner {...config.banner} />}
|
||||
<Menu className={styles.menu} stackable>
|
||||
<Menu className={styles.menu}>
|
||||
<Container>
|
||||
<Link to="/" component={MenuItemForLink} header className={styles.pageTitle}>
|
||||
OpenBikeSensor
|
||||
</Link>
|
||||
|
||||
{hasMap && (
|
||||
{config?.obsMapSource && (
|
||||
<Link component={MenuItemForLink} to="/map" as="a">
|
||||
{t('App.menu.map')}
|
||||
Map
|
||||
</Link>
|
||||
)}
|
||||
|
||||
<Link component={MenuItemForLink} to="/tracks" as="a">
|
||||
{t('App.menu.tracks')}
|
||||
</Link>
|
||||
|
||||
<Link component={MenuItemForLink} to="/export" as="a">
|
||||
{t('App.menu.export')}
|
||||
Tracks
|
||||
</Link>
|
||||
|
||||
<Menu.Menu position="right">
|
||||
{login ? (
|
||||
<>
|
||||
<Link component={MenuItemForLink} to="/my/tracks" as="a">
|
||||
{t('App.menu.myTracks')}
|
||||
My Tracks
|
||||
</Link>
|
||||
<Dropdown item trigger={<Avatar user={login} className={styles.avatar} />}>
|
||||
<Dropdown.Menu>
|
||||
<Link
|
||||
to="/upload"
|
||||
component={DropdownItemForLink}
|
||||
icon="cloud upload"
|
||||
text={t('App.menu.uploadTracks')}
|
||||
/>
|
||||
<Link to="/settings" component={DropdownItemForLink} icon="cog" text={t('App.menu.settings')} />
|
||||
<Link to="/upload" component={DropdownItemForLink} icon="cloud upload" text="Upload tracks" />
|
||||
<Link to="/settings" component={DropdownItemForLink} icon="cog" text="Settings" />
|
||||
<Dropdown.Divider />
|
||||
<Link to="/logout" component={DropdownItemForLink} icon="sign-out" text={t('App.menu.logout')} />
|
||||
<Link to="/logout" component={DropdownItemForLink} icon="sign-out" text="Logout" />
|
||||
</Dropdown.Menu>
|
||||
</Dropdown>
|
||||
</>
|
||||
|
@ -132,16 +110,14 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
|
|||
<Route path="/" exact>
|
||||
<HomePage />
|
||||
</Route>
|
||||
{hasMap && (
|
||||
<Route path="/map" exact>
|
||||
<MapPage />
|
||||
</Route>
|
||||
)}
|
||||
<Route path="/map" exact>
|
||||
<MapPage />
|
||||
</Route>
|
||||
<Route path="/tracks" exact>
|
||||
<TracksPage />
|
||||
</Route>
|
||||
<Route path="/my/tracks" exact>
|
||||
<MyTracksPage />
|
||||
<TracksPage privateTracks />
|
||||
</Route>
|
||||
<Route path={`/tracks/:slug`} exact>
|
||||
<TrackPage />
|
||||
|
@ -149,12 +125,6 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
|
|||
<Route path={`/tracks/:slug/edit`} exact>
|
||||
<TrackEditor />
|
||||
</Route>
|
||||
<Route path="/export" exact>
|
||||
<ExportPage />
|
||||
</Route>
|
||||
<Route path="/acknowledgements" exact>
|
||||
<AcknowledgementsPage />
|
||||
</Route>
|
||||
<Route path="/redirect" exact>
|
||||
<LoginRedirectPage />
|
||||
</Route>
|
||||
|
@ -181,7 +151,7 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
|
|||
<Grid columns={4} stackable>
|
||||
<Grid.Row>
|
||||
<Grid.Column>
|
||||
<Header as="h5">{t('App.footer.aboutTheProject')}</Header>
|
||||
<Header as="h5">About the project</Header>
|
||||
<List>
|
||||
<List.Item>
|
||||
<a href="https://openbikesensor.org/" target="_blank" rel="noreferrer">
|
||||
|
@ -192,68 +162,56 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
|
|||
</Grid.Column>
|
||||
|
||||
<Grid.Column>
|
||||
<Header as="h5">{t('App.footer.getInvolved')}</Header>
|
||||
<Header as="h5">Get involved</Header>
|
||||
<List>
|
||||
<List.Item>
|
||||
<a href="https://forum.openbikesensor.org/" target="_blank" rel="noreferrer">
|
||||
{t('App.footer.getHelpInForum')}
|
||||
Get help in forum
|
||||
</a>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<a href="https://github.com/openbikesensor/portal/issues/new" target="_blank" rel="noreferrer">
|
||||
{t('App.footer.reportAnIssue')}
|
||||
Report an issue
|
||||
</a>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<a href="https://github.com/openbikesensor/portal" target="_blank" rel="noreferrer">
|
||||
{t('App.footer.development')}
|
||||
Development
|
||||
</a>
|
||||
</List.Item>
|
||||
</List>
|
||||
</Grid.Column>
|
||||
|
||||
<Grid.Column>
|
||||
<Header as="h5">{t('App.footer.thisInstallation')}</Header>
|
||||
<Header as="h5">This installation</Header>
|
||||
<List>
|
||||
<List.Item>
|
||||
<a href={config?.privacyPolicyUrl} target="_blank" rel="noreferrer">
|
||||
{t('App.footer.privacyPolicy')}
|
||||
Privacy policy
|
||||
</a>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<a href={config?.imprintUrl} target="_blank" rel="noreferrer">
|
||||
{t('App.footer.imprint')}
|
||||
</a>
|
||||
</List.Item>
|
||||
{config?.termsUrl && (
|
||||
<List.Item>
|
||||
<a href={config?.termsUrl} target="_blank" rel="noreferrer">
|
||||
{t('App.footer.terms')}
|
||||
</a>
|
||||
</List.Item>
|
||||
)}
|
||||
<List.Item>
|
||||
<a
|
||||
href={`https://github.com/openbikesensor/portal${
|
||||
apiVersion ? `/releases/tag/${apiVersion}` : ''
|
||||
}`}
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
>
|
||||
{apiVersion ? t('App.footer.version', {apiVersion}) : t('App.footer.versionLoading')}
|
||||
Imprint
|
||||
</a>
|
||||
</List.Item>
|
||||
</List>
|
||||
</Grid.Column>
|
||||
|
||||
<Grid.Column>
|
||||
<Header as="h5">{t('App.footer.changeLanguage')}</Header>
|
||||
<Header as="h5">Info</Header>
|
||||
<List>
|
||||
{AVAILABLE_LOCALES.map((locale) => (
|
||||
<List.Item key={locale}>
|
||||
<a onClick={() => setLocale(locale)}>{t(`locales.${locale}`)}</a>
|
||||
</List.Item>
|
||||
))}
|
||||
<List.Item>
|
||||
<a
|
||||
href={`https://github.com/openbikesensor/portal${
|
||||
apiVersion ? `/releases/tag/v${apiVersion}` : ''
|
||||
}`}
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
>
|
||||
{apiVersion ? `v${apiVersion}` : 'Fetching version...'}
|
||||
</a>
|
||||
</List.Item>
|
||||
</List>
|
||||
</Grid.Column>
|
||||
</Grid.Row>
|
||||
|
|
|
@ -19,21 +19,21 @@ function getColor(s) {
|
|||
}
|
||||
|
||||
export default function Avatar({user, className}) {
|
||||
const {image, displayName} = user || {}
|
||||
const {image, username} = user || {}
|
||||
|
||||
if (image) {
|
||||
return <Comment.Avatar src={image} className={className} />
|
||||
}
|
||||
|
||||
if (!displayName) {
|
||||
if (!username) {
|
||||
return <div className={classnames(className, 'avatar', 'empty-avatar')} />
|
||||
}
|
||||
|
||||
const color = getColor(displayName)
|
||||
const color = getColor(username)
|
||||
|
||||
return (
|
||||
<div className={classnames(className, 'avatar', 'text-avatar')} style={{background: color}}>
|
||||
{displayName && <span>{displayName[0]}</span>}
|
||||
{username && <span>{username[0]}</span>}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
|
@ -1,77 +0,0 @@
|
|||
import React from 'react'
|
||||
import ReactEChartsCore from 'echarts-for-react/lib/core'
|
||||
|
||||
import * as echarts from 'echarts/core'
|
||||
|
||||
import {
|
||||
// LineChart,
|
||||
BarChart,
|
||||
// PieChart,
|
||||
// ScatterChart,
|
||||
// RadarChart,
|
||||
// MapChart,
|
||||
// TreeChart,
|
||||
// TreemapChart,
|
||||
// GraphChart,
|
||||
// GaugeChart,
|
||||
// FunnelChart,
|
||||
// ParallelChart,
|
||||
// SankeyChart,
|
||||
// BoxplotChart,
|
||||
// CandlestickChart,
|
||||
// EffectScatterChart,
|
||||
// LinesChart,
|
||||
// HeatmapChart,
|
||||
// PictorialBarChart,
|
||||
// ThemeRiverChart,
|
||||
// SunburstChart,
|
||||
// CustomChart,
|
||||
} from 'echarts/charts'
|
||||
|
||||
// import components, all suffixed with Component
|
||||
import {
|
||||
// GridSimpleComponent,
|
||||
GridComponent,
|
||||
// PolarComponent,
|
||||
// RadarComponent,
|
||||
// GeoComponent,
|
||||
// SingleAxisComponent,
|
||||
// ParallelComponent,
|
||||
// CalendarComponent,
|
||||
// GraphicComponent,
|
||||
// ToolboxComponent,
|
||||
TooltipComponent,
|
||||
// AxisPointerComponent,
|
||||
// BrushComponent,
|
||||
TitleComponent,
|
||||
// TimelineComponent,
|
||||
// MarkPointComponent,
|
||||
// MarkLineComponent,
|
||||
// MarkAreaComponent,
|
||||
// LegendComponent,
|
||||
// LegendScrollComponent,
|
||||
// LegendPlainComponent,
|
||||
// DataZoomComponent,
|
||||
// DataZoomInsideComponent,
|
||||
// DataZoomSliderComponent,
|
||||
// VisualMapComponent,
|
||||
// VisualMapContinuousComponent,
|
||||
// VisualMapPiecewiseComponent,
|
||||
// AriaComponent,
|
||||
// TransformComponent,
|
||||
DatasetComponent,
|
||||
} from 'echarts/components'
|
||||
|
||||
// Import renderer, note that introducing the CanvasRenderer or SVGRenderer is a required step
|
||||
import {
|
||||
CanvasRenderer,
|
||||
// SVGRenderer,
|
||||
} from 'echarts/renderers'
|
||||
|
||||
// Register the required components
|
||||
echarts.use([TitleComponent, TooltipComponent, GridComponent, BarChart, CanvasRenderer])
|
||||
|
||||
// The usage of ReactEChartsCore are same with above.
|
||||
export default function Chart(props) {
|
||||
return <ReactEChartsCore echarts={echarts} notMerge lazyUpdate {...props} />
|
||||
}
|
|
@ -1,78 +1,15 @@
|
|||
import React, {useMemo} from 'react'
|
||||
|
||||
import styles from './ColorMapLegend.module.less'
|
||||
import {useMemo} from "react";
|
||||
|
||||
type ColorMap = [number, string][]
|
||||
|
||||
function* pairs(arr) {
|
||||
for (let i = 1; i < arr.length; i++) {
|
||||
yield [arr[i - 1], arr[i]]
|
||||
}
|
||||
}
|
||||
function* zip(...arrs) {
|
||||
const l = Math.min(...arrs.map((a) => a.length))
|
||||
for (let i = 0; i < l; i++) {
|
||||
yield arrs.map((a) => a[i])
|
||||
}
|
||||
}
|
||||
import styles from './ColorMapLegend.module.less'
|
||||
|
||||
export function DiscreteColorMapLegend({map}: {map: ColorMap}) {
|
||||
const colors: string[] = map.filter((x, i) => i % 2 == 0) as any
|
||||
const stops: number[] = map.filter((x, i) => i % 2 == 1) as any
|
||||
let min = stops[0]
|
||||
let max = stops[stops.length - 1]
|
||||
const buffer = (max - min) / (stops.length - 1) / 2
|
||||
min -= buffer
|
||||
max += buffer
|
||||
const normalizeValue = (v) => (v - min) / (max - min)
|
||||
const stopPairs = Array.from(pairs([min, ...stops, max]))
|
||||
|
||||
const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, [])
|
||||
const gradientUrl = `url(#${gradientId})`
|
||||
|
||||
const parts = Array.from(zip(stopPairs, colors))
|
||||
|
||||
return (
|
||||
<div className={styles.colorMapLegend}>
|
||||
<svg width="100%" height="20" version="1.1" xmlns="http://www.w3.org/2000/svg">
|
||||
<defs>
|
||||
<linearGradient id={gradientId} x1="0" x2="1" y1="0" y2="0">
|
||||
{parts.map(([[left, right], color]) => (
|
||||
<React.Fragment key={left}>
|
||||
<stop offset={normalizeValue(left) * 100 + '%'} stopColor={color} />
|
||||
<stop offset={normalizeValue(right) * 100 + '%'} stopColor={color} />
|
||||
</React.Fragment>
|
||||
))}
|
||||
</linearGradient>
|
||||
</defs>
|
||||
|
||||
<rect id="rect1" x="0" y="0" width="100%" height="100%" fill={gradientUrl} />
|
||||
</svg>
|
||||
|
||||
{stops.map((value) => (
|
||||
<span className={styles.tick} key={value} style={{left: normalizeValue(value) * 100 + '%'}}>
|
||||
{value.toFixed(2)}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default function ColorMapLegend({
|
||||
map,
|
||||
twoTicks = false,
|
||||
digits = 2,
|
||||
}: {
|
||||
map: ColorMap
|
||||
twoTicks?: boolean
|
||||
digits?: number
|
||||
}) {
|
||||
export default function ColorMapLegend({map}: {map: ColorMap}) {
|
||||
const min = map[0][0]
|
||||
const max = map[map.length - 1][0]
|
||||
const normalizeValue = (v) => (v - min) / (max - min)
|
||||
const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, [])
|
||||
const gradientUrl = `url(#${gradientId})`
|
||||
const tickValues = twoTicks ? [map[0], map[map.length - 1]] : map
|
||||
const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, []);
|
||||
const gradientUrl = `url(#${gradientId})`;
|
||||
return (
|
||||
<div className={styles.colorMapLegend}>
|
||||
<svg width="100%" height="20" version="1.1" xmlns="http://www.w3.org/2000/svg">
|
||||
|
@ -86,9 +23,9 @@ export default function ColorMapLegend({
|
|||
|
||||
<rect id="rect1" x="0" y="0" width="100%" height="100%" fill={gradientUrl} />
|
||||
</svg>
|
||||
{tickValues.map(([value]) => (
|
||||
{map.map(([value]) => (
|
||||
<span className={styles.tick} key={value} style={{left: normalizeValue(value) * 100 + '%'}}>
|
||||
{value.toFixed(digits)}
|
||||
{value.toFixed(2)}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
|
|
|
@ -1,11 +1,9 @@
|
|||
import React from 'react'
|
||||
import {Icon, Segment, Header, Button} from 'semantic-ui-react'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import {FileDrop} from 'components'
|
||||
|
||||
export default function FileUploadField({onSelect: onSelect_, multiple}) {
|
||||
const {t} = useTranslation()
|
||||
const labelRef = React.useRef()
|
||||
const [labelRefState, setLabelRefState] = React.useState()
|
||||
|
||||
|
@ -33,14 +31,7 @@ export default function FileUploadField({onSelect: onSelect_, multiple}) {
|
|||
<input
|
||||
type="file"
|
||||
id="upload-field"
|
||||
style={{
|
||||
width: 0,
|
||||
height: 0,
|
||||
position: 'fixed',
|
||||
left: -1000,
|
||||
top: -1000,
|
||||
opacity: 0.001,
|
||||
}}
|
||||
style={{width: 0, height: 0, position: 'fixed', left: -1000, top: -1000, opacity: 0.001}}
|
||||
multiple={multiple}
|
||||
accept=".csv"
|
||||
onChange={onChangeField}
|
||||
|
@ -59,11 +50,11 @@ export default function FileUploadField({onSelect: onSelect_, multiple}) {
|
|||
>
|
||||
<Header icon>
|
||||
<Icon name="cloud upload" />
|
||||
{multiple ? t('FileUploadField.dropOrClickMultiple') : t('FileUploadField.dropOrClick')}
|
||||
Drop file{multiple ? 's' : ''} here or click to select {multiple ? 'them' : 'one'} for upload
|
||||
</Header>
|
||||
|
||||
<Button primary as="span">
|
||||
{multiple ? t('FileUploadField.uploadFiles') : t('FileUploadField.uploadFile')}
|
||||
Upload file{multiple ? 's' : ''}
|
||||
</Button>
|
||||
</Segment>
|
||||
)}
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
import {DateTime} from 'luxon'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
export default function FormattedDate({date, relative = false}) {
|
||||
if (date == null) {
|
||||
|
@ -11,19 +10,11 @@ export default function FormattedDate({date, relative = false}) {
|
|||
|
||||
let str
|
||||
|
||||
const {i18n} = useTranslation()
|
||||
const locale = i18n.language
|
||||
|
||||
if (relative) {
|
||||
str = dateTime.setLocale(locale).toRelative()
|
||||
str = dateTime.toRelative()
|
||||
} else {
|
||||
str = dateTime.setLocale(locale).toLocaleString(DateTime.DATETIME_MED)
|
||||
str = dateTime.toLocaleString(DateTime.DATETIME_MED)
|
||||
}
|
||||
|
||||
const iso = dateTime.toISO()
|
||||
return (
|
||||
<time dateTime={iso} title={iso}>
|
||||
{str}
|
||||
</time>
|
||||
)
|
||||
return <span title={dateTime.toISO()}>{str}</span>
|
||||
}
|
||||
|
|
|
@ -1,11 +1,9 @@
|
|||
import React from 'react'
|
||||
import {Button} from 'semantic-ui-react'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import api from 'api'
|
||||
|
||||
export default function LoginButton(props) {
|
||||
const {t} = useTranslation()
|
||||
const [busy, setBusy] = React.useState(false)
|
||||
|
||||
const onClick = React.useCallback(
|
||||
|
@ -21,7 +19,7 @@ export default function LoginButton(props) {
|
|||
|
||||
return (
|
||||
<Button onClick={busy ? null : onClick} loading={busy} {...props}>
|
||||
{t('LoginButton.login')}
|
||||
Login
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
|
|
|
@ -2,13 +2,12 @@ import React, {useState, useCallback, useMemo, useEffect} from 'react'
|
|||
import classnames from 'classnames'
|
||||
import {connect} from 'react-redux'
|
||||
import _ from 'lodash'
|
||||
import ReactMapGl, {WebMercatorViewport, ScaleControl, NavigationControl, AttributionControl} from 'react-map-gl'
|
||||
import ReactMapGl, {WebMercatorViewport, ScaleControl, NavigationControl} from 'react-map-gl'
|
||||
import turfBbox from '@turf/bbox'
|
||||
import {useHistory, useLocation} from 'react-router-dom'
|
||||
|
||||
import {useConfig} from 'config'
|
||||
|
||||
import {useCallbackRef} from '../../utils'
|
||||
import {baseMapStyles} from '../../mapstyles'
|
||||
|
||||
import styles from './styles.module.less'
|
||||
|
@ -20,13 +19,11 @@ interface Viewport {
|
|||
}
|
||||
const EMPTY_VIEWPORT: Viewport = {longitude: 0, latitude: 0, zoom: 0}
|
||||
|
||||
export const withBaseMapStyle = connect((state) => ({
|
||||
baseMapStyle: state.mapConfig?.baseMap?.style ?? 'positron',
|
||||
}))
|
||||
export const withBaseMapStyle = connect((state) => ({baseMapStyle: state.mapConfig?.baseMap?.style ?? 'positron'}))
|
||||
|
||||
function parseHash(v: string): Viewport | null {
|
||||
if (!v) return null
|
||||
const m = v.match(/^#([0-9\.]+)\/([0-9\.\-]+)\/([0-9\.\-]+)$/)
|
||||
const m = v.match(/^#([0-9\.]+)\/([0-9\.]+)\/([0-9\.]+)$/)
|
||||
if (!m) return null
|
||||
return {
|
||||
zoom: Number.parseFloat(m[1]),
|
||||
|
@ -39,32 +36,19 @@ function buildHash(v: Viewport): string {
|
|||
return `${v.zoom.toFixed(2)}/${v.latitude}/${v.longitude}`
|
||||
}
|
||||
|
||||
const setViewportToHash = _.debounce((history, viewport) => {
|
||||
history.replace({
|
||||
hash: buildHash(viewport),
|
||||
})
|
||||
}, 200)
|
||||
|
||||
function useViewportFromUrl(): [Viewport | null, (v: Viewport) => void] {
|
||||
const history = useHistory()
|
||||
const location = useLocation()
|
||||
|
||||
const [cachedValue, setCachedValue] = useState(parseHash(location.hash))
|
||||
|
||||
// when the location hash changes, set the new value to the cache
|
||||
useEffect(() => {
|
||||
setCachedValue(parseHash(location.hash))
|
||||
}, [location.hash])
|
||||
|
||||
const value = useMemo(() => parseHash(location.hash), [location.hash])
|
||||
const setter = useCallback(
|
||||
(v) => {
|
||||
setCachedValue(v)
|
||||
setViewportToHash(history, v)
|
||||
history.replace({
|
||||
hash: buildHash(v),
|
||||
})
|
||||
},
|
||||
[history]
|
||||
)
|
||||
|
||||
return [cachedValue || EMPTY_VIEWPORT, setter]
|
||||
return [value || EMPTY_VIEWPORT, setter]
|
||||
}
|
||||
|
||||
function Map({
|
||||
|
@ -72,28 +56,17 @@ function Map({
|
|||
children,
|
||||
boundsFromJson,
|
||||
baseMapStyle,
|
||||
hasToolbar,
|
||||
onViewportChange,
|
||||
...props
|
||||
}: {
|
||||
viewportFromUrl?: boolean
|
||||
children: React.ReactNode
|
||||
boundsFromJson: GeoJSON.Geometry
|
||||
baseMapStyle: string
|
||||
hasToolbar?: boolean
|
||||
onViewportChange: (viewport: Viewport) => void
|
||||
}) {
|
||||
const [viewportState, setViewportState] = useState(EMPTY_VIEWPORT)
|
||||
const [viewportUrl, setViewportUrl] = useViewportFromUrl()
|
||||
|
||||
const [viewport, setViewport_] = viewportFromUrl ? [viewportUrl, setViewportUrl] : [viewportState, setViewportState]
|
||||
const setViewport = useCallback(
|
||||
(viewport: Viewport) => {
|
||||
setViewport_(viewport)
|
||||
onViewportChange?.(viewport)
|
||||
},
|
||||
[setViewport_, onViewportChange]
|
||||
)
|
||||
const [viewport, setViewport] = viewportFromUrl ? [viewportUrl, setViewportUrl] : [viewportState, setViewportState]
|
||||
|
||||
const config = useConfig()
|
||||
useEffect(() => {
|
||||
|
@ -102,40 +75,20 @@ function Map({
|
|||
}
|
||||
}, [config, boundsFromJson])
|
||||
|
||||
const mapSourceHosts = useMemo(
|
||||
() => _.uniq(config?.obsMapSource?.tiles?.map((tileUrl: string) => new URL(tileUrl).host) ?? []),
|
||||
[config?.obsMapSource]
|
||||
)
|
||||
|
||||
const transformRequest = useCallbackRef((url, resourceType) => {
|
||||
if (resourceType === 'Tile' && mapSourceHosts.includes(new URL(url).host)) {
|
||||
return {
|
||||
url,
|
||||
credentials: 'include',
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
useEffect(() => {
|
||||
if (boundsFromJson) {
|
||||
const bbox = turfBbox(boundsFromJson)
|
||||
if (bbox.every((v) => Math.abs(v) !== Infinity)) {
|
||||
const [minX, minY, maxX, maxY] = bbox
|
||||
const vp = new WebMercatorViewport({
|
||||
width: 1000,
|
||||
height: 800,
|
||||
}).fitBounds(
|
||||
[
|
||||
[minX, minY],
|
||||
[maxX, maxY],
|
||||
],
|
||||
{
|
||||
padding: 20,
|
||||
offset: [0, -100],
|
||||
}
|
||||
)
|
||||
setViewport(_.pick(vp, ['zoom', 'latitude', 'longitude']))
|
||||
}
|
||||
const [minX, minY, maxX, maxY] = turfBbox(boundsFromJson)
|
||||
const vp = new WebMercatorViewport({width: 1000, height: 800}).fitBounds(
|
||||
[
|
||||
[minX, minY],
|
||||
[maxX, maxY],
|
||||
],
|
||||
{
|
||||
padding: 20,
|
||||
offset: [0, -100],
|
||||
}
|
||||
)
|
||||
setViewport(_.pick(vp, ['zoom', 'latitude', 'longitude']))
|
||||
}
|
||||
}, [boundsFromJson])
|
||||
|
||||
|
@ -145,15 +98,13 @@ function Map({
|
|||
width="100%"
|
||||
height="100%"
|
||||
onViewportChange={setViewport}
|
||||
{...{transformRequest}}
|
||||
{...viewport}
|
||||
{...props}
|
||||
className={classnames(styles.map, props.className)}
|
||||
attributionControl={false}
|
||||
>
|
||||
<AttributionControl style={{top: 0, right: 0}} />
|
||||
<NavigationControl showCompass={false} style={{left: 16, top: hasToolbar ? 64 : 16}} />
|
||||
<ScaleControl maxWidth={200} unit="metric" style={{left: 16, bottom: 16}} />
|
||||
<NavigationControl style={{left: 10, top: 10}} />
|
||||
<ScaleControl maxWidth={200} unit="metric" style={{left: 10, bottom: 10}} />
|
||||
|
||||
{children}
|
||||
</ReactMapGl>
|
||||
)
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
import React from 'react'
|
||||
import classnames from 'classnames'
|
||||
import {Container} from 'semantic-ui-react'
|
||||
import {Helmet} from 'react-helmet'
|
||||
|
||||
import styles from './Page.module.less'
|
||||
|
||||
|
@ -10,32 +9,23 @@ export default function Page({
|
|||
children,
|
||||
fullScreen,
|
||||
stage,
|
||||
title,
|
||||
}: {
|
||||
small?: boolean
|
||||
children: ReactNode
|
||||
fullScreen?: boolean
|
||||
stage?: ReactNode
|
||||
title?: string
|
||||
}) {
|
||||
return (
|
||||
<>
|
||||
{title && (
|
||||
<Helmet>
|
||||
<title>{title} - OpenBikeSensor Portal</title>
|
||||
</Helmet>
|
||||
<main
|
||||
className={classnames(
|
||||
styles.page,
|
||||
small && styles.small,
|
||||
fullScreen && styles.fullScreen,
|
||||
stage && styles.hasStage
|
||||
)}
|
||||
<main
|
||||
className={classnames(
|
||||
styles.page,
|
||||
small && styles.small,
|
||||
fullScreen && styles.fullScreen,
|
||||
stage && styles.hasStage
|
||||
)}
|
||||
>
|
||||
{stage}
|
||||
{fullScreen ? children : <Container>{children}</Container>}
|
||||
</main>
|
||||
</>
|
||||
>
|
||||
{stage}
|
||||
{fullScreen ? children : <Container>{children}</Container>}
|
||||
</main>
|
||||
)
|
||||
}
|
||||
|
|
|
@ -1,73 +0,0 @@
|
|||
import React, {useState, useCallback} from 'react'
|
||||
import {pickBy} from 'lodash'
|
||||
import {Loader, Statistic, Pagination, Segment, Header, Menu, Table, Icon} from 'semantic-ui-react'
|
||||
import {useObservable} from 'rxjs-hooks'
|
||||
import {of, from, concat, combineLatest} from 'rxjs'
|
||||
import {map, switchMap, distinctUntilChanged} from 'rxjs/operators'
|
||||
import {Duration, DateTime} from 'luxon'
|
||||
|
||||
import api from 'api'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
function formatDuration(seconds) {
|
||||
return (
|
||||
Duration.fromMillis((seconds ?? 0) * 1000)
|
||||
.as('hours')
|
||||
.toFixed(1) + ' h'
|
||||
)
|
||||
}
|
||||
|
||||
export default function Stats() {
|
||||
const {t} = useTranslation()
|
||||
const [page, setPage] = useState(1)
|
||||
const PER_PAGE = 10
|
||||
const stats = useObservable(
|
||||
() => of(null).pipe(switchMap(() => concat(of(null), from(api.get('/stats/regions'))))),
|
||||
null
|
||||
)
|
||||
|
||||
const pageCount = stats ? Math.ceil(stats.length / PER_PAGE) : 1
|
||||
|
||||
return (
|
||||
<>
|
||||
<Header as="h2">{t('RegionStats.title')}</Header>
|
||||
|
||||
<div>
|
||||
<Loader active={stats == null} />
|
||||
|
||||
<Table celled>
|
||||
<Table.Header>
|
||||
<Table.Row>
|
||||
<Table.HeaderCell> {t('RegionStats.regionName')}</Table.HeaderCell>
|
||||
<Table.HeaderCell>{t('RegionStats.eventCount')}</Table.HeaderCell>
|
||||
</Table.Row>
|
||||
</Table.Header>
|
||||
|
||||
<Table.Body>
|
||||
{stats?.slice((page - 1) * PER_PAGE, page * PER_PAGE)?.map((area) => (
|
||||
<Table.Row key={area.id}>
|
||||
<Table.Cell>{area.name}</Table.Cell>
|
||||
<Table.Cell>{area.overtaking_event_count}</Table.Cell>
|
||||
</Table.Row>
|
||||
))}
|
||||
</Table.Body>
|
||||
|
||||
{pageCount > 1 && (
|
||||
<Table.Footer>
|
||||
<Table.Row>
|
||||
<Table.HeaderCell colSpan="2">
|
||||
<Pagination
|
||||
floated="right"
|
||||
activePage={page}
|
||||
totalPages={pageCount}
|
||||
onPageChange={(e, data) => setPage(data.activePage as number)}
|
||||
/>
|
||||
</Table.HeaderCell>
|
||||
</Table.Row>
|
||||
</Table.Footer>
|
||||
)}
|
||||
</Table>
|
||||
</div>
|
||||
</>
|
||||
)
|
||||
}
|
|
@ -5,7 +5,6 @@ import {useObservable} from 'rxjs-hooks'
|
|||
import {of, from, concat, combineLatest} from 'rxjs'
|
||||
import {map, switchMap, distinctUntilChanged} from 'rxjs/operators'
|
||||
import {Duration, DateTime} from 'luxon'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import api from 'api'
|
||||
|
||||
|
@ -18,7 +17,6 @@ function formatDuration(seconds) {
|
|||
}
|
||||
|
||||
export default function Stats({user = null}: {user?: null | string}) {
|
||||
const {t} = useTranslation()
|
||||
const [timeframe, setTimeframe] = useState('all_time')
|
||||
const onClick = useCallback((_e, {name}) => setTimeframe(name), [setTimeframe])
|
||||
|
||||
|
@ -65,56 +63,49 @@ export default function Stats({user = null}: {user?: null | string}) {
|
|||
[timeframe, user]
|
||||
)
|
||||
|
||||
const placeholder = t('Stats.placeholder')
|
||||
|
||||
return (
|
||||
<>
|
||||
<Header as="h2">{user && 'My '}Statistics</Header>
|
||||
|
||||
<div>
|
||||
<Segment attached="top">
|
||||
<Loader active={stats == null} />
|
||||
<Statistic.Group widths={2} size="tiny">
|
||||
<Statistic>
|
||||
<Statistic.Value>
|
||||
{stats ? `${Number(stats?.trackLength / 1000).toFixed(1)} km` : placeholder}
|
||||
</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.totalTrackLength')}</Statistic.Label>
|
||||
<Statistic.Value>{stats ? `${Number(stats?.trackLength / 1000).toFixed(1)} km` : '...'}</Statistic.Value>
|
||||
<Statistic.Label>Total track length</Statistic.Label>
|
||||
</Statistic>
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats ? formatDuration(stats?.trackDuration) : placeholder}</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.timeRecorded')}</Statistic.Label>
|
||||
<Statistic.Value>{stats ? formatDuration(stats?.trackDuration) : '...'}</Statistic.Value>
|
||||
<Statistic.Label>Time recorded</Statistic.Label>
|
||||
</Statistic>
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.numEvents ?? placeholder}</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.eventsConfirmed')}</Statistic.Label>
|
||||
<Statistic.Value>{stats?.numEvents ?? '...'}</Statistic.Value>
|
||||
<Statistic.Label>Events confirmed</Statistic.Label>
|
||||
</Statistic>
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.trackCount ?? placeholder}</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.tracksRecorded')}</Statistic.Label>
|
||||
</Statistic>
|
||||
{!user && (
|
||||
<>
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.userCount ?? placeholder}</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.membersJoined')}</Statistic.Label>
|
||||
</Statistic>
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.deviceCount ?? placeholder}</Statistic.Value>
|
||||
<Statistic.Label>{t('Stats.deviceCount')}</Statistic.Label>
|
||||
</Statistic>
|
||||
</>
|
||||
{user ? (
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.trackCount ?? '...'}</Statistic.Value>
|
||||
<Statistic.Label>Tracks recorded</Statistic.Label>
|
||||
</Statistic>
|
||||
) : (
|
||||
<Statistic>
|
||||
<Statistic.Value>{stats?.userCount ?? '...'}</Statistic.Value>
|
||||
<Statistic.Label>Members joined</Statistic.Label>
|
||||
</Statistic>
|
||||
)}
|
||||
</Statistic.Group>
|
||||
</Segment>
|
||||
|
||||
<Menu widths={3} attached="bottom" size="small">
|
||||
<Menu.Item name="this_month" active={timeframe === 'this_month'} onClick={onClick}>
|
||||
{t('Stats.thisMonth')}
|
||||
This month
|
||||
</Menu.Item>
|
||||
<Menu.Item name="this_year" active={timeframe === 'this_year'} onClick={onClick}>
|
||||
{t('Stats.thisYear')}
|
||||
This year
|
||||
</Menu.Item>
|
||||
<Menu.Item name="all_time" active={timeframe === 'all_time'} onClick={onClick}>
|
||||
{t('Stats.allTime')}
|
||||
All time
|
||||
</Menu.Item>
|
||||
</Menu>
|
||||
</div>
|
||||
|
|
|
@ -1,14 +0,0 @@
|
|||
import React from 'react'
|
||||
import {Icon} from 'semantic-ui-react'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
export default function Visibility({public: public_}: {public: boolean}) {
|
||||
const {t} = useTranslation()
|
||||
const icon = public_ ? <Icon color="blue" name="eye" fitted /> : <Icon name="eye slash" fitted />
|
||||
const text = public_ ? t('general.public') : t('general.private')
|
||||
return (
|
||||
<>
|
||||
{icon} {text}
|
||||
</>
|
||||
)
|
||||
}
|
|
@ -1,13 +1,10 @@
|
|||
export {default as Avatar} from './Avatar'
|
||||
export {default as Chart} from './Chart'
|
||||
export {default as ColorMapLegend, DiscreteColorMapLegend} from './ColorMapLegend'
|
||||
export {default as ColorMapLegend} from './ColorMapLegend'
|
||||
export {default as FileDrop} from './FileDrop'
|
||||
export {default as FileUploadField} from './FileUploadField'
|
||||
export {default as FormattedDate} from './FormattedDate'
|
||||
export {default as LoginButton} from './LoginButton'
|
||||
export {default as Map} from './Map'
|
||||
export {default as Page} from './Page'
|
||||
export {default as RegionStats} from './RegionStats'
|
||||
export {default as Stats} from './Stats'
|
||||
export {default as StripMarkdown} from './StripMarkdown'
|
||||
export {default as Visibility} from './Visibility'
|
||||
|
|
|
@ -1,11 +1,16 @@
|
|||
import React from 'react'
|
||||
|
||||
export type MapSource = {
|
||||
type: 'vector'
|
||||
tiles: string[]
|
||||
minzoom: number
|
||||
maxzoom: number
|
||||
}
|
||||
export type MapSoure =
|
||||
| {
|
||||
type: 'vector'
|
||||
url: string
|
||||
}
|
||||
| {
|
||||
type: 'vector'
|
||||
tiles: string[]
|
||||
minzoom: number
|
||||
maxzoom: number
|
||||
}
|
||||
|
||||
export interface Config {
|
||||
apiUrl: string
|
||||
|
@ -14,10 +19,9 @@ export interface Config {
|
|||
longitude: number
|
||||
zoom: number
|
||||
}
|
||||
obsMapSource?: MapSource
|
||||
obsMapSource?: MapSoure
|
||||
imprintUrl?: string
|
||||
privacyPolicyUrl?: string
|
||||
termsUrl?: string
|
||||
banner?: {
|
||||
text: string
|
||||
style?: 'warning' | 'info'
|
||||
|
|
|
@ -1,87 +0,0 @@
|
|||
import {useState, useEffect, useMemo} from 'react'
|
||||
import i18next, {TOptions} from 'i18next'
|
||||
import {BehaviorSubject, combineLatest} from 'rxjs'
|
||||
import {map, distinctUntilChanged} from 'rxjs/operators'
|
||||
import HttpBackend, {BackendOptions, RequestCallback} from 'i18next-http-backend'
|
||||
import {initReactI18next} from 'react-i18next'
|
||||
import LanguageDetector from 'i18next-browser-languagedetector'
|
||||
|
||||
export type AvailableLocales = 'en' | 'de' | 'fr'
|
||||
|
||||
async function request(_options: BackendOptions, url: string, _payload: any, callback: RequestCallback) {
|
||||
try {
|
||||
const [lng] = url.split('/')
|
||||
const locale = await import(`translations/${lng}.yaml`)
|
||||
callback(null, {status: 200, data: locale})
|
||||
} catch (e) {
|
||||
console.error(`Unable to load locale at ${url}\n`, e)
|
||||
callback(null, {status: 404, data: String(e)})
|
||||
}
|
||||
}
|
||||
|
||||
export const AVAILABLE_LOCALES: AvailableLocales[] = ['en', 'de', 'fr']
|
||||
|
||||
const i18n = i18next.createInstance()
|
||||
|
||||
const options: TOptions = {
|
||||
fallbackLng: 'en',
|
||||
|
||||
ns: ['common'],
|
||||
defaultNS: 'common',
|
||||
whitelist: AVAILABLE_LOCALES,
|
||||
|
||||
// loading via webpack
|
||||
backend: {
|
||||
loadPath: '{{lng}}/{{ns}}',
|
||||
parse: (data: any) => data,
|
||||
request,
|
||||
},
|
||||
|
||||
load: 'languageOnly',
|
||||
|
||||
interpolation: {
|
||||
escapeValue: false, // not needed for react as it escapes by default
|
||||
},
|
||||
}
|
||||
|
||||
i18n
|
||||
.use(HttpBackend)
|
||||
.use(initReactI18next)
|
||||
.use(LanguageDetector)
|
||||
.init({...options})
|
||||
|
||||
const locale$ = new BehaviorSubject<AvailableLocales>('en')
|
||||
|
||||
export const translate = i18n.t.bind(i18n)
|
||||
|
||||
export const translate$ = (stringAndData$: [string, any]) =>
|
||||
combineLatest([stringAndData$, locale$.pipe(distinctUntilChanged())]).pipe(
|
||||
map(([stringAndData]) => {
|
||||
if (typeof stringAndData === 'string') {
|
||||
return i18n.t(stringAndData)
|
||||
} else {
|
||||
const [string, data] = stringAndData
|
||||
return i18n.t(string, {data})
|
||||
}
|
||||
})
|
||||
)
|
||||
|
||||
export const setLocale = (locale: AvailableLocales) => {
|
||||
i18n.changeLanguage(locale)
|
||||
locale$.next(locale)
|
||||
}
|
||||
|
||||
export function useLocale() {
|
||||
const [, reload] = useState()
|
||||
|
||||
useEffect(() => {
|
||||
i18n.on('languageChanged', reload)
|
||||
return () => {
|
||||
i18n.off('languageChanged', reload)
|
||||
}
|
||||
}, [])
|
||||
|
||||
return i18n.language
|
||||
}
|
||||
|
||||
export default i18n
|
11
frontend/src/index.css
Normal file
11
frontend/src/index.css
Normal file
|
@ -0,0 +1,11 @@
|
|||
body {
|
||||
margin: 0;
|
||||
font-family: 'Noto Sans', 'Roboto', -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Oxygen', 'Ubuntu', 'Cantarell',
|
||||
'Fira Sans', 'Droid Sans', 'Helvetica Neue', sans-serif;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
code {
|
||||
font-family: 'Noto Sans Mono', source-code-pro, Menlo, Monaco, Consolas, 'Courier New', monospace;
|
||||
}
|
|
@ -1,8 +1,6 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<title><%= htmlWebpackPlugin.options.title %></title>
|
||||
</head>
|
||||
<head></head>
|
||||
<body>
|
||||
<noscript>You need to enable JavaScript to run this app.</noscript>
|
||||
<div id="root"></div>
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
import React, {Suspense} from 'react'
|
||||
import React from 'react'
|
||||
import {Settings} from 'luxon'
|
||||
import ReactDOM from 'react-dom'
|
||||
import 'fomantic-ui-less/semantic.less'
|
||||
|
||||
import './index.less'
|
||||
import './index.css'
|
||||
import App from './App'
|
||||
|
||||
import 'maplibre-gl/dist/maplibre-gl.css'
|
||||
|
@ -11,16 +11,13 @@ import 'maplibre-gl/dist/maplibre-gl.css'
|
|||
import {Provider} from 'react-redux'
|
||||
|
||||
import store from './store'
|
||||
import './i18n'
|
||||
|
||||
// TODO: remove
|
||||
Settings.defaultLocale = 'de-DE'
|
||||
|
||||
ReactDOM.render(
|
||||
<Provider store={store}>
|
||||
<Suspense fallback={null}>
|
||||
<App />
|
||||
</Suspense>
|
||||
<App />
|
||||
</Provider>,
|
||||
document.getElementById('root')
|
||||
)
|
||||
|
|
|
@ -1,8 +0,0 @@
|
|||
@import 'styles.less';
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
font-family: @fontFamilyDefault;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
|
@ -1,5 +1,4 @@
|
|||
import _ from 'lodash'
|
||||
import produce from 'immer'
|
||||
|
||||
import bright from './bright.json'
|
||||
import positron from './positron.json'
|
||||
|
@ -22,16 +21,6 @@ function rgbArrayToColor(arr) {
|
|||
return ['rgb', ...arr.map((v) => Math.round(v * 255))]
|
||||
}
|
||||
|
||||
function rgbArrayToHtml(arr) {
|
||||
return (
|
||||
'#' +
|
||||
arr
|
||||
.map((v) => Math.round(v * 255).toString(16))
|
||||
.map((v) => (v.length == 1 ? '0' : '') + v)
|
||||
.join('')
|
||||
)
|
||||
}
|
||||
|
||||
export function colormapToScale(colormap, value, min, max) {
|
||||
return [
|
||||
'interpolate-hcl',
|
||||
|
@ -42,76 +31,33 @@ export function colormapToScale(colormap, value, min, max) {
|
|||
}
|
||||
|
||||
export const viridis = simplifyColormap(viridisBase.map(rgbArrayToColor), 20)
|
||||
export const viridisSimpleHtml = simplifyColormap(viridisBase.map(rgbArrayToHtml), 10)
|
||||
export const grayscale = ['#FFFFFF', '#000000']
|
||||
export const reds = ['rgba( 255, 0, 0, 0)', 'rgba( 255, 0, 0, 255)']
|
||||
export const reds = [
|
||||
'rgba( 255, 0, 0, 0)',
|
||||
'rgba( 255, 0, 0, 255)',
|
||||
]
|
||||
|
||||
export function colorByCount(attribute = 'event_count', maxCount, colormap = viridis) {
|
||||
return colormapToScale(colormap, ['case', isValidAttribute(attribute), ['get', attribute], 0], 0, maxCount)
|
||||
return colormapToScale(colormap, ['case', ['to-boolean', ['get', attribute]], ['get', attribute], 0], 0, maxCount)
|
||||
}
|
||||
|
||||
var steps = {rural: [1.6, 1.8, 2.0, 2.2], urban: [1.1, 1.3, 1.5, 1.7]}
|
||||
|
||||
export function isValidAttribute(attribute) {
|
||||
if (attribute.endsWith('zone')) {
|
||||
return ['in', ['get', attribute], ['literal', ['rural', 'urban']]]
|
||||
}
|
||||
return ['to-boolean', ['get', attribute]]
|
||||
}
|
||||
|
||||
export function borderByZone() {
|
||||
return ['match', ['get', 'zone'], 'rural', 'cyan', 'urban', 'blue', 'purple']
|
||||
}
|
||||
|
||||
export function colorByDistance(attribute = 'distance_overtaker_mean', fallback = '#ABC', zone = 'urban') {
|
||||
export function colorByDistance(attribute = 'distance_overtaker_mean', fallback = '#ABC') {
|
||||
return [
|
||||
'case',
|
||||
['!', isValidAttribute(attribute)],
|
||||
['!', ['to-boolean', ['get', attribute]]],
|
||||
fallback,
|
||||
[
|
||||
'match',
|
||||
['get', 'zone'],
|
||||
'rural',
|
||||
[
|
||||
'step',
|
||||
['get', attribute],
|
||||
'rgba(150, 0, 0, 1)',
|
||||
steps['rural'][0],
|
||||
'rgba(255, 0, 0, 1)',
|
||||
steps['rural'][1],
|
||||
'rgba(255, 220, 0, 1)',
|
||||
steps['rural'][2],
|
||||
'rgba(67, 200, 0, 1)',
|
||||
steps['rural'][3],
|
||||
'rgba(67, 150, 0, 1)',
|
||||
],
|
||||
'urban',
|
||||
[
|
||||
'step',
|
||||
['get', attribute],
|
||||
'rgba(150, 0, 0, 1)',
|
||||
steps['urban'][0],
|
||||
'rgba(255, 0, 0, 1)',
|
||||
steps['urban'][1],
|
||||
'rgba(255, 220, 0, 1)',
|
||||
steps['urban'][2],
|
||||
'rgba(67, 200, 0, 1)',
|
||||
steps['urban'][3],
|
||||
'rgba(67, 150, 0, 1)',
|
||||
],
|
||||
[
|
||||
'step',
|
||||
['get', attribute],
|
||||
'rgba(150, 0, 0, 1)',
|
||||
steps['urban'][0],
|
||||
'rgba(255, 0, 0, 1)',
|
||||
steps['urban'][1],
|
||||
'rgba(255, 220, 0, 1)',
|
||||
steps['urban'][2],
|
||||
'rgba(67, 200, 0, 1)',
|
||||
steps['urban'][3],
|
||||
'rgba(67, 150, 0, 1)',
|
||||
],
|
||||
'interpolate-hcl',
|
||||
['linear'],
|
||||
['get', attribute],
|
||||
1,
|
||||
'rgba(255, 0, 0, 1)',
|
||||
1.3,
|
||||
'rgba(255, 200, 0, 1)',
|
||||
1.5,
|
||||
'rgba(67, 200, 0, 1)',
|
||||
1.7,
|
||||
'rgba(67, 150, 0, 1)',
|
||||
],
|
||||
]
|
||||
}
|
||||
|
@ -121,66 +67,7 @@ export const trackLayer = {
|
|||
paint: {
|
||||
'line-width': ['interpolate', ['linear'], ['zoom'], 14, 2, 17, 5],
|
||||
'line-color': '#F06292',
|
||||
'line-opacity': 0.6,
|
||||
},
|
||||
}
|
||||
|
||||
export const getRegionLayers = (adminLevel = 6, baseColor = '#00897B', maxValue = 5000) => [
|
||||
{
|
||||
id: 'region',
|
||||
type: 'fill',
|
||||
source: 'obs',
|
||||
'source-layer': 'obs_regions',
|
||||
minzoom: 0,
|
||||
maxzoom: 10,
|
||||
// filter: [">", "overtaking_event_count", 0],
|
||||
paint: {
|
||||
'fill-color': baseColor,
|
||||
'fill-antialias': true,
|
||||
'fill-opacity': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['log10', ['max', ['get', 'overtaking_event_count'], 1]],
|
||||
0,
|
||||
0,
|
||||
Math.log10(maxValue),
|
||||
0.9,
|
||||
],
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'region-border',
|
||||
type: 'line',
|
||||
source: 'obs',
|
||||
'source-layer': 'obs_regions',
|
||||
minzoom: 0,
|
||||
maxzoom: 10,
|
||||
// filter: [">", "overtaking_event_count", 0],
|
||||
paint: {
|
||||
'line-width': [
|
||||
'interpolate',
|
||||
['linear'],
|
||||
['log10', ['max', ['get', 'overtaking_event_count'], 1]],
|
||||
0,
|
||||
0.2,
|
||||
Math.log10(maxValue),
|
||||
1.5,
|
||||
],
|
||||
'line-color': baseColor,
|
||||
},
|
||||
layout: {
|
||||
'line-join': 'round',
|
||||
'line-cap': 'round',
|
||||
},
|
||||
},
|
||||
]
|
||||
|
||||
export const trackLayerRaw = produce(trackLayer, (draft) => {
|
||||
// draft.paint['line-color'] = '#81D4FA'
|
||||
draft.paint['line-width'][4] = 1
|
||||
draft.paint['line-width'][6] = 2
|
||||
draft.paint['line-dasharray'] = [3, 3]
|
||||
delete draft.paint['line-opacity']
|
||||
})
|
||||
|
||||
export const basemap = positron
|
||||
|
|
|
@ -1,18 +0,0 @@
|
|||
import React from 'react'
|
||||
import {Header} from 'semantic-ui-react'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
import Markdown from 'react-markdown'
|
||||
|
||||
import {Page} from 'components'
|
||||
|
||||
export default function AcknowledgementsPage() {
|
||||
const {t} = useTranslation()
|
||||
const title = t('AcknowledgementsPage.title')
|
||||
|
||||
return (
|
||||
<Page title={title}>
|
||||
<Header as="h2">{title}</Header>
|
||||
<Markdown>{t('AcknowledgementsPage.information')}</Markdown>
|
||||
</Page>
|
||||
)
|
||||
}
|
|
@ -1,161 +0,0 @@
|
|||
import React, {useState, useCallback, useMemo} from 'react'
|
||||
import {Source, Layer} from 'react-map-gl'
|
||||
import _ from 'lodash'
|
||||
import {Button, Form, Dropdown, Header, Message, Icon} from 'semantic-ui-react'
|
||||
import {useTranslation, Trans as Translate} from 'react-i18next'
|
||||
import Markdown from 'react-markdown'
|
||||
|
||||
import {useConfig} from 'config'
|
||||
import {Page, Map} from 'components'
|
||||
|
||||
const BoundingBoxSelector = React.forwardRef(({value, name, onChange}, ref) => {
|
||||
const {t} = useTranslation()
|
||||
const [pointNum, setPointNum] = useState(0)
|
||||
const [point0, setPoint0] = useState(null)
|
||||
const [point1, setPoint1] = useState(null)
|
||||
|
||||
const onClick = (e) => {
|
||||
if (pointNum == 0) {
|
||||
setPoint0(e.lngLat)
|
||||
} else {
|
||||
setPoint1(e.lngLat)
|
||||
}
|
||||
setPointNum(1 - pointNum)
|
||||
}
|
||||
|
||||
React.useEffect(() => {
|
||||
if (!point0 || !point1) return
|
||||
const bbox = `${point0[0]},${point0[1]},${point1[0]},${point1[1]}`
|
||||
if (bbox !== value) {
|
||||
onChange(bbox)
|
||||
}
|
||||
}, [point0, point1])
|
||||
|
||||
React.useEffect(() => {
|
||||
if (!value) return
|
||||
const [p00, p01, p10, p11] = value.split(',').map((v) => Number.parseFloat(v))
|
||||
if (!point0 || point0[0] != p00 || point0[1] != p01) setPoint0([p00, p01])
|
||||
if (!point1 || point1[0] != p10 || point1[1] != p11) setPoint1([p10, p11])
|
||||
}, [value])
|
||||
|
||||
return (
|
||||
<div>
|
||||
<Form.Input
|
||||
label={t('ExportPage.boundingBox.label')}
|
||||
{...{name, value}}
|
||||
onChange={(e) => onChange(e.target.value)}
|
||||
/>
|
||||
|
||||
<div style={{height: 400, position: 'relative', marginBottom: 16}}>
|
||||
<Map onClick={onClick}>
|
||||
<Source
|
||||
id="bbox"
|
||||
type="geojson"
|
||||
data={
|
||||
point0 && point1
|
||||
? {
|
||||
type: 'FeatureCollection',
|
||||
features: [
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [
|
||||
[
|
||||
[point0[0], point0[1]],
|
||||
[point1[0], point0[1]],
|
||||
[point1[0], point1[1]],
|
||||
[point0[0], point1[1]],
|
||||
[point0[0], point0[1]],
|
||||
],
|
||||
],
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {}
|
||||
}
|
||||
>
|
||||
<Layer
|
||||
id="bbox"
|
||||
type="line"
|
||||
paint={{
|
||||
'line-width': 4,
|
||||
'line-color': '#F06292',
|
||||
}}
|
||||
/>
|
||||
</Source>
|
||||
</Map>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
})
|
||||
|
||||
const MODES = ['events', 'segments']
|
||||
const FORMATS = ['geojson', 'shapefile']
|
||||
|
||||
export default function ExportPage() {
|
||||
const [mode, setMode] = useState('events')
|
||||
const [bbox, setBbox] = useState('8.294678,49.651182,9.059601,50.108249')
|
||||
const [fmt, setFmt] = useState('geojson')
|
||||
const config = useConfig()
|
||||
const {t} = useTranslation()
|
||||
return (
|
||||
<Page title="Export">
|
||||
<Header as="h2">{t('ExportPage.title')}</Header>
|
||||
|
||||
<Message icon info>
|
||||
<Icon name="info circle" />
|
||||
<Message.Content>
|
||||
<Markdown>{t('ExportPage.information')}</Markdown>
|
||||
</Message.Content>
|
||||
</Message>
|
||||
|
||||
<Form>
|
||||
<Form.Field>
|
||||
<label>{t('ExportPage.mode.label')}</label>
|
||||
<Dropdown
|
||||
placeholder={t('ExportPage.mode.placeholder')}
|
||||
fluid
|
||||
selection
|
||||
options={MODES.map((value) => ({
|
||||
key: value,
|
||||
text: t(`ExportPage.mode.${value}`),
|
||||
value,
|
||||
}))}
|
||||
value={mode}
|
||||
onChange={(_e, {value}) => setMode(value)}
|
||||
/>
|
||||
</Form.Field>
|
||||
|
||||
<Form.Field>
|
||||
<label>{t('ExportPage.format.label')}</label>
|
||||
<Dropdown
|
||||
placeholder={t('ExportPage.format.placeholder')}
|
||||
fluid
|
||||
selection
|
||||
options={FORMATS.map((value) => ({
|
||||
key: value,
|
||||
text: t(`ExportPage.format.${value}`),
|
||||
value,
|
||||
}))}
|
||||
value={fmt}
|
||||
onChange={(_e, {value}) => setFmt(value)}
|
||||
/>
|
||||
</Form.Field>
|
||||
|
||||
<BoundingBoxSelector value={bbox} onChange={setBbox} />
|
||||
|
||||
<Button
|
||||
primary
|
||||
as="a"
|
||||
href={`${config?.apiUrl}/export/${mode}?bbox=${bbox}&fmt=${fmt}`}
|
||||
target="_blank"
|
||||
rel="noreferrer noopener"
|
||||
>
|
||||
{t('ExportPage.export')}
|
||||
</Button>
|
||||
</Form>
|
||||
</Page>
|
||||
)
|
||||
}
|
12
frontend/src/pages/HomePage.module.less
Normal file
12
frontend/src/pages/HomePage.module.less
Normal file
|
@ -0,0 +1,12 @@
|
|||
@import 'styles.less';
|
||||
|
||||
.welcomeMap {
|
||||
height: 60rem;
|
||||
max-height: 70vh;
|
||||
position: relative;
|
||||
|
||||
@media @mobile {
|
||||
margin: -35px -32px 0 -32px;
|
||||
max-height: 70vh;
|
||||
}
|
||||
}
|
|
@ -1,19 +1,17 @@
|
|||
import React from 'react'
|
||||
import {Grid, Loader, Header, Item} from 'semantic-ui-react'
|
||||
import {Link} from 'react-router-dom'
|
||||
import {Message, Grid, Loader, Header, Item} from 'semantic-ui-react'
|
||||
import {useObservable} from 'rxjs-hooks'
|
||||
import {of, from} from 'rxjs'
|
||||
import {map, switchMap} from 'rxjs/operators'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import api from 'api'
|
||||
import {RegionStats, Stats, Page} from 'components'
|
||||
import type {Track} from 'types'
|
||||
import {Stats, Page, Map} from 'components'
|
||||
|
||||
import {TrackListItem, NoPublicTracksMessage} from './TracksPage'
|
||||
import {TrackListItem} from './TracksPage'
|
||||
import styles from './HomePage.module.less'
|
||||
|
||||
function MostRecentTrack() {
|
||||
const {t} = useTranslation()
|
||||
|
||||
const track: Track | null = useObservable(
|
||||
() =>
|
||||
of(null).pipe(
|
||||
|
@ -26,10 +24,12 @@ function MostRecentTrack() {
|
|||
|
||||
return (
|
||||
<>
|
||||
<Header as="h2">{t('HomePage.mostRecentTrack')}</Header>
|
||||
<Header as="h2">Most recent track</Header>
|
||||
<Loader active={track === null} />
|
||||
{track === undefined ? (
|
||||
<NoPublicTracksMessage />
|
||||
<Message>
|
||||
No public tracks yet. <Link to="/upload">Upload the first!</Link>
|
||||
</Message>
|
||||
) : track ? (
|
||||
<Item.Group>
|
||||
<TrackListItem track={track} />
|
||||
|
@ -44,13 +44,15 @@ export default function HomePage() {
|
|||
<Page>
|
||||
<Grid stackable>
|
||||
<Grid.Row>
|
||||
<Grid.Column width={8}>
|
||||
<Grid.Column width={10}>
|
||||
<div className={styles.welcomeMap}>
|
||||
<Map />
|
||||
</div>
|
||||
</Grid.Column>
|
||||
<Grid.Column width={6}>
|
||||
<Stats />
|
||||
<MostRecentTrack />
|
||||
</Grid.Column>
|
||||
<Grid.Column width={8}>
|
||||
<RegionStats />
|
||||
</Grid.Column>
|
||||
</Grid.Row>
|
||||
</Grid>
|
||||
</Page>
|
||||
|
|
|
@ -4,18 +4,16 @@ import {Redirect, useLocation, useHistory} from 'react-router-dom'
|
|||
import {Icon, Message} from 'semantic-ui-react'
|
||||
import {useObservable} from 'rxjs-hooks'
|
||||
import {switchMap, pluck, distinctUntilChanged} from 'rxjs/operators'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import {Page} from 'components'
|
||||
import api from 'api'
|
||||
|
||||
const LoginRedirectPage = connect((state) => ({
|
||||
loggedIn: Boolean(state.login),
|
||||
}))(function LoginRedirectPage({loggedIn}) {
|
||||
const LoginRedirectPage = connect((state) => ({loggedIn: Boolean(state.login)}))(function LoginRedirectPage({
|
||||
loggedIn,
|
||||
}) {
|
||||
const location = useLocation()
|
||||
const history = useHistory()
|
||||
const {search} = location
|
||||
const {t} = useTranslation()
|
||||
|
||||
/* eslint-disable react-hooks/exhaustive-deps */
|
||||
|
||||
|
@ -37,8 +35,14 @@ const LoginRedirectPage = connect((state) => ({
|
|||
|
||||
if (error) {
|
||||
return (
|
||||
<Page small title={t('LoginRedirectPage.loginError')}>
|
||||
<LoginError errorText={errorDescription || error} />
|
||||
<Page small>
|
||||
<Message icon error>
|
||||
<Icon name="warning sign" />
|
||||
<Message.Content>
|
||||
<Message.Header>Login error</Message.Header>
|
||||
The login server reported: {errorDescription || error}.
|
||||
</Message.Content>
|
||||
</Message>
|
||||
</Page>
|
||||
)
|
||||
}
|
||||
|
@ -46,21 +50,7 @@ const LoginRedirectPage = connect((state) => ({
|
|||
return <ExchangeAuthCode code={code} />
|
||||
})
|
||||
|
||||
function LoginError({errorText}: {errorText: string}) {
|
||||
const {t} = useTranslation()
|
||||
return (
|
||||
<Message icon error>
|
||||
<Icon name="warning sign" />
|
||||
<Message.Content>
|
||||
<Message.Header>{t('LoginRedirectPage.loginError')}</Message.Header>
|
||||
{t('LoginRedirectPage.loginErrorText', {error: errorText})}
|
||||
</Message.Content>
|
||||
</Message>
|
||||
)
|
||||
}
|
||||
|
||||
function ExchangeAuthCode({code}) {
|
||||
const {t} = useTranslation()
|
||||
const result = useObservable(
|
||||
(_$, args$) =>
|
||||
args$.pipe(
|
||||
|
@ -78,8 +68,8 @@ function ExchangeAuthCode({code}) {
|
|||
<Message icon info>
|
||||
<Icon name="circle notched" loading />
|
||||
<Message.Content>
|
||||
<Message.Header>{t('LoginRedirectPage.loggingIn')}</Message.Header>
|
||||
{t('LoginRedirectPage.hangTight')}
|
||||
<Message.Header>Logging you in</Message.Header>
|
||||
Hang tight...
|
||||
</Message.Content>
|
||||
</Message>
|
||||
)
|
||||
|
@ -87,14 +77,21 @@ function ExchangeAuthCode({code}) {
|
|||
content = <Redirect to="/" />
|
||||
} else {
|
||||
const {error, error_description: errorDescription} = result
|
||||
content = <LoginError errorText={errorDescription || error} />
|
||||
content = (
|
||||
<>
|
||||
<Message icon error>
|
||||
<Icon name="warning sign" />
|
||||
<Message.Content>
|
||||
<Message.Header>Login error</Message.Header>
|
||||
The login server reported: {errorDescription || error}.
|
||||
</Message.Content>
|
||||
</Message>
|
||||
<pre>{JSON.stringify(result, null, 2)}</pre>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<Page small title="Login">
|
||||
{content}
|
||||
</Page>
|
||||
)
|
||||
return <Page small>{content}</Page>
|
||||
}
|
||||
|
||||
export default LoginRedirectPage
|
||||
|
|
|
@ -1,109 +1,53 @@
|
|||
import React from 'react'
|
||||
import _ from 'lodash'
|
||||
import {connect} from 'react-redux'
|
||||
import {Link} from 'react-router-dom'
|
||||
import {List, Select, Input, Divider, Label, Checkbox, Header} from 'semantic-ui-react'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
import {List, Select, Input, Divider, Checkbox, Header} from 'semantic-ui-react'
|
||||
|
||||
import {
|
||||
MapConfig,
|
||||
setMapConfigFlag as setMapConfigFlagAction,
|
||||
initialState as defaultMapConfig,
|
||||
} from 'reducers/mapConfig'
|
||||
import {colorByDistance, colorByCount, viridisSimpleHtml} from 'mapstyles'
|
||||
import {ColorMapLegend, DiscreteColorMapLegend} from 'components'
|
||||
import styles from './styles.module.less'
|
||||
import {colorByDistance, colorByCount, reds} from 'mapstyles'
|
||||
import {ColorMapLegend} from 'components'
|
||||
|
||||
const BASEMAP_STYLE_OPTIONS = ['positron', 'bright']
|
||||
|
||||
const ROAD_ATTRIBUTE_OPTIONS = [
|
||||
'distance_overtaker_mean',
|
||||
'distance_overtaker_min',
|
||||
'distance_overtaker_max',
|
||||
'distance_overtaker_median',
|
||||
'overtaking_event_count',
|
||||
'usage_count',
|
||||
'zone',
|
||||
const BASEMAP_STYLE_OPTIONS = [
|
||||
{value: 'positron', key: 'positron', text: 'Positron'},
|
||||
{value: 'bright', key: 'bright', text: 'OSM Bright'},
|
||||
]
|
||||
|
||||
const DATE_FILTER_MODES = ['none', 'range', 'threshold']
|
||||
|
||||
type User = Object
|
||||
const ROAD_ATTRIBUTE_OPTIONS = [
|
||||
{value: 'distance_overtaker_mean', key: 'distance_overtaker_mean', text: 'Overtaker distance mean'},
|
||||
{value: 'distance_overtaker_min', key: 'distance_overtaker_min', text: 'Overtaker distance minimum'},
|
||||
{value: 'distance_overtaker_max', key: 'distance_overtaker_max', text: 'Overtaker distance maximum'},
|
||||
{value: 'distance_overtaker_median', key: 'distance_overtaker_median', text: 'Overtaker distance median'},
|
||||
{value: 'overtaking_event_count', key: 'overtaking_event_count', text: 'Event count'},
|
||||
]
|
||||
|
||||
function LayerSidebar({
|
||||
mapConfig,
|
||||
login,
|
||||
setMapConfigFlag,
|
||||
}: {
|
||||
login: User | null
|
||||
mapConfig: MapConfig
|
||||
setMapConfigFlag: (flag: string, value: unknown) => void
|
||||
}) {
|
||||
const {t} = useTranslation()
|
||||
const {
|
||||
baseMap: {style},
|
||||
obsRoads: {show: showRoads, showUntagged, attribute, maxCount},
|
||||
obsEvents: {show: showEvents},
|
||||
obsRegions: {show: showRegions},
|
||||
filters: {currentUser: filtersCurrentUser, dateMode, startDate, endDate, thresholdAfter},
|
||||
} = mapConfig
|
||||
|
||||
const openStreetMapCopyright = (
|
||||
<List.Item className={styles.copyright}>
|
||||
{t('MapPage.sidebar.copyright.openStreetMap')}{' '}
|
||||
<Link to="/acknowledgements">{t('MapPage.sidebar.copyright.learnMore')}</Link>
|
||||
</List.Item>
|
||||
)
|
||||
|
||||
return (
|
||||
<div>
|
||||
<List relaxed>
|
||||
<List.Item>
|
||||
<List.Header>{t('MapPage.sidebar.baseMap.style.label')}</List.Header>
|
||||
<List.Header>Basemap Style</List.Header>
|
||||
<Select
|
||||
options={BASEMAP_STYLE_OPTIONS.map((value) => ({
|
||||
value,
|
||||
key: value,
|
||||
text: t(`MapPage.sidebar.baseMap.style.${value}`),
|
||||
}))}
|
||||
options={BASEMAP_STYLE_OPTIONS}
|
||||
value={style}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('baseMap.style', value)}
|
||||
/>
|
||||
</List.Item>
|
||||
{openStreetMapCopyright}
|
||||
<Divider />
|
||||
<List.Item>
|
||||
<Checkbox
|
||||
toggle
|
||||
size="small"
|
||||
id="obsRegions.show"
|
||||
style={{float: 'right'}}
|
||||
checked={showRegions}
|
||||
onChange={() => setMapConfigFlag('obsRegions.show', !showRegions)}
|
||||
/>
|
||||
<label htmlFor="obsRegions.show">
|
||||
<Header as="h4">{t('MapPage.sidebar.obsRegions.title')}</Header>
|
||||
</label>
|
||||
</List.Item>
|
||||
{showRegions && (
|
||||
<>
|
||||
<List.Item>{t('MapPage.sidebar.obsRegions.colorByEventCount')}</List.Item>
|
||||
<List.Item>
|
||||
<ColorMapLegend
|
||||
twoTicks
|
||||
map={[
|
||||
[0, '#00897B00'],
|
||||
[5000, '#00897BFF'],
|
||||
]}
|
||||
digits={0}
|
||||
/>
|
||||
</List.Item>
|
||||
<List.Item className={styles.copyright}>
|
||||
{t('MapPage.sidebar.copyright.boundaries')}{' '}
|
||||
<Link to="/acknowledgements">{t('MapPage.sidebar.copyright.learnMore')}</Link>
|
||||
</List.Item>
|
||||
</>
|
||||
)}
|
||||
<Divider />
|
||||
<List.Item>
|
||||
<Checkbox
|
||||
|
@ -115,7 +59,7 @@ function LayerSidebar({
|
|||
onChange={() => setMapConfigFlag('obsRoads.show', !showRoads)}
|
||||
/>
|
||||
<label htmlFor="obsRoads.show">
|
||||
<Header as="h4">{t('MapPage.sidebar.obsRoads.title')}</Header>
|
||||
<Header as="h4">Road segments</Header>
|
||||
</label>
|
||||
</List.Item>
|
||||
{showRoads && (
|
||||
|
@ -124,67 +68,38 @@ function LayerSidebar({
|
|||
<Checkbox
|
||||
checked={showUntagged}
|
||||
onChange={() => setMapConfigFlag('obsRoads.showUntagged', !showUntagged)}
|
||||
label={t('MapPage.sidebar.obsRoads.showUntagged.label')}
|
||||
label="Include roads without data"
|
||||
/>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<List.Header>{t('MapPage.sidebar.obsRoads.attribute.label')}</List.Header>
|
||||
<List.Header>Color based on</List.Header>
|
||||
<Select
|
||||
fluid
|
||||
options={ROAD_ATTRIBUTE_OPTIONS.map((value) => ({
|
||||
value,
|
||||
key: value,
|
||||
text: t(`MapPage.sidebar.obsRoads.attribute.${value}`),
|
||||
}))}
|
||||
options={ROAD_ATTRIBUTE_OPTIONS}
|
||||
value={attribute}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('obsRoads.attribute', value)}
|
||||
/>
|
||||
</List.Item>
|
||||
{attribute.endsWith('_count') ? (
|
||||
<>
|
||||
<List.Item>
|
||||
<List.Header>{t('MapPage.sidebar.obsRoads.maxCount.label')}</List.Header>
|
||||
<Input
|
||||
fluid
|
||||
type="number"
|
||||
value={maxCount}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('obsRoads.maxCount', value)}
|
||||
/>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<ColorMapLegend
|
||||
map={_.chunk(
|
||||
colorByCount('obsRoads.maxCount', mapConfig.obsRoads.maxCount, viridisSimpleHtml).slice(3),
|
||||
2
|
||||
)}
|
||||
twoTicks
|
||||
/>
|
||||
</List.Item>
|
||||
</>
|
||||
) : attribute.endsWith('zone') ? (
|
||||
<>
|
||||
<List.Item>
|
||||
<Label size="small" style={{background: 'blue', color: 'white'}}>
|
||||
{t('general.zone.urban')} (1.5 m)
|
||||
</Label>
|
||||
<Label size="small" style={{background: 'cyan', color: 'black'}}>
|
||||
{t('general.zone.rural')}(2 m)
|
||||
</Label>
|
||||
</List.Item>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<List.Item>
|
||||
<List.Header>{_.upperFirst(t('general.zone.urban'))}</List.Header>
|
||||
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][5].slice(2)} />
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<List.Header>{_.upperFirst(t('general.zone.rural'))}</List.Header>
|
||||
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][3].slice(2)} />
|
||||
</List.Item>
|
||||
</>
|
||||
<>
|
||||
<List.Item>
|
||||
<List.Header>Maximum value</List.Header>
|
||||
<Input
|
||||
fluid
|
||||
type="number"
|
||||
value={maxCount}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('obsRoads.maxCount', value)}
|
||||
/>
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<ColorMapLegend map={_.chunk(colorByCount('obsRoads.maxCount',mapConfig.obsRoads.maxCount, reds).slice(3), 2)} />
|
||||
</List.Item></>
|
||||
) :
|
||||
(
|
||||
<List.Item>
|
||||
<ColorMapLegend map={_.chunk(colorByDistance('distance_overtaker')[3].slice(3), 2)} />
|
||||
</List.Item>
|
||||
)}
|
||||
{openStreetMapCopyright}
|
||||
</>
|
||||
)}
|
||||
<Divider />
|
||||
|
@ -198,124 +113,16 @@ function LayerSidebar({
|
|||
onChange={() => setMapConfigFlag('obsEvents.show', !showEvents)}
|
||||
/>
|
||||
<label htmlFor="obsEvents.show">
|
||||
<Header as="h4">{t('MapPage.sidebar.obsEvents.title')}</Header>
|
||||
<Header as="h4">Event points</Header>
|
||||
</label>
|
||||
</List.Item>
|
||||
{showEvents && (
|
||||
<>
|
||||
<List.Item>
|
||||
<List.Header>{_.upperFirst(t('general.zone.urban'))}</List.Header>
|
||||
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][5].slice(2)} />
|
||||
</List.Item>
|
||||
<List.Item>
|
||||
<List.Header>{_.upperFirst(t('general.zone.rural'))}</List.Header>
|
||||
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][3].slice(2)} />
|
||||
<ColorMapLegend map={_.chunk(colorByDistance('distance_overtaker')[3].slice(3), 2)} />
|
||||
</List.Item>
|
||||
</>
|
||||
)}
|
||||
<Divider />
|
||||
|
||||
<List.Item>
|
||||
<Header as="h4">{t('MapPage.sidebar.filters.title')}</Header>
|
||||
</List.Item>
|
||||
|
||||
{login && (
|
||||
<>
|
||||
<List.Item>
|
||||
<Header as="h5">{t('MapPage.sidebar.filters.userData')}</Header>
|
||||
</List.Item>
|
||||
|
||||
<List.Item>
|
||||
<Checkbox
|
||||
toggle
|
||||
size="small"
|
||||
id="filters.currentUser"
|
||||
checked={filtersCurrentUser}
|
||||
onChange={() => setMapConfigFlag('filters.currentUser', !filtersCurrentUser)}
|
||||
label={t('MapPage.sidebar.filters.currentUser')}
|
||||
/>
|
||||
</List.Item>
|
||||
|
||||
<List.Item>
|
||||
<Header as="h5">{t('MapPage.sidebar.filters.dateRange')}</Header>
|
||||
</List.Item>
|
||||
|
||||
<List.Item>
|
||||
<Select
|
||||
id="filters.dateMode"
|
||||
options={DATE_FILTER_MODES.map((value) => ({
|
||||
value,
|
||||
key: value,
|
||||
text: t(`MapPage.sidebar.filters.dateMode.${value}`),
|
||||
}))}
|
||||
value={dateMode ?? 'none'}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('filters.dateMode', value)}
|
||||
/>
|
||||
</List.Item>
|
||||
|
||||
{dateMode == 'range' && (
|
||||
<List.Item>
|
||||
<Input
|
||||
type="date"
|
||||
min="2000-01-03"
|
||||
step="7"
|
||||
size="small"
|
||||
id="filters.startDate"
|
||||
onChange={(_e, {value}) => setMapConfigFlag('filters.startDate', value)}
|
||||
value={startDate ?? null}
|
||||
label={t('MapPage.sidebar.filters.start')}
|
||||
/>
|
||||
</List.Item>
|
||||
)}
|
||||
|
||||
{dateMode == 'range' && (
|
||||
<List.Item>
|
||||
<Input
|
||||
type="date"
|
||||
min="2000-01-03"
|
||||
step="7"
|
||||
size="small"
|
||||
id="filters.endDate"
|
||||
onChange={(_e, {value}) => setMapConfigFlag('filters.endDate', value)}
|
||||
value={endDate ?? null}
|
||||
label={t('MapPage.sidebar.filters.end')}
|
||||
/>
|
||||
</List.Item>
|
||||
)}
|
||||
|
||||
{dateMode == 'threshold' && (
|
||||
<List.Item>
|
||||
<Input
|
||||
type="date"
|
||||
min="2000-01-03"
|
||||
step="7"
|
||||
size="small"
|
||||
id="filters.startDate"
|
||||
value={startDate ?? null}
|
||||
onChange={(_e, {value}) => setMapConfigFlag('filters.startDate', value)}
|
||||
label={t('MapPage.sidebar.filters.threshold')}
|
||||
/>
|
||||
</List.Item>
|
||||
)}
|
||||
|
||||
{dateMode == 'threshold' && (
|
||||
<List.Item>
|
||||
<span>
|
||||
{t('MapPage.sidebar.filters.before')}{' '}
|
||||
<Checkbox
|
||||
toggle
|
||||
size="small"
|
||||
checked={thresholdAfter ?? false}
|
||||
onChange={() => setMapConfigFlag('filters.thresholdAfter', !thresholdAfter)}
|
||||
id="filters.thresholdAfter"
|
||||
/>{' '}
|
||||
{t('MapPage.sidebar.filters.after')}
|
||||
</span>
|
||||
</List.Item>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
{!login && <List.Item>{t('MapPage.sidebar.filters.needsLogin')}</List.Item>}
|
||||
</List>
|
||||
</div>
|
||||
)
|
||||
|
@ -329,7 +136,6 @@ export default connect(
|
|||
(state as any).mapConfig as MapConfig
|
||||
//
|
||||
),
|
||||
login: state.login,
|
||||
}),
|
||||
{setMapConfigFlag: setMapConfigFlagAction}
|
||||
//
|
||||
|
|
|
@ -1,31 +0,0 @@
|
|||
import React from 'react'
|
||||
import {createPortal} from 'react-dom'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
import {List, Header, Icon, Button} from 'semantic-ui-react'
|
||||
|
||||
import styles from './styles.module.less'
|
||||
|
||||
export default function RegionInfo({region, mapInfoPortal, onClose}) {
|
||||
const {t} = useTranslation()
|
||||
const content = (
|
||||
<>
|
||||
<div className={styles.closeHeader}>
|
||||
<Header as="h3">{region.properties.name || t('MapPage.regionInfo.unnamedRegion')}</Header>
|
||||
<Button primary icon onClick={onClose}>
|
||||
<Icon name="close" />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<List>
|
||||
<List.Item>
|
||||
<List.Header>{t('MapPage.regionInfo.eventCount')}</List.Header>
|
||||
<List.Content>{region.properties.overtaking_event_count ?? 0}</List.Content>
|
||||
</List.Item>
|
||||
</List>
|
||||
</>
|
||||
)
|
||||
|
||||
return content && mapInfoPortal
|
||||
? createPortal(<div className={styles.mapInfoBox}>{content}</div>, mapInfoPortal)
|
||||
: null
|
||||
}
|
|
@ -1,75 +1,48 @@
|
|||
import React, {useState, useCallback} from 'react'
|
||||
import {createPortal} from 'react-dom'
|
||||
import _ from 'lodash'
|
||||
import {Segment, Menu, Header, Label, Icon, Table, Message, Button} from 'semantic-ui-react'
|
||||
import {Segment, Menu, Header, Label, Icon, Table} from 'semantic-ui-react'
|
||||
import {Layer, Source} from 'react-map-gl'
|
||||
import {of, from, concat} from 'rxjs'
|
||||
import {useObservable} from 'rxjs-hooks'
|
||||
import {switchMap, distinctUntilChanged} from 'rxjs/operators'
|
||||
import {Chart} from 'components'
|
||||
import {pairwise} from 'utils'
|
||||
import {useTranslation} from 'react-i18next'
|
||||
|
||||
import type {Location} from 'types'
|
||||
import api from 'api'
|
||||
import {colorByDistance, borderByZone} from 'mapstyles'
|
||||
|
||||
import styles from './styles.module.less'
|
||||
|
||||
function selectFromColorMap(colormap, value) {
|
||||
let last = null
|
||||
for (let i = 0; i < colormap.length; i += 2) {
|
||||
if (colormap[i + 1] > value) {
|
||||
return colormap[i]
|
||||
}
|
||||
}
|
||||
return colormap[colormap.length - 1]
|
||||
}
|
||||
|
||||
const UNITS = {
|
||||
distanceOvertaker: 'm',
|
||||
distanceStationary: 'm',
|
||||
speed: 'km/h',
|
||||
}
|
||||
const ZONE_COLORS = {urban: 'blue', rural: 'cyan', motorway: 'purple'}
|
||||
const CARDINAL_DIRECTIONS = ['north', 'northEast', 'east', 'southEast', 'south', 'southWest', 'west', 'northWest']
|
||||
const getCardinalDirection = (t, bearing) => {
|
||||
if (bearing == null) {
|
||||
return t('MapPage.roadInfo.cardinalDirections.unknown')
|
||||
} else {
|
||||
const n = CARDINAL_DIRECTIONS.length
|
||||
const i = Math.floor(((bearing / 360.0) * n + 0.5) % n)
|
||||
const name = CARDINAL_DIRECTIONS[i]
|
||||
return t(`MapPage.roadInfo.cardinalDirections.${name}`)
|
||||
}
|
||||
}
|
||||
const UNITS = {distanceOvertaker: 'm', distanceStationary: 'm', speed: 'm/s'}
|
||||
const LABELS = {distanceOvertaker: 'Overtaker', distanceStationary: 'Stationary', speed: 'Speed'}
|
||||
const ZONE_COLORS = {urban: 'olive', rural: 'brown', motorway: 'purple'}
|
||||
const CARDINAL_DIRECTIONS = ['north', 'north-east', 'east', 'south-east', 'south', 'south-west', 'west', 'north-west']
|
||||
const getCardinalDirection = (bearing) =>
|
||||
bearing == null
|
||||
? 'unknown'
|
||||
: CARDINAL_DIRECTIONS[
|
||||
Math.floor(((bearing / 360.0) * CARDINAL_DIRECTIONS.length + 0.5) % CARDINAL_DIRECTIONS.length)
|
||||
] + ' bound'
|
||||
|
||||
function RoadStatsTable({data}) {
|
||||
const {t} = useTranslation()
|
||||
return (
|
||||
<Table size="small" compact>
|
||||
<Table.Header>
|
||||
<Table.Row>
|
||||
<Table.HeaderCell textAlign="right"></Table.HeaderCell>
|
||||
{['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
|
||||
<Table.HeaderCell key={prop} textAlign="right">
|
||||
{t(`MapPage.roadInfo.${prop}`)}
|
||||
</Table.HeaderCell>
|
||||
))}
|
||||
<Table.HeaderCell>Property</Table.HeaderCell>
|
||||
<Table.HeaderCell>n</Table.HeaderCell>
|
||||
<Table.HeaderCell>min</Table.HeaderCell>
|
||||
<Table.HeaderCell>q50</Table.HeaderCell>
|
||||
<Table.HeaderCell>max</Table.HeaderCell>
|
||||
<Table.HeaderCell>mean</Table.HeaderCell>
|
||||
<Table.HeaderCell>unit</Table.HeaderCell>
|
||||
</Table.Row>
|
||||
</Table.Header>
|
||||
<Table.Body>
|
||||
{['count', 'min', 'median', 'max', 'mean'].map((stat) => (
|
||||
<Table.Row key={stat}>
|
||||
<Table.Cell> {t(`MapPage.roadInfo.${stat}`)}</Table.Cell>
|
||||
{['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
|
||||
<Table.Cell key={prop} textAlign="right">
|
||||
{(data[prop]?.statistics?.[stat] * (prop === `speed` && stat != 'count' ? 3.6 : 1)).toFixed(
|
||||
stat === 'count' ? 0 : 2
|
||||
)}
|
||||
{stat !== 'count' && ` ${UNITS[prop]}`}
|
||||
</Table.Cell>
|
||||
{['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
|
||||
<Table.Row key={prop}>
|
||||
<Table.Cell>{LABELS[prop]}</Table.Cell>
|
||||
{['count', 'min', 'median', 'max', 'mean'].map((stat) => (
|
||||
<Table.Cell key={stat}>{data[prop]?.statistics?.[stat]?.toFixed(stat === 'count' ? 0 : 3)}</Table.Cell>
|
||||
))}
|
||||
<Table.Cell>{UNITS[prop]}</Table.Cell>
|
||||
</Table.Row>
|
||||
))}
|
||||
</Table.Body>
|
||||
|
@ -77,91 +50,7 @@ function RoadStatsTable({data}) {
|
|||
)
|
||||
}
|
||||
|
||||
function HistogramChart({bins, counts, zone}) {
|
||||
const diff = bins[1] - bins[0]
|
||||
const colortype = zone === 'rural' ? 3 : 5
|
||||
const data = _.zip(
|
||||
bins.slice(0, bins.length - 1).map((v) => v + diff / 2),
|
||||
counts
|
||||
).map((value) => ({
|
||||
value,
|
||||
itemStyle: {
|
||||
color: selectFromColorMap(colorByDistance()[3][colortype].slice(2), value[0]),
|
||||
},
|
||||
}))
|
||||
|
||||
return (
|
||||
<Chart
|
||||
style={{height: 240}}
|
||||
option={{
|
||||
grid: {top: 30, bottom: 30, right: 30, left: 30},
|
||||
xAxis: {
|
||||
type: 'value',
|
||||
axisLabel: {formatter: (v) => `${Math.round(v * 100)} cm`},
|
||||
min: 0,
|
||||
max: 2.5,
|
||||
},
|
||||
yAxis: {},
|
||||
series: [
|
||||
{
|
||||
type: 'bar',
|
||||
data,
|
||||
|
||||
barMaxWidth: 20,
|
||||
},
|
||||
],
|
||||
}}
|
||||
/>
|
||||
)
|
||||
}
|
||||
|
||||
interface ArrayStats {
|
||||
statistics: {
|
||||
count: number
|
||||
mean: number
|
||||
min: number
|
||||
max: number
|
||||
median: number
|
||||
}
|
||||
histogram: {
|
||||
bins: number[]
|
||||
counts: number[]
|
||||
}
|
||||
values: number[]
|
||||
}
|
||||
|
||||
export interface RoadDirectionInfo {
|
||||
bearing: number
|
||||
distanceOvertaker: ArrayStats
|
||||
distanceStationary: ArrayStats
|
||||
speed: ArrayStats
|
||||
}
|
||||
|
||||
export interface RoadInfoType {
|
||||
road: {
|
||||
way_id: number
|
||||
zone: 'urban' | 'rural' | null
|
||||
name: string
|
||||
directionality: -1 | 0 | 1
|
||||
oneway: boolean
|
||||
geometry: Object
|
||||
}
|
||||
forwards: RoadDirectionInfo
|
||||
backwards: RoadDirectionInfo
|
||||
}
|
||||
|
||||
export default function RoadInfo({
|
||||
roadInfo: info,
|
||||
hasFilters,
|
||||
onClose,
|
||||
mapInfoPortal,
|
||||
}: {
|
||||
roadInfo: RoadInfoType
|
||||
hasFilters: boolean
|
||||
onClose: () => void
|
||||
mapInfoPortal: HTMLElement
|
||||
}) {
|
||||
const {t} = useTranslation()
|
||||
export default function RoadInfo({clickLocation}) {
|
||||
const [direction, setDirection] = useState('forwards')
|
||||
|
||||
const onClickDirection = useCallback(
|
||||
|
@ -173,60 +62,73 @@ export default function RoadInfo({
|
|||
[setDirection]
|
||||
)
|
||||
|
||||
// TODO: change based on left-hand/right-hand traffic
|
||||
const offsetDirection = info.road.oneway ? 0 : direction === 'forwards' ? 1 : -1
|
||||
|
||||
const content = (
|
||||
<>
|
||||
<div className={styles.closeHeader}>
|
||||
<Header as="h3">{info?.road.name || t('MapPage.roadInfo.unnamedWay')}</Header>
|
||||
<Button primary icon onClick={onClose}>
|
||||
<Icon name="close" />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{hasFilters && (
|
||||
<Message info icon>
|
||||
<Icon name="info circle" small />
|
||||
<Message.Content>{t('MapPage.roadInfo.hintFiltersNotApplied')}</Message.Content>
|
||||
</Message>
|
||||
)}
|
||||
|
||||
{info?.road.zone && (
|
||||
<Label size="small" color={ZONE_COLORS[info?.road.zone]}>
|
||||
{t(`general.zone.${info.road.zone}`)}
|
||||
</Label>
|
||||
)}
|
||||
|
||||
{info?.road.oneway && (
|
||||
<Label size="small" color="blue">
|
||||
<Icon name="long arrow alternate right" fitted /> {t('MapPage.roadInfo.oneway')}
|
||||
</Label>
|
||||
)}
|
||||
|
||||
{info?.road.oneway ? null : (
|
||||
<Menu size="tiny" pointing>
|
||||
<Menu.Item header>{t('MapPage.roadInfo.direction')}</Menu.Item>
|
||||
<Menu.Item name="forwards" active={direction === 'forwards'} onClick={onClickDirection}>
|
||||
{getCardinalDirection(t, info?.forwards?.bearing)}
|
||||
</Menu.Item>
|
||||
<Menu.Item name="backwards" active={direction === 'backwards'} onClick={onClickDirection}>
|
||||
{getCardinalDirection(t, info?.backwards?.bearing)}
|
||||
</Menu.Item>
|
||||
</Menu>
|
||||
)}
|
||||
|
||||
{info?.[direction] && <RoadStatsTable data={info[direction]} />}
|
||||
|
||||
{info?.[direction]?.distanceOvertaker?.histogram && (
|
||||
<>
|
||||
<Header as="h5">{t('MapPage.roadInfo.overtakerDistanceDistribution')}</Header>
|
||||
<HistogramChart {...info[direction]?.distanceOvertaker?.histogram} />
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
const info = useObservable(
|
||||
(_$, inputs$) =>
|
||||
inputs$.pipe(
|
||||
distinctUntilChanged(_.isEqual),
|
||||
switchMap(([location]) =>
|
||||
location
|
||||
? concat(
|
||||
of(null),
|
||||
from(
|
||||
api.get('/mapdetails/road', {
|
||||
query: {
|
||||
...location,
|
||||
radius: 100,
|
||||
},
|
||||
})
|
||||
)
|
||||
)
|
||||
: of(null)
|
||||
)
|
||||
),
|
||||
null,
|
||||
[clickLocation]
|
||||
)
|
||||
|
||||
if (!clickLocation) {
|
||||
return null
|
||||
}
|
||||
|
||||
const loading = info == null
|
||||
|
||||
const offsetDirection = info?.road?.oneway ? 0 : direction === 'forwards' ? 1 : -1 // TODO: change based on left-hand/right-hand traffic
|
||||
|
||||
const content =
|
||||
!loading && !info.road ? (
|
||||
'No road found.'
|
||||
) : (
|
||||
<>
|
||||
<Header as="h3">{loading ? '...' : info?.road.name || 'Unnamed way'}</Header>
|
||||
|
||||
{info?.road.zone && (
|
||||
<Label size="small" color={ZONE_COLORS[info?.road.zone]}>
|
||||
{info?.road.zone}
|
||||
</Label>
|
||||
)}
|
||||
|
||||
{info?.road.oneway && (
|
||||
<Label size="small" color="blue">
|
||||
<Icon name="long arrow alternate right" fitted /> oneway
|
||||
</Label>
|
||||
)}
|
||||
|
||||
{info?.road.oneway ? null : (
|
||||
<Menu size="tiny" fluid secondary>
|
||||
<Menu.Item header>Direction</Menu.Item>
|
||||
<Menu.Item name="forwards" active={direction === 'forwards'} onClick={onClickDirection}>
|
||||
{getCardinalDirection(info?.forwards?.bearing)}
|
||||
</Menu.Item>
|
||||
<Menu.Item name="backwards" active={direction === 'backwards'} onClick={onClickDirection}>
|
||||
{getCardinalDirection(info?.backwards?.bearing)}
|
||||
</Menu.Item>
|
||||
</Menu>
|
||||
)}
|
||||
|
||||
{info?.[direction] && <RoadStatsTable data={info[direction]} />}
|
||||
</>
|
||||
)
|
||||
|
||||
return (
|
||||
<>
|
||||
{info?.road && (
|
||||
|
@ -254,7 +156,11 @@ export default function RoadInfo({
|
|||
</Source>
|
||||
)}
|
||||
|
||||
{content && mapInfoPortal && createPortal(<div className={styles.mapInfoBox}>{content}</div>, mapInfoPortal)}
|
||||
{content && (
|
||||
<div className={styles.mapInfoBox}>
|
||||
<Segment loading={loading}>{content}</Segment>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue