Compare commits

..

277 commits

Author SHA1 Message Date
Benjamin Yule Bädorf 710a37dac3
ci: add docker build and push action
All checks were successful
Build docker image / build-image (push) Successful in 6m29s
2024-02-22 17:35:35 +01:00
Paul Bienkowski fbf4d739f5 improve sql formatting and parameter passing 2024-02-18 10:13:41 +01:00
gluap ec669fa077
remove accidental submit of build version 2024-01-31 14:14:04 +01:00
gluap f7c0d48c22
do not make inaccessible ways cycleable (often used for tram tracks) 2024-01-31 14:04:15 +01:00
gluap 7bffc3a2b3
do not make inaccessible ways cycleable (often used for tram tracks) 2024-01-31 14:02:47 +01:00
gluap 241a43c4ad
fix for older postgres version 2024-01-26 00:19:59 +01:00
gluap 4940679201
Release: 0.8.1 2024-01-25 22:24:21 +01:00
gluap 6d35001f8d
merge exporting zones from "exports" page 2024-01-25 22:11:11 +01:00
gluap c41aa3f6a0
fix gpstime 2024-01-18 21:26:36 +01:00
Paul Bienkowski 278bcfc603 Format all JS/TS files 2023-08-26 10:26:13 +02:00
Paul Bienkowski ba7de7582d Add openbikesensor-transform-osm command to PATH 2023-08-17 13:46:20 +02:00
Alexandre Fauquette 4fa1d31f33
fix setup (#352) 2023-07-24 18:49:36 +02:00
gluap be6c736148
Add index to fix very slow rendering speed on low zoom levels. 2023-07-16 13:44:11 +02:00
gluap 1faaa6e7b4
nginx config example to cache tiles up to level 12 for a day. 2023-07-16 13:09:26 +02:00
gluap 4f44cc0e56
fix per-user-statistics 2023-07-08 14:05:23 +02:00
gluap 74c7e6444e
remove the non-working buttons from the device edit field. 2023-07-08 13:55:23 +02:00
gluap 4ebffc529f
default to reverse chronological sorting in the table (so users see their newest tracks on top). Fixes #350 2023-07-08 13:17:48 +02:00
gluap b9c9a61ca1
Release: 0.8.0 2023-06-22 22:01:29 +02:00
gluap f23ecc37e4
keep OpenSans (but fallback to noto as suggested in #347) 2023-06-22 21:58:27 +02:00
yyxcv d29c68432d
Fix/default font family (#347)
* remove 'Open Sans' references

* set default font-family

(use the font-family list that was set in index.css)

* small style tweaks to compensate effect of new base font-family

---------

Co-authored-by: yyxcv <yxcv@github.com>
2023-06-22 21:40:24 +02:00
yyxcv 5b91449749
fix: remove compass from map (#346)
(as it can not be rotated anyway)

Co-authored-by: yyxcv <yxcv@github.com>
Co-authored-by: gluap <44007906+gluap@users.noreply.github.com>
2023-06-22 20:31:29 +02:00
yyxcv 31d8390bdc fix: add scrollbar to mapSidebar on overflow 2023-06-22 20:28:43 +02:00
gluap 14c7f6e88b
Merge pull request #321 from openbikesensor/next
together with @opatut: finally. 🚀
2023-06-22 20:27:48 +02:00
gluap c897412f99
Merge pull request #343 from openbikesensor/next-document-upgrade
Next document upgrade
2023-06-19 20:41:36 +02:00
gluap 43765092c3
fix issue when displaying mapdetails (newer numpy deprecates numpy.bool, it was always an alias for bool apparently) 2023-06-10 19:37:34 +02:00
gluap 1a1232f2a7
up openmaptiles version (7.0 has been running on adfc-hessen for ages), add changelog. 2023-06-10 19:33:14 +02:00
gluap 4a87489b3f
documentation update after dancing the dance on obs.adfc-hessen.de 2023-06-10 17:45:56 +02:00
gluap 7dd6b68da8
fix percentage logging 2023-06-10 13:00:45 +02:00
gluap 0233045959
import in chunks to avoid smaller systems chocking 2023-06-10 12:56:44 +02:00
gluap b1cfd30da9
Merge pull request #341 from openbikesensor/next-semaphore-with-road-usage
Add export of road usage after review (without temporal filtering)
2023-05-29 13:22:16 +02:00
gluap da82303042
implement reviewer comments
- rename road_usage -> segments (as we're actually dealing with segments not with roads)
- use "select as" to ensure defined behaviour
- cleanup
2023-05-29 13:20:34 +02:00
gluap 497e1b739a
Add a first shot at upgrade documentation (WIP) 2023-05-25 22:24:05 +02:00
gluap d8e8d9aec1
Merge pull request #337 from openbikesensor/next-semaphore
Semaphore for tile requests
2023-05-25 21:51:07 +02:00
gluap e1763e0d3c
chore: upgrade python in containers. 2023-05-19 18:21:45 +02:00
gluap de029fa3d2
Merge branch 'next-semaphore' into next-semaphore-with-road-usage 2023-05-19 11:31:35 +02:00
gluap 0766467412
Merge branch 'next' into next-semaphore 2023-05-19 11:30:49 +02:00
gluap edc3c37abb
fix openid logout (wasn't working with old keycloak anyhow, but this works at least with new keycloak) 2023-05-19 11:29:15 +02:00
gluap 41ce56ac09
restore .env which was lost (probably due to gitignore) 2023-05-19 11:02:06 +02:00
gluap 7ff88aba15
Add the possibility to export road usage (without temporal filtering) 2023-05-18 13:51:05 +02:00
Paul Bienkowski a6811d4ba2 Also add semaphore for exports 2023-05-13 20:52:45 +02:00
Paul Bienkowski d3fbb113f3 Fix export bounding box ESPG id 2023-05-13 20:52:04 +02:00
Paul Bienkowski c249b1638e Add semaphore to limit simultaneous requests to tile data 2023-05-13 20:42:22 +02:00
Paul Bienkowski dd2e995720 generate proper filenames for bulk download, and use that as base folder inside tar 2023-05-13 20:22:16 +02:00
gluap 612a443dde make the tar strip the common parts of the directory structure. 2023-05-13 20:22:16 +02:00
Paul Bienkowski 1c53230b4d chore: fix log import 2023-05-13 20:22:16 +02:00
gluap 7e44f6d31d implement comments from review; slow fade-in of events 2023-05-13 20:22:16 +02:00
gluap a946ea53c9 Add bulk downloading 2023-05-13 20:22:16 +02:00
gluap fb3e8bf701
log10(0) is not defined - also: make region borders less intrusive for depending on number of events. As it was, it was hard to see the underlying geography. 2023-04-10 13:10:03 +02:00
gluap 56c9d2e455
don't die if no argv is supplied. 2023-04-10 12:42:24 +02:00
gluap c359d945da
the current version of region does not have that column - furthermore it now has an index for its id column already during creation. 2023-04-10 12:38:37 +02:00
gluap a66d96568e
fix downgrade step of event geometry transform. 2023-04-10 12:15:25 +02:00
gluap dc89db5471
fix typo in column name... 2023-04-10 12:05:15 +02:00
gluap 10fd02804e
exclude newest websockets versions that kill sanic. 2023-04-10 12:01:41 +02:00
gluap 5108eb02ce
Update scripts to latest version to avoid version conflicts 2023-04-01 21:43:35 +02:00
Paul Bienkowski 251be4a699 Fix bearings on road info, and German words for those 2023-04-01 20:19:59 +02:00
Paul Bienkowski dd72ed791f Use ESPG 3857 for all geometry columns 2023-04-01 16:44:47 +02:00
Paul Bienkowski ce8054b7ae Use NUTS for region import, not OSM 2023-03-31 21:06:59 +02:00
Paul Bienkowski 0d9ddf4884 Use logging in import_osm 2023-03-30 14:56:21 +02:00
Paul Bienkowski 6fb5dfe6de Fix index names on geometry tables and add way_id/relation_id indexes 2023-03-30 14:18:44 +02:00
gluap 10f6b0c0c9 make sure we generate the right geometry column type and stay with the types we had from osm2psql (and thus compatible with older installations).
I believe retrofitting the migrations is OK as these were overwritten by osm2pgsql in the past anyhow. Now at least we create the schema everyone is using already.
2023-03-30 14:18:44 +02:00
gluap 8ce5816f53 enable importing and dumping also regions - in the same epsg geometry as we had them from osm2pgsql 2023-03-30 14:18:44 +02:00
gluap dd912bcd0d np.float was a deprecated alias for the builtin float. To avoid this error in existing code, use float by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use np.float64 here. 2023-03-30 14:18:44 +02:00
gluap 39d90b3606 Update 35e7f1768f9b_create_table_road.py
Weird. Normally alembic should not make spelling errors :D
2023-03-30 14:18:44 +02:00
gluap e13bc759d7 Update Dockerfile
Remove roads_import.lua reference (file is already deleted)
2023-03-30 14:18:44 +02:00
Paul Bienkowski 0a18cda691 Remove osm2pgsql 2023-03-30 14:18:44 +02:00
Paul Bienkowski 761908a987 Move deployment readme to docs/ 2023-03-30 14:18:44 +02:00
Paul Bienkowski c4cc4a9078 Docs for new pipeline 2023-03-30 14:18:44 +02:00
Paul Bienkowski ac90d50239 Remove lean mode 2023-03-30 14:18:44 +02:00
Paul Bienkowski 59f074cb28 New import pipeline with a PBF conversion step 2023-03-30 14:18:44 +02:00
Paul Bienkowski 4c1c95e4ff Add chunk utility 2023-03-30 14:18:44 +02:00
Paul Bienkowski 69d7f64ead Add import_group columns for OSM data tables 2023-03-30 14:18:44 +02:00
Paul Bienkowski 276a2ddc69 Remove HSTORE tags column from region table 2023-03-30 14:18:44 +02:00
Paul Bienkowski de8d371b65 Include device count in stats 2023-03-30 14:18:26 +02:00
gluap cf8358d14b
fix the road_usage issue dennis found. 2023-03-26 22:34:20 +02:00
Dennis Boldt eda3bf2688 Make menu stackable on mobile 2023-03-26 19:09:29 +02:00
gluap df0466c6f1
as discussed with paul: This makes the region lookup smooth. 2023-03-26 16:46:23 +02:00
gluap 9882b2041f
update what can be updated without breaking stuff. 2023-03-26 14:15:57 +02:00
gluap a7566fb6b3
restrict to admin_level 6
Only compute the regions layer at the required admin level 6 (kreise/kreisfreie Städte)- we're not using other levels yet and this speeds it up notably.
2023-03-25 17:43:32 +01:00
gluap b6cf59a09d
fix requirements.txt
remove reduntdant version that breaks pip
```
 => ERROR [stage-2  5/18] RUN pip install -r requirements.txt                                                                                           2.2s
------
 > [stage-2  5/18] RUN pip install -r requirements.txt:
#0 1.825 ERROR: Invalid requirement: 'sqlalchemy[asyncio]~=1.4.39 <2.0' (from line 10 of requirements.txt)
#0 2.055 WARNING: You are using pip version 21.2.4; however, version 23.0.1 is available.
```
2023-03-21 20:10:06 +01:00
Paul Bienkowski 2f8e40db08
disable dependabot 2023-03-19 22:13:32 +01:00
gluap fa29deb397
Fix that bug sonarcloud is moaning about. 2023-03-12 23:30:52 +01:00
Paul Bienkowski 665816cc98 Fix map size when sidebar is open 2023-03-12 13:45:50 +01:00
Paul Bienkowski 0d44560830 Merge branch 'device-identifiers' into next 2023-03-12 13:38:42 +01:00
Paul Bienkowski 61b74e90fd wip:Build devices page 2023-03-12 13:37:51 +01:00
Paul Bienkowski 2c27a2c549 Pin sqlalchemy better 2023-03-12 13:13:46 +01:00
Paul Bienkowski 141460c79f Split settings page 2023-03-12 13:13:46 +01:00
Paul Bienkowski 4fe7d45dec Bulk update operations on tracks 2023-03-12 13:13:46 +01:00
Paul Bienkowski cbab83e6e3 Build awesome "My Tracks" table with filters and sorting 2023-03-12 13:13:46 +01:00
Paul Bienkowski 5a78d7eb38 Parse device identifiers and create UserDevice entries in database 2023-03-12 13:13:46 +01:00
Paul Bienkowski 56905fdf75 Install stream-zip 2023-03-12 13:13:46 +01:00
Paul Bienkowski 6c458a43f6 Raise maximum on track page limit 2023-03-12 13:09:36 +01:00
Paul Bienkowski 84ab957aa0 fix cors by implementing it ourselves 2023-03-12 13:09:36 +01:00
Paul Bienkowski ed272b4e4a Use TTY in development docker to get line-buffered prints 2023-03-12 13:09:36 +01:00
Paul Bienkowski b9aaf23e0a Clean up sanic logging 2023-03-12 13:09:36 +01:00
Paul Bienkowski 78dca1477c Fix naming of AUTO_RELOAD/AUTO_RESTART 2023-03-12 13:09:36 +01:00
Paul Bienkowski 215801f2b0 Regions: Fix migration order 2023-03-12 13:09:36 +01:00
Paul Bienkowski 6d71b88010 Translate region frontend 2023-03-12 12:57:05 +01:00
Paul Bienkowski e0070fc794 Merge branch 'administrative-area-import' into next 2023-03-12 12:44:09 +01:00
Paul Bienkowski 518bcd81ef Show regions on map page, and move on-click info panel into a proper sidebar 2023-03-12 12:43:08 +01:00
Paul Bienkowski 7ae4ebebb6 Show region stats on home page 2023-03-12 12:42:42 +01:00
Paul Bienkowski 382db5a11e Expose OBS map source for all zoom levels 2023-03-12 12:41:41 +01:00
Paul Bienkowski 3a97b07325 Add tile layer for regions with event count 2023-03-12 12:41:09 +01:00
Paul Bienkowski bea4174b37 Do not generate roads and events for tiles at low zoom levels 2023-03-12 12:40:57 +01:00
Paul Bienkowski 78561d5929 Add route to expose region stats 2023-03-12 12:40:06 +01:00
Paul Bienkowski 7e51976c06 Import regions from administrative boundaries 2023-03-12 12:39:50 +01:00
Paul Bienkowski ec53591ce0 Create Region table 2023-03-12 12:39:23 +01:00
gluap 9e80113089
Add an optionally-displayable "Terms and Conditions" link. 2023-03-11 16:49:42 +01:00
gluap e7b02b170e
Merge pull request #303 from cbiteau/ADD-FR-TRANSLATION
Add French translation
2023-03-11 14:35:24 +01:00
gluap 94d23adcd2
Merge pull request #304 from cbiteau/fix_parse_hash_for_negative_values
Make parseHash function working with negative values
2023-01-08 13:05:10 +01:00
Charly BITEAU d889abc798 Make parseHash function working with negative values 2023-01-07 12:56:17 +01:00
Charly BITEAU 1d2218b2df Add FR translation 2023-01-07 12:23:14 +01:00
Paul Bienkowski c1ccec9664 Add display_name field to users to specify a new name within the application, without changing the login name 2022-09-26 11:53:54 +02:00
Paul Bienkowski dec165341b Clean usernames of invalid characters when the users receive their name from the login server 2022-09-26 11:53:54 +02:00
Paul Bienkowski 426e6c8593 Rename users when they log in with a new preferred_username 2022-09-26 11:53:54 +02:00
Paul Bienkowski 8d1d575215 Debounce map moves before writing browser history (fixes #276) 2022-09-13 08:53:16 +02:00
Paul Bienkowski 0b5fe015d9 Add button to close road info overlay (fixes #275) 2022-09-13 08:53:16 +02:00
Paul Bienkowski 61890c6a5c Do not click on road when toggling sidebar (fixes #274) 2022-09-13 08:53:16 +02:00
Paul Bienkowski c1c3797eb8 Improve placement of map controls and popovers (fixes #272) 2022-09-13 08:53:16 +02:00
Paul Bienkowski fc930fe433 Decrease map page height if banner is enabled (for #272) 2022-09-13 08:53:16 +02:00
Paul Bienkowski 5cfc8aae39 Release: 0.7.0 2022-08-26 10:23:36 +02:00
Paul Bienkowski 8096c2c2d2 changelog and upgrade instructions for 0.7.0 2022-08-26 10:23:28 +02:00
Paul Bienkowski c3ed4f24dd Hint about filters not being applied in road info popover 2022-07-28 22:05:31 +02:00
Paul Bienkowski 1a3b971a71 Refactor filter arguments outside tile handler 2022-07-28 22:05:31 +02:00
Paul Bienkowski 201db32050 Remove logs 2022-07-28 22:05:31 +02:00
Paul Bienkowski a737d1ac1b Make Map sidebar wider for long German words ;) 2022-07-28 22:05:31 +02:00
Paul Bienkowski 57af4614b1 Translate filters in sidebar 2022-07-28 22:05:31 +02:00
Paul Bienkowski 8878a71c14 Limit map date filter to weeks (mondays) 2022-07-28 22:05:31 +02:00
Paul Bienkowski dcfcd21c96 Emit all roads in obs_roads layer regardless of filter, generate only their statistics with filters 2022-07-28 22:05:31 +02:00
Paul Bienkowski c02b40b0d3 Remove log 2022-07-28 22:05:31 +02:00
Paul Bienkowski e0cb36565a Add date-range filters to map 2022-07-28 22:05:31 +02:00
Paul Bienkowski 7716da8844 Add filter toggle for user-owned data to map UI 2022-07-28 22:05:31 +02:00
Paul Bienkowski 5beb5ac0d3 fix dynamic tile arguments and implement in both layers 2022-07-28 22:05:31 +02:00
Paul Bienkowski 598ba8d187 add dynamic tile arguments 2022-07-28 22:05:31 +02:00
Paul Bienkowski 24aaca654f Remove duplicate i18n text 2022-07-28 19:03:24 +02:00
Paul Bienkowski 373fab6e90 fix for sanic 22.6 2022-07-28 19:03:24 +02:00
dependabot[bot] 00f018c61c Update python-slugify requirement from ~=5.0.2 to ~=6.1.2 in /api
Updates the requirements on [python-slugify](https://github.com/un33k/python-slugify) to permit the latest version.
- [Release notes](https://github.com/un33k/python-slugify/releases)
- [Changelog](https://github.com/un33k/python-slugify/blob/master/CHANGELOG.md)
- [Commits](https://github.com/un33k/python-slugify/compare/5.0.2...v6.1.2)

---
updated-dependencies:
- dependency-name: python-slugify
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 4a3681ec16 Update sanic requirement from ~=21.9.3 to ~=22.6.0 in /api
Updates the requirements on [sanic](https://github.com/sanic-org/sanic) to permit the latest version.
- [Release notes](https://github.com/sanic-org/sanic/releases)
- [Changelog](https://github.com/sanic-org/sanic/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/sanic-org/sanic/compare/v21.9.3...v22.6.0)

---
updated-dependencies:
- dependency-name: sanic
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 854332a188 Update motor requirement from ~=2.5.1 to ~=3.0.0 in /api
Updates the requirements on [motor](https://github.com/mongodb/motor) to permit the latest version.
- [Release notes](https://github.com/mongodb/motor/releases)
- [Changelog](https://github.com/mongodb/motor/blob/master/doc/changelog.rst)
- [Commits](https://github.com/mongodb/motor/compare/2.5.1...3.0.0)

---
updated-dependencies:
- dependency-name: motor
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 7ed2e8532e Update sqlalchemy[asyncio] requirement from ~=1.4.32 to ~=1.4.39 in /api
Updates the requirements on [sqlalchemy[asyncio]](https://github.com/sqlalchemy/sqlalchemy) to permit the latest version.
- [Release notes](https://github.com/sqlalchemy/sqlalchemy/releases)
- [Changelog](https://github.com/sqlalchemy/sqlalchemy/blob/main/CHANGES.rst)
- [Commits](https://github.com/sqlalchemy/sqlalchemy/commits)

---
updated-dependencies:
- dependency-name: sqlalchemy[asyncio]
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 3b21459805 Update sanic requirement from ~=21.9.3 to ~=22.3.0 in /api
Updates the requirements on [sanic](https://github.com/sanic-org/sanic) to permit the latest version.
- [Release notes](https://github.com/sanic-org/sanic/releases)
- [Changelog](https://github.com/sanic-org/sanic/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/sanic-org/sanic/compare/v21.9.3...v22.3.0)

---
updated-dependencies:
- dependency-name: sanic
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 10ced9d65e Update pyshp requirement from ~=2.2.0 to ~=2.3.1 in /api
Updates the requirements on [pyshp](https://github.com/GeospatialPython/pyshp) to permit the latest version.
- [Release notes](https://github.com/GeospatialPython/pyshp/releases)
- [Changelog](https://github.com/GeospatialPython/pyshp/blob/master/changelog.txt)
- [Commits](https://github.com/GeospatialPython/pyshp/compare/2.2.0...2.3.1)

---
updated-dependencies:
- dependency-name: pyshp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] a736984265 Bump terser from 5.10.0 to 5.14.2 in /frontend
Bumps [terser](https://github.com/terser/terser) from 5.10.0 to 5.14.2.
- [Release notes](https://github.com/terser/terser/releases)
- [Changelog](https://github.com/terser/terser/blob/master/CHANGELOG.md)
- [Commits](https://github.com/terser/terser/commits)

---
updated-dependencies:
- dependency-name: terser
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] e60c42990b Bump nanoid from 3.1.30 to 3.3.4 in /frontend
Bumps [nanoid](https://github.com/ai/nanoid) from 3.1.30 to 3.3.4.
- [Release notes](https://github.com/ai/nanoid/releases)
- [Changelog](https://github.com/ai/nanoid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/ai/nanoid/compare/3.1.30...3.3.4)

---
updated-dependencies:
- dependency-name: nanoid
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 3d1ac596b2 Bump follow-redirects from 1.14.5 to 1.15.1 in /frontend
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.14.5 to 1.15.1.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.14.5...v1.15.1)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] 4e31f21059 Bump async from 2.6.3 to 2.6.4 in /frontend
Bumps [async](https://github.com/caolan/async) from 2.6.3 to 2.6.4.
- [Release notes](https://github.com/caolan/async/releases)
- [Changelog](https://github.com/caolan/async/blob/v2.6.4/CHANGELOG.md)
- [Commits](https://github.com/caolan/async/compare/v2.6.3...v2.6.4)

---
updated-dependencies:
- dependency-name: async
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
dependabot[bot] d7a172b39c Bump node from 17 to 18
Bumps node from 17 to 18.

---
updated-dependencies:
- dependency-name: node
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-28 19:03:24 +02:00
Paul Bienkowski 36f1675577 deduplicate i18n for zone names 2022-07-28 13:56:18 +02:00
Paul Bienkowski 85e5e1ba65 Fix sidebar, use upperFirst to retain ß 2022-07-28 13:56:18 +02:00
Paul Bienkowski f158414f24 Translate TrackEditor 2022-07-28 13:56:18 +02:00
Paul Bienkowski 6f7c8d54f2 Translate TrackPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski ab6cc6f6d0 Reorder Trackpage 2022-07-28 13:56:18 +02:00
Paul Bienkowski fe7d7ce274 Translate SettingsPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski 76943fb1f0 Fix translation interpolation 2022-07-28 13:56:18 +02:00
Paul Bienkowski 248f8b4a6f Translate MapPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski a977e2d1c3 Translate login and notfound 2022-07-28 13:56:18 +02:00
Paul Bienkowski 2cff606092 Translate UploadPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski a85379418e Translate ExportPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski 1533fdc450 Translate TracksPage 2022-07-28 13:56:18 +02:00
Paul Bienkowski f1f40a254a HomePage: import missing type 2022-07-28 13:56:18 +02:00
Paul Bienkowski 682b41f2a4 Remove unneeded style file 2022-07-28 13:56:18 +02:00
Paul Bienkowski c020839b31 Translate HomePage, Stats 2022-07-28 13:56:18 +02:00
Paul Bienkowski 31fac13f8a Translate App, LoginButton 2022-07-28 13:56:18 +02:00
Paul Bienkowski 5f3ac69f60 Add react-i18next integration 2022-07-28 13:56:18 +02:00
Paul Bienkowski ed9ed68d83 install eslint 2022-07-26 08:10:03 +02:00
gluap 2755d6b2b5
Merge pull request #242 from openbikesensor/rural_urban
Implement difference between urban and rural for events and roads
2022-07-26 07:58:50 +02:00
Paul Bienkowski 617011c528 make headers in sidebar pretty, and remove colors 2022-07-25 22:24:41 +02:00
Paul Bienkowski 76b1b41b4b derive logic for "include roads without data" from selected attribute 2022-07-25 22:24:31 +02:00
gluap 1ad5fe562e Merge branch 'lost_work_rural_urban' into rural_urban 2022-07-25 18:13:10 +02:00
gluap a11a3c4b8c make color change active 2022-07-25 18:02:09 +02:00
Paul Bienkowski a3d548cd4b fixup for e266a4f40a, print exceptions again 2022-07-22 13:04:51 +02:00
gluap eda0fe29b2 it is called "portal.example.com" everywhere else 2022-07-08 19:10:46 +02:00
Paul Bienkowski a0852fdc41 Fix ExportPage bounding box input 2022-06-26 19:12:28 +02:00
Paul Bienkowski e266a4f40a Remove some error logs for canceled requests (as the map page tends to do that quite a lot) 2022-06-26 13:38:54 +02:00
Paul Bienkowski 0cbf03cd56 Remove useless session creation (possibly #192) 2022-06-26 13:38:26 +02:00
Paul Bienkowski 4907f038da Make raw track not look like a river (fixes #252) 2022-06-26 13:02:01 +02:00
gluap 8bc83a5f18
Update TrackEditor.tsx 2022-06-25 20:55:50 +02:00
gluap afc801aefc save current state. 2022-06-24 18:52:59 +02:00
gluap 225a238e77 more lua ifery to detect rural streets 2022-06-24 18:51:52 +02:00
gluap 5e8830cc15 fix #245 2022-05-21 21:55:24 +02:00
gluap f70f4d5716
Merge pull request #246 from openbikesensor/fix-206-overtaking-events-not-deleted
Fix #206 - Overtaking events are deleted now, when the parent track is deleted - I tested this on a background instance and indeed deleting a track via the web interface does correctly delete the corrsponding events.
2022-05-21 21:20:08 +02:00
gluap f36e38b10b some more lua logic to guess rurality. 2022-05-21 21:09:54 +02:00
Dennis Boldt ad5a0bfbf6 Fix #206 - Overtaking events are deleted now, when the parent track is deleted 2022-05-21 14:21:58 +02:00
gluap 8ba5d8e3ad Take the zone from road, not event as suggested by opatut. 2022-05-05 23:16:38 +02:00
gluap 66dd84982c Implement difference between urban and rural for events and road segments. 2022-05-02 22:00:17 +02:00
Paul Bienkowski 8728347695 frontend: Render histogram as chart in road details panel 2022-04-24 11:30:27 +02:00
Paul Bienkowski cb6c94f7a5 frontend: Flip table in road details panel and make it easier to read 2022-04-24 11:30:27 +02:00
Paul Bienkowski 04bf99b7cb api: Add histogram details to /mapdetails/road endpoint 2022-04-24 11:30:27 +02:00
Paul Bienkowski 2fd664f79a frontend: Add echarts dependencies 2022-04-24 11:30:27 +02:00
gluap 2e50e0c59c Release: 0.6.2 2022-04-12 23:27:49 +02:00
gluap 62528a04da
Update CHANGELOG.md 2022-04-12 23:25:15 +02:00
gluap 96d157b226
Merge pull request #232 from openbikesensor/fix-traversal
do not serve files from outside the frontend dir.
2022-04-12 23:19:59 +02:00
gluap c61157aca3 do not serve files from outside the frontend dir. 2022-04-12 20:11:34 +02:00
Paul Bienkowski f229ab4112 Release: 0.6.1 2022-04-03 16:14:05 +02:00
Julian Golderer 4417263019 Remove transformation in mapdetails_road of Road.geometry 2022-04-03 16:11:28 +02:00
Paul Bienkowski e5b48f8ffd Release: 0.6.0 2022-04-03 16:08:54 +02:00
Paul Bienkowski 4d0002e6d8 Create one-in-all upgrade script (fixes #220) 2022-04-03 16:06:12 +02:00
Paul Bienkowski 959cb7d2b7 Add migrations for the whole schema 2022-04-03 16:06:12 +02:00
gluap 850b907995 reorder steps, add remark about rebuilding image (as it was explicit in 0.4.0) as well as stopping the running services before upgrading DB schema. 2022-03-31 08:41:45 +02:00
Paul Bienkowski f0f804ae76 Release: 0.5.1 2022-03-21 23:27:59 +01:00
Paul Bienkowski d7d00ac3fd Add migration content to docker image 2022-03-21 23:15:33 +01:00
Paul Bienkowski 6126e2273b Changelog for 0.5.0 2022-03-21 22:40:22 +01:00
Paul Bienkowski 388539fd71 Include road_usage in backups, please 2022-03-21 22:37:02 +01:00
Paul Bienkowski b1071a34d3 Fix Dockerfile clone URL 2022-03-21 22:36:53 +01:00
Paul Bienkowski 36d6bb026c Release: 0.5.0 2022-03-21 22:28:12 +01:00
Paul Bienkowski 157b970b29 Documentation for migrations 2022-03-21 22:26:07 +01:00
Paul Bienkowski 76270d199e Fix initial database creation schema 2022-03-21 22:21:45 +01:00
Paul Bienkowski 1c52ce7de9 Use discrete colors for distances, with greens only above 1.5m 2022-03-21 22:20:08 +01:00
Paul Bienkowski 6893d7b56f Use viridis colormap for roads' count layers 2022-03-16 22:30:15 +01:00
Paul Bienkowski 36fd8c492c Fix split roads for usage_count 2022-03-16 22:20:15 +01:00
Paul Bienkowski 6ef233a2a2 Show usage count on map 2022-03-16 22:06:06 +01:00
Paul Bienkowski 85fcdea403 Add usage_count to map layer obs_roads 2022-03-16 22:05:58 +01:00
Paul Bienkowski 9a1c412597 Import used roads into database when processing track 2022-03-16 22:05:39 +01:00
Paul Bienkowski 5a5948b653 Add road_usage table 2022-03-16 21:01:36 +01:00
Paul Bienkowski f3a1ca4165 Add alembic setup for migrating 2022-03-16 21:01:20 +01:00
Paul Bienkowski 34660b266c Project the whole track to the map, and show both versions 2022-03-16 20:16:06 +01:00
Paul Bienkowski cb837ef5f2 Build osm2pgsql with -j4 2022-03-14 19:05:55 +01:00
gluap e09c257995 first shot at proper login. 2022-03-10 20:59:16 +01:00
gluap 0c43e49bb4 do not convert "count" into km/h while converting speeds. 2022-03-08 22:21:37 +01:00
gluap 51f75fcf61 convert units to km/h in frontend
people struggle with SI units in that regard.
2022-03-08 21:15:04 +01:00
dependabot[bot] 49f7827b51 Update sqlalchemy[asyncio] requirement from ~=1.4.31 to ~=1.4.32 in /api
Updates the requirements on [sqlalchemy[asyncio]](https://github.com/sqlalchemy/sqlalchemy) to permit the latest version.
- [Release notes](https://github.com/sqlalchemy/sqlalchemy/releases)
- [Changelog](https://github.com/sqlalchemy/sqlalchemy/blob/main/CHANGES)
- [Commits](https://github.com/sqlalchemy/sqlalchemy/commits)

---
updated-dependencies:
- dependency-name: sqlalchemy[asyncio]
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-03-08 19:10:08 +01:00
Paul Bienkowski 8f2861a8c9 Add dynamic titles to all pages via react-helmet (related to #148) 2022-03-04 13:07:56 +01:00
Paul Bienkowski 40d23c537e Compile static title into index.html (fixes #148) 2022-03-04 13:07:34 +01:00
Paul Bienkowski a013dae3fe Remove map from home page, it was empty anyway (fixes #120) 2022-03-04 12:40:00 +01:00
Paul Bienkowski bc17c72fdb Fix logout (fixes #146) 2022-03-04 12:39:29 +01:00
Paul Bienkowski f5be2b20f8 Fix footer version link to github tag 2022-03-04 12:34:39 +01:00
lumbric 835aeeb483 Fix keycloak URI in example config
In the example login.example.com is used for the subdomain for the keycloak instance.
2022-03-03 15:39:28 +01:00
lumbric fd06baeeb5 Update example config.py
This commit syncs config.py with api/config.py.example. Some new
parameters were missing, causing AttributeErrors during deployment.
2022-03-03 15:38:34 +01:00
Paul Bienkowski 509e784521 Release: 0.4.2 2022-03-03 08:17:43 +01:00
Paul Bienkowski abb935694e Fix export route, it should be a child of /api/ 2022-03-03 08:16:59 +01:00
Paul Bienkowski 38e14c0084 Release: 0.4.1 2022-03-02 19:48:22 +01:00
Paul Bienkowski b72499b29e Update changelog for GPX export feature 2022-03-02 19:48:12 +01:00
Paul Bienkowski 2a9e3549b5 Fix track download dropdown appearing for non-authors, and update tooltip 2022-03-02 19:48:02 +01:00
Paul Bienkowski 741ff0d488 Upgrade guide for 0.4.1 2022-03-02 19:41:06 +01:00
Paul Bienkowski 21055e669a Document how to connect to postgresql 2022-03-02 19:40:14 +01:00
Paul Bienkowski 82f20e6354 Show proper error messages when track download fails 2022-03-02 19:25:40 +01:00
Paul Bienkowski af3e9574e4 Add better note on top of export page indicating experimental status 2022-03-02 19:15:35 +01:00
Paul Bienkowski 7e33fb6424 Prevent JS error on track page when bounding box is invalid 2022-03-02 19:06:40 +01:00
Paul Bienkowski 70fa1a41c4 Provide GPX download from track page 2022-02-25 11:53:28 +01:00
Paul Bienkowski a71dadfc7f Export GPX track while processing 2022-02-25 11:53:13 +01:00
Paul Bienkowski 600457fe19 Fix downloaded filename 2022-02-25 11:52:56 +01:00
Paul Bienkowski a884ac88d8 Add exports page 2022-02-22 17:52:13 +01:00
Paul Bienkowski 8135d4ed51 Release: 0.4.0 2022-02-22 15:56:27 +01:00
Paul Bienkowski ba887e2208 Write changelog for 0.4.0 2022-02-22 15:56:13 +01:00
Paul Bienkowski ec7a4506f9 Document lean mode 2022-02-19 09:27:08 +01:00
Paul Bienkowski 5a7900d269 Lean mode default should be off 2022-02-19 09:27:08 +01:00
Paul Bienkowski bdc68e950e Do not mount /tiles and /mapdetails routes in lean mode 2022-02-19 09:27:08 +01:00
Paul Bienkowski 3ef6dcf5d9 Add lean mode (overpass source and no map view in frontend) 2022-02-19 09:27:08 +01:00
Paul Bienkowski d10b91804c Update readme to mention config.overrides.py 2022-02-18 18:36:31 +01:00
Paul Bienkowski 01bde30d0c Add config.overrides.py for development config 2022-02-18 18:34:02 +01:00
Paul Bienkowski 71a04b1611 Fix track comment route 2022-02-18 18:24:28 +01:00
Paul Bienkowski 7fc9558e42 Use custom get_single arg everywhere, remove sanicargs (fixes #193) 2022-02-18 12:15:08 +01:00
Paul Bienkowski 8bb5d71186 Extract utility functions 2022-02-18 12:05:04 +01:00
Dennis Boldt 412349cf4f Improve the README 2022-02-18 11:46:00 +01:00
gluap 1735f44769 Port 3000 exposed for consistency
Noticed I can do this myself -- disregard my comment about this fact.
2022-02-18 11:46:00 +01:00
Dennis Boldt e82f2c9a0e Use environment variables to configure the portal
- Added an example .env file, where all variables start with `OBS_`
- `OBS_` variables are handled by the portal as configuration variables
- Uncomment some variables in the config.py, since the config.py overrides the env-vars
- Use env-vars and env-file in the docker-compose.yaml
- Add the worker to the docker-compose.yaml
- Add KeyCloak with its own postgres to the docker-compose.yaml
2022-02-18 11:46:00 +01:00
Dennis Boldt 0f816e1680 Fix env-var names in osm2pgsql.sh 2022-02-18 11:46:00 +01:00
Dennis Boldt e4e9f921b6 Fix isPath 2022-02-18 11:46:00 +01:00
Dennis Boldt 9df2914b86 Allow env-vars starting with OBS_ to configure the portal 2022-02-18 11:46:00 +01:00
Dennis Boldt af174bc930 osm2pgsql can use a connection string
See option -d at https://osm2pgsql.org/doc/man/latest.html
2022-02-18 11:46:00 +01:00
Dennis Boldt 12224db3b9 Use osm2pgsql.sh to import osm data into the database with the portal image
- Added POSTGRES env-vars to the portal (only handled by osm2pgsql.sh until now)
- Added ./data/pbf:/pbf as host volume mount to the portal
- Added osm2pgsql.sh, which ueses the env-vars and pbf-mount to import the osm data into the database
2022-02-18 11:46:00 +01:00
Dennis Boldt 85911a2c97 Add osm2pgsql to the portal container
- Build osm2pgsql in a separate woker
- Fix EXPOSE port to 3000
- Add postgres to the backend network
2022-02-18 11:46:00 +01:00
gluap b43f7a2ebb retry: should only modify client on success. 2022-02-18 11:21:04 +01:00
gluap 4505ddd0ee add retry to keycloak connection 2022-02-18 11:21:04 +01:00
gluap b8ab7da1a9 This allows us to recover from the condition with the broken connections. 2022-02-18 11:19:43 +01:00
gluap 5ac2900e63 make pool_size and overflow configurable for worker and portal 2022-02-18 11:19:43 +01:00
gluap 6a34eaf819 Make setup.py consistent with requirements.txt
pyyaml, sqlalchemy asyncpg were missing. I wasn't able to identify where we use pyyaml but left it in to not accidentally break things.
2022-02-18 11:15:00 +01:00
Paul Bienkowski 0d49945018 Fix logging and use coloredlogs for nicer output 2022-02-18 11:15:00 +01:00
Paul Bienkowski 96642d2255 Fix format strings 2022-02-18 11:15:00 +01:00
Paul Bienkowski b66784f1ed Fix path to roads_import.lua in docs 2022-02-18 11:15:00 +01:00
gluap 15aaf06168
ignore submodules - it's not branch-aware 2022-02-11 17:09:37 +01:00
gluap 41e7fb001c
Merge pull request #194 from openbikesensor/bump_versions_2022-01-14
bump those versions that we can bump
2022-02-09 21:15:00 +01:00
gluap 530c604623 bump those versions that we can bump 2022-02-08 22:51:15 +01:00
gluap 8a4fbf954c
Remove frontend checking
It's noisy and buildtime-only.
2022-02-08 21:59:02 +01:00
Paul Bienkowski 2ce0338f38 Refactor routes/exports.py 2022-01-19 20:39:05 +01:00
Paul Bienkowski 3da467800d Remove broken duplicate route 2022-01-19 20:36:03 +01:00
Paul Bienkowski 0c256d8923 Add export routes 2022-01-19 09:11:54 +01:00
Andreas Mandel 1c09725ff1
Merge pull request #158 from openbikesensor/feature/enable-dependabot-for-update-prs
Enable dependabot for update pull requrests
2022-01-18 19:49:29 +01:00
140 changed files with 12544 additions and 1922 deletions

View file

@ -0,0 +1,20 @@
name: Build docker image
on: [push]
jobs:
build-image:
runs-on: ubuntu-latest
container:
image: catthehacker/ubuntu:act-latest
steps:
- name: Login to Forgejo docker registry
uses: docker/login-action@v3.0.0
with:
registry: git.pub.solar
username: hakkonaut
password: ${{ secrets.GIT_AUTH_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5.1.0
with:
push: true
tags: git.pub.solar/pub-solar/obs-portal:latest

View file

@ -1,23 +0,0 @@
# see also https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically
version: 2
updates:
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "daily"
- package-ecosystem: "gitsubmodule"
directory: "/"
schedule:
interval: "daily"
- package-ecosystem: "npm"
directory: "/frontend"
schedule:
interval: "daily"
- package-ecosystem: "pip"
directory: "/api"
schedule:
interval: "daily"

View file

@ -1,5 +1,172 @@
# Changelog # Changelog
## 0.8.1
### Improvements
* The zone (urban/rural) is now also exported with the events GeoJson export.
### Bug Fixes
* Update to a current version of gpstime (python dependency) fix portal startup.
## 0.8.0
### Features
* Bulk actions on users owned tracks (reprocess, download, make private, make public, delete) (#269, #38)
* Easy sorting by device for "multi-device users" (e.g. group lending out OBSes)
* Region display at higher zoom levels to easily find interesting areas (#112)
* Export of road statistics on top of the already-existing event statistics (#341)
### Improvements
* Refactored database access to hopefully combat portal crashes (#337)
* New infrastructure for map imports that makes import of larger maps possible on small VMs (#334)
* Reference current postgres and postgis versions in docker-compose.yaml files (#286)
* Configurable terms-and-conditions link (#320)
* French translation by @cbiteau (#303)
### Bug Fixes
* Logout not working (#285)
* Duplicate road usage hashes (#335, #253)
* cannot import name .... (#338)
## 0.7.0
### Features
* Add histogram of overtaking distances in road details panel
* Flip table in road details panel and make it easier to read
* Implement difference between urban and rural for events and road segments.
* Better road zone detection in import
* Make the frontend translatable and add German translation
* Add time and user filters to map view (for logged-in users only)
### Improvements
* Make raw track not look like a river (#252)
* Update many dependencies
### Bug fixes
* Overtaking events are now deleted when the parent track is deleted (#206)
* Remove useless session creation (#192)
* Remove some error logs for canceled requests (as the map page tends to do that quite a lot)
* Fix ExportPage bounding box input
## 0.6.2
### Improvements
* Prevent directory traversals inside container on python-served frontend.
## 0.6.1
### Improvements
* Make road details request (clicking on a road segment in the map) way faster
by using PostGIS geometry index correctly (#226).
## 0.6.0
Starting in this version, the database schema is created through migrations
instead of using the `reset_database.py` script. This means that for both the
initial setup, as well as for upgrades, only the migrations have to be run.
After updating and migrating, it is good practice to regenerate the SQL tile
functions (`api/tools/prepare_sql_tiles.py`) as well. It doesn't matter if you
do this when it is not required, so we've written a simple all-in-one update
script that you can run to do all upgrade tasks. This is now in
`api/tools/upgrade.py`.
Please check [`UPGRADING.md`](./UPGRADING.md) for more details if you're
upgrading an existing installation. It contains an important note for this
upgrade in particular.
## 0.5.1
Maintenance release, only includes build, deployment and documentation changes.
## 0.5.0
### Features
* Use discrete colors for distances, with greens only above 1.5m
* Use viridis colormap for roads' count layers
* Generate usage count information (how often has a road been traveled)
* Project the whole track to the map, and show both versions
* Log out of OpenID server when logging out of application
* Convert speed units to km/h in frontend
* Pages now have titles (#148)
* Remove map from home page, it was empty anyway (#120)
### Internal
* Add alembic setup for migrating
* Build osm2pgsql with -j4
* Update sqlalchemy[asyncio] requirement from ~=1.4.31 to ~=1.4.32 in /api
## 0.4.2
### Features
### Bugfixes
* Fix export route, it should be a child of /api
## 0.4.1
### Features
* Add page for exporting data through web frontend
* Generate GPX track file when importing a track
* Add GPX track export button on the track page (accessible for anybody who can
see the track)
## 0.4.0
### Improvements
* Retry OpenID Connect connection if it fails on boot
* Format log outputs with color and improve access log
* Make pool_size and overflow configurable for worker and portal
* Add a route for exporting events as GeoJSON/Shapefile
* Point footer to forum, not slack (fixes #140)
* Improve wording on profile page ("My" instead of "Your")
* Show "My tracks" directly in main menu (fixes #136)
### Bugfixes
* Make sure the API can recover from the broken postgresql connection state
* Remove duplicate events from the same track
* Fix direction of road segments (fixes #142)
* Solve a few problems with the colormap scales in the map view
### Docs & deployment
* Greatly improve deployement docs for a simple follow-along routine
* Use environment variables (`OBS_*`) for configuration
* Fix port numbers in example files and expose 3000 in the image
* Add `LEAN_MODE` configuration to disable `road` database table usage and fall
back to Overpass API for processing tracks (see
[docs/lean-mode.md](docs/lean-mode.md)).
* Read `config.overrides.py` file if it exists
* Add osm2pgsql to portal image to be able to import OSM data from within the
container
* Fix path to roads_import.lua in docs
* Explain to use the portal service, instead of api, in production
* Use entrypoint instead of command, so you can run process_track.py one-off tasks
### Internals
* Use custom `get_single_arg` everywhere, remove sanicargs (fixes #193)
* Update requirements and make them consistent
* Fix error handling, especially for file uploads
## 0.3.4 ## 0.3.4
### Features ### Features

View file

@ -4,7 +4,7 @@
# Build the frontend AS builder # Build the frontend AS builder
############################################# #############################################
FROM node:17 as frontend-builder FROM node:18 as frontend-builder
WORKDIR /opt/obs/frontend WORKDIR /opt/obs/frontend
ADD frontend/package.json frontend/package-lock.json /opt/obs/frontend/ ADD frontend/package.json frontend/package-lock.json /opt/obs/frontend/
@ -21,7 +21,21 @@ RUN npm run build
# Build the API and add the built frontend to it # Build the API and add the built frontend to it
############################################# #############################################
FROM python:3.9.7-bullseye FROM python:3.11.3-bullseye
RUN apt-get update &&\
apt-get install -y \
libboost-dev \
libboost-system-dev \
libboost-filesystem-dev \
libexpat1-dev \
zlib1g-dev \
libbz2-dev \
libpq-dev \
libproj-dev \
lua5.3 \
liblua5.3-dev &&\
rm -rf /var/lib/apt/lists/*
WORKDIR /opt/obs/api WORKDIR /opt/obs/api
@ -34,13 +48,14 @@ ADD api/scripts /opt/obs/scripts
RUN pip install -e /opt/obs/scripts RUN pip install -e /opt/obs/scripts
ADD api/setup.py /opt/obs/api/ ADD api/setup.py /opt/obs/api/
ADD api/alembic.ini /opt/obs/api/
ADD api/migrations /opt/obs/api/migrations/
ADD api/obs /opt/obs/api/obs/ ADD api/obs /opt/obs/api/obs/
ADD api/tools /opt/obs/api/tools/ ADD api/tools /opt/obs/api/tools/
RUN pip install -e /opt/obs/api/ RUN pip install -e /opt/obs/api/
COPY --from=frontend-builder /opt/obs/frontend/build /opt/obs/frontend/build COPY --from=frontend-builder /opt/obs/frontend/build /opt/obs/frontend/build
EXPOSE 8000 EXPOSE 3000
CMD ["openbikesensor-api"] CMD ["openbikesensor-api"]

126
README.md
View file

@ -36,10 +36,11 @@ git submodule update --init --recursive
## Production setup ## Production setup
There is a guide for a deployment based on docker in the There is a guide for a deployment based on docker at
[deployment](deployment) folder. Lots of non-docker deployment strategy are [docs/production-deployment.md](docs/production-deployment.md). Lots of
possible, but they are not "officially" supported, so please do not expect the non-docker deployment strategies are possible, but they are not "officially"
authors of the software to assist in troubleshooting. supported, so please do not expect the authors of the software to assist in
troubleshooting.
This is a rather complex application, and it is expected that you know the This is a rather complex application, and it is expected that you know the
basics of deploying a modern web application securely onto a production server. basics of deploying a modern web application securely onto a production server.
@ -52,19 +53,29 @@ Please note that you will always need to install your own reverse proxy that
terminates TLS for you and handles certificates. We do not support TLS directly terminates TLS for you and handles certificates. We do not support TLS directly
in the application, instead, please use this prefered method. in the application, instead, please use this prefered method.
Upgrading and migrating is described in [UPGRADING.md](./UPGRADING.md) for each
version.
### Migrating (Production) ### Migrating (Production)
Migrations are not implemented yet. Once we need them, we'll add them and Migrations are done with
document the usage here. [Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
its documentation for help. Most of the time, running this command will do all
the migrations you need:
### Upgrading from v0.2 to v0.3 ```bash
docker-compose run --rm api tools/upgrade.py
```
After v0.2 we switched the underlying technology of the API and the database. This command is equivalent to running migrations through *alembic*, then
We now have no more MongoDB, instead, everything has moved to the PostgreSQL regenerating the SQL functions that compute vector tiles directly in the
installation. For development setups, it is advised to just reset the whole database:
state (remove the `local` folder) and start fresh. For production upgrades,
please follow the relevant section in [`UPGRADING.md`](./UPGRADING.md).
```bash
# equivalent to the above command, you don't usually run these
docker-compose run --rm api alembic upgrade head
docker-compose run --rm api tools/prepare_sql_tiles
```
## Development setup ## Development setup
@ -80,7 +91,6 @@ Then clone the repository as described above.
### Configure Keycloak ### Configure Keycloak
Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance: Login will not be possible until you configure the keycloak realm correctly. Boot your keycloak instance:
```bash ```bash
@ -97,8 +107,15 @@ Now navigate to http://localhost:3003/ and follow these steps:
- In the Tab *Settings*, edit the new client's *Access Type* to *confidential* - In the Tab *Settings*, edit the new client's *Access Type* to *confidential*
and enter as *Valid Redirect URIs*: `http://localhost:3000/login/redirect`, and enter as *Valid Redirect URIs*: `http://localhost:3000/login/redirect`,
then *Save* then *Save*
- Under *Credentials*, copy the *Secret* and paste it into `api/config.dev.py` - Under *Credentials*, copy the *Secret*. Create a file at `api/config.overrides.py` with the secret in it:
as `KEYCLOAK_CLIENT_SECRET`. Please do not commit this change to git.
```python
KEYCLOAK_CLIENT_SECRET="your secret here"
```
You can use this file in development mode to change settings without editing
the git-controlled default file at `api/config.dev.py`. Options in this file
take precendence.
- In the sidebar, navigate to *Manage* &rarr; *Users*, and click *Add user* on the top right. - In the sidebar, navigate to *Manage* &rarr; *Users*, and click *Add user* on the top right.
- Give the user a name (e.g. `test`), leave the rest as-is. - Give the user a name (e.g. `test`), leave the rest as-is.
- Under the tab *Credentials*, choose a new password, and make it - Under the tab *Credentials*, choose a new password, and make it
@ -126,23 +143,17 @@ If you don't wait long enough, the following commands might fail. In this case,
you can always stop the container, remove the data directory (`local/postgres`) you can always stop the container, remove the data directory (`local/postgres`)
and restart the process. and restart the process.
Next, initialize an empty database, which applies the database schema for the Next, run the upgrade command to generate the database schema:
application:
```bash ```bash
docker-compose run --rm api tools/reset_database.py docker-compose run --rm api tools/upgrade.py
``` ```
To be able serve dynamic vector tiles from the API, run the following command once: You will need to re-run this command after updates, to migrate the database and
(re-)create the functions in the SQL database that are used when generating
vector tiles.
```bash You should also [import OpenStreetMap data](docs/osm-import.md) now.
docker-compose run --rm api tools/prepare_sql_tiles.py
```
You might need to re-run this command after updates, to (re-)create the
functions in the SQL database that are used when generating vector tiles.
You should also import OpenStreetMap data now, see below for instructions.
### Boot the application ### Boot the application
@ -158,50 +169,16 @@ testing.
### Migrating (Development) ### Migrating (Development)
Migrations are not implemented yet. Once we need them, we'll add them and Migrations are done with
document the usage here. [Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
its documentation for help. Most of the time, running this command will do all
the migrations you need:
```bash
docker-compose run --rm api alembic upgrade head
```
## Import OpenStreetMap data
You need to import road information from OpenStreetMap for the portal to work.
This information is stored in your PostgreSQL database and used when processing
tracks (instead of querying the Overpass API), as well as for vector tile
generation. The process applies to both development and production setups. For
development, you should choose a small area for testing, such as your local
county or city, to keep the amount of data small. For production use you have
to import the whole region you are serving.
* Install `osm2pgsql`.
* Download the area(s) you would like to import from [GeoFabrik](https://download.geofabrik.de).
* Import each file like this:
```bash
osm2pgsql --create --hstore --style api/roads_import.lua -O flex \
-H localhost -d obs -U obs \
path/to/downloaded/myarea-latest.osm.pbf
```
You might need to adjust the host, database and username (`-H`, `-d`, `-U`) to
your setup, and also provide the correct password when queried. For the
development setup the password is `obs`. For production, you might need to
expose the containers port and/or create a TCP tunnel, for example with SSH,
such that you can run the import from your local host and write to the remote
database.
The import process should take a few seconds to minutes, depending on the area
size. A whole country might even take one or more hours. You should probably
not try to import `planet.osm.pbf`.
You can run the process multiple times, with the same or different area files,
to import or update the data. However, for this to work, the actual [command
line arguments](https://osm2pgsql.org/doc/manual.html#running-osm2pgsql) are a
bit different each time, including when first importing, and the disk space
required is much higher.
Refer to the documentation of `osm2pgsql` for assistance. We are using "flex
mode", the provided script `api/roads_import.lua` describes the transformations
and extractions to perform on the original data.
## Troubleshooting ## Troubleshooting
@ -209,6 +186,17 @@ If any step of the instructions does not work for you, please open an issue and
describe the problem you're having, as it is important to us that onboarding is describe the problem you're having, as it is important to us that onboarding is
super easy :) super easy :)
### Connecting to the PostgreSQL database
If you need to connect to your development PostgreSQL database, you should
install `psql` locally. The port 5432 is already forwarded, so you can connect with:
```
psql -h localhost -U obs -d obs
```
The password is `obs` as well.
## License ## License
Copyright (C) 2020-2021 OpenBikeSensor Contributors Copyright (C) 2020-2021 OpenBikeSensor Contributors

View file

@ -1,11 +1,123 @@
# Upgrading # Upgrading
This document describes the general steps to upgrade between major changes. This document describes the general steps to upgrade between major changes.
Simple migrations, e.g. for adding schema changes, are not documented Simple migrations, e.g. for adding schema changes, are not documented
explicitly. Once we implement them, their usage will be described in the explicitly. Their general usage is described in the [README](./README.md) (for
[README](./README.md). development) and [docs/production-deployment.md](docs/production-deployment.md) (for production).
## 0.8.1
- Get the release in your source folder (``git pull; git checkout 0.8.0`` and update submodules ``git submodule update --recursive``)
- Rebuild images ``docker-compose build``
- No database upgrade is required, but tile functions need an update:
```bash
docker-compose run --rm portal tools/prepare_sql_tiles.py
```
- Start your portal and worker services. ``docker-compose up -d worker portal``
## 0.8.0
Upgrade to `0.7.x` first. See below for details. Then follow these steps:
> **Warning** The update includes a reprocessing of tracks after import. Depending on the number of tracks this can take a few hours. The portal is reachable during that time but events disappear and incrementally reappear during reimport.
> **Info** With this version the import process for OpenStreetMap data has changed: the [new process](docs/osm-import.md) is easier on resources and finally permits to import a full country on a low-end VM.
- Do your [usual backup](docs/production-deployment.md)
- get the release in your source folder (``git pull; git checkout 0.8.0`` and update submodules ``git submodule update --recursive``)
- Rebuild images ``docker-compose build``
- Stop your portal and worker services ``docker-compose stop worker portal``
- run upgrade
```bash
docker-compose run --rm portal tools/upgrade.py
```
this automatically does the following
- Migration of database schema using alembic.
- Upgrade of SQL tile schema to new schema.
- Import the nuts-regions from the web into the database.
- Trigger a re-import of all tracks.
- Start your portal and worker services. ``docker-compose up -d worker portal``
## 0.7.0
Upgrade to `0.6.x` first. See below for details. Then follow these steps:
- Rebuild images
- Stop your portal and worker services.
- **Migration with alembic**: required
- **Prepare SQL Tiles**: required
- Start your portal and worker services.
- **Reimport tracks**: no action required
- **OSM Import**: required
- **Config changes**: add `POSTGRES_MAX_OVERFLOW` and `POSTGRES_POOL_SIZE`
variables, see `api/config.py.example`
## 0.6.0
**Make sure to upgrade to `0.5.1` first, by checking out that version tag and
running migrations, then coming back to this version.** This is required
because the migrations have been edited to create the initial database schema,
but if you run the 0.5.1 migrations first, your database will remember that it
already has all the tables created. This is not required if you set up a new
installation.
For this update, run these steps:
- Build new images
- Stop portal and worker services
- Run the new upgrade tool:
```bash
docker-compose run --rm portal tools/upgrade.py
```
- Start portal and worker services
## 0.5.0
The upgrade requires the following steps in the given order
- Rebuild images
- Stop your portal and worker services.
- **Migration with alembic**: required
- **Prepare SQL Tiles**: required
- Start your portal and worker services.
- **Reimport tracks**: required
- **OSM Import**: no action required
- **Config changes**: none
## 0.4.1
You can, but do not have to, reimport all tracks. This will generate a GPX file
for each track and allow the users to download those. If a GPX file has not yet
been created, the download will fail. To reimport all tracks, log in to your
PostgreSQL database (instructions are in [README.md](./README.md) for
development and [docs/production-deployment.md](./docs/production-deployment.md) for production)
and run:
```sql
UPDATE track SET processing_status = 'queued';
```
You can do this selectively with `WHERE` statements.
Make sure your worker is running to process the queue.
## 0.4.0
* Rebuild your image, this may take longer than usual, as it will compile
`osm2pgsql` for you. Next time, it should be in your docker build cache and
be fast again.
* Add new config flags: `VERBOSE`, `LEAN_MODE`, `POSTGRES_POOL_SIZE`,
`POSTGRES_MAX_OVERFLOW`. Check the example config for sane default values.
* Re-run `tools/prepare_sql_tiles.py` again (see README)
* It has been made easier to import OSM data, check
[docs/production-deployment.md](./docs/production-deployment.md) for the sections "Download
OpenStreetMap maps" and "Import OpenStreetMap data". You can now download
multiple .pbf files and then import them at once, using the docker image
built with the `Dockerfile`. Alternatively, you can choose to enable [lean
mode](docs/lean-mode.md). You do not need to reimport data, but setting this
up now will make your life easier in the long run ;)
## v0.2 to v0.3 (MongoDB to PostgreSQL) ## v0.2 to v0.3 (MongoDB to PostgreSQL)
* Shut down all services * Shut down all services
@ -54,5 +166,5 @@ explicitly. Once we implement them, their usage will be described in the
`export/users.json` into your realm, it will re-add all the users from the `export/users.json` into your realm, it will re-add all the users from the
old installation. You should delete the file and `export/` folder afterwards. old installation. You should delete the file and `export/` folder afterwards.
* Start `portal`. * Start `portal`.
* Consider configuring a worker service. See [deployment/README.md](deployment/README.md). * Consider configuring a worker service. See [docs/production-deployment.md](./docs/production-deployment.md).

2
api/.gitignore vendored
View file

@ -43,3 +43,5 @@ local/
# both, because then developers will only update one of them and they'll # both, because then developers will only update one of them and they'll
# contradict. For now, npm shall be the canonical default (compare README.md). # contradict. For now, npm shall be the canonical default (compare README.md).
yarn.lock yarn.lock
config.overrides.py

View file

@ -1,4 +1,4 @@
FROM python:3.9.7-bullseye FROM python:3.11.3-bullseye
WORKDIR /opt/obs/api WORKDIR /opt/obs/api

102
api/alembic.ini Normal file
View file

@ -0,0 +1,102 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = migrations
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python-dateutil library that can be
# installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to api/migrations/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:api/migrations/versions
# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View file

@ -1,9 +1,12 @@
HOST = "0.0.0.0" HOST = "0.0.0.0"
PORT = 3000 PORT = 3000
DEBUG = True DEBUG = True
AUTO_RESTART = True VERBOSE = False
AUTO_RELOAD = True
SECRET = "!!!!!!!!!!!!CHANGE ME!!!!!!!!!!!!" SECRET = "!!!!!!!!!!!!CHANGE ME!!!!!!!!!!!!"
POSTGRES_URL = "postgresql+asyncpg://obs:obs@postgres/obs" POSTGRES_URL = "postgresql+asyncpg://obs:obs@postgres/obs"
POSTGRES_POOL_SIZE = 20
POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
KEYCLOAK_URL = "http://keycloak:8080/auth/realms/obs-dev/" KEYCLOAK_URL = "http://keycloak:8080/auth/realms/obs-dev/"
KEYCLOAK_CLIENT_ID = "portal" KEYCLOAK_CLIENT_ID = "portal"
KEYCLOAK_CLIENT_SECRET = "c385278e-bd2e-4f13-9937-34b0c0f44c2d" KEYCLOAK_CLIENT_SECRET = "c385278e-bd2e-4f13-9937-34b0c0f44c2d"
@ -15,6 +18,7 @@ FRONTEND_DIR = None
FRONTEND_CONFIG = { FRONTEND_CONFIG = {
"imprintUrl": "https://example.com/imprint", "imprintUrl": "https://example.com/imprint",
"privacyPolicyUrl": "https://example.com/privacy", "privacyPolicyUrl": "https://example.com/privacy",
# "termsUrl": "https://example.com/terms", # Link is only shown when set
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3}, "mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
# "banner": {"text": "This is a development installation.", "style": "info"}, # "banner": {"text": "This is a development installation.", "style": "info"},
} }
@ -25,5 +29,7 @@ ADDITIONAL_CORS_ORIGINS = [
"http://localhost:8880/", # for maputnik on 8880 "http://localhost:8880/", # for maputnik on 8880
"http://localhost:8888/", # for maputnik on 8888 "http://localhost:8888/", # for maputnik on 8888
] ]
TILE_SEMAPHORE_SIZE = 4
EXPORT_SEMAPHORE_SIZE = 4
# vim: set ft=python : # vim: set ft=python :

View file

@ -4,13 +4,16 @@ PORT = 3000
# Extended log output, but slower # Extended log output, but slower
DEBUG = False DEBUG = False
AUTO_RESTART = DEBUG VERBOSE = DEBUG
AUTO_RELOAD = DEBUG
# Required to encrypt or sign sessions, cookies, tokens, etc. # Required to encrypt or sign sessions, cookies, tokens, etc.
SECRET = "!!!<<<CHANGEME>>>!!!" SECRET = "!!!<<<CHANGEME>>>!!!"
# Connection to the database # Connection to the database
POSTGRES_URL = "postgresql+asyncpg://user:pass@host/dbname" POSTGRES_URL = "postgresql+asyncpg://user:pass@host/dbname"
POSTGRES_POOL_SIZE = 20
POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
# URL to the keycloak realm, as reachable by the API service. This is not # URL to the keycloak realm, as reachable by the API service. This is not
# necessarily its publicly reachable URL, keycloak advertises that iself. # necessarily its publicly reachable URL, keycloak advertises that iself.
@ -36,6 +39,7 @@ FRONTEND_DIR = "../frontend/build/"
FRONTEND_CONFIG = { FRONTEND_CONFIG = {
"imprintUrl": "https://example.com/imprint", "imprintUrl": "https://example.com/imprint",
"privacyPolicyUrl": "https://example.com/privacy", "privacyPolicyUrl": "https://example.com/privacy",
# "termsUrl": "https://example.com/user_terms_and_conditions", # Link is only shown when set
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3}, "mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
"banner": {"text": "This is a test installation.", "style": "warning"}, "banner": {"text": "This is a test installation.", "style": "warning"},
} }
@ -57,4 +61,13 @@ TILES_FILE = None
# default. Python list, or whitespace separated string. # default. Python list, or whitespace separated string.
ADDITIONAL_CORS_ORIGINS = None ADDITIONAL_CORS_ORIGINS = None
# How many asynchronous requests may be sent to the database to generate tile
# information. Should be less than POSTGRES_POOL_SIZE to leave some connections
# to the other features of the API ;)
TILE_SEMAPHORE_SIZE = 4
# How many asynchronous requests may generate exported data simultaneously.
# Keep this small.
EXPORT_SEMAPHORE_SIZE = 1
# vim: set ft=python : # vim: set ft=python :

1
api/migrations/README Normal file
View file

@ -0,0 +1 @@
Generic single-database configuration.

83
api/migrations/env.py Normal file
View file

@ -0,0 +1,83 @@
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def do_run_migrations(connection):
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
from obs.api.app import app
url = app.config.POSTGRES_URL
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
async def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
from obs.api.app import app, connect_db
url = app.config.POSTGRES_URL
async with connect_db(url) as engine:
async with engine.connect() as connection:
await connection.run_sync(do_run_migrations)
await engine.dispose()
if context.is_offline_mode():
run_migrations_offline()
else:
asyncio.run(run_migrations_online())

View file

@ -0,0 +1,24 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}

16
api/migrations/utils.py Normal file
View file

@ -0,0 +1,16 @@
import sqlalchemy as sa
def dbtype(name):
"""
Create a UserDefinedType for use in migrations as the type of a column,
when the type already exists in the database, but isn't available as a
proper sqlalchemy type.
"""
class TheType(sa.types.UserDefinedType):
def get_col_spec(self):
return name
TheType.__name__ = name
return TheType

View file

@ -0,0 +1,39 @@
"""create table road
Revision ID: 35e7f1768f9b
Revises: 5d75febe2d59
Create Date: 2022-03-30 21:36:48.157457
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from migrations.utils import dbtype
# revision identifiers, used by Alembic.
revision = "35e7f1768f9b"
down_revision = "920aed1450c9"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"road",
sa.Column(
"way_id", sa.BIGINT, primary_key=True, index=True, autoincrement=False
),
sa.Column("zone", dbtype("zone_type")),
sa.Column("name", sa.Text),
sa.Column("geometry", dbtype("geometry(LINESTRING,3857)")),
sa.Column("directionality", sa.Integer),
sa.Column("oneway", sa.Boolean),
)
op.execute(
"CREATE INDEX road_geometry_idx ON road USING GIST (geometry) WITH (FILLFACTOR=100);"
)
def downgrade():
op.drop_table("road")

View file

@ -0,0 +1,28 @@
"""create extensions
Revision ID: 3856f240bb6d
Revises: a9627f63fbed
Create Date: 2022-03-30 21:31:06.282725
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "3856f240bb6d"
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
op.execute('CREATE EXTENSION IF NOT EXISTS "hstore";')
op.execute('CREATE EXTENSION IF NOT EXISTS "postgis";')
op.execute('CREATE EXTENSION IF NOT EXISTS "uuid-ossp";')
def downgrade():
op.execute('DROP EXTENSION "hstore";')
op.execute('DROP EXTENSION "postgis";')
op.execute('DROP EXTENSION "uuid-ossp";')

View file

@ -0,0 +1,30 @@
"""transform overtaking_event geometry to 3857
Revision ID: 587e69ecb466
Revises: f4b0f460254d
Create Date: 2023-04-01 14:30:49.927505
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "587e69ecb466"
down_revision = "f4b0f460254d"
branch_labels = None
depends_on = None
def upgrade():
op.execute("UPDATE overtaking_event SET geometry = ST_Transform(geometry, 3857);")
op.execute(
"ALTER TABLE overtaking_event ALTER COLUMN geometry TYPE geometry(POINT, 3857);"
)
def downgrade():
op.execute(
"ALTER TABLE overtaking_event ALTER COLUMN geometry TYPE geometry;"
)
op.execute("UPDATE overtaking_event SET geometry = ST_Transform(geometry, 4326);")

View file

@ -0,0 +1,43 @@
"""create table overtaking_event
Revision ID: 5d75febe2d59
Revises: 920aed1450c9
Create Date: 2022-03-30 21:36:37.687080
"""
from alembic import op
import sqlalchemy as sa
from migrations.utils import dbtype
# revision identifiers, used by Alembic.
revision = "5d75febe2d59"
down_revision = "9336eef458e7"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"overtaking_event",
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True, index=True),
sa.Column(
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
),
sa.Column("hex_hash", sa.String, unique=True, index=True),
sa.Column("way_id", sa.BIGINT, index=True),
sa.Column("direction_reversed", sa.Boolean),
sa.Column("geometry", dbtype("GEOMETRY")),
sa.Column("latitude", sa.Float),
sa.Column("longitude", sa.Float),
sa.Column("time", sa.DateTime),
sa.Column("distance_overtaker", sa.Float),
sa.Column("distance_stationary", sa.Float),
sa.Column("course", sa.Float),
sa.Column("speed", sa.Float),
sa.Index("road_segment", "way_id", "direction_reversed"),
)
def downgrade():
op.drop_table("overtaking_event")

View file

@ -0,0 +1,26 @@
"""add_overtaking_event_index
Revision ID: 7868aed76122
Revises: 587e69ecb466
Create Date: 2023-07-16 13:37:17.694079
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '7868aed76122'
down_revision = '587e69ecb466'
branch_labels = None
depends_on = None
def upgrade():
op.execute("CREATE INDEX IF NOT EXISTS ix_overtaking_event_geometry ON overtaking_event using GIST(geometry);")
def downgrade():
op.drop_index("ix_overtaking_event_geometry")

View file

@ -0,0 +1,31 @@
"""create enum processing_status
Revision ID: 920aed1450c9
Revises: 986c6953e431
Create Date: 2022-03-30 21:36:25.896192
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "920aed1450c9"
down_revision = "986c6953e431"
branch_labels = None
depends_on = None
def _get_enum_type():
return postgresql.ENUM(
"created", "queued", "processing", "complete", "error", name="processing_status"
)
def upgrade():
_get_enum_type().create(op.get_bind(), checkfirst=True)
def downgrade():
_get_enum_type().drop(op.get_bind())

View file

@ -0,0 +1,42 @@
"""create table comment
Revision ID: 9336eef458e7
Revises: 9d8c8c38a1d0
Create Date: 2022-03-30 21:37:02.080429
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import UUID
# revision identifiers, used by Alembic.
revision = "9336eef458e7"
down_revision = "d66baafab5ec"
branch_labels = None
depends_on = None
def upgrade():
NOW = sa.text("NOW()")
op.create_table(
"comment",
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
sa.Column("uid", UUID, server_default=sa.func.uuid_generate_v4()),
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
sa.Column(
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
),
sa.Column("body", sa.TEXT),
sa.Column(
"author_id", sa.Integer, sa.ForeignKey("user.id", ondelete="CASCADE")
),
sa.Column(
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
),
)
def downgrade():
op.drop_table("comment")

View file

@ -0,0 +1,29 @@
"""create enum zone_type
Revision ID: 986c6953e431
Revises: 3856f240bb6d
Create Date: 2022-03-30 21:36:19.888268
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "986c6953e431"
down_revision = "3856f240bb6d"
branch_labels = None
depends_on = None
def _get_enum_type():
return postgresql.ENUM("rural", "urban", "motorway", name="zone_type")
def upgrade():
_get_enum_type().create(op.get_bind(), checkfirst=True)
def downgrade():
_get_enum_type().drop(op.get_bind())

View file

@ -0,0 +1,26 @@
"""add user display_name
Revision ID: 99a3d2eb08f9
Revises: a9627f63fbed
Create Date: 2022-09-13 07:30:18.747880
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "99a3d2eb08f9"
down_revision = "a9627f63fbed"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"user", sa.Column("display_name", sa.String, nullable=True), schema="public"
)
def downgrade():
op.drop_column("user", "display_name", schema="public")

View file

@ -0,0 +1,45 @@
"""create table user
Revision ID: 9d8c8c38a1d0
Revises: d66baafab5ec
Create Date: 2022-03-30 21:36:59.375149
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "9d8c8c38a1d0"
down_revision = "35e7f1768f9b"
branch_labels = None
depends_on = None
def upgrade():
NOW = sa.text("NOW()")
op.create_table(
"user",
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
sa.Column(
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
),
sa.Column("sub", sa.String, unique=True, nullable=False),
sa.Column("username", sa.String, unique=True, nullable=False),
sa.Column("email", sa.String, nullable=False),
sa.Column("bio", sa.TEXT),
sa.Column("image", sa.String),
sa.Column(
"are_tracks_visible_for_all",
sa.Boolean,
server_default=sa.false(),
nullable=False,
),
sa.Column("api_key", sa.String),
sa.Column("match_by_username_email", sa.Boolean, server_default=sa.false()),
)
def downgrade():
op.drop_table("user")

View file

@ -0,0 +1,35 @@
"""create table region
Revision ID: a049e5eb24dd
Revises: a9627f63fbed
Create Date: 2022-04-02 21:28:43.124521
"""
from alembic import op
import sqlalchemy as sa
from migrations.utils import dbtype
# revision identifiers, used by Alembic.
revision = "a049e5eb24dd"
down_revision = "99a3d2eb08f9"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"region",
sa.Column("id", sa.String(24), primary_key=True, index=True),
sa.Column("name", sa.Text),
sa.Column("geometry", dbtype("GEOMETRY(GEOMETRY,3857)"), index=False),
sa.Column("admin_level", sa.Integer, index=True),
)
op.execute(
"CREATE INDEX region_geometry_idx ON region USING GIST (geometry) WITH (FILLFACTOR=100);"
)
def downgrade():
op.drop_table("region")

View file

@ -0,0 +1,34 @@
"""create table road_usage
Revision ID: a9627f63fbed
Revises:
Create Date: 2022-03-16 20:26:17.449569
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "a9627f63fbed"
down_revision = "5d75febe2d59"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"road_usage",
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True, index=True),
sa.Column(
"track_id", sa.Integer, sa.ForeignKey("track.id", ondelete="CASCADE")
),
sa.Column("hex_hash", sa.String, unique=True, index=True),
sa.Column("way_id", sa.BIGINT, index=True),
sa.Column("time", sa.DateTime),
sa.Column("direction_reversed", sa.Boolean),
sa.Index("road_usage_segment", "way_id", "direction_reversed"),
)
def downgrade():
op.drop_table("road_usage")

View file

@ -0,0 +1,39 @@
"""add import groups
Revision ID: b8b0fbae50a4
Revises: f7b21148126a
Create Date: 2023-03-26 09:41:36.621203
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "b8b0fbae50a4"
down_revision = "f7b21148126a"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"road",
sa.Column("import_group", sa.String(), nullable=True),
)
op.add_column(
"region",
sa.Column("import_group", sa.String(), nullable=True),
)
# Set existing to "osm2pgsql"
road = sa.table("road", sa.column("import_group", sa.String))
op.execute(road.update().values(import_group="osm2pgsql"))
region = sa.table("region", sa.column("import_group", sa.String))
op.execute(region.update().values(import_group="osm2pgsql"))
def downgrade():
op.drop_column("road", "import_group")
op.drop_column("region", "import_group")

View file

@ -0,0 +1,66 @@
"""create table track
Revision ID: d66baafab5ec
Revises: 35e7f1768f9b
Create Date: 2022-03-30 21:36:54.848452
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from migrations.utils import dbtype
# revision identifiers, used by Alembic.
revision = "d66baafab5ec"
down_revision = "9d8c8c38a1d0"
branch_labels = None
depends_on = None
def upgrade():
NOW = sa.text("NOW()")
op.create_table(
"track",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("slug", sa.String, unique=True, nullable=False, index=True),
sa.Column("created_at", sa.DateTime, nullable=False, server_default=NOW),
sa.Column(
"updated_at", sa.DateTime, nullable=False, server_default=NOW, onupdate=NOW
),
sa.Column("title", sa.String),
sa.Column(
"processing_status",
dbtype("processing_status"),
server_default=sa.literal("created"),
),
sa.Column("processing_queued_at", sa.DateTime),
sa.Column("processed_at", sa.DateTime),
sa.Column("processing_log", sa.TEXT),
sa.Column(
"customized_title", sa.Boolean, server_default=sa.false(), nullable=False
),
sa.Column("description", sa.TEXT),
sa.Column("public", sa.Boolean, server_default=sa.false()),
sa.Column("uploaded_by_user_agent", sa.String),
sa.Column("original_file_name", sa.String),
sa.Column("original_file_hash", sa.String, nullable=False),
sa.Column(
"author_id",
sa.Integer,
sa.ForeignKey("user.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("recorded_at", sa.DateTime),
sa.Column("recorded_until", sa.DateTime),
sa.Column("duration", sa.Float),
sa.Column("length", sa.Float),
sa.Column("segments", sa.Integer),
sa.Column("num_events", sa.Integer),
sa.Column("num_measurements", sa.Integer),
sa.Column("num_valid", sa.Integer),
)
def downgrade():
op.drop_table("track")

View file

@ -0,0 +1,24 @@
"""add osm id indexes
Revision ID: f4b0f460254d
Revises: b8b0fbae50a4
Create Date: 2023-03-30 10:56:22.066768
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "f4b0f460254d"
down_revision = "b8b0fbae50a4"
branch_labels = None
depends_on = None
def upgrade():
op.execute("CREATE INDEX IF NOT EXISTS ix_road_way_id ON road (way_id);")
def downgrade():
op.drop_index("ix_road_way_id")

View file

@ -0,0 +1,41 @@
"""add user_device
Revision ID: f7b21148126a
Revises: a9627f63fbed
Create Date: 2022-09-15 17:48:06.764342
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "f7b21148126a"
down_revision = "a049e5eb24dd"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"user_device",
sa.Column("id", sa.Integer, autoincrement=True, primary_key=True),
sa.Column("user_id", sa.Integer, sa.ForeignKey("user.id", ondelete="CASCADE")),
sa.Column("identifier", sa.String, nullable=False),
sa.Column("display_name", sa.String, nullable=True),
sa.Index("user_id_identifier", "user_id", "identifier", unique=True),
)
op.add_column(
"track",
sa.Column(
"user_device_id",
sa.Integer,
sa.ForeignKey("user_device.id", ondelete="RESTRICT"),
nullable=True,
),
)
def downgrade():
op.drop_column("track", "user_device_id")
op.drop_table("user_device")

View file

@ -1 +1 @@
__version__ = "0.3.4" __version__ = "0.8.1"

View file

@ -1,9 +1,11 @@
import asyncio
import logging import logging
import re import re
from json import JSONEncoder, dumps from json import JSONEncoder, dumps
from functools import wraps, partial from functools import wraps, partial
from urllib.parse import urlparse from urllib.parse import urlparse
from os.path import dirname, join, normpath, abspath from os.path import dirname, join, normpath, abspath, isfile
from datetime import datetime, date from datetime import datetime, date
from sanic import Sanic, Blueprint from sanic import Sanic, Blueprint
@ -18,22 +20,128 @@ from sanic_session import Session, InMemorySessionInterface
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import sessionmaker
from obs.api.db import User, make_session, connect_db from obs.api.db import User, make_session, connect_db
from obs.api.cors import setup_options, add_cors_headers
from obs.api.utils import get_single_arg
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
app = Sanic("OpenBikeSensor Portal API")
app.update_config("./config.py") class SanicAccessMessageFilter(logging.Filter):
"""
A filter that modifies the log message of a sanic.access log entry to
include useful information.
"""
def filter(self, record):
record.msg = f"{record.request} -> {record.status}"
return True
def configure_sanic_logging():
for logger_name in ["sanic.root", "sanic.access", "sanic.error"]:
logger = logging.getLogger(logger_name)
for handler in logger.handlers:
logger.removeHandler(handler)
logger = logging.getLogger("sanic.access")
for filter_ in logger.filters:
logger.removeFilter(filter_)
logger.addFilter(SanicAccessMessageFilter())
logging.getLogger("sanic.root").setLevel(logging.WARNING)
app = Sanic(
"openbikesensor-api",
env_prefix="OBS_",
)
configure_sanic_logging()
app.config.update(
dict(
DEBUG=False,
VERBOSE=False,
AUTO_RELOAD=False,
POSTGRES_POOL_SIZE=20,
POSTGRES_MAX_OVERFLOW=40,
DEDICATED_WORKER=True,
FRONTEND_URL=None,
FRONTEND_HTTPS=True,
TILES_FILE=None,
TILE_SEMAPHORE_SIZE=4,
EXPORT_SEMAPHORE_SIZE=1,
)
)
# overwrite from defaults again
app.config.load_environment_vars("OBS_")
if isfile("./config.py"):
app.update_config("./config.py")
# For developers to override the config without committing it
if isfile("./config.overrides.py"):
app.update_config("./config.overrides.py")
c = app.config c = app.config
api = Blueprint("api", url_prefix="/api") api = Blueprint("api", url_prefix="/api")
auth = Blueprint("auth", url_prefix="") auth = Blueprint("auth", url_prefix="")
import re
TILE_REQUEST_CANCELLED = re.compile(
r"Connection lost before response written.*GET /tiles"
)
class NoConnectionLostFilter(logging.Filter):
def filter(record):
return not TILE_REQUEST_CANCELLED.match(record.getMessage())
logging.getLogger("sanic.error").addFilter(NoConnectionLostFilter)
def setup_cors(app):
frontend_url = app.config.get("FRONTEND_URL")
additional_origins = app.config.get("ADDITIONAL_CORS_ORIGINS")
if not frontend_url and not additional_origins:
# No CORS configured
return
origins = []
if frontend_url:
u = urlparse(frontend_url)
origins.append(f"{u.scheme}://{u.netloc}")
if isinstance(additional_origins, str):
origins += re.split(r"\s+", additional_origins)
elif isinstance(additional_origins, list):
origins += additional_origins
elif additional_origins is not None:
raise ValueError(
"invalid option type for ADDITIONAL_CORS_ORIGINS, must be list or space separated str"
)
app.ctx.cors_origins = origins
# Add OPTIONS handlers to any route that is missing it
app.register_listener(setup_options, "before_server_start")
# Fill in CORS headers
app.register_middleware(add_cors_headers, "response")
setup_cors(app)
@app.exception(SanicException, BaseException)
async def _handle_sanic_errors(_request, exception):
if isinstance(exception, asyncio.CancelledError):
return None
@api.exception(SanicException, BaseException)
def _handle_sanic_errors(_request, exception):
log.error("Exception in handler: %s", exception, exc_info=True) log.error("Exception in handler: %s", exception, exc_info=True)
return json_response( return json_response(
{ {
@ -65,38 +173,6 @@ def configure_paths(c):
configure_paths(app.config) configure_paths(app.config)
def setup_cors(app):
frontend_url = app.config.get("FRONTEND_URL")
additional_origins = app.config.get("ADDITIONAL_CORS_ORIGINS")
if not frontend_url and not additional_origins:
# No CORS configured
return
origins = []
if frontend_url:
u = urlparse(frontend_url)
origins.append(f"{u.scheme}://{u.netloc}")
if isinstance(additional_origins, str):
origins += re.split(r"\s+", additional_origins)
elif isinstance(additional_origins, list):
origins += additional_origins
elif additional_origins is not None:
raise ValueError(
"invalid option type for ADDITIONAL_CORS_ORIGINS, must be list or space separated str"
)
from sanic_cors import CORS
CORS(
app,
origins=origins,
supports_credentials=True,
)
setup_cors(app)
# TODO: use a different interface, maybe backed by the PostgreSQL, to allow # TODO: use a different interface, maybe backed by the PostgreSQL, to allow
# scaling the API # scaling the API
Session(app, interface=InMemorySessionInterface()) Session(app, interface=InMemorySessionInterface())
@ -104,9 +180,19 @@ Session(app, interface=InMemorySessionInterface())
@app.before_server_start @app.before_server_start
async def app_connect_db(app, loop): async def app_connect_db(app, loop):
app.ctx._db_engine_ctx = connect_db(app.config.POSTGRES_URL) app.ctx._db_engine_ctx = connect_db(
app.config.POSTGRES_URL,
app.config.POSTGRES_POOL_SIZE,
app.config.POSTGRES_MAX_OVERFLOW,
)
app.ctx._db_engine = await app.ctx._db_engine_ctx.__aenter__() app.ctx._db_engine = await app.ctx._db_engine_ctx.__aenter__()
if app.config.TILE_SEMAPHORE_SIZE:
app.ctx.tile_semaphore = asyncio.Semaphore(app.config.TILE_SEMAPHORE_SIZE)
if app.config.EXPORT_SEMAPHORE_SIZE:
app.ctx.export_semaphore = asyncio.Semaphore(app.config.EXPORT_SEMAPHORE_SIZE)
@app.after_server_stop @app.after_server_stop
async def app_disconnect_db(app, loop): async def app_disconnect_db(app, loop):
@ -120,6 +206,11 @@ def remove_right(l, r):
return l return l
@app.middleware("request")
async def inject_arg_getter(req):
req.ctx.get_single_arg = partial(get_single_arg, req)
@app.middleware("request") @app.middleware("request")
async def inject_urls(req): async def inject_urls(req):
if req.app.config.FRONTEND_HTTPS: if req.app.config.FRONTEND_HTTPS:
@ -163,7 +254,6 @@ async def inject_urls(req):
async def inject_session(req): async def inject_session(req):
req.ctx._session_ctx = make_session() req.ctx._session_ctx = make_session()
req.ctx.db = await req.ctx._session_ctx.__aenter__() req.ctx.db = await req.ctx._session_ctx.__aenter__()
sessionmaker(req.app.ctx._db_engine, class_=AsyncSession, expire_on_commit=False)()
@app.middleware("response") @app.middleware("response")
@ -250,12 +340,12 @@ from .routes import (
info, info,
login, login,
stats, stats,
tiles,
tracks, tracks,
users, users,
mapdetails, exports,
) )
from .routes import tiles, mapdetails
from .routes import frontend from .routes import frontend

68
api/obs/api/cors.py Normal file
View file

@ -0,0 +1,68 @@
from collections import defaultdict
from typing import Dict, FrozenSet, Iterable
from sanic import Sanic, response
from sanic_routing.router import Route
def _add_cors_headers(request, response, methods: Iterable[str]) -> None:
allow_methods = list(set(methods))
if "OPTIONS" not in allow_methods:
allow_methods.append("OPTIONS")
origin = request.headers.get("origin")
if origin in request.app.ctx.cors_origins:
headers = {
"Access-Control-Allow-Methods": ",".join(allow_methods),
"Access-Control-Allow-Origin": origin,
"Access-Control-Allow-Credentials": "true",
"Access-Control-Allow-Headers": (
"origin, content-type, accept, "
"authorization, x-xsrf-token, x-request-id"
),
"Access-Control-Expose-Headers": "content-disposition",
}
response.headers.extend(headers)
def add_cors_headers(request, response):
if request.method != "OPTIONS":
methods = [method for method in request.route.methods]
_add_cors_headers(request, response, methods)
def _compile_routes_needing_options(routes: Dict[str, Route]) -> Dict[str, FrozenSet]:
needs_options = defaultdict(list)
# This is 21.12 and later. You will need to change this for older versions.
for route in routes.values():
if "OPTIONS" not in route.methods:
needs_options[route.uri].extend(route.methods)
return {uri: frozenset(methods) for uri, methods in dict(needs_options).items()}
def _options_wrapper(handler, methods):
def wrapped_handler(request, *args, **kwargs):
nonlocal methods
return handler(request, methods)
return wrapped_handler
async def options_handler(request, methods) -> response.HTTPResponse:
resp = response.empty()
_add_cors_headers(request, resp, methods)
return resp
def setup_options(app: Sanic, _):
app.router.reset()
needs_options = _compile_routes_needing_options(app.router.routes_all)
for uri, methods in needs_options.items():
app.add_route(
_options_wrapper(options_handler, methods),
uri,
methods=["OPTIONS"],
)
app.router.finalize()

View file

@ -3,7 +3,7 @@ from contextvars import ContextVar
from contextlib import asynccontextmanager from contextlib import asynccontextmanager
from datetime import datetime from datetime import datetime
import os import os
from os.path import join, dirname from os.path import exists, join, dirname
from json import loads from json import loads
import re import re
import math import math
@ -12,6 +12,7 @@ import random
import string import string
import secrets import secrets
from slugify import slugify from slugify import slugify
import logging
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
@ -33,26 +34,33 @@ from sqlalchemy import (
select, select,
text, text,
literal, literal,
Text,
) )
from sqlalchemy.dialects.postgresql import HSTORE, UUID from sqlalchemy.dialects.postgresql import UUID
log = logging.getLogger(__name__)
Base = declarative_base() Base = declarative_base()
engine = None engine = None
sessionmaker = None sessionmaker: SessionMaker
@asynccontextmanager @asynccontextmanager
async def make_session(): async def make_session():
async with sessionmaker() as session: async with sessionmaker(autoflush=True) as session:
yield session yield session
async def drop_all():
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
async def init_models(): async def init_models():
async with engine.begin() as conn: async with engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "hstore";')) await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "hstore";'))
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "postgis";')) await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "postgis";'))
await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "uuid-ossp";')) await conn.execute(text('CREATE EXTENSION IF NOT EXISTS "uuid-ossp";'))
@ -65,10 +73,12 @@ def random_string(length):
@asynccontextmanager @asynccontextmanager
async def connect_db(url): async def connect_db(url, pool_size=10, max_overflow=20):
global engine, sessionmaker global engine, sessionmaker
engine = create_async_engine(url, echo=False) engine = create_async_engine(
url, echo=False, pool_size=pool_size, max_overflow=max_overflow
)
sessionmaker = SessionMaker(engine, class_=AsyncSession, expire_on_commit=False) sessionmaker = SessionMaker(engine, class_=AsyncSession, expire_on_commit=False)
yield engine yield engine
@ -98,6 +108,28 @@ class Geometry(UserDefinedType):
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self) return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
class LineString(UserDefinedType):
def get_col_spec(self):
return "geometry(LineString, 3857)"
def bind_expression(self, bindvalue):
return func.ST_GeomFromGeoJSON(bindvalue, type_=self)
def column_expression(self, col):
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
class GeometryGeometry(UserDefinedType):
def get_col_spec(self):
return "geometry(GEOMETRY, 3857)"
def bind_expression(self, bindvalue):
return func.ST_GeomFromGeoJSON(bindvalue, type_=self)
def column_expression(self, col):
return func.ST_AsGeoJSON(func.ST_Transform(col, 4326), type_=self)
class OvertakingEvent(Base): class OvertakingEvent(Base):
__tablename__ = "overtaking_event" __tablename__ = "overtaking_event"
__table_args__ = (Index("road_segment", "way_id", "direction_reversed"),) __table_args__ = (Index("road_segment", "way_id", "direction_reversed"),)
@ -125,12 +157,23 @@ class OvertakingEvent(Base):
class Road(Base): class Road(Base):
__tablename__ = "road" __tablename__ = "road"
way_id = Column(BIGINT, primary_key=True, index=True) way_id = Column(BIGINT, primary_key=True, index=True, autoincrement=False)
zone = Column(ZoneType) zone = Column(ZoneType)
name = Column(String) name = Column(Text)
geometry = Column(Geometry) geometry = Column(LineString)
directionality = Column(Integer) directionality = Column(Integer)
oneway = Column(Boolean) oneway = Column(Boolean)
import_group = Column(String)
__table_args__ = (
# We keep the index name as osm2pgsql created it, way back when.
Index(
"road_geometry_idx",
"geometry",
postgresql_using="gist",
postgresql_with={"fillfactor": 100},
),
)
def to_dict(self): def to_dict(self):
return { return {
@ -143,11 +186,34 @@ class Road(Base):
} }
class RoadUsage(Base):
__tablename__ = "road_usage"
__table_args__ = (Index("road_usage_segment", "way_id", "direction_reversed"),)
id = Column(Integer, autoincrement=True, primary_key=True, index=True)
track_id = Column(Integer, ForeignKey("track.id", ondelete="CASCADE"))
hex_hash = Column(String, unique=True, index=True)
way_id = Column(BIGINT, index=True)
time = Column(DateTime)
direction_reversed = Column(Boolean)
def __repr__(self):
return f"<RoadUsage {self.id}>"
def __hash__(self):
return int(self.hex_hash, 16)
def __eq__(self, other):
return self.hex_hash == other.hex_hash
NOW = text("NOW()") NOW = text("NOW()")
class DuplicateTrackFileError(ValueError): class DuplicateTrackFileError(ValueError):
pass pass
class Track(Base): class Track(Base):
__tablename__ = "track" __tablename__ = "track"
id = Column(Integer, primary_key=True, autoincrement=True) id = Column(Integer, primary_key=True, autoincrement=True)
@ -195,6 +261,12 @@ class Track(Base):
Integer, ForeignKey("user.id", ondelete="CASCADE"), nullable=False Integer, ForeignKey("user.id", ondelete="CASCADE"), nullable=False
) )
user_device_id = Column(
Integer,
ForeignKey("user_device.id", ondelete="RESTRICT"),
nullable=True,
)
# Statistics... maybe we'll drop some of this if we can easily compute them from SQL # Statistics... maybe we'll drop some of this if we can easily compute them from SQL
recorded_at = Column(DateTime) recorded_at = Column(DateTime)
recorded_until = Column(DateTime) recorded_until = Column(DateTime)
@ -227,6 +299,7 @@ class Track(Base):
if for_user_id is not None and for_user_id == self.author_id: if for_user_id is not None and for_user_id == self.author_id:
result["uploadedByUserAgent"] = self.uploaded_by_user_agent result["uploadedByUserAgent"] = self.uploaded_by_user_agent
result["originalFileName"] = self.original_file_name result["originalFileName"] = self.original_file_name
result["userDeviceId"] = self.user_device_id
if self.author: if self.author:
result["author"] = self.author.to_dict(for_user_id=for_user_id) result["author"] = self.author.to_dict(for_user_id=for_user_id)
@ -328,6 +401,7 @@ class User(Base):
updated_at = Column(DateTime, nullable=False, server_default=NOW, onupdate=NOW) updated_at = Column(DateTime, nullable=False, server_default=NOW, onupdate=NOW)
sub = Column(String, unique=True, nullable=False) sub = Column(String, unique=True, nullable=False)
username = Column(String, unique=True, nullable=False) username = Column(String, unique=True, nullable=False)
display_name = Column(String, nullable=True)
email = Column(String, nullable=False) email = Column(String, nullable=False)
bio = Column(TEXT) bio = Column(TEXT)
image = Column(String) image = Column(String)
@ -348,11 +422,60 @@ class User(Base):
self.api_key = secrets.token_urlsafe(24) self.api_key = secrets.token_urlsafe(24)
def to_dict(self, for_user_id=None): def to_dict(self, for_user_id=None):
return { result = {
"username": self.username, "id": self.id,
"displayName": self.display_name or self.username,
"bio": self.bio, "bio": self.bio,
"image": self.image, "image": self.image,
} }
if for_user_id == self.id:
result["username"] = self.username
return result
async def rename(self, config, new_name):
old_name = self.username
renames = [
(join(basedir, old_name), join(basedir, new_name))
for basedir in [config.PROCESSING_OUTPUT_DIR, config.TRACKS_DIR]
]
for src, dst in renames:
if exists(dst):
raise FileExistsError(
f"cannot move {src!r} to {dst!r}, destination exists"
)
for src, dst in renames:
if not exists(src):
log.debug("Rename user %s: Not moving %s, not found", self.id, src)
else:
log.info("Rename user %s: Moving %s to %s", self.id, src, dst)
os.rename(src, dst)
self.username = new_name
class UserDevice(Base):
__tablename__ = "user_device"
id = Column(Integer, autoincrement=True, primary_key=True)
user_id = Column(Integer, ForeignKey("user.id", ondelete="CASCADE"))
identifier = Column(String, nullable=False)
display_name = Column(String, nullable=True)
__table_args__ = (
Index("user_id_identifier", "user_id", "identifier", unique=True),
)
def to_dict(self, for_user_id=None):
if for_user_id != self.user_id:
return {}
return {
"id": self.id,
"identifier": self.identifier,
"displayName": self.display_name,
}
class Comment(Base): class Comment(Base):
@ -378,24 +501,58 @@ class Comment(Base):
} }
class Region(Base):
__tablename__ = "region"
id = Column(String(24), primary_key=True, index=True)
name = Column(Text)
geometry = Column(GeometryGeometry)
admin_level = Column(Integer, index=True)
import_group = Column(String)
__table_args__ = (
# We keep the index name as osm2pgsql created it, way back when.
Index(
"region_geometry_idx",
"geometry",
postgresql_using="gist",
postgresql_with={"fillfactor": 100},
),
)
Comment.author = relationship("User", back_populates="authored_comments") Comment.author = relationship("User", back_populates="authored_comments")
User.authored_comments = relationship( User.authored_comments = relationship(
"Comment", order_by=Comment.created_at, back_populates="author" "Comment",
order_by=Comment.created_at,
back_populates="author",
passive_deletes=True,
) )
Track.author = relationship("User", back_populates="authored_tracks") Track.author = relationship("User", back_populates="authored_tracks")
User.authored_tracks = relationship( User.authored_tracks = relationship(
"Track", order_by=Track.created_at, back_populates="author" "Track", order_by=Track.created_at, back_populates="author", passive_deletes=True
) )
Comment.track = relationship("Track", back_populates="comments") Comment.track = relationship("Track", back_populates="comments")
Track.comments = relationship( Track.comments = relationship(
"Comment", order_by=Comment.created_at, back_populates="track" "Comment", order_by=Comment.created_at, back_populates="track", passive_deletes=True
) )
OvertakingEvent.track = relationship("Track", back_populates="overtaking_events") OvertakingEvent.track = relationship("Track", back_populates="overtaking_events")
Track.overtaking_events = relationship( Track.overtaking_events = relationship(
"OvertakingEvent", order_by=OvertakingEvent.time, back_populates="track" "OvertakingEvent",
order_by=OvertakingEvent.time,
back_populates="track",
passive_deletes=True,
)
Track.user_device = relationship("UserDevice", back_populates="tracks")
UserDevice.tracks = relationship(
"Track",
order_by=Track.created_at,
back_populates="user_device",
passive_deletes=False,
) )

View file

@ -8,7 +8,7 @@ import pytz
from os.path import join from os.path import join
from datetime import datetime from datetime import datetime
from sqlalchemy import delete, select from sqlalchemy import delete, func, select, and_
from sqlalchemy.orm import joinedload from sqlalchemy.orm import joinedload
from obs.face.importer import ImportMeasurementsCsv from obs.face.importer import ImportMeasurementsCsv
@ -27,12 +27,21 @@ from obs.face.filter import (
from obs.face.osm import DataSource, DatabaseTileSource from obs.face.osm import DataSource, DatabaseTileSource
from obs.api.db import OvertakingEvent, Track, make_session from obs.api.db import OvertakingEvent, RoadUsage, Track, UserDevice, make_session
from obs.api.app import app from obs.api.app import app
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
def get_data_source():
"""
Creates a data source based on the configuration of the portal. In *lean*
mode, the OverpassTileSource is used to fetch data on demand. In normal
mode, the roads database is used.
"""
return DataSource(DatabaseTileSource())
async def process_tracks_loop(delay): async def process_tracks_loop(delay):
while True: while True:
try: try:
@ -50,9 +59,7 @@ async def process_tracks_loop(delay):
await asyncio.sleep(delay) await asyncio.sleep(delay)
continue continue
tile_source = DatabaseTileSource() data_source = get_data_source()
data_source = DataSource(tile_source)
await process_track(session, track, data_source) await process_track(session, track, data_source)
except BaseException: except BaseException:
log.exception("Failed to process track. Will continue.") log.exception("Failed to process track. Will continue.")
@ -66,8 +73,7 @@ async def process_tracks(tracks):
:param tracks: A list of strings which :param tracks: A list of strings which
""" """
tile_source = DatabaseTileSource() data_source = get_data_source()
data_source = DataSource(tile_source)
async with make_session() as session: async with make_session() as session:
for track_id_or_slug in tracks: for track_id_or_slug in tracks:
@ -95,6 +101,30 @@ def to_naive_utc(t):
return t.astimezone(pytz.UTC).replace(tzinfo=None) return t.astimezone(pytz.UTC).replace(tzinfo=None)
async def export_gpx(track, filename, name):
import xml.etree.ElementTree as ET
gpx = ET.Element("gpx")
metadata = ET.SubElement(gpx, "metadata")
ET.SubElement(metadata, "name").text = name
trk = ET.SubElement(gpx, "trk")
ET.SubElement(trk, "name").text = name
ET.SubElement(trk, "type").text = "Cycling"
trkseg = ET.SubElement(trk, "trkseg")
for point in track:
trkpt = ET.SubElement(
trkseg, "trkpt", lat=str(point["latitude"]), lon=str(point["longitude"])
)
ET.SubElement(trkpt, "time").text = point["time"].isoformat()
et = ET.ElementTree(gpx)
et.write(filename, encoding="utf-8", xml_declaration=True)
async def process_track(session, track, data_source): async def process_track(session, track, data_source):
try: try:
track.processing_status = "complete" track.processing_status = "complete"
@ -109,14 +139,17 @@ async def process_track(session, track, data_source):
os.makedirs(output_dir, exist_ok=True) os.makedirs(output_dir, exist_ok=True)
log.info("Annotating and filtering CSV file") log.info("Annotating and filtering CSV file")
imported_data, statistics = ImportMeasurementsCsv().read( imported_data, statistics, track_metadata = ImportMeasurementsCsv().read(
original_file_path, original_file_path,
user_id="dummy", # TODO: user username or id or nothing? user_id="dummy", # TODO: user username or id or nothing?
dataset_id=Track.slug, # TODO: use track id or slug or nothing? dataset_id=Track.slug, # TODO: use track id or slug or nothing?
return_metadata=True,
) )
annotator = AnnotateMeasurements( annotator = AnnotateMeasurements(
data_source, cache_dir=app.config.OBS_FACE_CACHE_DIR data_source,
cache_dir=app.config.OBS_FACE_CACHE_DIR,
fully_annotate_unconfirmed=True,
) )
input_data = await annotator.annotate(imported_data) input_data = await annotator.annotate(imported_data)
@ -153,23 +186,69 @@ async def process_track(session, track, data_source):
}, },
} }
track_raw_json = {
"type": "Feature",
"geometry": {
"type": "LineString",
"coordinates": [
[m["longitude_GPS"], m["latitude_GPS"]] for m in track_points
],
},
}
for output_filename, data in [ for output_filename, data in [
("measurements.json", measurements_json), ("measurements.json", measurements_json),
("overtakingEvents.json", overtaking_events_json), ("overtakingEvents.json", overtaking_events_json),
("track.json", track_json), ("track.json", track_json),
("trackRaw.json", track_raw_json),
]: ]:
target = join(output_dir, output_filename) target = join(output_dir, output_filename)
log.debug("Writing file %s", target) log.debug("Writing file %s", target)
with open(target, "w") as fp: with open(target, "w") as fp:
json.dump(data, fp, indent=4) json.dump(data, fp, indent=4)
await export_gpx(track_points, join(output_dir, "track.gpx"), track.slug)
log.info("Clearing old track data...") log.info("Clearing old track data...")
await clear_track_data(session, track) await clear_track_data(session, track)
await session.commit() await session.commit()
device_identifier = track_metadata.get("DeviceId")
if device_identifier:
if isinstance(device_identifier, list):
device_identifier = device_identifier[0]
log.info("Finding or creating device %s", device_identifier)
user_device = (
await session.execute(
select(UserDevice).where(
and_(
UserDevice.user_id == track.author_id,
UserDevice.identifier == device_identifier,
)
)
)
).scalar()
log.debug("user_device is %s", user_device)
if not user_device:
user_device = UserDevice(
user_id=track.author_id, identifier=device_identifier
)
log.debug("Create new device for this user")
session.add(user_device)
track.user_device = user_device
else:
log.info("No DeviceId in track metadata.")
log.info("Import events into database...") log.info("Import events into database...")
await import_overtaking_events(session, track, overtaking_events) await import_overtaking_events(session, track, overtaking_events)
log.info("import road usages...")
await import_road_usages(session, track, track_points)
log.info("Write track statistics and update status...") log.info("Write track statistics and update status...")
track.recorded_at = to_naive_utc(statistics["t_min"]) track.recorded_at = to_naive_utc(statistics["t_min"])
track.recorded_until = to_naive_utc(statistics["t_max"]) track.recorded_until = to_naive_utc(statistics["t_max"])
@ -207,6 +286,7 @@ async def clear_track_data(session, track):
await session.execute( await session.execute(
delete(OvertakingEvent).where(OvertakingEvent.track_id == track.id) delete(OvertakingEvent).where(OvertakingEvent.track_id == track.id)
) )
await session.execute(delete(RoadUsage).where(RoadUsage.track_id == track.id))
async def import_overtaking_events(session, track, overtaking_events): async def import_overtaking_events(session, track, overtaking_events):
@ -226,11 +306,16 @@ async def import_overtaking_events(session, track, overtaking_events):
hex_hash=hex_hash, hex_hash=hex_hash,
way_id=m.get("OSM_way_id"), way_id=m.get("OSM_way_id"),
direction_reversed=m.get("OSM_way_orientation", 0) < 0, direction_reversed=m.get("OSM_way_orientation", 0) < 0,
geometry=json.dumps( geometry=func.ST_Transform(
func.ST_GeomFromGeoJSON(
json.dumps(
{ {
"type": "Point", "type": "Point",
"coordinates": [m["longitude"], m["latitude"]], "coordinates": [m["longitude"], m["latitude"]],
} }
)
),
3857,
), ),
latitude=m["latitude"], latitude=m["latitude"],
longitude=m["longitude"], longitude=m["longitude"],
@ -242,3 +327,51 @@ async def import_overtaking_events(session, track, overtaking_events):
) )
session.add_all(event_models.values()) session.add_all(event_models.values())
def get_road_usages(track_points):
last_key = None
last = None
for p in track_points:
way_id = p.get("OSM_way_id")
direction_reversed = p.get("OSM_way_orientation", 0) < 0
key = (way_id, direction_reversed)
if last_key is None or last_key[0] is None:
last = p
last_key = key
continue
if last_key != key:
if last_key[0] is not None:
yield last
last_key = key
last = p
if last is not None and last_key[0] is not None:
yield last
async def import_road_usages(session, track, track_points):
usages = set()
for p in get_road_usages(track_points):
direction_reversed = p.get("OSM_way_orientation", 0) < 0
way_id = p.get("OSM_way_id")
time = p["time"]
hex_hash = hashlib.sha256(
struct.pack("dQ", way_id, int(time.timestamp()))
).hexdigest()
usages.add(
RoadUsage(
track_id=track.id,
hex_hash=hex_hash,
way_id=way_id,
time=time.astimezone(pytz.utc).replace(tzinfo=None),
direction_reversed=direction_reversed,
)
)
session.add_all(usages)

View file

@ -0,0 +1,261 @@
import json
from enum import Enum
from contextlib import contextmanager
import zipfile
import io
import re
import math
from sqlite3 import connect
import shapefile
from obs.api.db import OvertakingEvent
from sqlalchemy import select, func, text
from sanic.response import raw
from sanic.exceptions import InvalidUsage
from obs.api.app import api, json as json_response
from obs.api.utils import use_request_semaphore
import logging
log = logging.getLogger(__name__)
class ExportFormat(str, Enum):
SHAPEFILE = "shapefile"
GEOJSON = "geojson"
def parse_bounding_box(input_string):
left, bottom, right, top = map(float, input_string.split(","))
return func.ST_SetSRID(
func.ST_MakeBox2D(
func.ST_Point(left, bottom),
func.ST_Point(right, top),
),
4326,
)
PROJECTION_4326 = (
'GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],'
'AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],'
'UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]'
)
@contextmanager
def shapefile_zip(shape_type=shapefile.POINT, basename="events"):
zip_buffer = io.BytesIO()
shp, shx, dbf = (io.BytesIO() for _ in range(3))
writer = shapefile.Writer(
shp=shp, shx=shx, dbf=dbf, shapeType=shape_type, encoding="utf8"
)
yield writer, zip_buffer
writer.balance()
writer.close()
zip_file = zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False)
zip_file.writestr(f"{basename}.shp", shp.getbuffer())
zip_file.writestr(f"{basename}.shx", shx.getbuffer())
zip_file.writestr(f"{basename}.dbf", dbf.getbuffer())
zip_file.writestr(f"{basename}.prj", PROJECTION_4326)
zip_file.close()
@api.get(r"/export/events")
async def export_events(req):
async with use_request_semaphore(req, "export_semaphore", timeout=30):
bbox = req.ctx.get_single_arg("bbox", default="-180,-90,180,90")
assert re.match(r"(-?\d+\.?\d+,?){4}", bbox)
bbox = list(map(float, bbox.split(",")))
fmt = req.ctx.get_single_arg("fmt", convert=ExportFormat)
events = await req.ctx.db.stream(
text(
"""
SELECT
ST_AsGeoJSON(ST_Transform(geometry, 4326)) AS geometry,
distance_overtaker,
distance_stationary,
way_id,
direction,
speed,
time_stamp,
course,
zone
FROM
layer_obs_events(
ST_Transform(ST_MakeEnvelope(:bbox0, :bbox1, :bbox2, :bbox3, 4326), 3857),
19,
NULL,
'1900-01-01'::timestamp,
'2100-01-01'::timestamp
)
"""
).bindparams(bbox0=bbox[0], bbox1=bbox[1], bbox2=bbox[2], bbox3=bbox[3])
)
if fmt == ExportFormat.SHAPEFILE:
with shapefile_zip(basename="events") as (writer, zip_buffer):
writer.field("distance_overtaker", "N", decimal=4)
writer.field("distance_stationary", "N", decimal=4)
writer.field("way_id", "N", decimal=0)
writer.field("direction", "N", decimal=0)
writer.field("course", "N", decimal=4)
writer.field("speed", "N", decimal=4)
writer.field("zone", "C")
async for event in events:
coords = json.loads(event.geometry)["coordinates"]
writer.point(*coords)
writer.record(
distance_overtaker=event.distance_overtaker,
distance_stationary=event.distance_stationary,
direction=event.direction,
way_id=event.way_id,
course=event.course,
speed=event.speed,
zone=event.zone
# "time"=event.time,
)
return raw(zip_buffer.getbuffer())
if fmt == ExportFormat.GEOJSON:
features = []
async for event in events:
geom = json.loads(event.geometry)
features.append(
{
"type": "Feature",
"geometry": geom,
"properties": {
"distance_overtaker": event.distance_overtaker
if event.distance_overtaker is not None
and not math.isnan(event.distance_overtaker)
else None,
"distance_stationary": event.distance_stationary
if event.distance_stationary is not None
and not math.isnan(event.distance_stationary)
else None,
"direction": event.direction
if event.direction is not None
and not math.isnan(event.direction)
else None,
"way_id": event.way_id,
"course": event.course
if event.course is not None and not math.isnan(event.course)
else None,
"speed": event.speed
if event.speed is not None and not math.isnan(event.speed)
else None,
"time": event.time_stamp,
"zone": event.zone,
},
}
)
geojson = {"type": "FeatureCollection", "features": features}
return json_response(geojson)
raise InvalidUsage("unknown export format")
@api.get(r"/export/segments")
async def export_segments(req):
async with use_request_semaphore(req, "export_semaphore", timeout=30):
bbox = req.ctx.get_single_arg("bbox", default="-180,-90,180,90")
assert re.match(r"(-?\d+\.?\d+,?){4}", bbox)
bbox = list(map(float, bbox.split(",")))
fmt = req.ctx.get_single_arg("fmt", convert=ExportFormat)
segments = await req.ctx.db.stream(
text(
"""
SELECT
ST_AsGeoJSON(ST_Transform(geometry, 4326)) AS geometry,
way_id,
distance_overtaker_mean,
distance_overtaker_min,
distance_overtaker_max,
distance_overtaker_median,
overtaking_event_count,
usage_count,
direction,
zone,
offset_direction,
distance_overtaker_array
FROM
layer_obs_roads(
ST_Transform(ST_MakeEnvelope(:bbox0, :bbox1, :bbox2, :bbox3, 4326), 3857),
11,
NULL,
'1900-01-01'::timestamp,
'2100-01-01'::timestamp
)
WHERE usage_count > 0
"""
).bindparams(bbox0=bbox[0], bbox1=bbox[1], bbox2=bbox[2], bbox3=bbox[3])
)
if fmt == ExportFormat.SHAPEFILE:
with shapefile_zip(shape_type=3, basename="segments") as (
writer,
zip_buffer,
):
writer.field("distance_overtaker_mean", "N", decimal=4)
writer.field("distance_overtaker_max", "N", decimal=4)
writer.field("distance_overtaker_min", "N", decimal=4)
writer.field("distance_overtaker_median", "N", decimal=4)
writer.field("overtaking_event_count", "N", decimal=4)
writer.field("usage_count", "N", decimal=4)
writer.field("way_id", "N", decimal=0)
writer.field("direction", "N", decimal=0)
writer.field("zone", "C")
async for segment in segments:
geom = json.loads(segment.st_asgeojson)
writer.line([geom["coordinates"]])
writer.record(
distance_overtaker_mean=segment.distance_overtaker_mean,
distance_overtaker_median=segment.distance_overtaker_median,
distance_overtaker_max=segment.distance_overtaker_max,
distance_overtaker_min=segment.distance_overtaker_min,
usage_count=segment.usage_count,
overtaking_event_count=segment.overtaking_event_count,
direction=segment.direction,
way_id=segment.way_id,
zone=segment.zone,
)
return raw(zip_buffer.getbuffer())
if fmt == ExportFormat.GEOJSON:
features = []
async for segment in segments:
features.append(
{
"type": "Feature",
"geometry": json.loads(segment.geometry),
"properties": {
"distance_overtaker_mean": segment.distance_overtaker_mean,
"distance_overtaker_max": segment.distance_overtaker_max,
"distance_overtaker_median": segment.distance_overtaker_median,
"overtaking_event_count": segment.overtaking_event_count,
"usage_count": segment.usage_count,
"distance_overtaker_array": segment.distance_overtaker_array,
"direction": segment.direction,
"way_id": segment.way_id,
"zone": segment.zone,
},
}
)
geojson = {"type": "FeatureCollection", "features": features}
return json_response(geojson)
raise InvalidUsage("unknown export format")

View file

@ -1,4 +1,4 @@
from os.path import join, exists, isfile from os.path import join, exists, isfile, abspath
import sanic.response as response import sanic.response as response
from sanic.exceptions import NotFound from sanic.exceptions import NotFound
@ -6,6 +6,7 @@ from sanic.exceptions import NotFound
from obs.api.app import app from obs.api.app import app
if app.config.FRONTEND_CONFIG: if app.config.FRONTEND_CONFIG:
@app.get("/config.json") @app.get("/config.json")
def get_frontend_config(req): def get_frontend_config(req):
result = { result = {
@ -22,7 +23,7 @@ if app.config.FRONTEND_CONFIG:
.replace("111", "{x}") .replace("111", "{x}")
.replace("222", "{y}") .replace("222", "{y}")
], ],
"minzoom": 12, "minzoom": 0,
"maxzoom": 14, "maxzoom": 14,
}, },
} }
@ -45,6 +46,9 @@ if INDEX_HTML and exists(INDEX_HTML):
raise NotFound() raise NotFound()
file = join(app.config.FRONTEND_DIR, path) file = join(app.config.FRONTEND_DIR, path)
if not abspath(file).startswith(abspath(app.config.FRONTEND_DIR)):
raise NotFound()
if not exists(file) or not path or not isfile(file): if not exists(file) or not path or not isfile(file):
return response.html( return response.html(
index_file_contents.replace("__BASE_HREF__", req.ctx.frontend_url + "/") index_file_contents.replace("__BASE_HREF__", req.ctx.frontend_url + "/")

View file

@ -1,5 +1,8 @@
import asyncio
import logging import logging
import os import re
from requests.exceptions import RequestException
from sqlalchemy import select from sqlalchemy import select
@ -8,20 +11,23 @@ from oic.oic import Client
from oic.oic.message import AuthorizationResponse, RegistrationResponse from oic.oic.message import AuthorizationResponse, RegistrationResponse
from oic.utils.authn.client import CLIENT_AUTHN_METHOD from oic.utils.authn.client import CLIENT_AUTHN_METHOD
from obs.api.app import auth from obs.api.app import auth, api
from obs.api.db import User from obs.api.db import User
from sanic.response import json, redirect from sanic.response import json, redirect
from sanicargs import parse_parameters
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
client = Client(client_authn_method=CLIENT_AUTHN_METHOD) client = Client(client_authn_method=CLIENT_AUTHN_METHOD)
# Do not show verbose library output, even when the appliaction is in debug mode
logging.getLogger("oic").setLevel(logging.INFO)
@auth.before_server_start @auth.before_server_start
async def connect_auth_client(app, loop): async def connect_auth_client(app, loop):
client.allow["issuer_mismatch"] = True client.allow["issuer_mismatch"] = True
try:
client.provider_config(app.config.KEYCLOAK_URL) client.provider_config(app.config.KEYCLOAK_URL)
client.store_registration_info( client.store_registration_info(
RegistrationResponse( RegistrationResponse(
@ -29,15 +35,22 @@ async def connect_auth_client(app, loop):
client_secret=app.config.KEYCLOAK_CLIENT_SECRET, client_secret=app.config.KEYCLOAK_CLIENT_SECRET,
) )
) )
except RequestException:
log.exception(f"could not connect to {app.config.KEYCLOAK_URL}")
log.info("will retry")
await asyncio.sleep(2)
log.info("retrying")
await connect_auth_client(app, loop)
@auth.route("/login") @auth.route("/login")
@parse_parameters async def login(req):
async def login(req, next: str = None): next_url = req.ctx.get_single_arg("next", default=None)
session = req.ctx.session session = req.ctx.session
session["state"] = rndstr() session["state"] = rndstr()
session["nonce"] = rndstr() session["nonce"] = rndstr()
session["next"] = next session["next"] = next_url
args = { args = {
"client_id": client.client_id, "client_id": client.client_id,
"response_type": "code", "response_type": "code",
@ -79,6 +92,15 @@ async def login_redirect(req):
preferred_username = userinfo["preferred_username"] preferred_username = userinfo["preferred_username"]
email = userinfo.get("email") email = userinfo.get("email")
clean_username = re.sub(r"[^a-zA-Z0-9_.-]", "", preferred_username)
if clean_username != preferred_username:
log.warning(
"Username %r contained invalid characters and was changed to %r",
preferred_username,
clean_username,
)
preferred_username = clean_username
if email is None: if email is None:
raise ValueError( raise ValueError(
"user has no email set, please configure keycloak to require emails" "user has no email set, please configure keycloak to require emails"
@ -116,16 +138,20 @@ async def login_redirect(req):
user = User(sub=sub, username=preferred_username, email=email) user = User(sub=sub, username=preferred_username, email=email)
req.ctx.db.add(user) req.ctx.db.add(user)
else: else:
log.info("Logged in known user (id: %s, sub: %s).", user.id, user.sub) log.info(
"Logged in known user (id: %s, sub: %s, %s).",
user.id,
user.sub,
preferred_username,
)
if email != user.email: if email != user.email:
log.debug("Updating user (id: %s) email from auth system.", user.id) log.debug("Updating user (id: %s) email from auth system.", user.id)
user.email = email user.email = email
# TODO: re-add username change when we can safely rename users if preferred_username != user.username:
# if preferred_username != user.username: log.debug("Updating user (id: %s) username from auth system.", user.id)
# log.debug("Updating user (id: %s) username from auth system.", user.id) await user.rename(req.app.config, preferred_username)
# user.username = preferred_username
await req.ctx.db.commit() await req.ctx.db.commit()
@ -133,3 +159,15 @@ async def login_redirect(req):
next_ = session.pop("next", "/") or "/" next_ = session.pop("next", "/") or "/"
return redirect(next_) return redirect(next_)
@api.route("/logout")
async def logout(req):
session = req.ctx.session
if "user_id" in session:
del session["user_id"]
auth_req = client.construct_EndSessionRequest(state=session["state"])
logout_url = auth_req.request(client.end_session_endpoint)
return redirect(logout_url + f"&post_logout_redirect_uri={req.ctx.api_url}/logout")

View file

@ -1,5 +1,6 @@
import json import json
from functools import partial from functools import partial
import logging
import numpy import numpy
import math import math
@ -10,53 +11,37 @@ from sanic.exceptions import InvalidUsage
from obs.api.app import api from obs.api.app import api
from obs.api.db import Road, OvertakingEvent, Track from obs.api.db import Road, OvertakingEvent, Track
from obs.api.utils import round_to
from .stats import round_to
round_distance = partial(round_to, multiples=0.001) round_distance = partial(round_to, multiples=0.001)
round_speed = partial(round_to, multiples=0.1) round_speed = partial(round_to, multiples=0.1)
RAISE = object() log = logging.getLogger(__name__)
def get_single_arg(req, name, default=RAISE, convert=None): def get_bearing(b, a):
try:
value = req.args[name][0]
except LookupError as e:
if default is not RAISE:
return default
raise InvalidUsage("missing `{name}`") from e
if convert is not None:
try:
value = convert(value)
except (ValueError, TypeError) as e:
raise InvalidUsage("invalid `{name}`") from e
return value
def get_bearing(a, b):
# longitude, latitude # longitude, latitude
dL = b[0] - a[0] dL = b[0] - a[0]
X = numpy.cos(b[1]) * numpy.sin(dL) X = numpy.cos(b[1]) * numpy.sin(dL)
Y = numpy.cos(a[1]) * numpy.sin(b[1]) - numpy.sin(a[1]) * numpy.cos( Y = numpy.cos(a[1]) * numpy.sin(b[1]) - numpy.sin(a[1]) * numpy.cos(
b[1] b[1]
) * numpy.cos(dL) ) * numpy.cos(dL)
return numpy.arctan2(X, Y) return numpy.arctan2(Y, X) + 0.5 * math.pi
# Bins for histogram on overtaker distances. 0, 0.25, ... 2.25, infinity
DISTANCE_BINS = numpy.arange(0, 2.5, 0.25).tolist() + [float('inf')]
@api.route("/mapdetails/road", methods=["GET"]) @api.route("/mapdetails/road", methods=["GET"])
async def mapdetails_road(req): async def mapdetails_road(req):
longitude = get_single_arg(req, "longitude", convert=float) longitude = req.ctx.get_single_arg("longitude", convert=float)
latitude = get_single_arg(req, "latitude", convert=float) latitude = req.ctx.get_single_arg("latitude", convert=float)
radius = get_single_arg(req, "radius", default=100, convert=float) radius = req.ctx.get_single_arg("radius", default=100, convert=float)
if not (1 <= radius <= 1000): if not (1 <= radius <= 1000):
raise InvalidUsage("`radius` parameter must be between 1 and 1000") raise InvalidUsage("`radius` parameter must be between 1 and 1000")
road_geometry = func.ST_Transform(Road.geometry, 3857) road_geometry = Road.geometry
point = func.ST_Transform( point = func.ST_Transform(
func.ST_GeomFromGeoJSON( func.ST_GeomFromGeoJSON(
json.dumps( json.dumps(
@ -99,26 +84,25 @@ async def mapdetails_road(req):
arrays = numpy.array(arrays).T arrays = numpy.array(arrays).T
if len(arrays) == 0: if len(arrays) == 0:
arrays = numpy.array([[], [], [], []], dtype=numpy.float) arrays = numpy.array([[], [], [], []], dtype=float)
data, mask = arrays[:-1], arrays[-1] data, mask = arrays[:-1], arrays[-1]
data = data.astype(numpy.float64) data = data.astype(numpy.float64)
mask = mask.astype(numpy.bool) mask = mask.astype(bool)
def partition(arr, cond): def partition(arr, cond):
return arr[:, cond], arr[:, ~cond] return arr[:, cond], arr[:, ~cond]
forwards, backwards = partition(data, ~mask) forwards, backwards = partition(data, ~mask)
print("for", forwards.dtype, "back", backwards.dtype)
def array_stats(arr, rounder): def array_stats(arr, rounder, bins=30):
if len(arr): if len(arr):
print("ARR DTYPE", arr.dtype)
print("ARR", arr)
arr = arr[~numpy.isnan(arr)] arr = arr[~numpy.isnan(arr)]
n = len(arr) n = len(arr)
hist, bins = numpy.histogram(arr, bins=bins)
return { return {
"statistics": { "statistics": {
"count": n, "count": n,
@ -127,6 +111,11 @@ async def mapdetails_road(req):
"max": rounder(numpy.max(arr)) if n else None, "max": rounder(numpy.max(arr)) if n else None,
"median": rounder(numpy.median(arr)) if n else None, "median": rounder(numpy.median(arr)) if n else None,
}, },
"histogram": {
"bins": [None if math.isinf(b) else b for b in bins.tolist()],
"counts": hist.tolist(),
"zone": road.zone
},
"values": list(map(rounder, arr.tolist())), "values": list(map(rounder, arr.tolist())),
} }
@ -139,15 +128,13 @@ async def mapdetails_road(req):
# convert to degrees, as this is more natural to understand for consumers # convert to degrees, as this is more natural to understand for consumers
bearing = round_to((bearing / math.pi * 180 + 360) % 360, 1) bearing = round_to((bearing / math.pi * 180 + 360) % 360, 1)
print(road.geometry)
def get_direction_stats(direction_arrays, backwards=False): def get_direction_stats(direction_arrays, backwards=False):
return { return {
"bearing": ((bearing + 180) % 360 if backwards else bearing) "bearing": ((bearing + 180) % 360 if backwards else bearing)
if bearing is not None if bearing is not None
else None, else None,
"distanceOvertaker": array_stats(direction_arrays[0], round_distance), "distanceOvertaker": array_stats(direction_arrays[0], round_distance, bins=DISTANCE_BINS),
"distanceStationary": array_stats(direction_arrays[1], round_distance), "distanceStationary": array_stats(direction_arrays[1], round_distance, bins=DISTANCE_BINS),
"speed": array_stats(direction_arrays[2], round_speed), "speed": array_stats(direction_arrays[2], round_speed),
} }

View file

@ -4,13 +4,13 @@ from typing import Optional
from operator import and_ from operator import and_
from functools import reduce from functools import reduce
from sqlalchemy import select, func from sqlalchemy import distinct, select, func, desc
from sanic.response import json from sanic.response import json
from sanicargs import parse_parameters
from obs.api.app import api from obs.api.app import api
from obs.api.db import Track, OvertakingEvent, User from obs.api.db import Track, OvertakingEvent, User, Region, UserDevice
from obs.api.utils import round_to
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -26,15 +26,12 @@ TRACK_DURATION_ROUNDING = 120
MINUMUM_RECORDING_DATE = datetime(2010, 1, 1) MINUMUM_RECORDING_DATE = datetime(2010, 1, 1)
def round_to(value: float, multiples: float) -> float:
if value is None:
return None
return round(value / multiples) * multiples
@api.route("/stats") @api.route("/stats")
@parse_parameters async def stats(req):
async def stats(req, user: str = None, start: datetime = None, end: datetime = None): user = req.ctx.get_single_arg("user", default=None)
start = req.ctx.get_single_arg("start", default=None, convert=datetime)
end = req.ctx.get_single_arg("end", default=None, convert=datetime)
conditions = [ conditions = [
Track.recorded_at != None, Track.recorded_at != None,
Track.recorded_at > MINUMUM_RECORDING_DATE, Track.recorded_at > MINUMUM_RECORDING_DATE,
@ -48,7 +45,7 @@ async def stats(req, user: str = None, start: datetime = None, end: datetime = N
# Only the user can look for their own stats, for now # Only the user can look for their own stats, for now
by_user = ( by_user = (
user is not None and req.ctx.user is not None and req.ctx.user.username == user user is not None and req.ctx.user is not None and req.ctx.user.id == int(user)
) )
if by_user: if by_user:
conditions.append(Track.author_id == req.ctx.user.id) conditions.append(Track.author_id == req.ctx.user.id)
@ -95,6 +92,14 @@ async def stats(req, user: str = None, start: datetime = None, end: datetime = N
.where(track_condition) .where(track_condition)
) )
).scalar() ).scalar()
device_count = (
await req.ctx.db.execute(
select(func.count(distinct(UserDevice.id)))
.select_from(UserDevice)
.join(Track.user_device)
.where(track_condition)
)
).scalar()
result = { result = {
"numEvents": event_count, "numEvents": event_count,
@ -103,6 +108,7 @@ async def stats(req, user: str = None, start: datetime = None, end: datetime = N
"trackDuration": round_to(track_duration or 0, TRACK_DURATION_ROUNDING), "trackDuration": round_to(track_duration or 0, TRACK_DURATION_ROUNDING),
"publicTrackCount": public_track_count, "publicTrackCount": public_track_count,
"trackCount": track_count, "trackCount": track_count,
"deviceCount": device_count,
} }
return json(result) return json(result)
@ -170,3 +176,31 @@ async def stats(req, user: str = None, start: datetime = None, end: datetime = N
# }); # });
# }), # }),
# ); # );
@api.route("/stats/regions")
async def stats(req):
query = (
select(
[
Region.id,
Region.name,
func.count(OvertakingEvent.id).label("overtaking_event_count"),
]
)
.select_from(Region)
.join(
OvertakingEvent,
func.ST_Within(OvertakingEvent.geometry, Region.geometry),
)
.group_by(
Region.id,
Region.name,
Region.geometry,
)
.having(func.count(OvertakingEvent.id) > 0)
.order_by(desc("overtaking_event_count"))
)
regions = list(map(dict, (await req.ctx.db.execute(query)).all()))
return json(regions)

View file

@ -1,11 +1,16 @@
from gzip import decompress from gzip import decompress
from sqlite3 import connect from sqlite3 import connect
from datetime import datetime, time, timedelta
from typing import Optional, Tuple
import dateutil.parser
from sanic.exceptions import Forbidden, InvalidUsage
from sanic.response import raw from sanic.response import raw
from sqlalchemy import select, text from sqlalchemy import text
from sqlalchemy.sql.expression import table, column
from obs.api.app import app from obs.api.app import app
from obs.api.utils import use_request_semaphore
def get_tile(filename, zoom, x, y): def get_tile(filename, zoom, x, y):
@ -23,26 +28,82 @@ def get_tile(filename, zoom, x, y):
content = db.execute( content = db.execute(
"SELECT tile_data FROM tiles WHERE zoom_level=? AND tile_column=? AND tile_row=?", "SELECT tile_data FROM tiles WHERE zoom_level=? AND tile_column=? AND tile_row=?",
(zoom, x, (2 ** zoom - 1) - y), (zoom, x, (2**zoom - 1) - y),
).fetchone() ).fetchone()
return content and content[0] or None return content and content[0] or None
def round_date(date, to="weeks", up=False):
if to != "weeks":
raise ValueError(f"cannot round to {to}")
midnight = time(0, 0, 0, 0)
start_of_day = date.date() # ignore time
weekday = date.weekday()
is_rounded = date.time() == midnight and weekday == 0
if is_rounded:
return date
if up:
return datetime.combine(start_of_day + timedelta(days=7 - weekday), midnight)
else:
return datetime.combine(start_of_day - timedelta(days=weekday), midnight)
# regenerate approx. once each day # regenerate approx. once each day
TILE_CACHE_MAX_AGE = 3600 * 24 TILE_CACHE_MAX_AGE = 3600 * 24
def get_filter_options(
req,
) -> Tuple[Optional[str], Optional[datetime], Optional[datetime]]:
"""
Returns parsed, validated and normalized options for filtering map data, a
tuple of
* user_id (str|None)
* start (datetime|None)
* end (datetime|None)
"""
user_id = req.ctx.get_single_arg("user", default=None, convert=int)
if user_id is not None and (req.ctx.user is None or req.ctx.user.id != user_id):
raise Forbidden()
parse_date = lambda s: dateutil.parser.parse(s)
start = req.ctx.get_single_arg("start", default=None, convert=parse_date)
end = req.ctx.get_single_arg("end", default=None, convert=parse_date)
start = round_date(start, to="weeks", up=False) if start else None
end = round_date(end, to="weeks", up=True) if end else None
if start is not None and end is not None and start >= end:
raise InvalidUsage(
"end date must be later than start date (note: dates are rounded to weeks)"
)
return user_id, start, end
@app.route(r"/tiles/<zoom:int>/<x:int>/<y:(\d+)\.pbf>") @app.route(r"/tiles/<zoom:int>/<x:int>/<y:(\d+)\.pbf>")
async def tiles(req, zoom: int, x: int, y: str): async def tiles(req, zoom: int, x: int, y: str):
async with use_request_semaphore(req, "tile_semaphore"):
if app.config.get("TILES_FILE"): if app.config.get("TILES_FILE"):
tile = get_tile(req.app.config.TILES_FILE, int(zoom), int(x), int(y)) tile = get_tile(req.app.config.TILES_FILE, int(zoom), int(x), int(y))
else: else:
user_id, start, end = get_filter_options(req)
tile = await req.ctx.db.scalar( tile = await req.ctx.db.scalar(
text(f"select data from getmvt(:zoom, :x, :y) as b(data, key);").bindparams( text(
"select data from getmvt(:zoom, :x, :y, :user_id, :min_time, :max_time) as b(data, key);"
).bindparams(
zoom=int(zoom), zoom=int(zoom),
x=int(x), x=int(x),
y=int(y), y=int(y),
user_id=user_id,
min_time=start,
max_time=end,
) )
) )

View file

@ -1,17 +1,18 @@
import logging import logging
import re import re
from datetime import date
from json import load as jsonload from json import load as jsonload
from os.path import join, exists, isfile from os.path import join, exists, isfile
from sqlalchemy import select, func from sanic.exceptions import InvalidUsage, NotFound, Forbidden
from sanic.response import file_stream, empty
from slugify import slugify
from sqlalchemy import select, func, and_
from sqlalchemy.orm import joinedload from sqlalchemy.orm import joinedload
from obs.api.db import Track, User, Comment, DuplicateTrackFileError
from obs.api.app import api, require_auth, read_api_key, json from obs.api.app import api, require_auth, read_api_key, json
from obs.api.db import Track, Comment, DuplicateTrackFileError
from sanic.response import file_stream, empty from obs.api.utils import tar_of_tracks
from sanic.exceptions import InvalidUsage, NotFound, Forbidden
from sanicargs import parse_parameters
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -24,8 +25,8 @@ def normalize_user_agent(user_agent):
return m[0] if m else None return m[0] if m else None
async def _return_tracks(req, extend_query, limit, offset): async def _return_tracks(req, extend_query, limit, offset, order_by=None):
if limit <= 0 or limit > 100: if limit <= 0 or limit > 1000:
raise InvalidUsage("invalid limit") raise InvalidUsage("invalid limit")
if offset < 0: if offset < 0:
@ -40,7 +41,7 @@ async def _return_tracks(req, extend_query, limit, offset):
extend_query(select(Track).options(joinedload(Track.author))) extend_query(select(Track).options(joinedload(Track.author)))
.limit(limit) .limit(limit)
.offset(offset) .offset(offset)
.order_by(Track.created_at.desc()) .order_by(order_by if order_by is not None else Track.created_at)
) )
tracks = (await req.ctx.db.execute(query)).scalars() tracks = (await req.ctx.db.execute(query)).scalars()
@ -61,27 +62,117 @@ async def _return_tracks(req, extend_query, limit, offset):
@api.get("/tracks") @api.get("/tracks")
@parse_parameters async def get_tracks(req):
async def get_tracks(req, limit: int = 20, offset: int = 0, author: str = None): limit = req.ctx.get_single_arg("limit", default=20, convert=int)
offset = req.ctx.get_single_arg("offset", default=0, convert=int)
# author = req.ctx.get_single_arg("author", default=None, convert=int)
def extend_query(q): def extend_query(q):
q = q.where(Track.public) q = q.where(Track.public)
if author is not None: # if author is not None:
q = q.where(User.username == author) # q = q.where(Track.author_id == author)
return q return q
return await _return_tracks(req, extend_query, limit, offset) return await _return_tracks(req, extend_query, limit, offset)
def parse_boolean(s):
if s is None:
return None
s = s.lower()
if s in ("true", "1", "yes", "y", "t"):
return True
if s in ("false", "0", "no", "n", "f"):
return False
raise ValueError("invalid value for boolean")
@api.get("/tracks/feed") @api.get("/tracks/feed")
@require_auth @require_auth
@parse_parameters async def get_feed(req):
async def get_feed(req, limit: int = 20, offset: int = 0): limit = req.ctx.get_single_arg("limit", default=20, convert=int)
def extend_query(q): offset = req.ctx.get_single_arg("offset", default=0, convert=int)
return q.where(Track.author_id == req.ctx.user.id) user_device_id = req.ctx.get_single_arg("user_device_id", default=None, convert=int)
return await _return_tracks(req, extend_query, limit, offset) order_by_columns = {
"recordedAt": Track.recorded_at,
"title": Track.title,
"visibility": Track.public,
"length": Track.length,
"duration": Track.duration,
"user_device_id": Track.user_device_id,
}
order_by = req.ctx.get_single_arg(
"order_by", default=None, convert=order_by_columns.get
)
reversed_ = req.ctx.get_single_arg("reversed", convert=parse_boolean, default=False)
if reversed_:
order_by = order_by.desc()
public = req.ctx.get_single_arg("public", convert=parse_boolean, default=None)
def extend_query(q):
q = q.where(Track.author_id == req.ctx.user.id)
if user_device_id is not None:
q = q.where(Track.user_device_id == user_device_id)
if public is not None:
q = q.where(Track.public == public)
return q
return await _return_tracks(req, extend_query, limit, offset, order_by)
@api.post("/tracks/bulk")
@require_auth
async def tracks_bulk_action(req):
body = req.json
action = body["action"]
track_slugs = body["tracks"]
if action not in ("delete", "makePublic", "makePrivate", "reprocess", "download"):
raise InvalidUsage("invalid action")
query = select(Track).where(
and_(Track.author_id == req.ctx.user.id, Track.slug.in_(track_slugs))
)
files = set()
for track in (await req.ctx.db.execute(query)).scalars():
if action == "delete":
await req.ctx.db.delete(track)
elif action == "makePublic":
if not track.public:
track.queue_processing()
track.public = True
elif action == "makePrivate":
if track.public:
track.queue_processing()
track.public = False
elif action == "reprocess":
track.queue_processing()
elif action == "download":
files.add(track.get_original_file_path(req.app.config))
await req.ctx.db.commit()
if action == "download":
username_slug = slugify(req.ctx.user.username, separator="-")
date_str = date.today().isoformat()
file_basename = f"tracks_{username_slug}_{date_str}"
await tar_of_tracks(req, files, file_basename)
return
return empty()
@api.post("/tracks") @api.post("/tracks")
@ -177,6 +268,7 @@ async def get_track_data(req, slug: str):
"measurements": "measurements.json", "measurements": "measurements.json",
"overtakingEvents": "overtakingEvents.json", "overtakingEvents": "overtakingEvents.json",
"track": "track.json", "track": "track.json",
"trackRaw": "trackRaw.json",
} }
result = {} result = {}
@ -203,7 +295,29 @@ async def download_original_file(req, slug: str):
if not track.is_visible_to_private(req.ctx.user): if not track.is_visible_to_private(req.ctx.user):
raise Forbidden() raise Forbidden()
return await file_stream(track.get_original_file_path(req.app.config)) return await file_stream(
track.get_original_file_path(req.app.config),
mime_type="text/csv",
filename=f"{slug}.csv",
)
@api.get("/tracks/<slug:str>/download/track.gpx")
async def download_track_gpx(req, slug: str):
track = await _load_track(req, slug)
if not track.is_visible_to(req.ctx.user):
raise Forbidden()
file_path = join(req.app.config.PROCESSING_OUTPUT_DIR, track.file_path, "track.gpx")
if not exists(file_path) or not isfile(file_path):
raise NotFound()
return await file_stream(
file_path,
mime_type="application/gpx+xml",
filename=f"{slug}.gpx",
)
@api.put("/tracks/<slug:str>") @api.put("/tracks/<slug:str>")
@ -260,8 +374,10 @@ async def put_track(req, slug: str):
@api.get("/tracks/<slug:str>/comments") @api.get("/tracks/<slug:str>/comments")
@parse_parameters async def get_track_comments(req, slug: str):
async def get_track_comments(req, slug: str, limit: int = 20, offset: int = 0): limit = req.ctx.get_single_arg("limit", default=20, convert=int)
offset = req.ctx.get_single_arg("offset", default=0, convert=int)
track = await _load_track(req, slug) track = await _load_track(req, slug)
comment_count = await req.ctx.db.scalar( comment_count = await req.ctx.db.scalar(

View file

@ -1,9 +1,11 @@
import logging import logging
from sanic.response import json from sanic.response import json
from sanic.exceptions import InvalidUsage from sanic.exceptions import InvalidUsage, Forbidden, NotFound
from sqlalchemy import and_, select
from obs.api.app import api, require_auth from obs.api.app import api, require_auth
from obs.api.db import UserDevice
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -12,7 +14,9 @@ from obs.api import __version__ as version
def user_to_json(user): def user_to_json(user):
return { return {
"id": user.id,
"username": user.username, "username": user.username,
"displayName": user.display_name,
"email": user.email, "email": user.email,
"bio": user.bio, "bio": user.bio,
"image": user.image, "image": user.image,
@ -26,6 +30,48 @@ async def get_user(req):
return json(user_to_json(req.ctx.user) if req.ctx.user else None) return json(user_to_json(req.ctx.user) if req.ctx.user else None)
@api.get("/user/devices")
async def get_user_devices(req):
if not req.ctx.user:
raise Forbidden()
query = (
select(UserDevice)
.where(UserDevice.user_id == req.ctx.user.id)
.order_by(UserDevice.id)
)
devices = (await req.ctx.db.execute(query)).scalars()
return json([device.to_dict(req.ctx.user.id) for device in devices])
@api.put("/user/devices/<device_id:int>")
async def put_user_device(req, device_id):
if not req.ctx.user:
raise Forbidden()
body = req.json
query = (
select(UserDevice)
.where(and_(UserDevice.user_id == req.ctx.user.id, UserDevice.id == device_id))
.limit(1)
)
device = (await req.ctx.db.execute(query)).scalar()
if device is None:
raise NotFound()
new_name = body.get("displayName", "").strip()
if new_name and device.display_name != new_name:
device.display_name = new_name
await req.ctx.db.commit()
return json(device.to_dict())
@api.put("/user") @api.put("/user")
@require_auth @require_auth
async def put_user(req): async def put_user(req):
@ -36,6 +82,9 @@ async def put_user(req):
if key in data and isinstance(data[key], (str, type(None))): if key in data and isinstance(data[key], (str, type(None))):
setattr(user, key, data[key]) setattr(user, key, data[key])
if "displayName" in data:
user.display_name = data["displayName"] or None
if "areTracksVisibleForAll" in data: if "areTracksVisibleForAll" in data:
user.are_tracks_visible_for_all = bool(data["areTracksVisibleForAll"]) user.are_tracks_visible_for_all = bool(data["areTracksVisibleForAll"])

162
api/obs/api/utils.py Normal file
View file

@ -0,0 +1,162 @@
import asyncio
from contextlib import asynccontextmanager
from datetime import datetime
import logging
from os.path import commonpath, join, relpath
import queue
import tarfile
import dateutil.parser
from sanic.exceptions import InvalidUsage, ServiceUnavailable
log = logging.getLogger(__name__)
RAISE = object()
def get_single_arg(req, name, default=RAISE, convert=None):
try:
value = req.args[name][0]
except LookupError as e:
if default is RAISE:
raise InvalidUsage(f"missing `{name}`") from e
value = default
if convert is not None and value is not None:
if convert is datetime or convert in ("date", "datetime"):
convert = lambda s: dateutil.parser.parse(s)
try:
value = convert(value)
except (ValueError, TypeError) as e:
raise InvalidUsage(f"invalid `{name}`: {str(e)}") from e
return value
def round_to(value: float, multiples: float) -> float:
if value is None:
return None
return round(value / multiples) * multiples
def chunk_list(lst, n):
for s in range(0, len(lst), n):
yield lst[s : s + n]
class chunk:
def __init__(self, iterable, n):
self.iterable = iterable
self.n = n
def __iter__(self):
if isinstance(self.iterable, list):
yield from chunk_list(self.iterable, self.n)
return
it = iter(self.iterable)
while True:
current = []
try:
for _ in range(self.n):
current.append(next(it))
yield current
except StopIteration:
if current:
yield current
break
async def __aiter__(self):
if hasattr(self.iterable, "__iter__"):
for item in self:
yield item
return
it = self.iterable.__aiter__()
while True:
current = []
try:
for _ in range(self.n):
current.append(await it.__anext__())
yield current
except StopAsyncIteration:
if len(current):
yield current
break
async def tar_of_tracks(req, files, file_basename="tracks"):
response = await req.respond(
content_type="application/x-gtar",
headers={
"content-disposition": f'attachment; filename="{file_basename}.tar.bz2"'
},
)
helper = StreamerHelper(response)
tar = tarfile.open(name=None, fileobj=helper, mode="w|bz2", bufsize=256 * 512)
root = commonpath(list(files))
for fname in files:
log.info("Write file to tar: %s", fname)
with open(fname, "rb") as fobj:
tarinfo = tar.gettarinfo(fname)
tarinfo.name = join(file_basename, relpath(fname, root))
tar.addfile(tarinfo, fobj)
await helper.send_all()
tar.close()
await helper.send_all()
await response.eof()
class StreamerHelper:
def __init__(self, response):
self.response = response
self.towrite = queue.Queue()
def write(self, data):
self.towrite.put(data)
async def send_all(self):
while True:
try:
tosend = self.towrite.get(block=False)
await self.response.send(tosend)
except queue.Empty:
break
@asynccontextmanager
async def use_request_semaphore(req, semaphore_name, timeout=10):
"""
If configured, acquire a semaphore for the map tile request and release it
after the context has finished.
If the semaphore cannot be acquired within the timeout, issue a 503 Service
Unavailable error response that describes that the database is overloaded,
so users know what the problem is.
Operates as a noop when the tile semaphore is not enabled.
"""
semaphore = getattr(req.app.ctx, semaphore_name, None)
if semaphore is None:
yield
return
try:
await asyncio.wait_for(semaphore.acquire(), timeout)
try:
yield
finally:
semaphore.release()
except asyncio.TimeoutError:
raise ServiceUnavailable(
"Too many requests, database overloaded. Please retry later."
)

View file

@ -1,28 +1,64 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import math
import sys import sys
import os import os
import argparse import argparse
import asyncio import asyncio
import logging import logging
import coloredlogs
from obs.api.app import app from obs.api.app import app
from obs.api.db import connect_db from obs.api.db import connect_db
log = logging.getLogger(__name__)
def format_size(n, b=1024):
if n == 0:
return "0 B"
if n < 0:
return "-" + format_size(n, b)
e = math.floor(math.log(n, b))
prefixes = ["", "Ki", "Mi", "Gi", "Ti"] if b == 1024 else ["", "K", "M", "G", "T"]
e = min(e, len(prefixes) - 1)
r = n / b**e
s = f"{r:0.2f}" if e > 0 else str(n)
return f"{s} {prefixes[e]}B"
class AccessLogFilter(logging.Filter):
def filter(self, record):
if not record.msg:
record.msg = (
f"{record.request} - {record.status} ({format_size(record.byte)})"
)
return True
def main(): def main():
debug = app.config.DEBUG debug = app.config.DEBUG
logging.basicConfig( coloredlogs.install(
level=logging.DEBUG if debug else logging.INFO, level=logging.DEBUG if app.config.get("VERBOSE", debug) else logging.INFO,
format="%(levelname)s: %(message)s", milliseconds=True,
isatty=True,
) )
for ln in ["sanic.root", "sanic.error", "sanic.access"]:
l = logging.getLogger(ln)
for h in list(l.handlers):
l.removeHandler(h)
logging.getLogger("sanic.access").addFilter(AccessLogFilter())
app.run( app.run(
host=app.config.HOST, host=app.config.HOST,
port=app.config.PORT, port=app.config.PORT,
debug=debug, debug=debug,
auto_reload=app.config.get("AUTO_RELOAD", debug), auto_reload=app.config.get("AUTO_RELOAD", debug),
access_log=True,
) )

View file

@ -0,0 +1,191 @@
#!/usr/bin/env python3
import sys
import re
import msgpack
import osmium
import shapely.wkb as wkb
from shapely.ops import transform
HIGHWAY_TYPES = {
"trunk",
"primary",
"secondary",
"tertiary",
"unclassified",
"residential",
"trunk_link",
"primary_link",
"secondary_link",
"tertiary_link",
"living_street",
"service",
"track",
"road",
}
ZONE_TYPES = {
"urban",
"rural",
"motorway",
}
URBAN_TYPES = {
"residential",
"living_street",
"road",
}
MOTORWAY_TYPES = {
"motorway",
"motorway_link",
}
ADMIN_LEVEL_MIN = 2
ADMIN_LEVEL_MAX = 8
MINSPEED_RURAL = 60
ONEWAY_YES = {"yes", "true", "1"}
ONEWAY_REVERSE = {"reverse", "-1"}
def parse_number(tag):
if not tag:
return None
match = re.search(r"[0-9]+", tag)
if not match:
return None
digits = match.group(0)
try:
return int(digits)
except ValueError:
return None
def determine_zone(tags):
highway = tags.get("highway")
zone = tags.get("zone:traffic")
if zone is not None:
if "rural" in zone:
return "rural"
if "motorway" in zone:
return "motorway"
return "urban"
# From here on we are guessing based on other tags
if highway in URBAN_TYPES:
return "urban"
if highway in MOTORWAY_TYPES:
return "motorway"
maxspeed_source = tags.get("source:maxspeed")
if maxspeed_source and "rural" in maxspeed_source:
return "rural"
if maxspeed_source and "urban" in maxspeed_source:
return "urban"
for key in ["maxspeed", "maxspeed:forward", "maxspeed:backward"]:
maxspeed = parse_number(tags.get(key))
if maxspeed is not None and maxspeed > MINSPEED_RURAL:
return "rural"
# default to urban if we have no idea
return "urban"
def determine_direction(tags, zone):
if (
tags.get("oneway") in ONEWAY_YES
or tags.get("junction") == "roundabout"
or zone == "motorway"
):
return 1, True
if tags.get("oneway") in ONEWAY_REVERSE:
return -1, True
return 0, False
class StreamPacker:
def __init__(self, stream, *args, **kwargs):
self.stream = stream
self.packer = msgpack.Packer(*args, autoreset=False, **kwargs)
def _write_out(self):
if hasattr(self.packer, "getbuffer"):
chunk = self.packer.getbuffer()
else:
chunk = self.packer.bytes()
self.stream.write(chunk)
self.packer.reset()
def pack(self, *args, **kwargs):
self.packer.pack(*args, **kwargs)
self._write_out()
def pack_array_header(self, *args, **kwargs):
self.packer.pack_array_header(*args, **kwargs)
self._write_out()
def pack_map_header(self, *args, **kwargs):
self.packer.pack_map_header(*args, **kwargs)
self._write_out()
def pack_map_pairs(self, *args, **kwargs):
self.packer.pack_map_pairs(*args, **kwargs)
self._write_out()
# A global factory that creates WKB from a osmium geometry
wkbfab = osmium.geom.WKBFactory()
from pyproj import Transformer
project = Transformer.from_crs("EPSG:4326", "EPSG:3857", always_xy=True).transform
class OSMHandler(osmium.SimpleHandler):
def __init__(self, packer):
self.packer = packer
super().__init__()
def way(self, way):
tags = way.tags
highway = tags.get("highway")
if not highway or highway not in HIGHWAY_TYPES:
return
access = tags.get("access", None)
bicycle = tags.get("bicycle", None)
if access == "no" and bicycle not in ["designated", "yes", "permissive", "destination"]:
return
zone = determine_zone(tags)
directionality, oneway = determine_direction(tags, zone)
name = tags.get("name")
geometry = wkb.loads(wkbfab.create_linestring(way), hex=True)
geometry = transform(project, geometry)
geometry = wkb.dumps(geometry)
self.packer.pack(
[b"\x01", way.id, name, zone, directionality, oneway, geometry]
)
def main():
with open(sys.argv[2], "wb") as fout:
packer = StreamPacker(fout)
osmhandler = OSMHandler(packer)
osmhandler.apply_file(sys.argv[1], locations=True)
if __name__ == "__main__":
main()

View file

@ -1,12 +1,22 @@
sanic~=21.9.1 coloredlogs~=15.0.1
oic~=1.3.0 sanic==22.6.2
oic~=1.5.0
sanic-session~=0.8.0 sanic-session~=0.8.0
sanicargs~=2.1.0 python-slugify~=6.1.2
sanic-cors~=1.0.1 motor~=3.1.1
python-slugify~=5.0.2 pyyaml~=5.3.1
motor~=2.5.1
pyyaml<6
-e git+https://github.com/openmaptiles/openmaptiles-tools#egg=openmaptiles-tools -e git+https://github.com/openmaptiles/openmaptiles-tools#egg=openmaptiles-tools
sqlparse~=0.4.2 sqlparse~=0.4.3
sqlalchemy[asyncio]~=1.4.25 sqlalchemy[asyncio]~=1.4.46
asyncpg~=0.24.0 asyncpg~=0.27.0
pyshp~=2.3.1
alembic~=1.9.4
stream-zip~=0.0.50
msgpack~=1.0.5
osmium~=3.6.0
psycopg~=3.1.8
shapely~=2.0.1
pyproj~=3.4.1
aiohttp~=3.8.1
# sanic requires websocets and chockes on >=10 in 2022.6.2
websockets<11

@ -1 +1 @@
Subproject commit 8e9395fd3cd0f1e83b4413546bc2d3cb0c726738 Subproject commit 664e4d606416417c0651ea1748d32dd36209be6a

View file

@ -10,19 +10,25 @@ setup(
packages=find_packages(), packages=find_packages(),
package_data={}, package_data={},
install_requires=[ install_requires=[
"sanic~=21.9.1", "coloredlogs~=15.0.1",
"sanic==22.6.2",
"oic>=1.3.0, <2", "oic>=1.3.0, <2",
"sanic-session~=0.8.0", "sanic-session~=0.8.0",
"sanicargs~=2.1.0", "python-slugify>=5.0.2,<6.2.0",
"sanic-cors~=1.0.1", "motor>=2.5.1,<3.1.2",
"python-slugify~=5.0.2", "pyyaml<6",
"motor~=2.5.1", "sqlparse~=0.4.3",
"sqlparse~=0.4.2",
"openmaptiles-tools", # install from git "openmaptiles-tools", # install from git
"pyshp>=2.2,<2.4",
"sqlalchemy[asyncio]~=1.4.46",
"asyncpg~=0.27.0",
"alembic~=1.9.4",
"stream-zip~=0.0.50",
], ],
entry_points={ entry_points={
"console_scripts": [ "console_scripts": [
"openbikesensor-api=obs.bin.openbikesensor_api:main", "openbikesensor-api=obs.bin.openbikesensor_api:main",
"openbikesensor-transform-osm=obs.bin.openbikesensor_transform_osm:main",
] ]
}, },
) )

108
api/tools/import_osm.py Executable file
View file

@ -0,0 +1,108 @@
#!/usr/bin/env python3
from dataclasses import dataclass
import asyncio
from os.path import basename, splitext
import sys
import logging
import msgpack
import psycopg
from obs.api.app import app
from obs.api.utils import chunk
log = logging.getLogger(__name__)
ROAD_BUFFER = 1000
AREA_BUFFER = 100
@dataclass
class Road:
way_id: int
name: str
zone: str
directionality: int
oneway: int
geometry: bytes
def read_file(filename):
"""
Reads a file iteratively, yielding
appear. Those may be mixed.
"""
with open(filename, "rb") as f:
unpacker = msgpack.Unpacker(f)
try:
while True:
type_id, *data = unpacker.unpack()
if type_id == b"\x01":
yield Road(*data)
except msgpack.OutOfData:
pass
async def import_osm(connection, filename, import_group=None):
if import_group is None:
import_group = splitext(basename(filename))[0]
# Pass 1: Find IDs only
road_ids = []
for item in read_file(filename):
road_ids.append(item.way_id)
async with connection.cursor() as cursor:
log.info("Pass 1: Delete previously imported data")
log.debug("Delete import group %s", import_group)
await cursor.execute(
"DELETE FROM road WHERE import_group = %s", (import_group,)
)
log.debug("Delete roads by way_id")
for ids in chunk(road_ids, 10000):
await cursor.execute("DELETE FROM road WHERE way_id = ANY(%s)", (ids,))
# Pass 2: Import
log.info("Pass 2: Import roads")
amount = 0
for items in chunk(read_file(filename), 10000):
amount += 10000
log.info(f"...{amount}/{len(road_ids)} ({100*amount/len(road_ids)}%)")
async with cursor.copy(
"COPY road (way_id, name, zone, directionality, oneway, geometry, import_group) FROM STDIN"
) as copy:
for item in items:
await copy.write_row(
(
item.way_id,
item.name,
item.zone,
item.directionality,
item.oneway,
bytes.hex(item.geometry),
import_group,
)
)
async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
url = app.config.POSTGRES_URL
url = url.replace("+asyncpg", "")
async with await psycopg.AsyncConnection.connect(url) as connection:
for filename in sys.argv[1:]:
log.debug("Loading file: %s", filename)
await import_osm(connection, filename)
if __name__ == "__main__":
asyncio.run(main())

93
api/tools/import_regions.py Executable file
View file

@ -0,0 +1,93 @@
#!/usr/bin/env python3
"""
This script downloads and/or imports regions for statistical analysis into the
PostGIS database. The regions are sourced from:
* EU countries are covered by
[NUTS](https://ec.europa.eu/eurostat/web/gisco/geodata/reference-data/administrative-units-statistical-units/nuts).
"""
import tempfile
from dataclasses import dataclass
import json
import asyncio
from os.path import basename, splitext
import sys
import logging
from typing import Optional
import aiohttp
import psycopg
from obs.api.app import app
from obs.api.utils import chunk
log = logging.getLogger(__name__)
NUTS_URL = "https://gisco-services.ec.europa.eu/distribution/v2/nuts/geojson/NUTS_RG_01M_2021_3857.geojson"
from pyproj import Transformer
project = Transformer.from_crs("EPSG:4326", "EPSG:3857", always_xy=True).transform
from shapely.ops import transform
from shapely.geometry import shape
import shapely.wkb as wkb
async def import_nuts(
connection, filename=None, level: int = 3, import_group: Optional[str] = None
):
if import_group is None:
import_group = f"nuts{level}"
if filename:
log.info("Load NUTS from file")
with open(filename) as f:
data = json.load(f)
else:
log.info("Download NUTS regions from europa.eu")
async with aiohttp.ClientSession() as session:
async with session.get(NUTS_URL) as resp:
data = await resp.json(content_type=None)
async with connection.cursor() as cursor:
log.info(
"Delete previously imported regions with import group %s", import_group
)
await cursor.execute(
"DELETE FROM region WHERE import_group = %s", (import_group,)
)
log.info("Import regions")
async with cursor.copy(
"COPY region (id, name, geometry, import_group) FROM STDIN"
) as copy:
for feature in data["features"]:
if feature["properties"]["LEVL_CODE"] == level:
geometry = shape(feature["geometry"])
# geometry = transform(project, geometry)
geometry = wkb.dumps(geometry)
geometry = bytes.hex(geometry)
await copy.write_row(
(
feature["properties"]["NUTS_ID"],
feature["properties"]["NUTS_NAME"],
geometry,
import_group,
)
)
async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
url = app.config.POSTGRES_URL
url = url.replace("+asyncpg", "")
async with await psycopg.AsyncConnection.connect(url) as connection:
await import_nuts(connection, sys.argv[1] if len(sys.argv) > 1 else None)
if __name__ == "__main__":
asyncio.run(main())

View file

@ -6,9 +6,12 @@ import re
import os import os
import glob import glob
from os.path import normpath, abspath, join from os.path import normpath, abspath, join
from typing import List, Tuple
from sqlalchemy import text from sqlalchemy import text
import sqlparse import sqlparse
from openmaptiles.sqltomvt import MvtGenerator
from obs.api.app import app from obs.api.app import app
from obs.api.db import connect_db, make_session from obs.api.db import connect_db, make_session
@ -21,6 +24,32 @@ TILE_GENERATOR = normpath(
) )
TILESET_FILE = join(TILE_GENERATOR, "openbikesensor.yaml") TILESET_FILE = join(TILE_GENERATOR, "openbikesensor.yaml")
EXTRA_ARGS = [
# name, type, default
("user_id", "integer", "NULL"),
("min_time", "timestamp", "NULL"),
("max_time", "timestamp", "NULL"),
]
class CustomMvtGenerator(MvtGenerator):
def generate_sqltomvt_func(self, fname, extra_args: List[Tuple[str, str]]) -> str:
"""
Creates a SQL function that returns a single bytea value or null. This
method is overridden to allow for custom arguments in the created function
"""
extra_args_types = "".join([f", {a[1]}" for a in extra_args])
extra_args_definitions = "".join(
[f", {a[0]} {a[1]} DEFAULT {a[2]}" for a in extra_args]
)
return f"""\
DROP FUNCTION IF EXISTS {fname}(integer, integer, integer{extra_args_types});
CREATE FUNCTION {fname}(zoom integer, x integer, y integer{extra_args_definitions})
RETURNS {'TABLE(mvt bytea, key text)' if self.key_column else 'bytea'} AS $$
{self.generate_sql()};
$$ LANGUAGE SQL STABLE CALLED ON NULL INPUT;"""
def parse_pg_url(url=app.config.POSTGRES_URL): def parse_pg_url(url=app.config.POSTGRES_URL):
m = re.match( m = re.match(
@ -39,7 +68,10 @@ def parse_pg_url(url=app.config.POSTGRES_URL):
async def main(): async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s") logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
await prepare_sql_tiles()
async def prepare_sql_tiles():
with tempfile.TemporaryDirectory() as build_dir: with tempfile.TemporaryDirectory() as build_dir:
await generate_data_yml(build_dir) await generate_data_yml(build_dir)
sql_snippets = await generate_sql(build_dir) sql_snippets = await generate_sql(build_dir)
@ -111,9 +143,20 @@ async def generate_sql(build_dir):
with open(filename, "rt") as f: with open(filename, "rt") as f:
sql_snippets.append(f.read()) sql_snippets.append(f.read())
getmvt_sql = await _run( mvt = CustomMvtGenerator(
f"python $(which generate-sqltomvt) {TILESET_FILE!r} --key --gzip --postgis-ver 3.0.1 --function --fname=getmvt" tileset=TILESET_FILE,
postgis_ver="3.0.1",
zoom="zoom",
x="x",
y="y",
gzip=True,
test_geometry=False, # ?
key_column=True,
) )
getmvt_sql = mvt.generate_sqltomvt_func("getmvt", EXTRA_ARGS)
# drop old versions of the function
sql_snippets.append("DROP FUNCTION IF EXISTS getmvt(integer, integer, integer);")
sql_snippets.append(getmvt_sql) sql_snippets.append(getmvt_sql)
return sql_snippets return sql_snippets
@ -121,7 +164,11 @@ async def generate_sql(build_dir):
async def import_sql(sql_snippets): async def import_sql(sql_snippets):
statements = sum(map(sqlparse.split, sql_snippets), []) statements = sum(map(sqlparse.split, sql_snippets), [])
async with connect_db(app.config.POSTGRES_URL): async with connect_db(
app.config.POSTGRES_URL,
app.config.POSTGRES_POOL_SIZE,
app.config.POSTGRES_MAX_OVERFLOW,
):
for i, statement in enumerate(statements): for i, statement in enumerate(statements):
clean_statement = sqlparse.format( clean_statement = sqlparse.format(
statement, statement,

View file

@ -35,7 +35,7 @@ async def main():
args = parser.parse_args() args = parser.parse_args()
async with connect_db(app.config.POSTGRES_URL): async with connect_db(app.config.POSTGRES_URL, app.config.POSTGRES_POOL_SIZE, app.config.POSTGRES_MAX_OVERFLOW):
if args.tracks: if args.tracks:
await process_tracks(args.tracks) await process_tracks(args.tracks)
else: else:

30
api/tools/reimport_tracks.py Executable file
View file

@ -0,0 +1,30 @@
#!/usr/bin/env python3
import logging
import asyncio
from sqlalchemy import text
from obs.api.app import app
from obs.api.db import connect_db, make_session
log = logging.getLogger(__name__)
async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
await reimport_tracks()
async def reimport_tracks():
async with connect_db(
app.config.POSTGRES_URL,
app.config.POSTGRES_POOL_SIZE,
app.config.POSTGRES_MAX_OVERFLOW,
):
async with make_session() as session:
await session.execute(text("UPDATE track SET processing_status = 'queued';"))
await session.commit()
if __name__ == "__main__":
asyncio.run(main())

View file

@ -1,17 +1,33 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import logging import logging
import asyncio import asyncio
import argparse
from obs.api.db import init_models, connect_db from obs.api.db import drop_all, init_models, connect_db
from obs.api.app import app from obs.api.app import app
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
async def main(): async def main():
parser = argparse.ArgumentParser(
description="drops the whole database, and possibly creates new table schema"
)
parser.add_argument(
"-s",
"--create-schema",
action="store_true",
help="create the schema",
)
args = parser.parse_args()
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s") logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
async with connect_db(app.config.POSTGRES_URL): async with connect_db(app.config.POSTGRES_URL):
await drop_all()
if args.create_schema:
await init_models() await init_models()
log.info("Database initialized.") log.info("Database initialized.")

6
api/tools/transform_osm.py Executable file
View file

@ -0,0 +1,6 @@
#!/usr/bin/env python3
from obs.bin.openbikesensor_transform_osm import main
if __name__ == "__main__":
main()

32
api/tools/upgrade.py Executable file
View file

@ -0,0 +1,32 @@
#!/usr/bin/env python3
import asyncio
import logging
log = logging.getLogger(__name__)
from prepare_sql_tiles import prepare_sql_tiles, _run
from import_regions import main as import_nuts
from reimport_tracks import main as reimport_tracks
async def _migrate():
await _run("alembic upgrade head")
async def main():
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
log.info("Running migrations...")
await _migrate()
log.info("Preparing SQL tiles...")
await prepare_sql_tiles()
log.info("Importing nuts regions...")
await import_nuts()
log.info("Nuts regions imported, scheduling reimport of tracks")
await reimport_tracks()
if __name__ == "__main__":
asyncio.run(main())

49
deployment/.env Normal file
View file

@ -0,0 +1,49 @@
###################################################
# Keycloak
###################################################
OBS_KEYCLOAK_URI=login.example.com
# Postgres
OBS_KEYCLOAK_POSTGRES_USER=obs
OBS_KEYCLOAK_POSTGRES_PASSWORD=<<TODO>>
OBS_KEYCLOAK_POSTGRES_DB=obs
OBS_POSTGRES_MAX_OVERFLOW=20
OBS_POSTGRES_POOL_SIZE=40
# KeyCloak
OBS_KEYCLOAK_POSTGRES_HOST=postgres-keycloak
OBS_KEYCLOAK_ADMIN_USER=admin
OBS_KEYCLOAK_ADMIN_PASSWORD=<<TODO>>
OBS_KEYCLOAK_REALM=obs
OBS_KEYCLOAK_PORTAL_REDIRECT_URI=https://portal.example.com/*
###################################################
# Portal
###################################################
OBS_PORTAL_URI=portal.example.com
# Postgres + osm2pgsql
OBS_POSTGRES_HOST=postgres
OBS_POSTGRES_USER=obs
OBS_POSTGRES_PASSWORD=<<TODO>>
OBS_POSTGRES_DB=obs
# Portal
OBS_HOST=0.0.0.0
OBS_PORT=3000
OBS_SECRET=<<TODO>>
OBS_POSTGRES_URL=postgresql+asyncpg://obs:<<TODO>>@postgres/obs
OBS_KEYCLOAK_URL=https://login.example.com/auth/realms/obs/
OBS_KEYCLOAK_CLIENT_ID=portal
OBS_KEYCLOAK_CLIENT_SECRET=<<TODO>>
OBS_DEDICATED_WORKER="True"
OBS_DATA_DIR=/data
OBS_PROXIES_COUNT=1
###################################################

View file

@ -1,205 +0,0 @@
# Deploying an OpenBikeSensor Portal with Docker
## Introduction
The main idea of this document is to provide an easy docker-based
production-ready setup of the openbikesensor portal. It uses the [the traefik
proxy](https://doc.traefik.io/traefik/) as a reverse proxy, which listens
on port 80 and 443. Based on some labels, traefik routes the domains to the
corresponding docker containers.
## Before Getting Started
The guide and example configuration assumes one domain, which points to the
server's IP address. This documentation uses `portal.example.com` as an
example. The API is hosted at `https://portal.example.com/api`, while the main
frontend is reachable at the domain root.
## Setup instructions
### Clone the repository
First create a folder somewhere in your system, in the example we use
`/opt/openbikesensor` and export it as `$ROOT` to more easily refer to it.
Clone the repository to `$ROOT/source`.
```bash
export ROOT=/opt/openbikesensor
mkdir -p $ROOT
cd $ROOT
git clone --recursive https://github.com/openbikesensor/portal source/
# If you accidentally cloned without --recursive, fix it by running:
# git submodule update --init --recursive
```
Unless otherwise mentioned, commands below assume your current working
directory to be `$ROOT`.
### Configure `traefik.toml`
```bash
mkdir -p config/
cp source/deployment/examples/traefik.toml config/traefik.toml
vim config/traefik.toml
```
Configure your email in the `config/traefik.toml`. This email is used by
*Let's Encrypt* to send you some emails regarding your certificates.
### Configure `docker-compose.yaml`
```bash
cp source/deployment/examples/docker-compose.yaml docker-compose.yaml
vim docker-compose.yaml
```
* Change the domain where it occurs, such as in `Host()` rules.
* Generate a secure password for the PostgreSQL database user. You will need to
configure this in the application later.
### Create a keycloak instance
Follow the [official guides](https://www.keycloak.org/documentation) to create
your own keycloak server. You can run the keycloak in docker and include it in
your `docker-compose.yaml`, if you like.
Documenting the details of this is out of scope for our project. Please make
sure to configure:
* An admin account for yourself
* A realm for the portal
* A client in that realm with "Access Type" set to "confidential" and a
redirect URL of this pattern: `https://portal.example.com/login/redirect`
### Prepare database
Follow the procedure outlined in [README.md](../README.md) under "Prepare
database". Whenever the docker-compose service `api` is referenced, replace it
with `portal`, which contains the same python code as the development `api`
service, but also the frontend. For example:
```bash
# development
docker-compose run --rm api tools/prepare_sql_tiles.py
# production
docker-compose run --rm portal tools/prepare_sql_tiles.py
```
### Import OpenStreetMap data
Follow the procedure outlined in [README.md](../README.md) under "Import OpenStreetMap data".
### Configure portal
```bash
cp source/api/config.py.example config/config.py
```
Then edit `config/config.py` to your heart's content (and matching the
configuration of the keycloak). Do not forget to generate a secure secret
string.
Also set `PROXIES_COUNT = 1` in your config, even if that option is not
included in the example file. Read the
[Sanic docs](https://sanicframework.org/en/guide/advanced/proxy-headers.html)
for why this needs to be done. If your reverse proxy supports it, you can also
use a forwarded secret to secure your proxy target from spoofing. This is not
required if your application server does not listen on a public interface, but
it is recommended anyway, if possible.
### Build container and run them
```bash
docker-compose build portal
docker-compose up -d portal
```
## Running a dedicated worker
Extend your `docker-compose.yaml` with the following service:
```yaml
worker:
image: openbikesensor-portal
build:
context: ./source
volumes:
- ./data/api-data:/data
- ./config/config.py:/opt/obs/api/config.py
restart: on-failure
links:
- postgres
networks:
- backend
command:
- python
- tools/process_track.py
```
Change the `DEDICATED_WORKER` option in your config to `True` to stop
processing tracks in the portal container. Then restart the `portal` service
and start the `worker` service.
## Miscellaneous
### Logs
To read logs, run
```bash
docker-compose logs -f
```
If something went wrong, you can reconfigure your config files and rerun:
```bash
docker-compose build
docker-compose up -d
```
### Updates
Before updating make sure that you have properly backed-up your instance so you
can always roll back to a pre-update state.
### Backups
To backup your instances private data you only need to backup the ``$ROOT`` folder.
This should contain everything needed to start your instance again, no persistent
data lives in docker containers. You should stop the containers for a clean backup.
This backup contains the imported OSM data as well. That is of course a lot of
redundant data, but very nice to have for a quick restore operation. If you
want to generate smaller, nonredundant backups, or backups during live
operation of the database, use a tool like `pg_dump` and extract only the
required tables:
* `overtaking_event`
* `track`
* `user` (make sure to reference `public.user`, not the postgres user table)
* `comment`
You might also instead use the `--exclude-table` option to ignore the `road`
table only (adjust connection parameters and names):
```bash
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
```
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
directory. The processed data can be regenerated, but you can also back that
up, from `local/api-data/processing-output`.
Finally, make sure to create a backup of your keycloak instance. Refer to the
keycloak documentation for how to export its data in a restorable way. This
should work very well if you are storing keycloak data in the PostgreSQL and
exporting that with an exclusion pattern instead of an explicit list.
And then, please test your backup and restore strategy before going live, or at
least before you need it!

View file

@ -0,0 +1,63 @@
# Bind address of the server
# HOST = "127.0.0.1"
# PORT = 3000
# Extended log output, but slower
DEBUG = False
VERBOSE = DEBUG
AUTO_RELOAD = DEBUG
# Required to encrypt or sign sessions, cookies, tokens, etc.
# SECRET = "!!!<<<CHANGEME>>>!!!"
# Connection to the database
# POSTGRES_URL = "postgresql+asyncpg://user:pass@host/dbname"
# POSTGRES_POOL_SIZE = 20
# POSTGRES_MAX_OVERFLOW = 2 * POSTGRES_POOL_SIZE
# URL to the keycloak realm, as reachable by the API service. This is not
# necessarily its publicly reachable URL, keycloak advertises that iself.
# KEYCLOAK_URL = "http://localhost:1234/auth/realms/obs/"
# Auth client credentials
# KEYCLOAK_CLIENT_ID = "portal"
# KEYCLOAK_CLIENT_SECRET = "00000000-0000-0000-0000-000000000000"
# Whether the API should run the worker loop, or a dedicated worker is used
# DEDICATED_WORKER = True
# The root of the frontend. Needed for redirecting after login, and for CORS.
# Set to None if frontend is served by the API.
FRONTEND_URL = None
FRONTEND_HTTPS = True
# Where to find the compiled frontend assets (must include index.html), or None
# to disable serving the frontend.
FRONTEND_DIR = "../frontend/build/"
# Can be an object or a JSON string
FRONTEND_CONFIG = {
"imprintUrl": "https://example.com/imprint",
"privacyPolicyUrl": "https://example.com/privacy",
"mapHome": {"zoom": 6, "longitude": 10.2, "latitude": 51.3},
"banner": {"text": "This is a test installation.", "style": "warning"},
}
# If the API should serve generated tiles, this is the path where the tiles are
# built. This is an experimental option and probably very inefficient, a proper
# tileserver should be prefered. Set to None to disable.
TILES_FILE = None
# Path overrides:
# API_ROOT_DIR = "??" # default: api/ inside repository
# DATA_DIR = "??" # default: $API_ROOT_DIR/..
# PROCESSING_DIR = "??" # default: DATA_DIR/processing
# PROCESSING_OUTPUT_DIR = "??" # default: DATA_DIR/processing-output
# TRACKS_DIR = "??" # default: DATA_DIR/tracks
# OBS_FACE_CACHE_DIR = "??" # default: DATA_DIR/obs-face-cache
# Additional allowed origins for CORS headers. The FRONTEND_URL is included by
# default. Python list, or whitespace separated string.
ADDITIONAL_CORS_ORIGINS = None
# vim: set ft=python :

View file

@ -0,0 +1,22 @@
events {}
http {
proxy_cache_path /data/nginx/cache levels=1:2 keys_zone=STATIC:10m
inactive=24h max_size=1g;
server {
location ~* ^/tiles/\d[012]?/[^?]+$ {
proxy_pass http://portal:3000;
proxy_set_header Host $host:3000;
proxy_buffering on;
proxy_cache_methods GET HEAD;
proxy_cache STATIC;
proxy_cache_valid 200 1d;
proxy_cache_use_stale error timeout invalid_header updating
http_500 http_502 http_503 http_504;
}
location / {
proxy_pass http://portal:3000;
proxy_set_header Host $host:3000;
}
}
}

View file

@ -0,0 +1,150 @@
version: '3.5'
networks:
gateway:
external: true
name: gateway
backend:
internal: true
services:
############################################################
# Portal
############################################################
postgres:
image: "openmaptiles/postgis:7.0"
environment:
- POSTGRES_DB=${OBS_POSTGRES_DB}
- POSTGRES_USER=${OBS_POSTGRES_USER}
- POSTGRES_PASSWORD=${OBS_POSTGRES_PASSWORD}
volumes:
- ./data/postgres/data:/var/lib/postgresql/data
networks:
- backend
portal:
image: openbikesensor-portal
build:
context: ./source
env_file: .env
volumes:
- ./data/api-data:${OBS_DATA_DIR}
- ./config/config.py:/opt/obs/api/config.py
- ./data/tiles/:/tiles
- ./data/pbf/:/pbf
restart: on-failure
depends_on:
- traefik
- postgres
- worker
# - keycloak
labels:
- traefik.http.routers.portal.rule=Host(`${OBS_PORTAL_URI}`)
- traefik.http.routers.portal.entrypoints=websecure
- traefik.http.routers.portal.tls=true
- traefik.http.routers.portal.tls.certresolver=leresolver
- traefik.docker.network=gateway
# - traefik.http.services.portal.loadbalancer.server.port=3000
networks:
- gateway
- backend
worker:
image: openbikesensor-portal
build:
context: ./source
env_file: .env
volumes:
- ./data/api-data:${OBS_DATA_DIR}
- ./config/config.py:/opt/obs/api/config.py
restart: on-failure
depends_on:
- postgres
networks:
- backend
command:
- python
- tools/process_track.py
############################################################
# Traefik
############################################################
traefik:
image: traefik:2.4.8
restart: always
ports:
- "80:80"
- "443:443"
# The Web UI (enabled by [api] in traefik.toml)
# - "8080:8080"
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./config/traefik.toml:/traefik.toml
- ./config/usersfile:/usersfile
- ./config/acme:/acme
networks:
- gateway
labels:
# global redirect from http to https
- "traefik.http.routers.http-catchall.rule=hostregexp(`{host:.+}`)"
- "traefik.http.routers.http-catchall.entrypoints=web"
# Define middlewares to be used
- "traefik.http.routers.http-catchall.middlewares=redirect-http-to-https"
# Configure middlewares
- "traefik.http.middlewares.redirect-http-to-https.redirectscheme.scheme=https"
############################################################
# Keycloak
############################################################
keycloak:
image: jboss/keycloak:15.1.0
restart: always
networks:
- gateway
- backend
env_file: .env
environment:
# database
- DB_VENDOR=postgres
- DB_ADDR=${OBS_KEYCLOAK_POSTGRES_HOST}
- DB_DATABASE=${OBS_KEYCLOAK_POSTGRES_DB}
- DB_USER=${OBS_KEYCLOAK_POSTGRES_USER}
- DB_PASSWORD=${OBS_KEYCLOAK_POSTGRES_PASSWORD}
# admin user
- KEYCLOAK_USER=${OBS_KEYCLOAK_ADMIN_USER}
- KEYCLOAK_PASSWORD=${OBS_KEYCLOAK_ADMIN_PASSWORD}
- PROXY_ADDRESS_FORWARDING=true
- OBS_KEYCLOAK_PORTAL_REDIRECT_URI=${OBS_KEYCLOAK_PORTAL_REDIRECT_URI}
depends_on:
- traefik
- postgres-keycloak
labels:
- "traefik.http.routers.login.rule=Host(`${OBS_KEYCLOAK_URI}`)"
- "traefik.http.routers.login.entrypoints=websecure"
- "traefik.http.routers.login.tls=true"
- "traefik.http.routers.login.tls.certresolver=leresolver"
# This container runs on two ports (8080/tcp, 8443/tcp). Tell traefik, which one to use.
- "traefik.http.services.login.loadbalancer.server.port=8080"
# This container runs on more than one network. Tell traefik, which one to use.
- "traefik.docker.network=gateway"
postgres-keycloak:
image: postgres:15
restart: always
networks:
- backend
volumes:
- ./data/postgres-keycloak:/var/lib/postgresql/data
environment:
- POSTGRES_DB=${OBS_KEYCLOAK_POSTGRES_DB}
- POSTGRES_USER=${OBS_KEYCLOAK_POSTGRES_USER}
- POSTGRES_PASSWORD=${OBS_KEYCLOAK_POSTGRES_PASSWORD}
labels:
- traefik.enable=false

View file

@ -1,78 +0,0 @@
version: '3'
networks:
gateway:
external: true
name: gateway
backend:
internal: true
services:
postgres:
image: "openmaptiles/postgis:6.0"
environment:
POSTGRES_USER: obs
POSTGRES_PASSWORD: obs
POSTGRES_DB: obs
volumes:
- ./data/postgres/data:/var/lib/postgresql/data
portal:
image: openbikesensor-portal
build:
context: ./source
volumes:
- ./data/api-data:/data
- ./config/config.py:/opt/obs/api/config.py
- ./data/tiles/:/tiles
restart: on-failure
depends_on:
- postgres
# if you introduce a dockerized keycloak instance within this compose also:
# - keycloak
labels:
- traefik.http.routers.portal.rule=Host(`portal.example.com`)
- traefik.http.routers.portal.entrypoints=websecure
- traefik.http.routers.portal.tls=true
- traefik.http.routers.portal.tls.certresolver=leresolver
- traefik.docker.network=gateway
- traefik.http.services.whoami.loadbalancer.server.port=80
networks:
- gateway
- backend
traefik:
image: traefik:2.4.8
restart: always
ports:
- "80:80"
- "443:443"
# The Web UI (enabled by [api] in traefik.toml)
# - "8080:8080"
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./config/traefik.toml:/traefik.toml
- ./config/usersfile:/usersfile
- ./config/acme:/acme
networks:
- gateway
labels:
# global redirect from http to https
- "traefik.http.routers.http-catchall.rule=hostregexp(`{host:.+}`)"
- "traefik.http.routers.http-catchall.entrypoints=web"
# Define middlewares to be used
- "traefik.http.routers.http-catchall.middlewares=redirect-http-to-https"
# Configure middlewares
- "traefik.http.middlewares.redirect-http-to-https.redirectscheme.scheme=https"
# Show Traefik Dashboard. Enable the dashboard in traefik.toml if you use these.
# - "traefik.http.routers.traefik.rule=Host(`traefik.example.com`)"
# - "traefik.http.routers.traefik.service=api@internal"
# - "traefik.http.routers.traefik.tls=true"
# - "traefik.http.routers.traefik.entrypoints=websecure"
# - "traefik.http.routers.traefik.tls.certresolver=leresolver"
# - "traefik.http.routers.traefik.middlewares=basic-auth"
# - "traefik.http.middlewares.basic-auth.basicauth.usersfile=/usersfile"

View file

@ -8,7 +8,7 @@ version: '3'
services: services:
postgres: postgres:
image: "openmaptiles/postgis:6.0" image: "openmaptiles/postgis:7.0"
environment: environment:
POSTGRES_USER: obs POSTGRES_USER: obs
POSTGRES_PASSWORD: obs POSTGRES_PASSWORD: obs
@ -20,6 +20,7 @@ services:
api: api:
image: openbikesensor-api image: openbikesensor-api
tty: true
build: build:
context: ./api/ context: ./api/
dockerfile: Dockerfile dockerfile: Dockerfile
@ -28,10 +29,15 @@ services:
- ./api/scripts/obs:/opt/obs/scripts/obs - ./api/scripts/obs:/opt/obs/scripts/obs
- ./api/tools:/opt/obs/api/tools - ./api/tools:/opt/obs/api/tools
- ./api/config.dev.py:/opt/obs/api/config.py - ./api/config.dev.py:/opt/obs/api/config.py
- ./api/config.overrides.py:/opt/obs/api/config.overrides.py
- ./frontend/build:/opt/obs/frontend/build - ./frontend/build:/opt/obs/frontend/build
- ./tile-generator:/opt/obs/tile-generator - ./tile-generator:/opt/obs/tile-generator
- ./local/api-data:/data - ./local/api-data:/data
- ./tile-generator/data/:/tiles - ./tile-generator/data/:/tiles
- ./api/migrations:/opt/obs/api/migrations
- ./api/alembic.ini:/opt/obs/api/alembic.ini
- ./local/pbf:/pbf
- ./local/obsdata:/obsdata
depends_on: depends_on:
- postgres - postgres
- keycloak - keycloak
@ -43,6 +49,7 @@ services:
worker: worker:
image: openbikesensor-api image: openbikesensor-api
tty: true
build: build:
context: ./api/ context: ./api/
dockerfile: Dockerfile dockerfile: Dockerfile
@ -51,6 +58,7 @@ services:
- ./api/scripts/obs:/opt/obs/scripts/obs - ./api/scripts/obs:/opt/obs/scripts/obs
- ./api/tools:/opt/obs/api/tools - ./api/tools:/opt/obs/api/tools
- ./api/config.dev.py:/opt/obs/api/config.py - ./api/config.dev.py:/opt/obs/api/config.py
- ./api/config.overrides.py:/opt/obs/api/config.overrides.py
- ./local/api-data:/data - ./local/api-data:/data
depends_on: depends_on:
- postgres - postgres

103
docs/osm-import.md Normal file
View file

@ -0,0 +1,103 @@
# Importing OpenStreetMap data
The application requires a lot of data from the OpenStreetMap to work.
The required information is stored in the PostgreSQL database and used when
processing tracks, as well as for vector tile generation. The process applies
to both development and production setups. For development, you should choose a
small area for testing, such as your local county or city, to keep the amount
of data small. For production use you have to import the whole region you are
serving.
## General pipeline overview
1. Download OpenStreetMap data as one or more `.osm.pbf` files.
2. Transform this data to generate geometry data for all roads and regions, so
we don't need to look up nodes separately. This step requires a lot of CPU
and memory, so it can be done "offline" on a high power machine.
3. Import the transformed data into the PostgreSQL/PostGIS database.
## Community hosted transformed data
Since the first two steps are the same for everybody, the community will soon
provide a service where relatively up-to-date transformed data can be
downloaded for direct import. Stay tuned.
## Download data
[GeoFabrik](https://download.geofabrik.de) kindly hosts extracts of the
OpenStreetMap planet by region. Download all regions you're interested in from
there in `.osm.pbf` format, with the tool of your choice, e. g.:
```bash
wget -P local/pbf/ https://download.geofabrik.de/europe/germany/baden-wuerttemberg-latest.osm.pbf
```
## Transform data
To transform downloaded data, you can either use the docker image from a
development or production environment, or locally install the API into your
python environment. Then run the `api/tools/transform_osm.py` script on the data.
```bash
api/tools/transform_osm.py baden-wuerttemberg-latest.osm.pbf baden-wuerttemberg-latest.msgpack
```
In dockerized setups, make sure to mount your data somewhere in the container
and also mount a directory where the result can be written. The development
setup takes care of this, so you can use:
```bash
docker-compose run --rm api tools/transform_osm.py \
/pbf/baden-wuerttemberg-latest.osm.pbf /obsdata/baden-wuerttemberg-latest.msgpack
```
Repeat this command for every file you want to transform.
## Import transformed data
The command for importing looks like this:
```bash
api/tools/import_osm.py baden-wuerttemberg-latest.msgpack
```
This tool reads your application config from `config.py`, so set that up first
as if you were setting up your application.
In dockerized setups, make sure to mount your data somewhere in the container.
Again, the development setup takes care of this, so you can use:
```bash
docker-compose run --rm api tools/import_osm.py \
/obsdata/baden-wuerttemberg-latest.msgpack
```
The transform process should take a few seconds to minutes, depending on the area
size. You can run the process multiple times, with the same or different area
files, to import or update the data. You can update only one region and leave
the others as they are, or add more filenames to the command line to
bulk-import data.
## How this works
* The transformation is done with a python script that uses
[pyosmium](https://osmcode.org/pyosmium/) to read the `.osm.pbf` file. This
script then filters the data for only the required objects (such as road
segments and administrative areas), and extracts the interesting information
from those objects.
* The node geolocations are looked up to generate a geometry for each object.
This requires a lot of memory to run efficiently.
* The geometry is projected to [Web Mercator](https://epsg.io/3857) in this
step to avoid continous transformation when tiles are generated later. Most
operations will work fine in this projection. Projection is done with the
[pyproj](https://pypi.org/project/pyproj/) library.
* The output is written to a binary file in a very simple format using
[msgpack](https://github.com/msgpack/msgpack-python), which is way more
efficient that (Geo-)JSON for example. This format is stremable, so the
generated file is never fully written or read into memory.
* The import script reads the msgpack file and sends it to the database using
[psycopg](https://www.psycopg.org/). This is done because it supports
PostgreSQL's `COPY FROM` statement, which enables much faster writes to the
database that a traditionional `INSERT VALUES`. The file is streamed directly
to the database, so it is never read into memory.

View file

@ -0,0 +1,414 @@
# Deploying an OpenBikeSensor Portal with Docker
## Introduction
The main idea of this document is to provide an easy docker-based
production-ready setup of the openbikesensor portal. It uses the [the traefik
proxy](https://doc.traefik.io/traefik/) as a reverse proxy, which listens
on port 80 and 443. Based on some labels, traefik routes the domains to the
corresponding docker containers.
## Requirements
This guide requires a Linux-system, where `docker` and `docker-compose` are installed.
Ensure, that your system is up to date.
> TODO
```bash
apt install docker.io docker-compose pwgen
```
## Before Getting Started
The example configurations assume two domains, which points to the
server's IP address. This documentation uses `portal.example.com` and
`login.example.com`. The API is hosted at `https://portal.example.com/api`,
while the main frontend is reachable at the domain root.
## Setup instructions
First of all, login into your system via SSH.
### Create working directory
Create a folder somewhere in your system, in this guide we use
`/opt/openbikesensor`:
```bash
mkdir /opt/openbikesensor
```
### Clone the repository
Clone the repository to `/opt/openbikesensor/`:
```bash
cd /opt/openbikesensor/
git clone --recursive https://github.com/openbikesensor/portal source/
# If you accidentally cloned without --recursive, fix it by running:
# git submodule update --init --recursive
```
### Copy predefined configuration files
```bash
mkdir -p /opt/openbikesensor/config
cd /opt/openbikesensor/
cp -r source/deployment/config source/deployment/docker-compose.yaml source/deployment/.env .
```
### Create a Docker network
```bash
docker network create gateway
```
### Traefik
#### Configure `traefik.toml`
```bash
cd /opt/openbikesensor/
nano config/traefik.toml
```
Configure your email in the `config/traefik.toml`. This email is used by
*Let's Encrypt* to send you some emails regarding your certificates.
#### Start Traefik
```bash
cd /opt/openbikesensor/
docker-compose up -d traefik
docker-compose logs -f traefik
```
> traefik_1 | time="2022-01-03T13:02:36Z" level=info msg="Configuration loaded from file: /traefik.toml"
### Generate passwords
Generate three passords, for example with `pwgen`:
```bash
pwgen -n 20
```
They will be uses in the next steps.
### KeyCloak
#### Configure `.env`
```bash
cd /opt/openbikesensor/
nano .env
```
Configure:
* `OBS_KEYCLOAK_URI`:
* The subdomain of your keycloak
* `OBS_KEYCLOAK_POSTGRES_PASSWORD`
* One of the generated passwords for the KeyCloak-postgres
* `OBS_KEYCLOAK_ADMIN_PASSWORD`:
* One of the generated passwords for the KeyCloak-admin
* `OBS_KEYCLOAK_PORTAL_REDIRECT_URI`:
* The Redirect URI, e.g. the subdomain of your portal (ensure, it ends with `/*`)
#### Start KeyCloak
```bash
docker-compose up -d keycloak
docker-compose logs -f keycloak
```
Wait until postgres and keycloak are started:
> keycloak_1 | 13:08:55,558 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
Open:
* https://login.example.com/
* Test login to the admin console with your admin account
#### Configure Realm and Client
Jump into the KeyCloak container:
```bash
docker-compose exec keycloak /bin/bash
```
Since we configured the `.env`-file we can run the following commands
to create a realm and a client now:
```bash
# Login
/opt/jboss/keycloak/bin/kcadm.sh config credentials --server http://localhost:8080/auth --realm master --user $KEYCLOAK_USER --password $KEYCLOAK_PASSWORD
# Create Realm
/opt/jboss/keycloak/bin/kcadm.sh create realms -s realm=$OBS_KEYCLOAK_REALM -s enabled=true -o
# Create a client and remember the unique id of the client
CID=$(/opt/jboss/keycloak/bin/kcadm.sh create clients -r $OBS_KEYCLOAK_REALM -s clientId=portal -s "redirectUris=[\"$OBS_KEYCLOAK_PORTAL_REDIRECT_URI\"]" -i)
# Create a secret for the client
/opt/jboss/keycloak/bin/kcadm.sh create clients/$CID/client-secret -r $OBS_KEYCLOAK_REALM
# Get the secret of the client
/opt/jboss/keycloak/bin/kcadm.sh get clients/$CID/client-secret -r $OBS_KEYCLOAK_REALM
```
Exit the container with `exit`. Configure the client secret:
```bash
cd /opt/openbikesensor/
nano .env
```
Configure:
* `OBS_KEYCLOAK_CLIENT_SECRET`:
* Use the obtained client secret
#### Create a user
* Login into your Keycloak with the admin user and select the realm obs
* Create a user with username and email for the realm `obs` (*Hint*: email is required by the portal)
* Configure a password in the tab `Credentials` as well
### Portal
#### Configure Postgres
```bash
cd /opt/openbikesensor/
nano .env
```
Configure:
* `OBS_POSTGRES_HOST`:
* The should be the postgres-container, e.g. `postgres`
* `OBS_POSTGRES_USER`:
* The default postgres-user is `obs`
* `OBS_POSTGRES_PASSWORD`:
* Use one of the generated passwords for the postgres
* `OBS_POSTGRES_DB`:
* The default postgres-database is `obs`
* `OBS_POSTGRES_URL`:
* Use the same informations as aboe to configure the `POSTGRES_URL`,
this one is used by the portal.
#### Start Postgres for the portal
```
cd /opt/openbikesensor/
docker-compose up -d postgres
docker-compose logs -f postgres
```
Wait until started:
> postgres_1 | PostgreSQL init process complete; ready for start up.
#### Build the portal image
```bash
cd /opt/openbikesensor/
docker-compose build portal
```
*Hint*: This may take up to 10 minutes. In the future, we will provide a prebuild image.
#### Prepare database
Run the following scripts to prepare the database:
```bash
docker-compose run --rm portal tools/upgrade.py
```
For more details, see [README.md](../README.md) under "Prepare database".
#### Import OpenStreetMap data
Follow [these instructions](./osm-import.md).
#### Configure portal
The portal can be configured via env-vars or via the `config.py`.
It's important to know, that the `config.py` overrides the env-vars.
All env-vars start with `OBS_` and will be handled by the application without the prefix.
For example, the env-var `OBS_SECRET` will be same as `SECRET` within the `config.py` and will be `SECRET` within the application.
```bash
cd /opt/openbikesensor/
nano .env
```
Configure:
* `OBS_PORTAL_URI`:
* The subdomain of your portal
* `OBS_SECRET`:
* Generate a UUID with `uuidgen` and use it as the secret
* `OBS_POSTGRES_URL`:
* Should be configured already
* `OBS_KEYCLOAK_URL`:
* You can find it as the `issuer`, when you click on *OpenID Endpoint Configuration* in the realm obs
* `OBS_KEYCLOAK_CLIENT_SECRET`:
* Should be configured already
* `OBS: DEDICATED_WORKER`
* Should be set to `"True"`, since it the workder will be started with the portal
* `OBS_DATA_DIR`
* The data dir must be the same for the portal and the worer.
The default is `/data` within the containers
* `OBS_PROXIES_COUNT`:
* This sets `PROXIES_COUNT = 1` in your config
* Read the [Sanic docs](https://sanicframework.org/en/guide/advanced/proxy-headers.html)
for why this needs to be done. If your reverse proxy supports it, you can also
use a forwarded secret to secure your proxy target from spoofing. This is not
required if your application server does not listen on a public interface, but
it is recommended anyway, if possible.
Have a look into the `config.py`, which other variables may affect you.
#### Start the portal
```bash
cd /opt/openbikesensor/
docker-compose up -d portal
docker-compose logs -f portal worker
```
> portal_1 | [2022-01-03 13:37:48 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:3000
This also starts a dedicated worker container to handle the tracks.
#### Test the portal
* Open: https://portal.example.com/ (URL depends on your setup)
* Login with the user
* Upload a track via My Tracks
You should see smth. like:
> worker_1 | INFO: Track uuqvcvlm imported.
When you click on *My Tracks*, you should see it on a map.
#### Configure the map position
Open the tab *Map** an zoom to the desired position. The URL contains the corresponding GPS position,
for example:
> 14/53.86449349032097/10.696108517499198
Configure the map position in the `config.py` and restart the portal, by setting `mapHome` in the variable `FRONTEND_CONFIG`:
```
cd /opt/openbikesensor/
nano config/config.py
docker-compose restart portal
```
**Hint**: Maybe it's required to disable the browser cache to see the change.
The tab *Map* should be the selected map section now.
When you uploaded some tracks, you map should show a colors overlay on the streets.
## Miscellaneous
### Logs
To read the logs, run
```bash
docker-compose logs -f
```
If something went wrong, you can reconfigure your config files and rerun:
```bash
docker-compose up -d
```
### Updates
Before updating make sure that you have properly backed-up your instance so you
can always roll back to a pre-update state.
#### Migrating
Migrations are done with
[Alembic](https://alembic.sqlalchemy.org/en/latest/index.html), please refer to
its documentation for help. Most of the time, running this command will do all
the migrations you need:
```bash
docker-compose run --rm portal alembic upgrade head
```
You are advised to create a backup (see below) before running a migration, and
to shut down the services before the migration and start them afterwards.
### Backups
To backup your instances private data you only need to backup the ``$ROOT`` folder.
This should contain everything needed to start your instance again, no persistent
data lives in docker containers. You should stop the containers for a clean backup.
This backup contains the imported OSM data as well. That is of course a lot of
redundant data, but very nice to have for a quick restore operation. If you
want to generate smaller, nonredundant backups, or backups during live
operation of the database, use a tool like `pg_dump` and extract only the
required tables:
* `road_usage`
* `overtaking_event`
* `track`
* `user` (make sure to reference `public.user`, not the postgres user table)
* `comment`
You might also instead use the `--exclude-table` option to ignore the `road`
table only (adjust connection parameters and names):
```bash
pg_dump -h localhost -d obs -U obs -n public -T road -f backup-`date +%F`.sql
```
Also back up the raw uploaded files, i.e. the `local/api-data/tracks`
directory. The processed data can be regenerated, but you can also back that
up, from `local/api-data/processing-output`.
Finally, make sure to create a backup of your keycloak instance. Refer to the
keycloak documentation for how to export its data in a restorable way. This
should work very well if you are storing keycloak data in the PostgreSQL and
exporting that with an exclusion pattern instead of an explicit list.
And then, please test your backup and restore strategy before going live, or at
least before you need it!
### Connecting to the PostgreSQL database
Here are the quick steps for connecting to your PostgreSQL database, should you
need that:
* Add the `gateway` network to your `postgres` service.
* Add a port forwarding to your `postgres` service:
```yaml
ports:
- 127.0.0.1:25432:5432
```
* Run `docker-compose up -d postgres` again
* You can now connect from your server to the PostgreSQL service with:
```
psql -h localhost -U obs -d obs -p 25432
```
You will need your database password for the connection.
* If you do not want to install `psql` outside your container, you can use an
SSH tunnel from your local machine to your server and run `psql` locally.

View file

@ -12,7 +12,7 @@
"obsMapSource": { "obsMapSource": {
"type": "vector", "type": "vector",
"tiles": ["https://portal.example.com/tiles/{z}/{x}/{y}.pbf"], "tiles": ["https://portal.example.com/tiles/{z}/{x}/{y}.pbf"],
"minzoom": 12, "minzoom": 0,
"maxzoom": 14 "maxzoom": 14
} }
} }

File diff suppressed because it is too large Load diff

View file

@ -12,7 +12,11 @@
"classnames": "^2.3.1", "classnames": "^2.3.1",
"colormap": "^2.3.2", "colormap": "^2.3.2",
"downloadjs": "^1.4.7", "downloadjs": "^1.4.7",
"echarts": "^5.3.2",
"echarts-for-react": "^3.0.2",
"fomantic-ui-less": "^2.8.8", "fomantic-ui-less": "^2.8.8",
"i18next-browser-languagedetector": "^6.1.4",
"i18next-http-backend": "^1.4.1",
"immer": "^9.0.7", "immer": "^9.0.7",
"luxon": "^1.28.0", "luxon": "^1.28.0",
"maplibre-gl": "^1.15.2", "maplibre-gl": "^1.15.2",
@ -25,7 +29,9 @@
"proj4": "^2.7.5", "proj4": "^2.7.5",
"react": "^17.0.2", "react": "^17.0.2",
"react-dom": "^17.0.2", "react-dom": "^17.0.2",
"react-helmet": "^6.1.0",
"react-hook-form": "^6.15.8", "react-hook-form": "^6.15.8",
"react-i18next": "^11.18.1",
"react-map-gl": "^6.1.17", "react-map-gl": "^6.1.17",
"react-markdown": "^5.0.3", "react-markdown": "^5.0.3",
"react-redux": "^7.2.6", "react-redux": "^7.2.6",
@ -38,11 +44,13 @@
"sass": "^1.43.5", "sass": "^1.43.5",
"semantic-ui-react": "^2.0.4", "semantic-ui-react": "^2.0.4",
"ts-loader": "^9.2.6", "ts-loader": "^9.2.6",
"typescript": "^4.5.2" "typescript": "^4.7.4",
"yaml-loader": "^0.8.0"
}, },
"eslintConfig": { "eslintConfig": {
"extends": [ "extends": [
"react-app" "react-app",
"plugin:prettier/recommended"
] ]
}, },
"browserslist": { "browserslist": {
@ -72,8 +80,12 @@
"@types/react-router-dom": "^5.3.2", "@types/react-router-dom": "^5.3.2",
"babel-loader": "^8.2.3", "babel-loader": "^8.2.3",
"css-loader": "^5.2.7", "css-loader": "^5.2.7",
"eslint-config-prettier": "^8.5.0",
"eslint-config-react-app": "^7.0.1",
"eslint-plugin-prettier": "^4.2.1",
"html-webpack-plugin": "^5.5.0", "html-webpack-plugin": "^5.5.0",
"less-loader": "^10.2.0", "less-loader": "^10.2.0",
"prettier": "^2.7.1",
"react-refresh": "^0.11.0", "react-refresh": "^0.11.0",
"style-loader": "^3.3.1", "style-loader": "^3.3.1",
"webpack": "^5.64.4", "webpack": "^5.64.4",

View file

@ -69,7 +69,6 @@
} }
.pageTitle a { .pageTitle a {
font-family: 'Open Sans Condensed';
font-weight: 600; font-weight: 600;
font-size: 18pt; font-size: 18pt;
@ -120,6 +119,15 @@
} }
} }
@media @mobile {
.menu.menu {
> :global(.ui.container) {
height: @menuHeightMobile;
align-items: stretch;
}
}
}
.banner { .banner {
padding: 8px; padding: 8px;
z-index: 100; z-index: 100;

View file

@ -6,11 +6,16 @@ import {BrowserRouter as Router, Switch, Route, Link} from 'react-router-dom'
import {useObservable} from 'rxjs-hooks' import {useObservable} from 'rxjs-hooks'
import {from} from 'rxjs' import {from} from 'rxjs'
import {pluck} from 'rxjs/operators' import {pluck} from 'rxjs/operators'
import {Helmet} from 'react-helmet'
import {useTranslation} from 'react-i18next'
import {useConfig} from 'config' import {useConfig} from 'config'
import styles from './App.module.less' import styles from './App.module.less'
import {AVAILABLE_LOCALES, setLocale} from 'i18n'
import { import {
AcknowledgementsPage,
ExportPage,
HomePage, HomePage,
LoginRedirectPage, LoginRedirectPage,
LogoutPage, LogoutPage,
@ -21,6 +26,7 @@ import {
TrackPage, TrackPage,
TracksPage, TracksPage,
UploadPage, UploadPage,
MyTracksPage,
} from 'pages' } from 'pages'
import {Avatar, LoginButton} from 'components' import {Avatar, LoginButton} from 'components'
import api from 'api' import api from 'api'
@ -56,44 +62,60 @@ function Banner({text, style = 'warning'}: {text: string; style: 'warning' | 'in
} }
const App = connect((state) => ({login: state.login}))(function App({login}) { const App = connect((state) => ({login: state.login}))(function App({login}) {
const {t} = useTranslation()
const config = useConfig() const config = useConfig()
const apiVersion = useObservable(() => from(api.get('/info')).pipe(pluck('version'))) const apiVersion = useObservable(() => from(api.get('/info')).pipe(pluck('version')))
const hasMap = Boolean(config?.obsMapSource)
React.useEffect(() => { React.useEffect(() => {
api.loadUser() api.loadUser()
}, []) }, [])
return config ? ( return config ? (
<Router basename={config.basename}> <Router basename={config.basename}>
<Helmet>
<meta charSet="utf-8" />
<title>OpenBikeSensor Portal</title>
</Helmet>
{config?.banner && <Banner {...config.banner} />} {config?.banner && <Banner {...config.banner} />}
<Menu className={styles.menu}> <Menu className={styles.menu} stackable>
<Container> <Container>
<Link to="/" component={MenuItemForLink} header className={styles.pageTitle}> <Link to="/" component={MenuItemForLink} header className={styles.pageTitle}>
OpenBikeSensor OpenBikeSensor
</Link> </Link>
{config?.obsMapSource && ( {hasMap && (
<Link component={MenuItemForLink} to="/map" as="a"> <Link component={MenuItemForLink} to="/map" as="a">
Map {t('App.menu.map')}
</Link> </Link>
)} )}
<Link component={MenuItemForLink} to="/tracks" as="a"> <Link component={MenuItemForLink} to="/tracks" as="a">
Tracks {t('App.menu.tracks')}
</Link>
<Link component={MenuItemForLink} to="/export" as="a">
{t('App.menu.export')}
</Link> </Link>
<Menu.Menu position="right"> <Menu.Menu position="right">
{login ? ( {login ? (
<> <>
<Link component={MenuItemForLink} to="/my/tracks" as="a"> <Link component={MenuItemForLink} to="/my/tracks" as="a">
My Tracks {t('App.menu.myTracks')}
</Link> </Link>
<Dropdown item trigger={<Avatar user={login} className={styles.avatar} />}> <Dropdown item trigger={<Avatar user={login} className={styles.avatar} />}>
<Dropdown.Menu> <Dropdown.Menu>
<Link to="/upload" component={DropdownItemForLink} icon="cloud upload" text="Upload tracks" /> <Link
<Link to="/settings" component={DropdownItemForLink} icon="cog" text="Settings" /> to="/upload"
component={DropdownItemForLink}
icon="cloud upload"
text={t('App.menu.uploadTracks')}
/>
<Link to="/settings" component={DropdownItemForLink} icon="cog" text={t('App.menu.settings')} />
<Dropdown.Divider /> <Dropdown.Divider />
<Link to="/logout" component={DropdownItemForLink} icon="sign-out" text="Logout" /> <Link to="/logout" component={DropdownItemForLink} icon="sign-out" text={t('App.menu.logout')} />
</Dropdown.Menu> </Dropdown.Menu>
</Dropdown> </Dropdown>
</> </>
@ -110,14 +132,16 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
<Route path="/" exact> <Route path="/" exact>
<HomePage /> <HomePage />
</Route> </Route>
{hasMap && (
<Route path="/map" exact> <Route path="/map" exact>
<MapPage /> <MapPage />
</Route> </Route>
)}
<Route path="/tracks" exact> <Route path="/tracks" exact>
<TracksPage /> <TracksPage />
</Route> </Route>
<Route path="/my/tracks" exact> <Route path="/my/tracks" exact>
<TracksPage privateTracks /> <MyTracksPage />
</Route> </Route>
<Route path={`/tracks/:slug`} exact> <Route path={`/tracks/:slug`} exact>
<TrackPage /> <TrackPage />
@ -125,6 +149,12 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
<Route path={`/tracks/:slug/edit`} exact> <Route path={`/tracks/:slug/edit`} exact>
<TrackEditor /> <TrackEditor />
</Route> </Route>
<Route path="/export" exact>
<ExportPage />
</Route>
<Route path="/acknowledgements" exact>
<AcknowledgementsPage />
</Route>
<Route path="/redirect" exact> <Route path="/redirect" exact>
<LoginRedirectPage /> <LoginRedirectPage />
</Route> </Route>
@ -151,7 +181,7 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
<Grid columns={4} stackable> <Grid columns={4} stackable>
<Grid.Row> <Grid.Row>
<Grid.Column> <Grid.Column>
<Header as="h5">About the project</Header> <Header as="h5">{t('App.footer.aboutTheProject')}</Header>
<List> <List>
<List.Item> <List.Item>
<a href="https://openbikesensor.org/" target="_blank" rel="noreferrer"> <a href="https://openbikesensor.org/" target="_blank" rel="noreferrer">
@ -162,56 +192,68 @@ const App = connect((state) => ({login: state.login}))(function App({login}) {
</Grid.Column> </Grid.Column>
<Grid.Column> <Grid.Column>
<Header as="h5">Get involved</Header> <Header as="h5">{t('App.footer.getInvolved')}</Header>
<List> <List>
<List.Item> <List.Item>
<a href="https://forum.openbikesensor.org/" target="_blank" rel="noreferrer"> <a href="https://forum.openbikesensor.org/" target="_blank" rel="noreferrer">
Get help in forum {t('App.footer.getHelpInForum')}
</a> </a>
</List.Item> </List.Item>
<List.Item> <List.Item>
<a href="https://github.com/openbikesensor/portal/issues/new" target="_blank" rel="noreferrer"> <a href="https://github.com/openbikesensor/portal/issues/new" target="_blank" rel="noreferrer">
Report an issue {t('App.footer.reportAnIssue')}
</a> </a>
</List.Item> </List.Item>
<List.Item> <List.Item>
<a href="https://github.com/openbikesensor/portal" target="_blank" rel="noreferrer"> <a href="https://github.com/openbikesensor/portal" target="_blank" rel="noreferrer">
Development {t('App.footer.development')}
</a> </a>
</List.Item> </List.Item>
</List> </List>
</Grid.Column> </Grid.Column>
<Grid.Column> <Grid.Column>
<Header as="h5">This installation</Header> <Header as="h5">{t('App.footer.thisInstallation')}</Header>
<List> <List>
<List.Item> <List.Item>
<a href={config?.privacyPolicyUrl} target="_blank" rel="noreferrer"> <a href={config?.privacyPolicyUrl} target="_blank" rel="noreferrer">
Privacy policy {t('App.footer.privacyPolicy')}
</a> </a>
</List.Item> </List.Item>
<List.Item> <List.Item>
<a href={config?.imprintUrl} target="_blank" rel="noreferrer"> <a href={config?.imprintUrl} target="_blank" rel="noreferrer">
Imprint {t('App.footer.imprint')}
</a>
</List.Item>
{config?.termsUrl && (
<List.Item>
<a href={config?.termsUrl} target="_blank" rel="noreferrer">
{t('App.footer.terms')}
</a>
</List.Item>
)}
<List.Item>
<a
href={`https://github.com/openbikesensor/portal${
apiVersion ? `/releases/tag/${apiVersion}` : ''
}`}
target="_blank"
rel="noreferrer"
>
{apiVersion ? t('App.footer.version', {apiVersion}) : t('App.footer.versionLoading')}
</a> </a>
</List.Item> </List.Item>
</List> </List>
</Grid.Column> </Grid.Column>
<Grid.Column> <Grid.Column>
<Header as="h5">Info</Header> <Header as="h5">{t('App.footer.changeLanguage')}</Header>
<List> <List>
<List.Item> {AVAILABLE_LOCALES.map((locale) => (
<a <List.Item key={locale}>
href={`https://github.com/openbikesensor/portal${ <a onClick={() => setLocale(locale)}>{t(`locales.${locale}`)}</a>
apiVersion ? `/releases/tag/v${apiVersion}` : ''
}`}
target="_blank"
rel="noreferrer"
>
{apiVersion ? `v${apiVersion}` : 'Fetching version...'}
</a>
</List.Item> </List.Item>
))}
</List> </List>
</Grid.Column> </Grid.Column>
</Grid.Row> </Grid.Row>

View file

@ -19,21 +19,21 @@ function getColor(s) {
} }
export default function Avatar({user, className}) { export default function Avatar({user, className}) {
const {image, username} = user || {} const {image, displayName} = user || {}
if (image) { if (image) {
return <Comment.Avatar src={image} className={className} /> return <Comment.Avatar src={image} className={className} />
} }
if (!username) { if (!displayName) {
return <div className={classnames(className, 'avatar', 'empty-avatar')} /> return <div className={classnames(className, 'avatar', 'empty-avatar')} />
} }
const color = getColor(username) const color = getColor(displayName)
return ( return (
<div className={classnames(className, 'avatar', 'text-avatar')} style={{background: color}}> <div className={classnames(className, 'avatar', 'text-avatar')} style={{background: color}}>
{username && <span>{username[0]}</span>} {displayName && <span>{displayName[0]}</span>}
</div> </div>
) )
} }

View file

@ -0,0 +1,77 @@
import React from 'react'
import ReactEChartsCore from 'echarts-for-react/lib/core'
import * as echarts from 'echarts/core'
import {
// LineChart,
BarChart,
// PieChart,
// ScatterChart,
// RadarChart,
// MapChart,
// TreeChart,
// TreemapChart,
// GraphChart,
// GaugeChart,
// FunnelChart,
// ParallelChart,
// SankeyChart,
// BoxplotChart,
// CandlestickChart,
// EffectScatterChart,
// LinesChart,
// HeatmapChart,
// PictorialBarChart,
// ThemeRiverChart,
// SunburstChart,
// CustomChart,
} from 'echarts/charts'
// import components, all suffixed with Component
import {
// GridSimpleComponent,
GridComponent,
// PolarComponent,
// RadarComponent,
// GeoComponent,
// SingleAxisComponent,
// ParallelComponent,
// CalendarComponent,
// GraphicComponent,
// ToolboxComponent,
TooltipComponent,
// AxisPointerComponent,
// BrushComponent,
TitleComponent,
// TimelineComponent,
// MarkPointComponent,
// MarkLineComponent,
// MarkAreaComponent,
// LegendComponent,
// LegendScrollComponent,
// LegendPlainComponent,
// DataZoomComponent,
// DataZoomInsideComponent,
// DataZoomSliderComponent,
// VisualMapComponent,
// VisualMapContinuousComponent,
// VisualMapPiecewiseComponent,
// AriaComponent,
// TransformComponent,
DatasetComponent,
} from 'echarts/components'
// Import renderer, note that introducing the CanvasRenderer or SVGRenderer is a required step
import {
CanvasRenderer,
// SVGRenderer,
} from 'echarts/renderers'
// Register the required components
echarts.use([TitleComponent, TooltipComponent, GridComponent, BarChart, CanvasRenderer])
// The usage of ReactEChartsCore are same with above.
export default function Chart(props) {
return <ReactEChartsCore echarts={echarts} notMerge lazyUpdate {...props} />
}

View file

@ -1,15 +1,78 @@
import {useMemo} from "react"; import React, {useMemo} from 'react'
type ColorMap = [number, string][]
import styles from './ColorMapLegend.module.less' import styles from './ColorMapLegend.module.less'
export default function ColorMapLegend({map}: {map: ColorMap}) { type ColorMap = [number, string][]
function* pairs(arr) {
for (let i = 1; i < arr.length; i++) {
yield [arr[i - 1], arr[i]]
}
}
function* zip(...arrs) {
const l = Math.min(...arrs.map((a) => a.length))
for (let i = 0; i < l; i++) {
yield arrs.map((a) => a[i])
}
}
export function DiscreteColorMapLegend({map}: {map: ColorMap}) {
const colors: string[] = map.filter((x, i) => i % 2 == 0) as any
const stops: number[] = map.filter((x, i) => i % 2 == 1) as any
let min = stops[0]
let max = stops[stops.length - 1]
const buffer = (max - min) / (stops.length - 1) / 2
min -= buffer
max += buffer
const normalizeValue = (v) => (v - min) / (max - min)
const stopPairs = Array.from(pairs([min, ...stops, max]))
const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, [])
const gradientUrl = `url(#${gradientId})`
const parts = Array.from(zip(stopPairs, colors))
return (
<div className={styles.colorMapLegend}>
<svg width="100%" height="20" version="1.1" xmlns="http://www.w3.org/2000/svg">
<defs>
<linearGradient id={gradientId} x1="0" x2="1" y1="0" y2="0">
{parts.map(([[left, right], color]) => (
<React.Fragment key={left}>
<stop offset={normalizeValue(left) * 100 + '%'} stopColor={color} />
<stop offset={normalizeValue(right) * 100 + '%'} stopColor={color} />
</React.Fragment>
))}
</linearGradient>
</defs>
<rect id="rect1" x="0" y="0" width="100%" height="100%" fill={gradientUrl} />
</svg>
{stops.map((value) => (
<span className={styles.tick} key={value} style={{left: normalizeValue(value) * 100 + '%'}}>
{value.toFixed(2)}
</span>
))}
</div>
)
}
export default function ColorMapLegend({
map,
twoTicks = false,
digits = 2,
}: {
map: ColorMap
twoTicks?: boolean
digits?: number
}) {
const min = map[0][0] const min = map[0][0]
const max = map[map.length - 1][0] const max = map[map.length - 1][0]
const normalizeValue = (v) => (v - min) / (max - min) const normalizeValue = (v) => (v - min) / (max - min)
const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, []); const gradientId = useMemo(() => `gradient${Math.floor(Math.random() * 1000000)}`, [])
const gradientUrl = `url(#${gradientId})`; const gradientUrl = `url(#${gradientId})`
const tickValues = twoTicks ? [map[0], map[map.length - 1]] : map
return ( return (
<div className={styles.colorMapLegend}> <div className={styles.colorMapLegend}>
<svg width="100%" height="20" version="1.1" xmlns="http://www.w3.org/2000/svg"> <svg width="100%" height="20" version="1.1" xmlns="http://www.w3.org/2000/svg">
@ -23,9 +86,9 @@ export default function ColorMapLegend({map}: {map: ColorMap}) {
<rect id="rect1" x="0" y="0" width="100%" height="100%" fill={gradientUrl} /> <rect id="rect1" x="0" y="0" width="100%" height="100%" fill={gradientUrl} />
</svg> </svg>
{map.map(([value]) => ( {tickValues.map(([value]) => (
<span className={styles.tick} key={value} style={{left: normalizeValue(value) * 100 + '%'}}> <span className={styles.tick} key={value} style={{left: normalizeValue(value) * 100 + '%'}}>
{value.toFixed(2)} {value.toFixed(digits)}
</span> </span>
))} ))}
</div> </div>

View file

@ -1,9 +1,11 @@
import React from 'react' import React from 'react'
import {Icon, Segment, Header, Button} from 'semantic-ui-react' import {Icon, Segment, Header, Button} from 'semantic-ui-react'
import {useTranslation} from 'react-i18next'
import {FileDrop} from 'components' import {FileDrop} from 'components'
export default function FileUploadField({onSelect: onSelect_, multiple}) { export default function FileUploadField({onSelect: onSelect_, multiple}) {
const {t} = useTranslation()
const labelRef = React.useRef() const labelRef = React.useRef()
const [labelRefState, setLabelRefState] = React.useState() const [labelRefState, setLabelRefState] = React.useState()
@ -31,7 +33,14 @@ export default function FileUploadField({onSelect: onSelect_, multiple}) {
<input <input
type="file" type="file"
id="upload-field" id="upload-field"
style={{width: 0, height: 0, position: 'fixed', left: -1000, top: -1000, opacity: 0.001}} style={{
width: 0,
height: 0,
position: 'fixed',
left: -1000,
top: -1000,
opacity: 0.001,
}}
multiple={multiple} multiple={multiple}
accept=".csv" accept=".csv"
onChange={onChangeField} onChange={onChangeField}
@ -50,11 +59,11 @@ export default function FileUploadField({onSelect: onSelect_, multiple}) {
> >
<Header icon> <Header icon>
<Icon name="cloud upload" /> <Icon name="cloud upload" />
Drop file{multiple ? 's' : ''} here or click to select {multiple ? 'them' : 'one'} for upload {multiple ? t('FileUploadField.dropOrClickMultiple') : t('FileUploadField.dropOrClick')}
</Header> </Header>
<Button primary as="span"> <Button primary as="span">
Upload file{multiple ? 's' : ''} {multiple ? t('FileUploadField.uploadFiles') : t('FileUploadField.uploadFile')}
</Button> </Button>
</Segment> </Segment>
)} )}

View file

@ -1,4 +1,5 @@
import {DateTime} from 'luxon' import {DateTime} from 'luxon'
import {useTranslation} from 'react-i18next'
export default function FormattedDate({date, relative = false}) { export default function FormattedDate({date, relative = false}) {
if (date == null) { if (date == null) {
@ -10,11 +11,19 @@ export default function FormattedDate({date, relative = false}) {
let str let str
const {i18n} = useTranslation()
const locale = i18n.language
if (relative) { if (relative) {
str = dateTime.toRelative() str = dateTime.setLocale(locale).toRelative()
} else { } else {
str = dateTime.toLocaleString(DateTime.DATETIME_MED) str = dateTime.setLocale(locale).toLocaleString(DateTime.DATETIME_MED)
} }
return <span title={dateTime.toISO()}>{str}</span> const iso = dateTime.toISO()
return (
<time dateTime={iso} title={iso}>
{str}
</time>
)
} }

View file

@ -1,9 +1,11 @@
import React from 'react' import React from 'react'
import {Button} from 'semantic-ui-react' import {Button} from 'semantic-ui-react'
import {useTranslation} from 'react-i18next'
import api from 'api' import api from 'api'
export default function LoginButton(props) { export default function LoginButton(props) {
const {t} = useTranslation()
const [busy, setBusy] = React.useState(false) const [busy, setBusy] = React.useState(false)
const onClick = React.useCallback( const onClick = React.useCallback(
@ -19,7 +21,7 @@ export default function LoginButton(props) {
return ( return (
<Button onClick={busy ? null : onClick} loading={busy} {...props}> <Button onClick={busy ? null : onClick} loading={busy} {...props}>
Login {t('LoginButton.login')}
</Button> </Button>
) )
} }

View file

@ -2,12 +2,13 @@ import React, {useState, useCallback, useMemo, useEffect} from 'react'
import classnames from 'classnames' import classnames from 'classnames'
import {connect} from 'react-redux' import {connect} from 'react-redux'
import _ from 'lodash' import _ from 'lodash'
import ReactMapGl, {WebMercatorViewport, ScaleControl, NavigationControl} from 'react-map-gl' import ReactMapGl, {WebMercatorViewport, ScaleControl, NavigationControl, AttributionControl} from 'react-map-gl'
import turfBbox from '@turf/bbox' import turfBbox from '@turf/bbox'
import {useHistory, useLocation} from 'react-router-dom' import {useHistory, useLocation} from 'react-router-dom'
import {useConfig} from 'config' import {useConfig} from 'config'
import {useCallbackRef} from '../../utils'
import {baseMapStyles} from '../../mapstyles' import {baseMapStyles} from '../../mapstyles'
import styles from './styles.module.less' import styles from './styles.module.less'
@ -19,11 +20,13 @@ interface Viewport {
} }
const EMPTY_VIEWPORT: Viewport = {longitude: 0, latitude: 0, zoom: 0} const EMPTY_VIEWPORT: Viewport = {longitude: 0, latitude: 0, zoom: 0}
export const withBaseMapStyle = connect((state) => ({baseMapStyle: state.mapConfig?.baseMap?.style ?? 'positron'})) export const withBaseMapStyle = connect((state) => ({
baseMapStyle: state.mapConfig?.baseMap?.style ?? 'positron',
}))
function parseHash(v: string): Viewport | null { function parseHash(v: string): Viewport | null {
if (!v) return null if (!v) return null
const m = v.match(/^#([0-9\.]+)\/([0-9\.]+)\/([0-9\.]+)$/) const m = v.match(/^#([0-9\.]+)\/([0-9\.\-]+)\/([0-9\.\-]+)$/)
if (!m) return null if (!m) return null
return { return {
zoom: Number.parseFloat(m[1]), zoom: Number.parseFloat(m[1]),
@ -36,19 +39,32 @@ function buildHash(v: Viewport): string {
return `${v.zoom.toFixed(2)}/${v.latitude}/${v.longitude}` return `${v.zoom.toFixed(2)}/${v.latitude}/${v.longitude}`
} }
const setViewportToHash = _.debounce((history, viewport) => {
history.replace({
hash: buildHash(viewport),
})
}, 200)
function useViewportFromUrl(): [Viewport | null, (v: Viewport) => void] { function useViewportFromUrl(): [Viewport | null, (v: Viewport) => void] {
const history = useHistory() const history = useHistory()
const location = useLocation() const location = useLocation()
const value = useMemo(() => parseHash(location.hash), [location.hash])
const [cachedValue, setCachedValue] = useState(parseHash(location.hash))
// when the location hash changes, set the new value to the cache
useEffect(() => {
setCachedValue(parseHash(location.hash))
}, [location.hash])
const setter = useCallback( const setter = useCallback(
(v) => { (v) => {
history.replace({ setCachedValue(v)
hash: buildHash(v), setViewportToHash(history, v)
})
}, },
[history] [history]
) )
return [value || EMPTY_VIEWPORT, setter]
return [cachedValue || EMPTY_VIEWPORT, setter]
} }
function Map({ function Map({
@ -56,17 +72,28 @@ function Map({
children, children,
boundsFromJson, boundsFromJson,
baseMapStyle, baseMapStyle,
hasToolbar,
onViewportChange,
...props ...props
}: { }: {
viewportFromUrl?: boolean viewportFromUrl?: boolean
children: React.ReactNode children: React.ReactNode
boundsFromJson: GeoJSON.Geometry boundsFromJson: GeoJSON.Geometry
baseMapStyle: string baseMapStyle: string
hasToolbar?: boolean
onViewportChange: (viewport: Viewport) => void
}) { }) {
const [viewportState, setViewportState] = useState(EMPTY_VIEWPORT) const [viewportState, setViewportState] = useState(EMPTY_VIEWPORT)
const [viewportUrl, setViewportUrl] = useViewportFromUrl() const [viewportUrl, setViewportUrl] = useViewportFromUrl()
const [viewport, setViewport] = viewportFromUrl ? [viewportUrl, setViewportUrl] : [viewportState, setViewportState] const [viewport, setViewport_] = viewportFromUrl ? [viewportUrl, setViewportUrl] : [viewportState, setViewportState]
const setViewport = useCallback(
(viewport: Viewport) => {
setViewport_(viewport)
onViewportChange?.(viewport)
},
[setViewport_, onViewportChange]
)
const config = useConfig() const config = useConfig()
useEffect(() => { useEffect(() => {
@ -75,10 +102,29 @@ function Map({
} }
}, [config, boundsFromJson]) }, [config, boundsFromJson])
const mapSourceHosts = useMemo(
() => _.uniq(config?.obsMapSource?.tiles?.map((tileUrl: string) => new URL(tileUrl).host) ?? []),
[config?.obsMapSource]
)
const transformRequest = useCallbackRef((url, resourceType) => {
if (resourceType === 'Tile' && mapSourceHosts.includes(new URL(url).host)) {
return {
url,
credentials: 'include',
}
}
})
useEffect(() => { useEffect(() => {
if (boundsFromJson) { if (boundsFromJson) {
const [minX, minY, maxX, maxY] = turfBbox(boundsFromJson) const bbox = turfBbox(boundsFromJson)
const vp = new WebMercatorViewport({width: 1000, height: 800}).fitBounds( if (bbox.every((v) => Math.abs(v) !== Infinity)) {
const [minX, minY, maxX, maxY] = bbox
const vp = new WebMercatorViewport({
width: 1000,
height: 800,
}).fitBounds(
[ [
[minX, minY], [minX, minY],
[maxX, maxY], [maxX, maxY],
@ -90,6 +136,7 @@ function Map({
) )
setViewport(_.pick(vp, ['zoom', 'latitude', 'longitude'])) setViewport(_.pick(vp, ['zoom', 'latitude', 'longitude']))
} }
}
}, [boundsFromJson]) }, [boundsFromJson])
return ( return (
@ -98,13 +145,15 @@ function Map({
width="100%" width="100%"
height="100%" height="100%"
onViewportChange={setViewport} onViewportChange={setViewport}
{...{transformRequest}}
{...viewport} {...viewport}
{...props} {...props}
className={classnames(styles.map, props.className)} className={classnames(styles.map, props.className)}
attributionControl={false}
> >
<NavigationControl style={{left: 10, top: 10}} /> <AttributionControl style={{top: 0, right: 0}} />
<ScaleControl maxWidth={200} unit="metric" style={{left: 10, bottom: 10}} /> <NavigationControl showCompass={false} style={{left: 16, top: hasToolbar ? 64 : 16}} />
<ScaleControl maxWidth={200} unit="metric" style={{left: 16, bottom: 16}} />
{children} {children}
</ReactMapGl> </ReactMapGl>
) )

View file

@ -1,6 +1,7 @@
import React from 'react' import React from 'react'
import classnames from 'classnames' import classnames from 'classnames'
import {Container} from 'semantic-ui-react' import {Container} from 'semantic-ui-react'
import {Helmet} from 'react-helmet'
import styles from './Page.module.less' import styles from './Page.module.less'
@ -9,13 +10,21 @@ export default function Page({
children, children,
fullScreen, fullScreen,
stage, stage,
title,
}: { }: {
small?: boolean small?: boolean
children: ReactNode children: ReactNode
fullScreen?: boolean fullScreen?: boolean
stage?: ReactNode stage?: ReactNode
title?: string
}) { }) {
return ( return (
<>
{title && (
<Helmet>
<title>{title} - OpenBikeSensor Portal</title>
</Helmet>
)}
<main <main
className={classnames( className={classnames(
styles.page, styles.page,
@ -27,5 +36,6 @@ export default function Page({
{stage} {stage}
{fullScreen ? children : <Container>{children}</Container>} {fullScreen ? children : <Container>{children}</Container>}
</main> </main>
</>
) )
} }

View file

@ -0,0 +1,73 @@
import React, {useState, useCallback} from 'react'
import {pickBy} from 'lodash'
import {Loader, Statistic, Pagination, Segment, Header, Menu, Table, Icon} from 'semantic-ui-react'
import {useObservable} from 'rxjs-hooks'
import {of, from, concat, combineLatest} from 'rxjs'
import {map, switchMap, distinctUntilChanged} from 'rxjs/operators'
import {Duration, DateTime} from 'luxon'
import api from 'api'
import {useTranslation} from 'react-i18next'
function formatDuration(seconds) {
return (
Duration.fromMillis((seconds ?? 0) * 1000)
.as('hours')
.toFixed(1) + ' h'
)
}
export default function Stats() {
const {t} = useTranslation()
const [page, setPage] = useState(1)
const PER_PAGE = 10
const stats = useObservable(
() => of(null).pipe(switchMap(() => concat(of(null), from(api.get('/stats/regions'))))),
null
)
const pageCount = stats ? Math.ceil(stats.length / PER_PAGE) : 1
return (
<>
<Header as="h2">{t('RegionStats.title')}</Header>
<div>
<Loader active={stats == null} />
<Table celled>
<Table.Header>
<Table.Row>
<Table.HeaderCell> {t('RegionStats.regionName')}</Table.HeaderCell>
<Table.HeaderCell>{t('RegionStats.eventCount')}</Table.HeaderCell>
</Table.Row>
</Table.Header>
<Table.Body>
{stats?.slice((page - 1) * PER_PAGE, page * PER_PAGE)?.map((area) => (
<Table.Row key={area.id}>
<Table.Cell>{area.name}</Table.Cell>
<Table.Cell>{area.overtaking_event_count}</Table.Cell>
</Table.Row>
))}
</Table.Body>
{pageCount > 1 && (
<Table.Footer>
<Table.Row>
<Table.HeaderCell colSpan="2">
<Pagination
floated="right"
activePage={page}
totalPages={pageCount}
onPageChange={(e, data) => setPage(data.activePage as number)}
/>
</Table.HeaderCell>
</Table.Row>
</Table.Footer>
)}
</Table>
</div>
</>
)
}

View file

@ -5,6 +5,7 @@ import {useObservable} from 'rxjs-hooks'
import {of, from, concat, combineLatest} from 'rxjs' import {of, from, concat, combineLatest} from 'rxjs'
import {map, switchMap, distinctUntilChanged} from 'rxjs/operators' import {map, switchMap, distinctUntilChanged} from 'rxjs/operators'
import {Duration, DateTime} from 'luxon' import {Duration, DateTime} from 'luxon'
import {useTranslation} from 'react-i18next'
import api from 'api' import api from 'api'
@ -17,6 +18,7 @@ function formatDuration(seconds) {
} }
export default function Stats({user = null}: {user?: null | string}) { export default function Stats({user = null}: {user?: null | string}) {
const {t} = useTranslation()
const [timeframe, setTimeframe] = useState('all_time') const [timeframe, setTimeframe] = useState('all_time')
const onClick = useCallback((_e, {name}) => setTimeframe(name), [setTimeframe]) const onClick = useCallback((_e, {name}) => setTimeframe(name), [setTimeframe])
@ -63,49 +65,56 @@ export default function Stats({user = null}: {user?: null | string}) {
[timeframe, user] [timeframe, user]
) )
const placeholder = t('Stats.placeholder')
return ( return (
<> <>
<Header as="h2">{user && 'My '}Statistics</Header>
<div> <div>
<Segment attached="top"> <Segment attached="top">
<Loader active={stats == null} /> <Loader active={stats == null} />
<Statistic.Group widths={2} size="tiny"> <Statistic.Group widths={2} size="tiny">
<Statistic> <Statistic>
<Statistic.Value>{stats ? `${Number(stats?.trackLength / 1000).toFixed(1)} km` : '...'}</Statistic.Value> <Statistic.Value>
<Statistic.Label>Total track length</Statistic.Label> {stats ? `${Number(stats?.trackLength / 1000).toFixed(1)} km` : placeholder}
</Statistic.Value>
<Statistic.Label>{t('Stats.totalTrackLength')}</Statistic.Label>
</Statistic> </Statistic>
<Statistic> <Statistic>
<Statistic.Value>{stats ? formatDuration(stats?.trackDuration) : '...'}</Statistic.Value> <Statistic.Value>{stats ? formatDuration(stats?.trackDuration) : placeholder}</Statistic.Value>
<Statistic.Label>Time recorded</Statistic.Label> <Statistic.Label>{t('Stats.timeRecorded')}</Statistic.Label>
</Statistic> </Statistic>
<Statistic> <Statistic>
<Statistic.Value>{stats?.numEvents ?? '...'}</Statistic.Value> <Statistic.Value>{stats?.numEvents ?? placeholder}</Statistic.Value>
<Statistic.Label>Events confirmed</Statistic.Label> <Statistic.Label>{t('Stats.eventsConfirmed')}</Statistic.Label>
</Statistic> </Statistic>
{user ? (
<Statistic> <Statistic>
<Statistic.Value>{stats?.trackCount ?? '...'}</Statistic.Value> <Statistic.Value>{stats?.trackCount ?? placeholder}</Statistic.Value>
<Statistic.Label>Tracks recorded</Statistic.Label> <Statistic.Label>{t('Stats.tracksRecorded')}</Statistic.Label>
</Statistic> </Statistic>
) : ( {!user && (
<>
<Statistic> <Statistic>
<Statistic.Value>{stats?.userCount ?? '...'}</Statistic.Value> <Statistic.Value>{stats?.userCount ?? placeholder}</Statistic.Value>
<Statistic.Label>Members joined</Statistic.Label> <Statistic.Label>{t('Stats.membersJoined')}</Statistic.Label>
</Statistic> </Statistic>
<Statistic>
<Statistic.Value>{stats?.deviceCount ?? placeholder}</Statistic.Value>
<Statistic.Label>{t('Stats.deviceCount')}</Statistic.Label>
</Statistic>
</>
)} )}
</Statistic.Group> </Statistic.Group>
</Segment> </Segment>
<Menu widths={3} attached="bottom" size="small"> <Menu widths={3} attached="bottom" size="small">
<Menu.Item name="this_month" active={timeframe === 'this_month'} onClick={onClick}> <Menu.Item name="this_month" active={timeframe === 'this_month'} onClick={onClick}>
This month {t('Stats.thisMonth')}
</Menu.Item> </Menu.Item>
<Menu.Item name="this_year" active={timeframe === 'this_year'} onClick={onClick}> <Menu.Item name="this_year" active={timeframe === 'this_year'} onClick={onClick}>
This year {t('Stats.thisYear')}
</Menu.Item> </Menu.Item>
<Menu.Item name="all_time" active={timeframe === 'all_time'} onClick={onClick}> <Menu.Item name="all_time" active={timeframe === 'all_time'} onClick={onClick}>
All time {t('Stats.allTime')}
</Menu.Item> </Menu.Item>
</Menu> </Menu>
</div> </div>

View file

@ -0,0 +1,14 @@
import React from 'react'
import {Icon} from 'semantic-ui-react'
import {useTranslation} from 'react-i18next'
export default function Visibility({public: public_}: {public: boolean}) {
const {t} = useTranslation()
const icon = public_ ? <Icon color="blue" name="eye" fitted /> : <Icon name="eye slash" fitted />
const text = public_ ? t('general.public') : t('general.private')
return (
<>
{icon} {text}
</>
)
}

View file

@ -1,10 +1,13 @@
export {default as Avatar} from './Avatar' export {default as Avatar} from './Avatar'
export {default as ColorMapLegend} from './ColorMapLegend' export {default as Chart} from './Chart'
export {default as ColorMapLegend, DiscreteColorMapLegend} from './ColorMapLegend'
export {default as FileDrop} from './FileDrop' export {default as FileDrop} from './FileDrop'
export {default as FileUploadField} from './FileUploadField' export {default as FileUploadField} from './FileUploadField'
export {default as FormattedDate} from './FormattedDate' export {default as FormattedDate} from './FormattedDate'
export {default as LoginButton} from './LoginButton' export {default as LoginButton} from './LoginButton'
export {default as Map} from './Map' export {default as Map} from './Map'
export {default as Page} from './Page' export {default as Page} from './Page'
export {default as RegionStats} from './RegionStats'
export {default as Stats} from './Stats' export {default as Stats} from './Stats'
export {default as StripMarkdown} from './StripMarkdown' export {default as StripMarkdown} from './StripMarkdown'
export {default as Visibility} from './Visibility'

View file

@ -1,16 +1,11 @@
import React from 'react' import React from 'react'
export type MapSoure = export type MapSource = {
| {
type: 'vector'
url: string
}
| {
type: 'vector' type: 'vector'
tiles: string[] tiles: string[]
minzoom: number minzoom: number
maxzoom: number maxzoom: number
} }
export interface Config { export interface Config {
apiUrl: string apiUrl: string
@ -19,9 +14,10 @@ export interface Config {
longitude: number longitude: number
zoom: number zoom: number
} }
obsMapSource?: MapSoure obsMapSource?: MapSource
imprintUrl?: string imprintUrl?: string
privacyPolicyUrl?: string privacyPolicyUrl?: string
termsUrl?: string
banner?: { banner?: {
text: string text: string
style?: 'warning' | 'info' style?: 'warning' | 'info'

87
frontend/src/i18n.ts Normal file
View file

@ -0,0 +1,87 @@
import {useState, useEffect, useMemo} from 'react'
import i18next, {TOptions} from 'i18next'
import {BehaviorSubject, combineLatest} from 'rxjs'
import {map, distinctUntilChanged} from 'rxjs/operators'
import HttpBackend, {BackendOptions, RequestCallback} from 'i18next-http-backend'
import {initReactI18next} from 'react-i18next'
import LanguageDetector from 'i18next-browser-languagedetector'
export type AvailableLocales = 'en' | 'de' | 'fr'
async function request(_options: BackendOptions, url: string, _payload: any, callback: RequestCallback) {
try {
const [lng] = url.split('/')
const locale = await import(`translations/${lng}.yaml`)
callback(null, {status: 200, data: locale})
} catch (e) {
console.error(`Unable to load locale at ${url}\n`, e)
callback(null, {status: 404, data: String(e)})
}
}
export const AVAILABLE_LOCALES: AvailableLocales[] = ['en', 'de', 'fr']
const i18n = i18next.createInstance()
const options: TOptions = {
fallbackLng: 'en',
ns: ['common'],
defaultNS: 'common',
whitelist: AVAILABLE_LOCALES,
// loading via webpack
backend: {
loadPath: '{{lng}}/{{ns}}',
parse: (data: any) => data,
request,
},
load: 'languageOnly',
interpolation: {
escapeValue: false, // not needed for react as it escapes by default
},
}
i18n
.use(HttpBackend)
.use(initReactI18next)
.use(LanguageDetector)
.init({...options})
const locale$ = new BehaviorSubject<AvailableLocales>('en')
export const translate = i18n.t.bind(i18n)
export const translate$ = (stringAndData$: [string, any]) =>
combineLatest([stringAndData$, locale$.pipe(distinctUntilChanged())]).pipe(
map(([stringAndData]) => {
if (typeof stringAndData === 'string') {
return i18n.t(stringAndData)
} else {
const [string, data] = stringAndData
return i18n.t(string, {data})
}
})
)
export const setLocale = (locale: AvailableLocales) => {
i18n.changeLanguage(locale)
locale$.next(locale)
}
export function useLocale() {
const [, reload] = useState()
useEffect(() => {
i18n.on('languageChanged', reload)
return () => {
i18n.off('languageChanged', reload)
}
}, [])
return i18n.language
}
export default i18n

View file

@ -1,11 +0,0 @@
body {
margin: 0;
font-family: 'Noto Sans', 'Roboto', -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Oxygen', 'Ubuntu', 'Cantarell',
'Fira Sans', 'Droid Sans', 'Helvetica Neue', sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
code {
font-family: 'Noto Sans Mono', source-code-pro, Menlo, Monaco, Consolas, 'Courier New', monospace;
}

View file

@ -1,6 +1,8 @@
<!DOCTYPE html> <!DOCTYPE html>
<html lang="en"> <html lang="en">
<head></head> <head>
<title><%= htmlWebpackPlugin.options.title %></title>
</head>
<body> <body>
<noscript>You need to enable JavaScript to run this app.</noscript> <noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div> <div id="root"></div>

View file

@ -1,9 +1,9 @@
import React from 'react' import React, {Suspense} from 'react'
import {Settings} from 'luxon' import {Settings} from 'luxon'
import ReactDOM from 'react-dom' import ReactDOM from 'react-dom'
import 'fomantic-ui-less/semantic.less' import 'fomantic-ui-less/semantic.less'
import './index.css' import './index.less'
import App from './App' import App from './App'
import 'maplibre-gl/dist/maplibre-gl.css' import 'maplibre-gl/dist/maplibre-gl.css'
@ -11,13 +11,16 @@ import 'maplibre-gl/dist/maplibre-gl.css'
import {Provider} from 'react-redux' import {Provider} from 'react-redux'
import store from './store' import store from './store'
import './i18n'
// TODO: remove // TODO: remove
Settings.defaultLocale = 'de-DE' Settings.defaultLocale = 'de-DE'
ReactDOM.render( ReactDOM.render(
<Provider store={store}> <Provider store={store}>
<Suspense fallback={null}>
<App /> <App />
</Suspense>
</Provider>, </Provider>,
document.getElementById('root') document.getElementById('root')
) )

8
frontend/src/index.less Normal file
View file

@ -0,0 +1,8 @@
@import 'styles.less';
body {
margin: 0;
font-family: @fontFamilyDefault;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}

View file

@ -1,4 +1,5 @@
import _ from 'lodash' import _ from 'lodash'
import produce from 'immer'
import bright from './bright.json' import bright from './bright.json'
import positron from './positron.json' import positron from './positron.json'
@ -21,6 +22,16 @@ function rgbArrayToColor(arr) {
return ['rgb', ...arr.map((v) => Math.round(v * 255))] return ['rgb', ...arr.map((v) => Math.round(v * 255))]
} }
function rgbArrayToHtml(arr) {
return (
'#' +
arr
.map((v) => Math.round(v * 255).toString(16))
.map((v) => (v.length == 1 ? '0' : '') + v)
.join('')
)
}
export function colormapToScale(colormap, value, min, max) { export function colormapToScale(colormap, value, min, max) {
return [ return [
'interpolate-hcl', 'interpolate-hcl',
@ -31,34 +42,77 @@ export function colormapToScale(colormap, value, min, max) {
} }
export const viridis = simplifyColormap(viridisBase.map(rgbArrayToColor), 20) export const viridis = simplifyColormap(viridisBase.map(rgbArrayToColor), 20)
export const viridisSimpleHtml = simplifyColormap(viridisBase.map(rgbArrayToHtml), 10)
export const grayscale = ['#FFFFFF', '#000000'] export const grayscale = ['#FFFFFF', '#000000']
export const reds = [ export const reds = ['rgba( 255, 0, 0, 0)', 'rgba( 255, 0, 0, 255)']
'rgba( 255, 0, 0, 0)',
'rgba( 255, 0, 0, 255)',
]
export function colorByCount(attribute = 'event_count', maxCount, colormap = viridis) { export function colorByCount(attribute = 'event_count', maxCount, colormap = viridis) {
return colormapToScale(colormap, ['case', ['to-boolean', ['get', attribute]], ['get', attribute], 0], 0, maxCount) return colormapToScale(colormap, ['case', isValidAttribute(attribute), ['get', attribute], 0], 0, maxCount)
} }
export function colorByDistance(attribute = 'distance_overtaker_mean', fallback = '#ABC') { var steps = {rural: [1.6, 1.8, 2.0, 2.2], urban: [1.1, 1.3, 1.5, 1.7]}
export function isValidAttribute(attribute) {
if (attribute.endsWith('zone')) {
return ['in', ['get', attribute], ['literal', ['rural', 'urban']]]
}
return ['to-boolean', ['get', attribute]]
}
export function borderByZone() {
return ['match', ['get', 'zone'], 'rural', 'cyan', 'urban', 'blue', 'purple']
}
export function colorByDistance(attribute = 'distance_overtaker_mean', fallback = '#ABC', zone = 'urban') {
return [ return [
'case', 'case',
['!', ['to-boolean', ['get', attribute]]], ['!', isValidAttribute(attribute)],
fallback, fallback,
[ [
'interpolate-hcl', 'match',
['linear'], ['get', 'zone'],
'rural',
[
'step',
['get', attribute], ['get', attribute],
1, 'rgba(150, 0, 0, 1)',
steps['rural'][0],
'rgba(255, 0, 0, 1)', 'rgba(255, 0, 0, 1)',
1.3, steps['rural'][1],
'rgba(255, 200, 0, 1)', 'rgba(255, 220, 0, 1)',
1.5, steps['rural'][2],
'rgba(67, 200, 0, 1)', 'rgba(67, 200, 0, 1)',
1.7, steps['rural'][3],
'rgba(67, 150, 0, 1)', 'rgba(67, 150, 0, 1)',
], ],
'urban',
[
'step',
['get', attribute],
'rgba(150, 0, 0, 1)',
steps['urban'][0],
'rgba(255, 0, 0, 1)',
steps['urban'][1],
'rgba(255, 220, 0, 1)',
steps['urban'][2],
'rgba(67, 200, 0, 1)',
steps['urban'][3],
'rgba(67, 150, 0, 1)',
],
[
'step',
['get', attribute],
'rgba(150, 0, 0, 1)',
steps['urban'][0],
'rgba(255, 0, 0, 1)',
steps['urban'][1],
'rgba(255, 220, 0, 1)',
steps['urban'][2],
'rgba(67, 200, 0, 1)',
steps['urban'][3],
'rgba(67, 150, 0, 1)',
],
],
] ]
} }
@ -67,7 +121,66 @@ export const trackLayer = {
paint: { paint: {
'line-width': ['interpolate', ['linear'], ['zoom'], 14, 2, 17, 5], 'line-width': ['interpolate', ['linear'], ['zoom'], 14, 2, 17, 5],
'line-color': '#F06292', 'line-color': '#F06292',
'line-opacity': 0.6,
}, },
} }
export const getRegionLayers = (adminLevel = 6, baseColor = '#00897B', maxValue = 5000) => [
{
id: 'region',
type: 'fill',
source: 'obs',
'source-layer': 'obs_regions',
minzoom: 0,
maxzoom: 10,
// filter: [">", "overtaking_event_count", 0],
paint: {
'fill-color': baseColor,
'fill-antialias': true,
'fill-opacity': [
'interpolate',
['linear'],
['log10', ['max', ['get', 'overtaking_event_count'], 1]],
0,
0,
Math.log10(maxValue),
0.9,
],
},
},
{
id: 'region-border',
type: 'line',
source: 'obs',
'source-layer': 'obs_regions',
minzoom: 0,
maxzoom: 10,
// filter: [">", "overtaking_event_count", 0],
paint: {
'line-width': [
'interpolate',
['linear'],
['log10', ['max', ['get', 'overtaking_event_count'], 1]],
0,
0.2,
Math.log10(maxValue),
1.5,
],
'line-color': baseColor,
},
layout: {
'line-join': 'round',
'line-cap': 'round',
},
},
]
export const trackLayerRaw = produce(trackLayer, (draft) => {
// draft.paint['line-color'] = '#81D4FA'
draft.paint['line-width'][4] = 1
draft.paint['line-width'][6] = 2
draft.paint['line-dasharray'] = [3, 3]
delete draft.paint['line-opacity']
})
export const basemap = positron export const basemap = positron

View file

@ -0,0 +1,18 @@
import React from 'react'
import {Header} from 'semantic-ui-react'
import {useTranslation} from 'react-i18next'
import Markdown from 'react-markdown'
import {Page} from 'components'
export default function AcknowledgementsPage() {
const {t} = useTranslation()
const title = t('AcknowledgementsPage.title')
return (
<Page title={title}>
<Header as="h2">{title}</Header>
<Markdown>{t('AcknowledgementsPage.information')}</Markdown>
</Page>
)
}

View file

@ -0,0 +1,161 @@
import React, {useState, useCallback, useMemo} from 'react'
import {Source, Layer} from 'react-map-gl'
import _ from 'lodash'
import {Button, Form, Dropdown, Header, Message, Icon} from 'semantic-ui-react'
import {useTranslation, Trans as Translate} from 'react-i18next'
import Markdown from 'react-markdown'
import {useConfig} from 'config'
import {Page, Map} from 'components'
const BoundingBoxSelector = React.forwardRef(({value, name, onChange}, ref) => {
const {t} = useTranslation()
const [pointNum, setPointNum] = useState(0)
const [point0, setPoint0] = useState(null)
const [point1, setPoint1] = useState(null)
const onClick = (e) => {
if (pointNum == 0) {
setPoint0(e.lngLat)
} else {
setPoint1(e.lngLat)
}
setPointNum(1 - pointNum)
}
React.useEffect(() => {
if (!point0 || !point1) return
const bbox = `${point0[0]},${point0[1]},${point1[0]},${point1[1]}`
if (bbox !== value) {
onChange(bbox)
}
}, [point0, point1])
React.useEffect(() => {
if (!value) return
const [p00, p01, p10, p11] = value.split(',').map((v) => Number.parseFloat(v))
if (!point0 || point0[0] != p00 || point0[1] != p01) setPoint0([p00, p01])
if (!point1 || point1[0] != p10 || point1[1] != p11) setPoint1([p10, p11])
}, [value])
return (
<div>
<Form.Input
label={t('ExportPage.boundingBox.label')}
{...{name, value}}
onChange={(e) => onChange(e.target.value)}
/>
<div style={{height: 400, position: 'relative', marginBottom: 16}}>
<Map onClick={onClick}>
<Source
id="bbox"
type="geojson"
data={
point0 && point1
? {
type: 'FeatureCollection',
features: [
{
type: 'Feature',
geometry: {
type: 'Polygon',
coordinates: [
[
[point0[0], point0[1]],
[point1[0], point0[1]],
[point1[0], point1[1]],
[point0[0], point1[1]],
[point0[0], point0[1]],
],
],
},
},
],
}
: {}
}
>
<Layer
id="bbox"
type="line"
paint={{
'line-width': 4,
'line-color': '#F06292',
}}
/>
</Source>
</Map>
</div>
</div>
)
})
const MODES = ['events', 'segments']
const FORMATS = ['geojson', 'shapefile']
export default function ExportPage() {
const [mode, setMode] = useState('events')
const [bbox, setBbox] = useState('8.294678,49.651182,9.059601,50.108249')
const [fmt, setFmt] = useState('geojson')
const config = useConfig()
const {t} = useTranslation()
return (
<Page title="Export">
<Header as="h2">{t('ExportPage.title')}</Header>
<Message icon info>
<Icon name="info circle" />
<Message.Content>
<Markdown>{t('ExportPage.information')}</Markdown>
</Message.Content>
</Message>
<Form>
<Form.Field>
<label>{t('ExportPage.mode.label')}</label>
<Dropdown
placeholder={t('ExportPage.mode.placeholder')}
fluid
selection
options={MODES.map((value) => ({
key: value,
text: t(`ExportPage.mode.${value}`),
value,
}))}
value={mode}
onChange={(_e, {value}) => setMode(value)}
/>
</Form.Field>
<Form.Field>
<label>{t('ExportPage.format.label')}</label>
<Dropdown
placeholder={t('ExportPage.format.placeholder')}
fluid
selection
options={FORMATS.map((value) => ({
key: value,
text: t(`ExportPage.format.${value}`),
value,
}))}
value={fmt}
onChange={(_e, {value}) => setFmt(value)}
/>
</Form.Field>
<BoundingBoxSelector value={bbox} onChange={setBbox} />
<Button
primary
as="a"
href={`${config?.apiUrl}/export/${mode}?bbox=${bbox}&fmt=${fmt}`}
target="_blank"
rel="noreferrer noopener"
>
{t('ExportPage.export')}
</Button>
</Form>
</Page>
)
}

View file

@ -1,12 +0,0 @@
@import 'styles.less';
.welcomeMap {
height: 60rem;
max-height: 70vh;
position: relative;
@media @mobile {
margin: -35px -32px 0 -32px;
max-height: 70vh;
}
}

View file

@ -1,17 +1,19 @@
import React from 'react' import React from 'react'
import {Link} from 'react-router-dom' import {Grid, Loader, Header, Item} from 'semantic-ui-react'
import {Message, Grid, Loader, Header, Item} from 'semantic-ui-react'
import {useObservable} from 'rxjs-hooks' import {useObservable} from 'rxjs-hooks'
import {of, from} from 'rxjs' import {of, from} from 'rxjs'
import {map, switchMap} from 'rxjs/operators' import {map, switchMap} from 'rxjs/operators'
import {useTranslation} from 'react-i18next'
import api from 'api' import api from 'api'
import {Stats, Page, Map} from 'components' import {RegionStats, Stats, Page} from 'components'
import type {Track} from 'types'
import {TrackListItem} from './TracksPage' import {TrackListItem, NoPublicTracksMessage} from './TracksPage'
import styles from './HomePage.module.less'
function MostRecentTrack() { function MostRecentTrack() {
const {t} = useTranslation()
const track: Track | null = useObservable( const track: Track | null = useObservable(
() => () =>
of(null).pipe( of(null).pipe(
@ -24,12 +26,10 @@ function MostRecentTrack() {
return ( return (
<> <>
<Header as="h2">Most recent track</Header> <Header as="h2">{t('HomePage.mostRecentTrack')}</Header>
<Loader active={track === null} /> <Loader active={track === null} />
{track === undefined ? ( {track === undefined ? (
<Message> <NoPublicTracksMessage />
No public tracks yet. <Link to="/upload">Upload the first!</Link>
</Message>
) : track ? ( ) : track ? (
<Item.Group> <Item.Group>
<TrackListItem track={track} /> <TrackListItem track={track} />
@ -44,15 +44,13 @@ export default function HomePage() {
<Page> <Page>
<Grid stackable> <Grid stackable>
<Grid.Row> <Grid.Row>
<Grid.Column width={10}> <Grid.Column width={8}>
<div className={styles.welcomeMap}>
<Map />
</div>
</Grid.Column>
<Grid.Column width={6}>
<Stats /> <Stats />
<MostRecentTrack /> <MostRecentTrack />
</Grid.Column> </Grid.Column>
<Grid.Column width={8}>
<RegionStats />
</Grid.Column>
</Grid.Row> </Grid.Row>
</Grid> </Grid>
</Page> </Page>

View file

@ -4,16 +4,18 @@ import {Redirect, useLocation, useHistory} from 'react-router-dom'
import {Icon, Message} from 'semantic-ui-react' import {Icon, Message} from 'semantic-ui-react'
import {useObservable} from 'rxjs-hooks' import {useObservable} from 'rxjs-hooks'
import {switchMap, pluck, distinctUntilChanged} from 'rxjs/operators' import {switchMap, pluck, distinctUntilChanged} from 'rxjs/operators'
import {useTranslation} from 'react-i18next'
import {Page} from 'components' import {Page} from 'components'
import api from 'api' import api from 'api'
const LoginRedirectPage = connect((state) => ({loggedIn: Boolean(state.login)}))(function LoginRedirectPage({ const LoginRedirectPage = connect((state) => ({
loggedIn, loggedIn: Boolean(state.login),
}) { }))(function LoginRedirectPage({loggedIn}) {
const location = useLocation() const location = useLocation()
const history = useHistory() const history = useHistory()
const {search} = location const {search} = location
const {t} = useTranslation()
/* eslint-disable react-hooks/exhaustive-deps */ /* eslint-disable react-hooks/exhaustive-deps */
@ -35,14 +37,8 @@ const LoginRedirectPage = connect((state) => ({loggedIn: Boolean(state.login)}))
if (error) { if (error) {
return ( return (
<Page small> <Page small title={t('LoginRedirectPage.loginError')}>
<Message icon error> <LoginError errorText={errorDescription || error} />
<Icon name="warning sign" />
<Message.Content>
<Message.Header>Login error</Message.Header>
The login server reported: {errorDescription || error}.
</Message.Content>
</Message>
</Page> </Page>
) )
} }
@ -50,7 +46,21 @@ const LoginRedirectPage = connect((state) => ({loggedIn: Boolean(state.login)}))
return <ExchangeAuthCode code={code} /> return <ExchangeAuthCode code={code} />
}) })
function LoginError({errorText}: {errorText: string}) {
const {t} = useTranslation()
return (
<Message icon error>
<Icon name="warning sign" />
<Message.Content>
<Message.Header>{t('LoginRedirectPage.loginError')}</Message.Header>
{t('LoginRedirectPage.loginErrorText', {error: errorText})}
</Message.Content>
</Message>
)
}
function ExchangeAuthCode({code}) { function ExchangeAuthCode({code}) {
const {t} = useTranslation()
const result = useObservable( const result = useObservable(
(_$, args$) => (_$, args$) =>
args$.pipe( args$.pipe(
@ -68,8 +78,8 @@ function ExchangeAuthCode({code}) {
<Message icon info> <Message icon info>
<Icon name="circle notched" loading /> <Icon name="circle notched" loading />
<Message.Content> <Message.Content>
<Message.Header>Logging you in</Message.Header> <Message.Header>{t('LoginRedirectPage.loggingIn')}</Message.Header>
Hang tight... {t('LoginRedirectPage.hangTight')}
</Message.Content> </Message.Content>
</Message> </Message>
) )
@ -77,21 +87,14 @@ function ExchangeAuthCode({code}) {
content = <Redirect to="/" /> content = <Redirect to="/" />
} else { } else {
const {error, error_description: errorDescription} = result const {error, error_description: errorDescription} = result
content = ( content = <LoginError errorText={errorDescription || error} />
<>
<Message icon error>
<Icon name="warning sign" />
<Message.Content>
<Message.Header>Login error</Message.Header>
The login server reported: {errorDescription || error}.
</Message.Content>
</Message>
<pre>{JSON.stringify(result, null, 2)}</pre>
</>
)
} }
return <Page small>{content}</Page> return (
<Page small title="Login">
{content}
</Page>
)
} }
export default LoginRedirectPage export default LoginRedirectPage

View file

@ -1,53 +1,109 @@
import React from 'react' import React from 'react'
import _ from 'lodash' import _ from 'lodash'
import {connect} from 'react-redux' import {connect} from 'react-redux'
import {List, Select, Input, Divider, Checkbox, Header} from 'semantic-ui-react' import {Link} from 'react-router-dom'
import {List, Select, Input, Divider, Label, Checkbox, Header} from 'semantic-ui-react'
import {useTranslation} from 'react-i18next'
import { import {
MapConfig, MapConfig,
setMapConfigFlag as setMapConfigFlagAction, setMapConfigFlag as setMapConfigFlagAction,
initialState as defaultMapConfig, initialState as defaultMapConfig,
} from 'reducers/mapConfig' } from 'reducers/mapConfig'
import {colorByDistance, colorByCount, reds} from 'mapstyles' import {colorByDistance, colorByCount, viridisSimpleHtml} from 'mapstyles'
import {ColorMapLegend} from 'components' import {ColorMapLegend, DiscreteColorMapLegend} from 'components'
import styles from './styles.module.less'
const BASEMAP_STYLE_OPTIONS = [ const BASEMAP_STYLE_OPTIONS = ['positron', 'bright']
{value: 'positron', key: 'positron', text: 'Positron'},
{value: 'bright', key: 'bright', text: 'OSM Bright'},
]
const ROAD_ATTRIBUTE_OPTIONS = [ const ROAD_ATTRIBUTE_OPTIONS = [
{value: 'distance_overtaker_mean', key: 'distance_overtaker_mean', text: 'Overtaker distance mean'}, 'distance_overtaker_mean',
{value: 'distance_overtaker_min', key: 'distance_overtaker_min', text: 'Overtaker distance minimum'}, 'distance_overtaker_min',
{value: 'distance_overtaker_max', key: 'distance_overtaker_max', text: 'Overtaker distance maximum'}, 'distance_overtaker_max',
{value: 'distance_overtaker_median', key: 'distance_overtaker_median', text: 'Overtaker distance median'}, 'distance_overtaker_median',
{value: 'overtaking_event_count', key: 'overtaking_event_count', text: 'Event count'}, 'overtaking_event_count',
'usage_count',
'zone',
] ]
const DATE_FILTER_MODES = ['none', 'range', 'threshold']
type User = Object
function LayerSidebar({ function LayerSidebar({
mapConfig, mapConfig,
login,
setMapConfigFlag, setMapConfigFlag,
}: { }: {
login: User | null
mapConfig: MapConfig mapConfig: MapConfig
setMapConfigFlag: (flag: string, value: unknown) => void setMapConfigFlag: (flag: string, value: unknown) => void
}) { }) {
const {t} = useTranslation()
const { const {
baseMap: {style}, baseMap: {style},
obsRoads: {show: showRoads, showUntagged, attribute, maxCount}, obsRoads: {show: showRoads, showUntagged, attribute, maxCount},
obsEvents: {show: showEvents}, obsEvents: {show: showEvents},
obsRegions: {show: showRegions},
filters: {currentUser: filtersCurrentUser, dateMode, startDate, endDate, thresholdAfter},
} = mapConfig } = mapConfig
const openStreetMapCopyright = (
<List.Item className={styles.copyright}>
{t('MapPage.sidebar.copyright.openStreetMap')}{' '}
<Link to="/acknowledgements">{t('MapPage.sidebar.copyright.learnMore')}</Link>
</List.Item>
)
return ( return (
<div> <div>
<List relaxed> <List relaxed>
<List.Item> <List.Item>
<List.Header>Basemap Style</List.Header> <List.Header>{t('MapPage.sidebar.baseMap.style.label')}</List.Header>
<Select <Select
options={BASEMAP_STYLE_OPTIONS} options={BASEMAP_STYLE_OPTIONS.map((value) => ({
value,
key: value,
text: t(`MapPage.sidebar.baseMap.style.${value}`),
}))}
value={style} value={style}
onChange={(_e, {value}) => setMapConfigFlag('baseMap.style', value)} onChange={(_e, {value}) => setMapConfigFlag('baseMap.style', value)}
/> />
</List.Item> </List.Item>
{openStreetMapCopyright}
<Divider />
<List.Item>
<Checkbox
toggle
size="small"
id="obsRegions.show"
style={{float: 'right'}}
checked={showRegions}
onChange={() => setMapConfigFlag('obsRegions.show', !showRegions)}
/>
<label htmlFor="obsRegions.show">
<Header as="h4">{t('MapPage.sidebar.obsRegions.title')}</Header>
</label>
</List.Item>
{showRegions && (
<>
<List.Item>{t('MapPage.sidebar.obsRegions.colorByEventCount')}</List.Item>
<List.Item>
<ColorMapLegend
twoTicks
map={[
[0, '#00897B00'],
[5000, '#00897BFF'],
]}
digits={0}
/>
</List.Item>
<List.Item className={styles.copyright}>
{t('MapPage.sidebar.copyright.boundaries')}{' '}
<Link to="/acknowledgements">{t('MapPage.sidebar.copyright.learnMore')}</Link>
</List.Item>
</>
)}
<Divider /> <Divider />
<List.Item> <List.Item>
<Checkbox <Checkbox
@ -59,7 +115,7 @@ function LayerSidebar({
onChange={() => setMapConfigFlag('obsRoads.show', !showRoads)} onChange={() => setMapConfigFlag('obsRoads.show', !showRoads)}
/> />
<label htmlFor="obsRoads.show"> <label htmlFor="obsRoads.show">
<Header as="h4">Road segments</Header> <Header as="h4">{t('MapPage.sidebar.obsRoads.title')}</Header>
</label> </label>
</List.Item> </List.Item>
{showRoads && ( {showRoads && (
@ -68,14 +124,18 @@ function LayerSidebar({
<Checkbox <Checkbox
checked={showUntagged} checked={showUntagged}
onChange={() => setMapConfigFlag('obsRoads.showUntagged', !showUntagged)} onChange={() => setMapConfigFlag('obsRoads.showUntagged', !showUntagged)}
label="Include roads without data" label={t('MapPage.sidebar.obsRoads.showUntagged.label')}
/> />
</List.Item> </List.Item>
<List.Item> <List.Item>
<List.Header>Color based on</List.Header> <List.Header>{t('MapPage.sidebar.obsRoads.attribute.label')}</List.Header>
<Select <Select
fluid fluid
options={ROAD_ATTRIBUTE_OPTIONS} options={ROAD_ATTRIBUTE_OPTIONS.map((value) => ({
value,
key: value,
text: t(`MapPage.sidebar.obsRoads.attribute.${value}`),
}))}
value={attribute} value={attribute}
onChange={(_e, {value}) => setMapConfigFlag('obsRoads.attribute', value)} onChange={(_e, {value}) => setMapConfigFlag('obsRoads.attribute', value)}
/> />
@ -83,7 +143,7 @@ function LayerSidebar({
{attribute.endsWith('_count') ? ( {attribute.endsWith('_count') ? (
<> <>
<List.Item> <List.Item>
<List.Header>Maximum value</List.Header> <List.Header>{t('MapPage.sidebar.obsRoads.maxCount.label')}</List.Header>
<Input <Input
fluid fluid
type="number" type="number"
@ -92,14 +152,39 @@ function LayerSidebar({
/> />
</List.Item> </List.Item>
<List.Item> <List.Item>
<ColorMapLegend map={_.chunk(colorByCount('obsRoads.maxCount',mapConfig.obsRoads.maxCount, reds).slice(3), 2)} /> <ColorMapLegend
</List.Item></> map={_.chunk(
) : colorByCount('obsRoads.maxCount', mapConfig.obsRoads.maxCount, viridisSimpleHtml).slice(3),
( 2
<List.Item>
<ColorMapLegend map={_.chunk(colorByDistance('distance_overtaker')[3].slice(3), 2)} />
</List.Item>
)} )}
twoTicks
/>
</List.Item>
</>
) : attribute.endsWith('zone') ? (
<>
<List.Item>
<Label size="small" style={{background: 'blue', color: 'white'}}>
{t('general.zone.urban')} (1.5&nbsp;m)
</Label>
<Label size="small" style={{background: 'cyan', color: 'black'}}>
{t('general.zone.rural')}(2&nbsp;m)
</Label>
</List.Item>
</>
) : (
<>
<List.Item>
<List.Header>{_.upperFirst(t('general.zone.urban'))}</List.Header>
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][5].slice(2)} />
</List.Item>
<List.Item>
<List.Header>{_.upperFirst(t('general.zone.rural'))}</List.Header>
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][3].slice(2)} />
</List.Item>
</>
)}
{openStreetMapCopyright}
</> </>
)} )}
<Divider /> <Divider />
@ -113,16 +198,124 @@ function LayerSidebar({
onChange={() => setMapConfigFlag('obsEvents.show', !showEvents)} onChange={() => setMapConfigFlag('obsEvents.show', !showEvents)}
/> />
<label htmlFor="obsEvents.show"> <label htmlFor="obsEvents.show">
<Header as="h4">Event points</Header> <Header as="h4">{t('MapPage.sidebar.obsEvents.title')}</Header>
</label> </label>
</List.Item> </List.Item>
{showEvents && ( {showEvents && (
<> <>
<List.Item> <List.Item>
<ColorMapLegend map={_.chunk(colorByDistance('distance_overtaker')[3].slice(3), 2)} /> <List.Header>{_.upperFirst(t('general.zone.urban'))}</List.Header>
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][5].slice(2)} />
</List.Item>
<List.Item>
<List.Header>{_.upperFirst(t('general.zone.rural'))}</List.Header>
<DiscreteColorMapLegend map={colorByDistance('distance_overtaker')[3][3].slice(2)} />
</List.Item> </List.Item>
</> </>
)} )}
<Divider />
<List.Item>
<Header as="h4">{t('MapPage.sidebar.filters.title')}</Header>
</List.Item>
{login && (
<>
<List.Item>
<Header as="h5">{t('MapPage.sidebar.filters.userData')}</Header>
</List.Item>
<List.Item>
<Checkbox
toggle
size="small"
id="filters.currentUser"
checked={filtersCurrentUser}
onChange={() => setMapConfigFlag('filters.currentUser', !filtersCurrentUser)}
label={t('MapPage.sidebar.filters.currentUser')}
/>
</List.Item>
<List.Item>
<Header as="h5">{t('MapPage.sidebar.filters.dateRange')}</Header>
</List.Item>
<List.Item>
<Select
id="filters.dateMode"
options={DATE_FILTER_MODES.map((value) => ({
value,
key: value,
text: t(`MapPage.sidebar.filters.dateMode.${value}`),
}))}
value={dateMode ?? 'none'}
onChange={(_e, {value}) => setMapConfigFlag('filters.dateMode', value)}
/>
</List.Item>
{dateMode == 'range' && (
<List.Item>
<Input
type="date"
min="2000-01-03"
step="7"
size="small"
id="filters.startDate"
onChange={(_e, {value}) => setMapConfigFlag('filters.startDate', value)}
value={startDate ?? null}
label={t('MapPage.sidebar.filters.start')}
/>
</List.Item>
)}
{dateMode == 'range' && (
<List.Item>
<Input
type="date"
min="2000-01-03"
step="7"
size="small"
id="filters.endDate"
onChange={(_e, {value}) => setMapConfigFlag('filters.endDate', value)}
value={endDate ?? null}
label={t('MapPage.sidebar.filters.end')}
/>
</List.Item>
)}
{dateMode == 'threshold' && (
<List.Item>
<Input
type="date"
min="2000-01-03"
step="7"
size="small"
id="filters.startDate"
value={startDate ?? null}
onChange={(_e, {value}) => setMapConfigFlag('filters.startDate', value)}
label={t('MapPage.sidebar.filters.threshold')}
/>
</List.Item>
)}
{dateMode == 'threshold' && (
<List.Item>
<span>
{t('MapPage.sidebar.filters.before')}{' '}
<Checkbox
toggle
size="small"
checked={thresholdAfter ?? false}
onChange={() => setMapConfigFlag('filters.thresholdAfter', !thresholdAfter)}
id="filters.thresholdAfter"
/>{' '}
{t('MapPage.sidebar.filters.after')}
</span>
</List.Item>
)}
</>
)}
{!login && <List.Item>{t('MapPage.sidebar.filters.needsLogin')}</List.Item>}
</List> </List>
</div> </div>
) )
@ -136,6 +329,7 @@ export default connect(
(state as any).mapConfig as MapConfig (state as any).mapConfig as MapConfig
// //
), ),
login: state.login,
}), }),
{setMapConfigFlag: setMapConfigFlagAction} {setMapConfigFlag: setMapConfigFlagAction}
// //

View file

@ -0,0 +1,31 @@
import React from 'react'
import {createPortal} from 'react-dom'
import {useTranslation} from 'react-i18next'
import {List, Header, Icon, Button} from 'semantic-ui-react'
import styles from './styles.module.less'
export default function RegionInfo({region, mapInfoPortal, onClose}) {
const {t} = useTranslation()
const content = (
<>
<div className={styles.closeHeader}>
<Header as="h3">{region.properties.name || t('MapPage.regionInfo.unnamedRegion')}</Header>
<Button primary icon onClick={onClose}>
<Icon name="close" />
</Button>
</div>
<List>
<List.Item>
<List.Header>{t('MapPage.regionInfo.eventCount')}</List.Header>
<List.Content>{region.properties.overtaking_event_count ?? 0}</List.Content>
</List.Item>
</List>
</>
)
return content && mapInfoPortal
? createPortal(<div className={styles.mapInfoBox}>{content}</div>, mapInfoPortal)
: null
}

View file

@ -1,48 +1,75 @@
import React, {useState, useCallback} from 'react' import React, {useState, useCallback} from 'react'
import {createPortal} from 'react-dom'
import _ from 'lodash' import _ from 'lodash'
import {Segment, Menu, Header, Label, Icon, Table} from 'semantic-ui-react' import {Segment, Menu, Header, Label, Icon, Table, Message, Button} from 'semantic-ui-react'
import {Layer, Source} from 'react-map-gl' import {Layer, Source} from 'react-map-gl'
import {of, from, concat} from 'rxjs' import {of, from, concat} from 'rxjs'
import {useObservable} from 'rxjs-hooks' import {useObservable} from 'rxjs-hooks'
import {switchMap, distinctUntilChanged} from 'rxjs/operators' import {switchMap, distinctUntilChanged} from 'rxjs/operators'
import {Chart} from 'components'
import {pairwise} from 'utils'
import {useTranslation} from 'react-i18next'
import type {Location} from 'types'
import api from 'api' import api from 'api'
import {colorByDistance, borderByZone} from 'mapstyles'
import styles from './styles.module.less' import styles from './styles.module.less'
const UNITS = {distanceOvertaker: 'm', distanceStationary: 'm', speed: 'm/s'} function selectFromColorMap(colormap, value) {
const LABELS = {distanceOvertaker: 'Overtaker', distanceStationary: 'Stationary', speed: 'Speed'} let last = null
const ZONE_COLORS = {urban: 'olive', rural: 'brown', motorway: 'purple'} for (let i = 0; i < colormap.length; i += 2) {
const CARDINAL_DIRECTIONS = ['north', 'north-east', 'east', 'south-east', 'south', 'south-west', 'west', 'north-west'] if (colormap[i + 1] > value) {
const getCardinalDirection = (bearing) => return colormap[i]
bearing == null }
? 'unknown' }
: CARDINAL_DIRECTIONS[ return colormap[colormap.length - 1]
Math.floor(((bearing / 360.0) * CARDINAL_DIRECTIONS.length + 0.5) % CARDINAL_DIRECTIONS.length) }
] + ' bound'
const UNITS = {
distanceOvertaker: 'm',
distanceStationary: 'm',
speed: 'km/h',
}
const ZONE_COLORS = {urban: 'blue', rural: 'cyan', motorway: 'purple'}
const CARDINAL_DIRECTIONS = ['north', 'northEast', 'east', 'southEast', 'south', 'southWest', 'west', 'northWest']
const getCardinalDirection = (t, bearing) => {
if (bearing == null) {
return t('MapPage.roadInfo.cardinalDirections.unknown')
} else {
const n = CARDINAL_DIRECTIONS.length
const i = Math.floor(((bearing / 360.0) * n + 0.5) % n)
const name = CARDINAL_DIRECTIONS[i]
return t(`MapPage.roadInfo.cardinalDirections.${name}`)
}
}
function RoadStatsTable({data}) { function RoadStatsTable({data}) {
const {t} = useTranslation()
return ( return (
<Table size="small" compact> <Table size="small" compact>
<Table.Header> <Table.Header>
<Table.Row> <Table.Row>
<Table.HeaderCell>Property</Table.HeaderCell> <Table.HeaderCell textAlign="right"></Table.HeaderCell>
<Table.HeaderCell>n</Table.HeaderCell> {['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
<Table.HeaderCell>min</Table.HeaderCell> <Table.HeaderCell key={prop} textAlign="right">
<Table.HeaderCell>q50</Table.HeaderCell> {t(`MapPage.roadInfo.${prop}`)}
<Table.HeaderCell>max</Table.HeaderCell> </Table.HeaderCell>
<Table.HeaderCell>mean</Table.HeaderCell> ))}
<Table.HeaderCell>unit</Table.HeaderCell>
</Table.Row> </Table.Row>
</Table.Header> </Table.Header>
<Table.Body> <Table.Body>
{['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
<Table.Row key={prop}>
<Table.Cell>{LABELS[prop]}</Table.Cell>
{['count', 'min', 'median', 'max', 'mean'].map((stat) => ( {['count', 'min', 'median', 'max', 'mean'].map((stat) => (
<Table.Cell key={stat}>{data[prop]?.statistics?.[stat]?.toFixed(stat === 'count' ? 0 : 3)}</Table.Cell> <Table.Row key={stat}>
<Table.Cell> {t(`MapPage.roadInfo.${stat}`)}</Table.Cell>
{['distanceOvertaker', 'distanceStationary', 'speed'].map((prop) => (
<Table.Cell key={prop} textAlign="right">
{(data[prop]?.statistics?.[stat] * (prop === `speed` && stat != 'count' ? 3.6 : 1)).toFixed(
stat === 'count' ? 0 : 2
)}
{stat !== 'count' && ` ${UNITS[prop]}`}
</Table.Cell>
))} ))}
<Table.Cell>{UNITS[prop]}</Table.Cell>
</Table.Row> </Table.Row>
))} ))}
</Table.Body> </Table.Body>
@ -50,7 +77,91 @@ function RoadStatsTable({data}) {
) )
} }
export default function RoadInfo({clickLocation}) { function HistogramChart({bins, counts, zone}) {
const diff = bins[1] - bins[0]
const colortype = zone === 'rural' ? 3 : 5
const data = _.zip(
bins.slice(0, bins.length - 1).map((v) => v + diff / 2),
counts
).map((value) => ({
value,
itemStyle: {
color: selectFromColorMap(colorByDistance()[3][colortype].slice(2), value[0]),
},
}))
return (
<Chart
style={{height: 240}}
option={{
grid: {top: 30, bottom: 30, right: 30, left: 30},
xAxis: {
type: 'value',
axisLabel: {formatter: (v) => `${Math.round(v * 100)} cm`},
min: 0,
max: 2.5,
},
yAxis: {},
series: [
{
type: 'bar',
data,
barMaxWidth: 20,
},
],
}}
/>
)
}
interface ArrayStats {
statistics: {
count: number
mean: number
min: number
max: number
median: number
}
histogram: {
bins: number[]
counts: number[]
}
values: number[]
}
export interface RoadDirectionInfo {
bearing: number
distanceOvertaker: ArrayStats
distanceStationary: ArrayStats
speed: ArrayStats
}
export interface RoadInfoType {
road: {
way_id: number
zone: 'urban' | 'rural' | null
name: string
directionality: -1 | 0 | 1
oneway: boolean
geometry: Object
}
forwards: RoadDirectionInfo
backwards: RoadDirectionInfo
}
export default function RoadInfo({
roadInfo: info,
hasFilters,
onClose,
mapInfoPortal,
}: {
roadInfo: RoadInfoType
hasFilters: boolean
onClose: () => void
mapInfoPortal: HTMLElement
}) {
const {t} = useTranslation()
const [direction, setDirection] = useState('forwards') const [direction, setDirection] = useState('forwards')
const onClickDirection = useCallback( const onClickDirection = useCallback(
@ -62,70 +173,57 @@ export default function RoadInfo({clickLocation}) {
[setDirection] [setDirection]
) )
const info = useObservable( // TODO: change based on left-hand/right-hand traffic
(_$, inputs$) => const offsetDirection = info.road.oneway ? 0 : direction === 'forwards' ? 1 : -1
inputs$.pipe(
distinctUntilChanged(_.isEqual),
switchMap(([location]) =>
location
? concat(
of(null),
from(
api.get('/mapdetails/road', {
query: {
...location,
radius: 100,
},
})
)
)
: of(null)
)
),
null,
[clickLocation]
)
if (!clickLocation) { const content = (
return null
}
const loading = info == null
const offsetDirection = info?.road?.oneway ? 0 : direction === 'forwards' ? 1 : -1 // TODO: change based on left-hand/right-hand traffic
const content =
!loading && !info.road ? (
'No road found.'
) : (
<> <>
<Header as="h3">{loading ? '...' : info?.road.name || 'Unnamed way'}</Header> <div className={styles.closeHeader}>
<Header as="h3">{info?.road.name || t('MapPage.roadInfo.unnamedWay')}</Header>
<Button primary icon onClick={onClose}>
<Icon name="close" />
</Button>
</div>
{hasFilters && (
<Message info icon>
<Icon name="info circle" small />
<Message.Content>{t('MapPage.roadInfo.hintFiltersNotApplied')}</Message.Content>
</Message>
)}
{info?.road.zone && ( {info?.road.zone && (
<Label size="small" color={ZONE_COLORS[info?.road.zone]}> <Label size="small" color={ZONE_COLORS[info?.road.zone]}>
{info?.road.zone} {t(`general.zone.${info.road.zone}`)}
</Label> </Label>
)} )}
{info?.road.oneway && ( {info?.road.oneway && (
<Label size="small" color="blue"> <Label size="small" color="blue">
<Icon name="long arrow alternate right" fitted /> oneway <Icon name="long arrow alternate right" fitted /> {t('MapPage.roadInfo.oneway')}
</Label> </Label>
)} )}
{info?.road.oneway ? null : ( {info?.road.oneway ? null : (
<Menu size="tiny" fluid secondary> <Menu size="tiny" pointing>
<Menu.Item header>Direction</Menu.Item> <Menu.Item header>{t('MapPage.roadInfo.direction')}</Menu.Item>
<Menu.Item name="forwards" active={direction === 'forwards'} onClick={onClickDirection}> <Menu.Item name="forwards" active={direction === 'forwards'} onClick={onClickDirection}>
{getCardinalDirection(info?.forwards?.bearing)} {getCardinalDirection(t, info?.forwards?.bearing)}
</Menu.Item> </Menu.Item>
<Menu.Item name="backwards" active={direction === 'backwards'} onClick={onClickDirection}> <Menu.Item name="backwards" active={direction === 'backwards'} onClick={onClickDirection}>
{getCardinalDirection(info?.backwards?.bearing)} {getCardinalDirection(t, info?.backwards?.bearing)}
</Menu.Item> </Menu.Item>
</Menu> </Menu>
)} )}
{info?.[direction] && <RoadStatsTable data={info[direction]} />} {info?.[direction] && <RoadStatsTable data={info[direction]} />}
{info?.[direction]?.distanceOvertaker?.histogram && (
<>
<Header as="h5">{t('MapPage.roadInfo.overtakerDistanceDistribution')}</Header>
<HistogramChart {...info[direction]?.distanceOvertaker?.histogram} />
</>
)}
</> </>
) )
@ -156,11 +254,7 @@ export default function RoadInfo({clickLocation}) {
</Source> </Source>
)} )}
{content && ( {content && mapInfoPortal && createPortal(<div className={styles.mapInfoBox}>{content}</div>, mapInfoPortal)}
<div className={styles.mapInfoBox}>
<Segment loading={loading}>{content}</Segment>
</div>
)}
</> </>
) )
} }

Some files were not shown because too many files have changed in this diff Show more