Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: update Django to 4.2 LTS #1545

Closed
wants to merge 26 commits into from
Closed
Show file tree
Hide file tree
Changes from 23 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
6e591c0
add: new vegetation index
NtskwK Nov 13, 2023
0655a48
Merge branch 'OpenDroneMap:master' into master
NtskwK Nov 22, 2023
426c62d
Merge branch 'OpenDroneMap:master' into master
NtskwK Apr 6, 2024
1567a8b
Merge branch 'OpenDroneMap:master' into master
NtskwK Apr 25, 2024
cac4eeb
Merge branch 'OpenDroneMap:master' into master
NtskwK Apr 30, 2024
f0213e7
Merge branch 'OpenDroneMap:master' into master
NtskwK May 9, 2024
a3ab91f
Merge branch 'OpenDroneMap:master' into master
NtskwK Jun 2, 2024
62c33ec
Merge branch 'OpenDroneMap:master' into master
NtskwK Jul 20, 2024
37e1619
Merge branch 'OpenDroneMap:master' into master
NtskwK Aug 19, 2024
e2e4fb3
style: Update scss style to fit webpack in new version
NtskwK Aug 19, 2024
50c70de
Merge branch 'master' of github.com:NtskwK/WebODM
NtskwK Aug 19, 2024
edb3b99
feat: add .vscode to .gitignore
NtskwK Aug 19, 2024
844297b
feat: remove version tag from docker-compose files
NtskwK Aug 19, 2024
197e72c
feat: update postgresql to 16.1
NtskwK Aug 19, 2024
e1656cd
feat: update webpack config for hash filename
NtskwK Aug 19, 2024
83ab170
refactor: update celery loglevel in worker.sh for new performance
NtskwK Aug 19, 2024
d694ed8
refactor: ingore authentication.py for rest_framework_jwt.authentication
NtskwK Aug 19, 2024
b103c57
feat: add GDAL version check and makemigrations in start.sh
NtskwK Aug 19, 2024
9e1c5b8
refactor: change Docker base image to Ubuntu 20.04
NtskwK Aug 25, 2024
040392f
chore: update rio_tiler to 6.6.1
NtskwK Aug 25, 2024
5289cd1
refactor: update Django to 4.2 LTS
NtskwK Aug 25, 2024
f32feae
feat: create postgis_raster extensions with Django
NtskwK Aug 25, 2024
bda999d
Merge branch 'OpenDroneMap:master' into master
NtskwK Aug 25, 2024
44fc1ba
fix: add slash behind path routes
NtskwK Aug 27, 2024
fe39546
refactor: update Docker base image to Ubuntu 22.04
NtskwK Aug 27, 2024
dc6a15b
refactor: update docker actions
NtskwK Aug 27, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,7 @@ node_modules/
webpack-stats.json
pip-selfcheck.json
.idea/
.vscode/
package-lock.json
.cronenv
.initialized
Expand Down
54 changes: 38 additions & 16 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,33 +1,55 @@
FROM ubuntu:21.04
FROM ubuntu:20.04
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curious about the move from 21 to 20 here. Due to the upcoming EOL of 20.04 would it make sense to work on bumping this to 22.04 or higher?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Version of ubuntu in Docker webodm_db and webodm_webapp should be same to ensure apt will install some package in the same version. Maybe I should to update both?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. The base image is Ubuntu 22.04 now, because PDAL can not be found by apt in Ubuntu 24.04.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah 24 is still fairly new but 22 has plenty of life left in it.

MAINTAINER Piero Toffanin <pt@masseranolabs.com>

ARG TEST_BUILD
ARG DEBIAN_FRONTEND=noninteractive
ENV PYTHONUNBUFFERED 1
ENV PYTHONPATH $PYTHONPATH:/webodm
ENV PROJ_LIB=/usr/share/proj
ENV PYTHONUNBUFFERED=1
ENV PYTHONPATH=$PYTHONPATH:/webodm
ENV NODE_MAJOR=20
ENV PYTHON_MAJOR=3.9
ENV GDAL_VERSION=3.8.5
ENV LD_LIBRARY_PATH=/usr/local/lib

# Prepare directory
ADD . /webodm/
WORKDIR /webodm

# Use old-releases for 21.04
RUN printf "deb http://old-releases.ubuntu.com/ubuntu/ hirsute main restricted\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute-updates main restricted\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute universe\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute-updates universe\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute multiverse\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute-updates multiverse\ndeb http://old-releases.ubuntu.com/ubuntu/ hirsute-backports main restricted universe multiverse" > /etc/apt/sources.list
RUN apt-get -o Acquire::Retries=3 -qq update > /dev/null && \
apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends wget curl git g++ clang make cmake postgresql-client > /dev/null && \

# Install Node.js using new Node install method
RUN apt-get -qq update && apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends wget curl && \
apt-get -o Acquire::Retries=3 install -y ca-certificates gnupg && \
# Install PDAL, letsencrypt, psql, cron
apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends binutils pdal certbot gettext tzdata libproj-dev libpq-dev > /dev/null && \

# Install Python in target version
apt-get -qq autoremove -y python3 && \
apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends python$PYTHON_MAJOR-dev python$PYTHON_MAJOR-full && \
ln -s /usr/bin/python$PYTHON_MAJOR /usr/bin/python && \
curl https://bootstrap.pypa.io/get-pip.py | python && \

echo $(pip -V) && \
echo $(python -V) && \

# Build GDAL from source
wget --no-check-certificate -q https://github.com/OSGeo/gdal/releases/download/v$GDAL_VERSION/gdal-$GDAL_VERSION.tar.gz && \
tar -xzf gdal-$GDAL_VERSION.tar.gz && \
cd gdal-$GDAL_VERSION && mkdir build && cd build && \
cmake .. && cmake --build . -j$(nproc) --target install && \
cd / && rm -rf gdal-$GDAL_VERSION gdal-$GDAL_VERSION.tar.gz && \

# Install pip reqs
cd /webodm && \
pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple && \
pip install --quiet -U pip && \
pip install --quiet -r requirements.txt "boto3==1.34.145" && \

# Install Node.js using new Node install method
apt-get -o Acquire::Retries=3 -qq install -y ca-certificates gnupg && \
mkdir -p /etc/apt/keyrings && \
curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg && \
NODE_MAJOR=20 && \
echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_$NODE_MAJOR.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list && \
apt-get -o Acquire::Retries=3 -qq update && apt-get -o Acquire::Retries=3 -qq install -y nodejs && \
# Install Python3, GDAL, PDAL, nginx, letsencrypt, psql
apt-get -o Acquire::Retries=3 -qq update && apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends python3 python3-pip python3-setuptools python3-wheel git g++ python3-dev python2.7-dev libpq-dev binutils libproj-dev gdal-bin pdal libgdal-dev python3-gdal nginx certbot gettext-base cron postgresql-client-13 gettext tzdata && \
update-alternatives --install /usr/bin/python python /usr/bin/python2.7 1 && update-alternatives --install /usr/bin/python python /usr/bin/python3.9 2 && \
# Install pip reqs
pip install pip==24.0 && pip install -r requirements.txt "boto3==1.14.14" && \
# Setup cron
apt-get -o Acquire::Retries=3 -qq install -y --no-install-recommends nginx cron && \
ln -s /webodm/nginx/crontab /var/spool/cron/crontabs/root && chmod 0644 /webodm/nginx/crontab && service cron start && chmod +x /webodm/nginx/letsencrypt-autogen.sh && \
/webodm/nodeodm/setup.sh && /webodm/nodeodm/cleanup.sh && cd /webodm && \
npm install --quiet -g webpack@5.89.0 && npm install --quiet -g webpack-cli@5.1.4 && npm install --quiet && webpack --mode production && \
Expand All @@ -36,7 +58,7 @@ RUN apt-get -qq update && apt-get -o Acquire::Retries=3 -qq install -y --no-inst
python manage.py rebuildplugins && \
python manage.py translate build --safe && \
# Cleanup
apt-get remove -y g++ python3-dev libpq-dev && apt-get autoremove -y && \
apt-get remove -y g++ python2 && apt-get autoremove -y && \
apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* && \
rm /webodm/webodm/secret_key.py

Expand Down
19 changes: 9 additions & 10 deletions app/admin.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,10 @@
import zipfile
import shutil

from django.conf.urls import url
from django.contrib import admin
from django.contrib import messages
from django.http import HttpResponseRedirect
from django.urls import reverse
from django.urls import reverse, path, re_path
from django.utils.html import format_html
from guardian.admin import GuardedModelAdmin
from django.contrib.auth.admin import UserAdmin as BaseUserAdmin
Expand Down Expand Up @@ -129,23 +128,23 @@ def author(self, obj):
def get_urls(self):
urls = super().get_urls()
custom_urls = [
url(
r'^(?P<plugin_name>.+)/enable/$',
re_path(
'(?P<plugin_name>.+)/enable/',
self.admin_site.admin_view(self.plugin_enable),
name='plugin-enable',
),
url(
r'^(?P<plugin_name>.+)/disable/$',
re_path(
'(?P<plugin_name>.+)/disable/',
self.admin_site.admin_view(self.plugin_disable),
name='plugin-disable',
),
url(
r'^(?P<plugin_name>.+)/delete/$',
re_path(
'(?P<plugin_name>.+)/delete/',
self.admin_site.admin_view(self.plugin_delete),
name='plugin-delete',
),
url(
r'^actions/upload/$',
re_path(
'actions/upload/',
self.admin_site.admin_view(self.plugin_upload),
name='plugin-upload',
),
Expand Down
14 changes: 8 additions & 6 deletions app/api/authentication.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
from rest_framework_jwt.authentication import BaseJSONWebTokenAuthentication


class JSONWebTokenAuthenticationQS(BaseJSONWebTokenAuthentication):
def get_jwt_value(self, request):
return request.query_params.get('jwt')
# from rest_framework_jwt.authentication import BaseJSONWebTokenAuthentication
#
# jwt_decode_handler = api_settings.JWT_DECODE_HANDLER
# jwt_get_username_from_payload_handler = api_settings.JWT_PAYLOAD_GET_USERNAME_HANDLER
#
# class JSONWebTokenAuthenticationQS(BaseJSONWebTokenAuthentication):
# def get_jwt_value(self, request):
# return request.query_params.get('jwt')
53 changes: 53 additions & 0 deletions app/api/formulas.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,59 @@
'help': _('Temperature in Centikelvin degrees.')
},


# new vegetation index

'SIPI': {
'expr': '(N - B) / (N - R)',
'help': _('Structure-Insensitive Pigment Index.')
},
'mNDVI': {
'expr': '(N - Re) / (N + Re - 2 * B)',
'help': _('Modified Normalized Difference Vegetation Index.')
},
'RENDVI': {
'expr': '(N - Re) / (N + Re)',
'help': _('Ratio Enhanced Normalized Difference Vegetation Index.')
},
'MTVI_1': {
'expr': '1.2 * (1.2 * (N - G) - 2.5 * (R - G))',
'help': _('Modified Transformed Vegetation Index 1.')
},
'MTVI_2': {
'expr': '1.5 * (1.2 * (N - G) - 2.5(R - G)) / (((2 * N + 1) ** 2 - (6 * N - 5 * (R0 ** 0.5)) - 0.5) ** 0.5)',
'help': _('Modified Transformed Vegetation Index 2.')
},
'GI': {
'expr': 'G / R',
'help': _('Greenness Index.'),
},
'TVI': {
'expr': '0.5 * (120 * (N - G) - 200 * (R - G))',
'help': _('Transparency Vegetation Index.')
},
'MCARI_1': {
'expr': '1.2 * (2.5 * (N - R) - 1.3 * (N - G))',
'help': _('Modified Chlorophyll Absorption Reflectance Index 1.')
},
'CI': {
'expr': 'N / G - 1',
'help': _('Chlorophyll Index.')
},
'SR': {
'expr': 'N / R',
'help': _('Simple Ratio.')
},
'mSR_2': {
'expr': '(N - Re) / (N + Re)',
'help': _('Modified Simple Ratio.')
},
'IPVI': {
'expr': 'N / (N + R)',
'help': _('Improved Perpendicular Vegetation Index.')
},


# more?

'_TESTRB': {
Expand Down
34 changes: 18 additions & 16 deletions app/api/tiler.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,11 @@
from rio_tiler.errors import TileOutsideBounds
from rio_tiler.utils import has_alpha_band, \
non_alpha_indexes, render, create_cutline
from rio_tiler.utils import _stats as raster_stats
from rio_tiler.models import ImageStatistics, ImageData
from rio_tiler.models import Metadata as RioMetadata
from rio_tiler.utils import get_array_statistics
from rio_tiler.models import BandStatistics
from rio_tiler.profiles import img_profiles
from rio_tiler.colormap import cmap as colormap, apply_cmap
from rio_tiler.io import COGReader
from rio_tiler.io import Reader
from rio_tiler.errors import InvalidColorMapName, AlphaBandWarning
import numpy as np
from .custom_colormaps_helper import custom_colormaps
Expand Down Expand Up @@ -97,7 +96,7 @@ def get(self, request, pk=None, project_pk=None, tile_type=""):
if not os.path.isfile(raster_path):
raise exceptions.NotFound()

with COGReader(raster_path) as src:
with Reader(raster_path) as src:
minzoom, maxzoom = get_zoom_safe(src)

return Response({
Expand Down Expand Up @@ -164,7 +163,7 @@ def get(self, request, pk=None, project_pk=None, tile_type=""):
if not os.path.isfile(raster_path):
raise exceptions.NotFound()
try:
with COGReader(raster_path) as src:
with Reader(raster_path) as src:
band_count = src.dataset.meta['count']
if boundaries_feature is not None:
boundaries_cutline = create_cutline(src.dataset, boundaries_feature, CRS.from_string('EPSG:4326'))
Expand All @@ -187,18 +186,20 @@ def get(self, request, pk=None, project_pk=None, tile_type=""):
data = np.ma.array(data)
data.mask = mask == 0
stats = {
str(b + 1): raster_stats(data[b], percentiles=(pmin, pmax), bins=255, range=hrange)
str(b + 1): get_array_statistics(data[b], percentiles=(pmin, pmax), bins=255, range=hrange)
for b in range(data.shape[0])
}
stats = {b: ImageStatistics(**s) for b, s in stats.items()}
metadata = RioMetadata(statistics=stats, **src.info().dict())
stats = {b: BandStatistics(**s[0]) for b, s in stats.items()}
else:
if (boundaries_cutline is not None) and (boundaries_bbox is not None):
metadata = src.metadata(pmin=pmin, pmax=pmax, hist_options=histogram_options, nodata=nodata
, bounds=boundaries_bbox, vrt_options={'cutline': boundaries_cutline})
stats = src.statistics(percentiles=(pmin, pmax), hist_options=histogram_options, nodata=nodata,
vrt_options={'cutline': boundaries_cutline})
else:
metadata = src.metadata(pmin=pmin, pmax=pmax, hist_options=histogram_options, nodata=nodata)
info = json.loads(metadata.json())
stats = src.statistics(percentiles=(pmin, pmax), hist_options=histogram_options, nodata=nodata)
info = src.info().model_dump()
info['statistics']= {}
for k,v in stats.items():
info['statistics'][k] = v.model_dump()
except IndexError as e:
# Caught when trying to get an invalid raster metadata
raise exceptions.ValidationError("Cannot retrieve raster metadata: %s" % str(e))
Expand All @@ -207,8 +208,9 @@ def get(self, request, pk=None, project_pk=None, tile_type=""):
for b in info['statistics']:
info['statistics'][b]['min'] = hrange[0]
info['statistics'][b]['max'] = hrange[1]
info['statistics'][b]['percentiles'][0] = max(hrange[0], info['statistics'][b]['percentiles'][0])
info['statistics'][b]['percentiles'][1] = min(hrange[1], info['statistics'][b]['percentiles'][1])
info['statistics'][b]['percentiles'] = [None] * 2
info['statistics'][b]['percentiles'][0] = max(hrange[0], info['statistics'][b]['percentile_' + str(int(pmax))])
info['statistics'][b]['percentiles'][1] = min(hrange[1], info['statistics'][b]['percentile_' + str(int(pmin))])

cmap_labels = {
"viridis": "Viridis",
Expand Down Expand Up @@ -351,7 +353,7 @@ def get(self, request, pk=None, project_pk=None, tile_type="", z="", x="", y="",
if not os.path.isfile(url):
raise exceptions.NotFound()

with COGReader(url) as src:
with Reader(url) as src:
if not src.tile_exists(z, x, y):
raise exceptions.NotFound(_("Outside of bounds"))

Expand Down
56 changes: 28 additions & 28 deletions app/api/urls.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from django.conf.urls import url, include
from django.urls import include, re_path, path

from app.api.presets import PresetViewSet
from app.plugins.views import api_view_handler
Expand All @@ -8,7 +8,8 @@
from .processingnodes import ProcessingNodeViewSet, ProcessingNodeOptionsView
from .admin import AdminUserViewSet, AdminGroupViewSet, AdminProfileViewSet
from rest_framework_nested import routers
from rest_framework_jwt.views import obtain_jwt_token
from rest_framework_simplejwt.views import TokenObtainPairView, TokenRefreshView

from .tiler import TileJson, Bounds, Metadata, Tiles, Export
from .potree import Scene, CameraView
from .workers import CheckTask, GetTaskResult
Expand All @@ -27,44 +28,43 @@
admin_router = routers.DefaultRouter()
admin_router.register(r'admin/users', AdminUserViewSet, basename='admin-users')
admin_router.register(r'admin/groups', AdminGroupViewSet, basename='admin-groups')
admin_router.register(r'admin/profiles', AdminProfileViewSet, basename='admin-groups')
admin_router.register(r'admin/profiles', AdminProfileViewSet, basename='admin-profiles')

urlpatterns = [
url(r'processingnodes/options/$', ProcessingNodeOptionsView.as_view()),
re_path(r'processingnodes/options/$', ProcessingNodeOptionsView.as_view()),

url(r'^', include(router.urls)),
url(r'^', include(tasks_router.urls)),
url(r'^', include(admin_router.urls)),
re_path(r'^', include(router.urls)),
re_path(r'^', include(tasks_router.urls)),
re_path(r'^', include(admin_router.urls)),

url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles\.json$', TileJson.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/bounds$', Bounds.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/metadata$', Metadata.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles/(?P<z>[\d]+)/(?P<x>[\d]+)/(?P<y>[\d]+)\.?(?P<ext>png|jpg|webp)?$', Tiles.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles/(?P<z>[\d]+)/(?P<x>[\d]+)/(?P<y>[\d]+)@(?P<scale>[\d]+)x\.?(?P<ext>png|jpg|webp)?$', Tiles.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<asset_type>orthophoto|dsm|dtm|georeferenced_model)/export$', Export.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles\.json', TileJson.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/bounds', Bounds.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/metadata', Metadata.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles/(?P<z>[\d]+)/(?P<x>[\d]+)/(?P<y>[\d]+)\.?(?P<ext>png|jpg|webp)?', Tiles.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<tile_type>orthophoto|dsm|dtm)/tiles/(?P<z>[\d]+)/(?P<x>[\d]+)/(?P<y>[\d]+)@(?P<scale>[\d]+)x\.?(?P<ext>png|jpg|webp)?', Tiles.as_view()),
re_path('projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/(?P<asset_type>orthophoto|dsm|dtm|georeferenced_model)/export', Export.as_view()),

url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/download/(?P<asset>.+)$', TaskDownloads.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/assets/(?P<unsafe_asset_path>.+)$', TaskAssets.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/import$', TaskAssetsImport.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/backup$', TaskBackup.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/images/thumbnail/(?P<image_filename>.+)$', Thumbnail.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/images/download/(?P<image_filename>.+)$', ImageDownload.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/download/(?P<asset>.+)$', TaskDownloads.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/assets/(?P<unsafe_asset_path>.+)$', TaskAssets.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/import$', TaskAssetsImport.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/backup$', TaskBackup.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/images/thumbnail/(?P<image_filename>.+)$', Thumbnail.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/images/download/(?P<image_filename>.+)$', ImageDownload.as_view()),

url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/3d/scene$', Scene.as_view()),
url(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/3d/cameraview$', CameraView.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/3d/scene$', Scene.as_view()),
re_path(r'projects/(?P<project_pk>[^/.]+)/tasks/(?P<pk>[^/.]+)/3d/cameraview$', CameraView.as_view()),

url(r'workers/check/(?P<celery_task_id>.+)', CheckTask.as_view()),
url(r'workers/get/(?P<celery_task_id>.+)', GetTaskResult.as_view()),
re_path(r'workers/check/(?P<celery_task_id>.+)', CheckTask.as_view()),
re_path(r'workers/get/(?P<celery_task_id>.+)', GetTaskResult.as_view()),

url(r'^auth/', include('rest_framework.urls')),
url(r'^token-auth/', obtain_jwt_token),
path(r'auth', include('rest_framework.urls')),

url(r'^plugins/(?P<plugin_name>[^/.]+)/(.*)$', api_view_handler),
re_path('plugins/(?P<plugin_name>[^/.]+)/(.*)', api_view_handler),
]

if settings.ENABLE_USERS_API:
urlpatterns.append(url(r'users', UsersList.as_view()))
urlpatterns.append(path(r'users', UsersList.as_view()))

if settings.EXTERNAL_AUTH_ENDPOINT != '':
urlpatterns.append(url(r'^external-token-auth/', ExternalTokenAuth.as_view()))
urlpatterns.append(path(r'external-token-auth', ExternalTokenAuth.as_view()))

Loading