-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cropsar_px process migration #78
base: main
Are you sure you want to change the base?
Conversation
However, I am not sure if including udf-dependencies within the UDF itself is the best solution. As I noticed that the service will then take much more time to run, I guess that is because it is trying to install everything on the fly for each chunk. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just did superficial check
"from_parameter": "data" | ||
}, | ||
"runtime": "Python", | ||
"udf": "#!/usr/bin/env python\nimport contextlib as __stickytape_contextlib\n\n@__stickytape_contextlib.contextmanager\ndef __stickytape_temporary_dir():\n import tempfile\n import shutil\n dir_path = tempfile.mkdtemp()\n try:\n yield dir_path\n finally:\n shutil.rmtree(dir_path)\n\nwith __stickytape_temporary_dir() as __stickytape_working_dir:\n def __stickytape_write_module(path, contents):\n import os, os.path\n\n def make_package(path):\n parts = path.split(\"/\")\n partial_path = __stickytape_working_dir\n for part in parts:\n partial_path = os.path.join(partial_path, part)\n if not os.path.exists(partial_path):\n os.mkdir(partial_path)\n with open(os.path.join(partial_path, \"__init__.py\"), \"wb\") as f:\n f.write(b\"\\n\")\n\n make_package(os.path.dirname(path))\n\n full_path = os.path.join(__stickytape_working_dir, path)\n with open(full_path, \"wb\") as module_file:\n module_file.write(contents)\n\n import sys as __stickytape_sys\n __stickytape_sys.path.insert(0, __stickytape_working_dir)\n\n __stickytape_write_module('cropsar_px_openeo/__init__.py', b'from cropsar_px_openeo.config.config import Config\\r\\n\\r\\nconfig = Config()\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/config/__init__.py', b'')\n __stickytape_write_module('cropsar_px_openeo/config/config.py', b'import configparser\\r\\nimport os\\r\\nfrom logging import Logger\\r\\nfrom pathlib import Path\\r\\n\\r\\n\\r\\nclass Config:\\r\\n def __init__(self, environment=os.environ.get(\"CROPSAR_PX_ENV\", \"cdse\")):\\r\\n self.logger = Logger(\"configparser\")\\r\\n self.config = self._load_config(environment=environment)\\r\\n\\r\\n def _get_config_path(self, environment: str) -> Path:\\r\\n \"\"\"\\r\\n Retrieve the full path of the configuration file for a specific environment.\\r\\n :param environment: Name of the environment for which to load the config.\\r\\n :return: Path variable pointing to the configuration of the selected environment\\r\\n \"\"\"\\r\\n return Path(__file__).parent.parent.parent.parent / \"config\" / f\"{environment}.cfg\"\\r\\n\\r\\n def _load_config(self, environment: str) -> configparser.ConfigParser:\\r\\n \"\"\"\\r\\n Load in the config file\\r\\n :param environment: String representing the environment for which to load the config\\r\\n :return:\\r\\n \"\"\"\\r\\n path = self._get_config_path(environment=environment)\\r\\n if path.exists():\\r\\n config = configparser.ConfigParser()\\r\\n config.read(path)\\r\\n self.logger.info(f\"Readed configuration from {path}\")\\r\\n else:\\r\\n config = None\\r\\n self.logger.warning(\\r\\n f\"Could not find config file for environment {environment}, \" f\"please create a file at {path}\"\\r\\n )\\r\\n return config\\r\\n\\r\\n def get_collection_id(self, collection_type: str) -> str:\\r\\n \"\"\"\\r\\n Retrieve the ID of the collection based on the provided type. This will be read from the [collections] section\\r\\n within the configuration\\r\\n :param collection_type: Type of the collection for which to get the ID\\r\\n :return: String representing the ID of the openEO collection\\r\\n \"\"\"\\r\\n self.logger.debug(f\"Reading collection {collection_type} from configuration\")\\r\\n return self.config[\"collections\"][collection_type]\\r\\n\\r\\n def get_openeo_url(self) -> str:\\r\\n \"\"\"\\r\\n Read the openEO URL from the config file\\r\\n :return: URL of the openEO endpoint\\r\\n \"\"\"\\r\\n self.logger.debug(\"Reading openEO endpoint from configuration\")\\r\\n return self.config[\"openeo\"][\"url\"]\\r\\n\\r\\n def get_openeo_credentials(self) -> dict:\\r\\n \"\"\"\\r\\n Read the openEO credentials from the config file\\r\\n :return: Dictionary containing the \\'client_id\\', \\'client_secret\\' and \\'provider\\' that can be used to authenticate\\r\\n with openEO\\r\\n \"\"\"\\r\\n self.logger.debug(\"Reading openEO endpoint from configuration\")\\r\\n return {\\r\\n \"id\": self.config[\"openeo\"][\"client_id\"],\\r\\n \"secret\": self.config[\"openeo\"][\"client_secret\"],\\r\\n \"provider\": self.config[\"openeo\"][\"provider\"],\\r\\n }\\r\\n\\r\\n @staticmethod\\r\\n def _get_namespace_url(url: str, namespace: str, service: str):\\r\\n \"\"\"\\r\\n Create the namespace URL to access a certain service from a namespace\\r\\n :param url: Base openEO URL\\r\\n :param namespace: Name of the namespace\\r\\n :param service: ID of the service\\r\\n :return: String representing the URL on which the service is accessible\\r\\n \"\"\"\\r\\n return f\"https://{url}/openeo/processes/{namespace}/{service}\"\\r\\n\\r\\n def get_service_info(self, service: str) -> dict:\\r\\n \"\"\"\\r\\n Read the `id` and `namespace` of a service from the config file\\r\\n :param service: Name of the service from with to read the information\\r\\n :return: Dictionary containing the `id` and `namespace` of the service\\r\\n \"\"\"\\r\\n self.logger.debug(f\"Looking up service information for {service}\")\\r\\n key = f\"service_{service}\"\\r\\n return {\\r\\n \"id\": self.config[key][\"id\"],\\r\\n \"namespace\": self._get_namespace_url(\\r\\n url=self.config[\"openeo\"][\"url\"],\\r\\n namespace=self.config[key][\"namespace\"],\\r\\n service=self.config[key][\"id\"],\\r\\n ),\\r\\n }\\r\\n\\r\\n def get_udp_info(self) -> dict:\\r\\n \"\"\"\\r\\n Return the name and namespace of the UDP process to use for the CropSAR service\\r\\n :return: Dictionary containing the `id` and `namespace` of the service\\r\\n \"\"\"\\r\\n self.logger.debug(\"Looking up the UDP process name\")\\r\\n return {\\r\\n \"id\": self.config[\"udp\"][\"process\"],\\r\\n \"namespace\": self._get_namespace_url(\\r\\n url=self.config[\"openeo\"][\"url\"],\\r\\n namespace=self.config[\"udp\"][\"namespace\"],\\r\\n service=self.config[\"udp\"][\"process\"],\\r\\n ),\\r\\n }\\r\\n\\r\\n def get_udp_summary(self) -> str:\\r\\n \"\"\"\\r\\n Return the summary of the UDP process to use for the CropSAR service\\r\\n :return: String representing the summary of the CropSAR s ervice\\r\\n \"\"\"\\r\\n self.logger.debug(\"Looking up the UDP summary\")\\r\\n return self.config[\"udp\"][\"summary\"]\\r\\n\\r\\n def get_udf_archives(self) -> list:\\r\\n \"\"\"\\r\\n Return the list of archives that should be included when executing the CropSAR UDP\\r\\n :return: List of UDF archives\\r\\n \"\"\"\\r\\n self.logger.debug(\"Looking up the UDF archives\")\\r\\n return [f\"{self.config[\\'udp\\'][\\'udf_archive\\']}#tmp/env/venv_cropsar\"]\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/types.py', b'from typing import Literal\\r\\n\\r\\nOutput = Literal[\"NDVI\", \"FAPAR\", \"FCOVER\", \"RGB_NIR\"]\\r\\nOrbitDirection = Literal[\"ASCENDING\", \"DESCENDING\"]\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/__init__.py', b'')\n __stickytape_write_module('cropsar_px_openeo/udf/constants.py', b'TEMPORAL_BUFFER = 80\\r\\nSPATIAL_WINDOW_SIZE = 128\\r\\nSPATIAL_RESOLUTION = 10\\r\\nTEMPORAL_FREQUENCY = \"5D\"\\r\\n\\r\\nDEPENDENCY_ZIP = \"https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/2.0.0/venv_cropsar.zip\"\\r\\nDOWNLOAD_CHUNK_SIZE = 1024 * 1024\\r\\nDOWNLOAD_TIMEOUT = 60\\r\\n\\r\\nSENTINEL1_BANDS = (\"VH\", \"VV\")\\r\\nORBIT_DIRECTIONS = (\"ASCENDING\", \"DESCENDING\")\\r\\n\\r\\nSENTINEL2_BANDS = {\"NDVI\": [\"NDVI\"], \"FAPAR\": [\"FAPAR\"], \"FCOVER\": [\"FCOVER\"]}\\r\\nSENTINEL2_NDVI = \"NDVI\"\\r\\n\\r\\nMODEL_URLS = {\\r\\n \"NDVI\": \"https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/models/20230628T094900_cnn_transformer_multi_repr2_ndvi_only.zip\",\\r\\n \"FAPAR\": \"https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/models/20230628T100340_cnn_transformer_multi_repr2_fapar_only.zip\",\\r\\n \"FCOVER\": \"https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/models/20230629T081418_cnn_transformer_multi_repr2_fcover_only.zip\",\\r\\n \"RGB_NIR\": \"https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/models/20230629T041857_cnn_transformer_multi_repr2_rgb_nir.zip\",\\r\\n}\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/preprocess/__init__.py', b'')\n __stickytape_write_module('cropsar_px_openeo/udf/preprocess/s1.py', b'import numpy as np\\r\\nimport pandas\\r\\nimport xarray\\r\\n\\r\\nfrom cropsar_px_openeo.udf.constants import ORBIT_DIRECTIONS, SENTINEL1_BANDS\\r\\nfrom cropsar_px_openeo.udf.preprocess import speckle\\r\\nfrom cropsar_px_openeo.utils.logger import Logger\\r\\n\\r\\nlogger = Logger(__name__, udf=True)\\r\\n\\r\\n\\r\\ndef prepare_s1(\\r\\n array: xarray.DataArray,\\r\\n temporal_index: pandas.DatetimeIndex,\\r\\n) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Prepare the Sentinel-1 data to be used as input for the CropSAR_px model.\\r\\n :param array: input data array\\r\\n :param temporal_index: temporal index\\r\\n :return: Sentinel-1 data in dimension order (t, bands, y, x)\\r\\n \"\"\"\\r\\n # select Sentinel-1 bands, input is in power units\\r\\n s1 = filter_s1_bands(array)\\r\\n s1 = multitemporal_speckle_filter(s1)\\r\\n s1 = resample_s1(s1, temporal_index)\\r\\n s1 = s1.transpose(\"t\", \"bands\", \"y\", \"x\")\\r\\n return to_dB(s1)\\r\\n\\r\\n\\r\\ndef to_dB(array: xarray.DataArray) -> xarray.DataArray:\\r\\n return 10 * np.log10(array)\\r\\n\\r\\n\\r\\ndef filter_s1_bands(array: xarray.DataArray) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Filter the Sentinel-1 bands from the input array.\\r\\n If the input array contains Sentinel-1 bands for both orbit directions, the best one will be selected.\\r\\n :param array: input data array\\r\\n :return: filtered data array containing VV and VH bands\\r\\n \"\"\"\\r\\n if all(band in array.bands for band in SENTINEL1_BANDS):\\r\\n logger.info(f\"Using provided Sentinel-1 bands {SENTINEL1_BANDS}\")\\r\\n return array.sel(bands=list(SENTINEL1_BANDS))\\r\\n\\r\\n logger.info(\"Selecting best orbit direction for Sentinel-1 bands\")\\r\\n # select one of the bands for both orbit directions\\r\\n band_orbs = array.sel(bands=[f\"{SENTINEL1_BANDS[0]}_{orbit_direction}\" for orbit_direction in ORBIT_DIRECTIONS])\\r\\n # count the number of valid pixels, take the one with the most\\r\\n best_band = band_orbs.bands[band_orbs.count(dim=(\"x\", \"y\", \"t\")).argmax()].item()\\r\\n # derive the direction from the band name\\r\\n best_direction = best_band.split(\"_\")[-1]\\r\\n logger.info(f\"Selecting {best_direction} orbit direction\")\\r\\n # get the bands for the best orbit direction\\r\\n s1 = array.sel(bands=[f\"{band}_{best_direction}\" for band in SENTINEL1_BANDS])\\r\\n # rename them to VH and VV\\r\\n s1[\"bands\"] = list(SENTINEL1_BANDS)\\r\\n return s1\\r\\n\\r\\n\\r\\ndef multitemporal_speckle_filter(array: xarray.DataArray) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Apply a multi-temporal speckle filter to the Sentinel-1 data in the input array.\\r\\n :param array: Sentinel-1 input data\\r\\n :return: Sentinel-1 data with multi-temporal speckle filter applied\\r\\n \"\"\"\\r\\n array = array.transpose(\"bands\", \"t\", \"y\", \"x\")\\r\\n for band in array.bands:\\r\\n data = array.loc[dict(bands=band)].values\\r\\n # Speckle filtering uses 0 as nodata\\r\\n idxnodata = np.isnan(data)\\r\\n data[idxnodata] = 0\\r\\n\\r\\n filtered = np.rollaxis(\\r\\n speckle.mtfilter(\\r\\n np.rollaxis(data, 0, 3), # input shape for mtfilter: (rows, cols, t)\\r\\n \"gamma\",\\r\\n ),\\r\\n 2,\\r\\n 0,\\r\\n ) # go back to shape: (t, rows, cols)\\r\\n\\r\\n filtered[idxnodata] = np.nan\\r\\n array.loc[dict(bands=band)] = filtered\\r\\n\\r\\n return array\\r\\n\\r\\n\\r\\ndef resample_s1(array: xarray.DataArray, temporal_index: pandas.DatetimeIndex) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Resample the Sentinel-1 data to 5-day frequency and interpolate NaN values.\\r\\n :param array: Sentinel-1 data\\r\\n :param temporal_index: temporal index with 5-day frequency\\r\\n :return: resampled Sentinel-1 data\\r\\n \"\"\"\\r\\n return (\\r\\n array.resample(t=temporal_index.freqstr)\\r\\n .mean(skipna=True)\\r\\n .interpolate_na(dim=\"t\", method=\"linear\")\\r\\n .reindex(t=temporal_index, method=\"ffill\", tolerance=temporal_index.freqstr)\\r\\n )\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/preprocess/speckle.py', b'import numpy as np\\r\\nfrom scipy import ndimage\\r\\n\\r\\n# based on https://git.vito.be/projects/LCLU/repos/satio/browse/satio/utils/speckle.py\\r\\n\\r\\n\\r\\ndef mtfilter(stack, kernel, mtwin=7, enl=3):\\r\\n \"\"\"\\r\\n stack: np array with multi-temporal stack of backscatter images (linear\\r\\n scale)\\r\\n\\r\\n kernel: \\'mean\\',\\'gauss\\',\\'gamma\\' - \\'gamma\\' is recommended (slower than the\\r\\n other kernels though)\\r\\n\\r\\n mtwin: filter window size - recommended mtwin=7\\r\\n\\r\\n enl: only required for kernel \\'gamma\\' - recommended for S1 enl = 3\\r\\n \"\"\"\\r\\n rows, cols, layers = stack.shape\\r\\n filtim = np.zeros((rows, cols, layers))\\r\\n\\r\\n rcs = image_sum = image_num = image_fil = None # pylance unbound warning\\r\\n\\r\\n for no in range(0, layers):\\r\\n # Initiate arrays\\r\\n if no == 0:\\r\\n image_sum = np.zeros((rows, cols))\\r\\n image_num = np.zeros((rows, cols))\\r\\n image_fil = np.zeros((rows, cols, layers))\\r\\n\\r\\n if kernel == \"mean\":\\r\\n rcs = ndimage.uniform_filter(stack[:, :, no], size=mtwin, mode=\"mirror\")\\r\\n elif kernel == \"gauss\":\\r\\n rcs = ndimage.gaussian_filter(stack[:, :, no], mtwin / 4, mode=\"mirror\")\\r\\n elif kernel == \"gamma\":\\r\\n rcs = GammaMAP(stack[:, :, no], mtwin, enl, 0)\\r\\n\\r\\n with np.errstate(divide=\"ignore\", invalid=\"ignore\"):\\r\\n ratio = stack[:, :, no] / rcs\\r\\n ratio[np.isnan(ratio)] = 0\\r\\n\\r\\n image_sum = image_sum + ratio\\r\\n image_num = image_num + (ratio > 0)\\r\\n image_fil[:, :, no] = rcs\\r\\n\\r\\n with np.errstate(invalid=\"ignore\"):\\r\\n for no in range(0, layers):\\r\\n im = stack[:, :, no]\\r\\n filtim1 = image_fil[:, :, no] * image_sum / image_num\\r\\n filtim1[np.isnan(filtim1)] = 0\\r\\n fillmask = (filtim1 == 0) & (im > 0)\\r\\n filtim1[fillmask] = im[fillmask]\\r\\n mask = im > 0\\r\\n filtim1[mask == 0] = im[mask == 0]\\r\\n filtim[:, :, no] = filtim1\\r\\n\\r\\n return filtim\\r\\n\\r\\n\\r\\ndef GammaMAP(band, size, ENL, ndv):\\r\\n img = band\\r\\n img[band == ndv] = 0.0\\r\\n sig_v2 = 1.0 / ENL\\r\\n ENL2 = ENL + 1.0\\r\\n sfak = 1.0 + sig_v2\\r\\n img_mean2 = ndimage.uniform_filter(pow(img, 2), size=size)\\r\\n img_mean2[img == ndv] = 0.0\\r\\n img_mean = ndimage.uniform_filter(img, size=size)\\r\\n img_mean[img == ndv] = 0.0\\r\\n var_z = img_mean2 - pow(img_mean, 2)\\r\\n out = img_mean\\r\\n\\r\\n with np.errstate(divide=\"ignore\", invalid=\"ignore\"):\\r\\n fact1 = var_z / pow(img_mean, 2)\\r\\n fact1[np.isnan(fact1)] = 0\\r\\n\\r\\n mask = (fact1 > sig_v2) & ((var_z - pow(img_mean, 2) * sig_v2) > 0.0)\\r\\n\\r\\n if mask.any():\\r\\n n = (pow(img_mean, 2) * sfak) / (var_z - pow(img_mean, 2) * sig_v2)\\r\\n phalf = (img_mean * (ENL2 - n)) / (2 * n)\\r\\n q = ENL * img_mean * img / n\\r\\n out[mask] = -phalf[mask] + np.sqrt(pow(phalf[mask], 2) + q[mask])\\r\\n\\r\\n out[img == 0.0] = ndv\\r\\n return out\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/utils/__init__.py', b'')\n __stickytape_write_module('cropsar_px_openeo/utils/logger.py', b'import logging\\r\\nfrom typing import Any\\r\\n\\r\\nfrom openeo.udf import inspect\\r\\n\\r\\n\\r\\nclass Logger:\\r\\n \"\"\"\\r\\n Custom logger instance to support default and\\r\\n UDF logging (https://open-eo.github.io/openeo-python-client/udf.html#logging-from-a-udf)\\r\\n \"\"\"\\r\\n\\r\\n def __init__(self, name: str, udf=False):\\r\\n \"\"\"\\r\\n Create a new logger instance\\r\\n :param name: Name of the logger instance to use\\r\\n :param udf: Flag indicating if the logger is used within a UDF (default: False)\\r\\n \"\"\"\\r\\n self.udf = udf\\r\\n self.logger = logging.getLogger(name)\\r\\n\\r\\n def debug(self, message: str, data: Any = None):\\r\\n self.logger.debug(message)\\r\\n self._inspect(data, message, \"debug\")\\r\\n\\r\\n def info(self, message: str, data: Any = None):\\r\\n self.logger.info(message)\\r\\n self._inspect(data, message, \"info\")\\r\\n\\r\\n def warn(self, message: str, data: Any = None):\\r\\n self.logger.warning(message)\\r\\n self._inspect(data, message, \"warning\")\\r\\n\\r\\n def error(self, message: str, data: Any = None):\\r\\n self.logger.error(message)\\r\\n self._inspect(data, message, \"error\")\\r\\n\\r\\n def _inspect(self, data: Any, message: str, level: str):\\r\\n if self.udf:\\r\\n inspect(data=data, message=message, level=level)\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/preprocess/s2.py', b'import numpy\\r\\nimport pandas\\r\\nimport xarray\\r\\n\\r\\nfrom cropsar_px_openeo.udf.constants import SENTINEL2_BANDS, SENTINEL2_NDVI\\r\\nfrom cropsar_px_openeo.udf.preprocess.filter_dips import flaglocalminima\\r\\n\\r\\n\\r\\ndef prepare_s2(array: xarray.DataArray, output: str, temporal_index: pandas.DatetimeIndex) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Prepare the Sentinel-2 data to be used as input for the CropSAR_px model.\\r\\n :param array: input data array\\r\\n :param output: output type\\r\\n :param temporal_index: temporal index\\r\\n :return: Sentinel-2 data in dimension order (t, bands, y, x)\\r\\n \"\"\"\\r\\n s2 = filter_s2_bands(array, output)\\r\\n ndvi = get_ndvi(array)\\r\\n s2 = multitemporal_mask(s2, ndvi)\\r\\n s2 = resample_s2(s2, temporal_index)\\r\\n return s2.transpose(\"t\", \"bands\", \"y\", \"x\")\\r\\n\\r\\n\\r\\ndef filter_s2_bands(array: xarray.DataArray, output: str) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Filter out the Sentinel-2 bands based on the output type.\\r\\n :param array: input data array\\r\\n :param output: output variable\\r\\n :return: filtered Sentinel-2 data array\\r\\n \"\"\"\\r\\n return array.sel(bands=SENTINEL2_BANDS[output])\\r\\n\\r\\n\\r\\ndef get_ndvi(array: xarray.DataArray) -> xarray.DataArray:\\r\\n return array.sel(bands=SENTINEL2_NDVI)\\r\\n\\r\\n\\r\\ndef resample_s2(array: xarray.DataArray, temporal_index: pandas.DatetimeIndex) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Resample the Sentinel-2 data to 5-day frequency, selecting the best acquisitions.\\r\\n :param array: Sentinel-2 data\\r\\n :param temporal_index: temporal index with 5-day frequency\\r\\n :return: resampled Sentinel-2 data\\r\\n \"\"\"\\r\\n return (\\r\\n array.resample(t=temporal_index.freqstr)\\r\\n .map(_take_best_acquisition)\\r\\n .reindex(t=temporal_index, method=\"ffill\", tolerance=temporal_index.freqstr)\\r\\n )\\r\\n\\r\\n\\r\\ndef _take_best_acquisition(group: xarray.DataArray) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Take the best acquisition from a group based on the number of not NaN pixels.\\r\\n :param group: group of acquisitions\\r\\n :return: best acquisition\\r\\n \"\"\"\\r\\n return group.isel(t=group.notnull().sum(dim=[\"bands\", \"x\", \"y\"]).argmax())\\r\\n\\r\\n\\r\\ndef multitemporal_mask(s2: xarray.DataArray, ndvi: xarray.DataArray) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Filter out dips in the timeseries by performing multi-temporal dip detection.\\r\\n The multi-temporal dip detection is performed on the NDVI data, the result is then applied to the Sentinel-2 data.\\r\\n :param s2: Sentinel-2 data\\r\\n :param ndvi: NDVI data\\r\\n :return: masked Sentinel-2 data\\r\\n \"\"\"\\r\\n ndvi_mask = multitemporal_mask_ndvi(ndvi)\\r\\n return s2.where(ndvi_mask.notnull())\\r\\n\\r\\n\\r\\ndef multitemporal_mask_ndvi(ndvi: xarray.DataArray) -> xarray.DataArray:\\r\\n \"\"\"\\r\\n Apply multi-temporal dip detection to NDVI data.\\r\\n :param ndvi: NDVI data\\r\\n :return: masked NDVI data\\r\\n \"\"\"\\r\\n timestamps = list(ndvi.t.values)\\r\\n daily_daterange = pandas.date_range(\\r\\n start=timestamps[0], end=timestamps[-1] + pandas.Timedelta(days=1), freq=\"D\"\\r\\n ).floor(\"D\")\\r\\n ndvi_daily = ndvi.reindex(t=daily_daterange, method=\"bfill\", tolerance=\"1D\")\\r\\n\\r\\n # run multi-temporal dip detection\\r\\n step = 256\\r\\n for idx in numpy.r_[: ndvi_daily.values.shape[1] : step]:\\r\\n for idy in numpy.r_[: ndvi_daily.values.shape[2] : step]:\\r\\n ndvi_daily.values[:, idx : idx + step, idy : idy + step] = flaglocalminima(\\r\\n ndvi_daily.values[:, idx : idx + step, idy : idy + step],\\r\\n maxdip=0.01,\\r\\n maxdif=0.1,\\r\\n maxgap=60,\\r\\n maxpasses=5,\\r\\n )\\r\\n # get the original timestamps\\r\\n return ndvi_daily.sel(t=timestamps, method=\"ffill\", tolerance=\"1D\")\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/preprocess/filter_dips.py', b'import numbers\\r\\nfrom typing import Union\\r\\n\\r\\nimport numpy as np\\r\\n\\r\\nfrom cropsar_px_openeo.utils.logger import Logger\\r\\n\\r\\nlogger = Logger(__name__, udf=True)\\r\\n\\r\\n# based on https://github.com/WorldCereal/worldcereal-classification/blob/v1.1.1/src/worldcereal/utils/masking.py\\r\\n\\r\\n\\r\\ndef flaglocalminima(\\r\\n npdatacube: np.ndarray,\\r\\n maxdip: Union[float, None] = None,\\r\\n maxdif: Union[float, None] = None,\\r\\n maxgap: Union[int, None] = None,\\r\\n maxpasses: int = 1,\\r\\n verbose: bool = True,\\r\\n):\\r\\n \"\"\"\\r\\n Remove dips and difs (replace by np.nan) from the input npdatacube.\\r\\n\\r\\n dip on position i:\\r\\n (xn - xi) < (n-l) * maxdip AND (xm - xi) < (m-i) * maxdip\\r\\n n first not-None position with value \\'left\\' of i\\r\\n m first not-None position with value \\'right\\' of i\\r\\n\\r\\n dif on position i:\\r\\n (xn - xi) < (n-l) * maxdif OR (xm - xi) < (m-i) * maxdif\\r\\n n first not-None position with value \\'left\\' of i\\r\\n m first not-None position with value \\'right\\' of i\\r\\n \"\"\"\\r\\n return _flaglocalextrema_ct(\\r\\n npdatacube, maxdip, maxdif, maxgap=maxgap, maxpasses=maxpasses, doflagmaxima=False, verbose=verbose\\r\\n )\\r\\n\\r\\n\\r\\ndef _flaglocalextrema_ct(\\r\\n npdatacube: np.ndarray,\\r\\n maxdip: Union[float, None],\\r\\n maxdif: Union[float, None],\\r\\n maxgap: Union[int, None] = None,\\r\\n maxpasses: int = 1,\\r\\n doflagmaxima: bool = False,\\r\\n verbose: bool = True,\\r\\n):\\r\\n def slopeprev(npdatacube, maxgap):\\r\\n \"\"\" \"\"\"\\r\\n shiftedval = np.full_like(npdatacube, np.nan, dtype=float)\\r\\n shifteddis = np.full_like(npdatacube, 1, dtype=int)\\r\\n numberofrasters = npdatacube.shape[0]\\r\\n shiftedval[1:numberofrasters, ...] = npdatacube[0 : numberofrasters - 1, ...]\\r\\n\\r\\n if np.isscalar(npdatacube[0]):\\r\\n nans = np.isnan(npdatacube)\\r\\n for iIdx in range(1, numberofrasters):\\r\\n if nans[iIdx - 1]:\\r\\n shiftedval[iIdx] = shiftedval[iIdx - 1] # can still be nan in case series started with nan\\r\\n shifteddis[iIdx] = shifteddis[iIdx - 1] + 1\\r\\n\\r\\n else:\\r\\n for iIdx in range(1, numberofrasters):\\r\\n nans = np.isnan(npdatacube[iIdx - 1])\\r\\n shiftedval[iIdx][nans] = shiftedval[iIdx - 1][nans]\\r\\n shifteddis[iIdx][nans] = shifteddis[iIdx - 1][nans] + 1\\r\\n\\r\\n slopetoprev = (shiftedval - npdatacube) / shifteddis\\r\\n comparable = ~np.isnan(slopetoprev)\\r\\n if maxgap is not None:\\r\\n comparable &= shifteddis <= maxgap\\r\\n\\r\\n return slopetoprev, comparable\\r\\n\\r\\n def slopenext(npdatacube, maxgap):\\r\\n \"\"\" \"\"\"\\r\\n shiftedval = np.full_like(npdatacube, np.nan, dtype=float)\\r\\n shifteddis = np.full_like(npdatacube, 1, dtype=int)\\r\\n numberofrasters = npdatacube.shape[0]\\r\\n shiftedval[0 : numberofrasters - 1, ...] = npdatacube[1:numberofrasters, ...]\\r\\n\\r\\n if np.isscalar(npdatacube[0]):\\r\\n nans = np.isnan(npdatacube)\\r\\n for iIdx in range(numberofrasters - 2, -1, -1):\\r\\n if nans[iIdx + 1]:\\r\\n shiftedval[iIdx] = shiftedval[iIdx + 1] # can still be nan in case series started with nan\\r\\n shifteddis[iIdx] = shifteddis[iIdx + 1] + 1\\r\\n\\r\\n else:\\r\\n for iIdx in range(numberofrasters - 2, -1, -1):\\r\\n nans = np.isnan(npdatacube[iIdx + 1])\\r\\n shiftedval[iIdx][nans] = shiftedval[iIdx + 1][nans]\\r\\n shifteddis[iIdx][nans] = shifteddis[iIdx + 1][nans] + 1\\r\\n\\r\\n slopetonext = (shiftedval - npdatacube) / shifteddis\\r\\n comparable = ~np.isnan(slopetonext)\\r\\n if maxgap is not None:\\r\\n comparable &= shifteddis <= maxgap\\r\\n\\r\\n return slopetonext, comparable\\r\\n\\r\\n def masklocalminima(slopesraster, thresholdvalue):\\r\\n return slopesraster > thresholdvalue\\r\\n\\r\\n def masklocalmaxima(slopesraster, thresholdvalue):\\r\\n return slopesraster < thresholdvalue\\r\\n\\r\\n maskextrema = masklocalmaxima if doflagmaxima else masklocalminima\\r\\n\\r\\n if maxdip is not None and (not isinstance(maxdip, numbers.Real) or (float(maxdip) != maxdip) or (maxdip <= 0)):\\r\\n raise ValueError(\"maxdip must be positive number or None\")\\r\\n if maxdif is not None and (not isinstance(maxdif, numbers.Real) or (float(maxdif) != maxdif) or (maxdif <= 0)):\\r\\n raise ValueError(\"maxdif must be positive number or None\")\\r\\n if maxgap is not None and (not isinstance(maxgap, numbers.Real) or (int(maxgap) != maxgap) or (maxgap <= 0)):\\r\\n raise ValueError(\"maxgap must be positive integer or None\")\\r\\n\\r\\n initialnumberofvalues = np.sum(~np.isnan(npdatacube))\\r\\n previousnumberofvalues = initialnumberofvalues\\r\\n for iteration in range(maxpasses):\\r\\n prevslope, prevcomparable = slopeprev(npdatacube, maxgap)\\r\\n nextslope, nextcomparable = slopenext(npdatacube, maxgap)\\r\\n\\r\\n isdip = None\\r\\n if maxdip is not None:\\r\\n isdip = prevcomparable & nextcomparable\\r\\n isdip[isdip] = isdip[isdip] & maskextrema(prevslope[isdip], maxdip)\\r\\n isdip[isdip] = isdip[isdip] & maskextrema(nextslope[isdip], maxdip)\\r\\n\\r\\n isdif = None\\r\\n if maxdif is not None:\\r\\n isdif = np.full_like(npdatacube, False, dtype=bool)\\r\\n isdif[prevcomparable] = isdif[prevcomparable] | maskextrema(prevslope[prevcomparable], maxdif)\\r\\n isdif[nextcomparable] = isdif[nextcomparable] | maskextrema(nextslope[nextcomparable], maxdif)\\r\\n\\r\\n if isdip is not None:\\r\\n npdatacube[isdip] = np.nan\\r\\n if isdif is not None:\\r\\n npdatacube[isdif] = np.nan\\r\\n\\r\\n remainingnumberofvalues = np.sum(~np.isnan(npdatacube))\\r\\n removednumberofvalues = previousnumberofvalues - remainingnumberofvalues\\r\\n if verbose:\\r\\n logger.debug(\\r\\n \"localextrema_ct pass(%s) - removed %s values. %s values remaining. %s values removed in total\"\\r\\n % (\\r\\n iteration + 1,\\r\\n removednumberofvalues,\\r\\n remainingnumberofvalues,\\r\\n initialnumberofvalues - remainingnumberofvalues,\\r\\n )\\r\\n )\\r\\n previousnumberofvalues = remainingnumberofvalues\\r\\n if removednumberofvalues <= 0 and maxpasses > 1:\\r\\n if verbose:\\r\\n logger.debug(\"localextrema_ct pass(%s) - exits\" % (iteration + 1))\\r\\n break\\r\\n\\r\\n return npdatacube\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/utils/models.py', b'import os\\r\\nfrom pathlib import Path\\r\\n\\r\\nfrom cropsar_px_openeo.types import Output\\r\\nfrom cropsar_px_openeo.udf.constants import MODEL_URLS\\r\\nfrom cropsar_px_openeo.udf.utils.helpers import download, unzip\\r\\nfrom cropsar_px_openeo.utils.logger import Logger\\r\\nfrom vito_cropsar.models import InpaintingCnnTransformer\\r\\n\\r\\nlogger = Logger(__name__, udf=True)\\r\\n\\r\\n\\r\\ndef load_model(output: Output) -> InpaintingCnnTransformer:\\r\\n \"\"\"\\r\\n Load the prediction model based on the selected output type.\\r\\n\\r\\n :param output: str\\r\\n :return: prediction model\\r\\n \"\"\"\\r\\n model_url = MODEL_URLS[output]\\r\\n model_zip = os.path.basename(model_url)\\r\\n model_name, _ext = os.path.splitext(model_zip)\\r\\n model_dir = Path(\"tmp/models\")\\r\\n model_dir.mkdir(parents=True, exist_ok=True)\\r\\n\\r\\n zip_path = model_dir / model_zip\\r\\n model_path = model_dir / model_name\\r\\n\\r\\n if not zip_path.exists() or not model_path.exists():\\r\\n logger.debug(\"Could not find model file locally\")\\r\\n download(model_url, zip_path)\\r\\n unzip(zip_path, model_path)\\r\\n else:\\r\\n logger.debug(\"Found model file locally\")\\r\\n\\r\\n return InpaintingCnnTransformer.load(model_path)\\r\\n')\n __stickytape_write_module('cropsar_px_openeo/udf/utils/helpers.py', b'import os\\r\\nimport zipfile\\r\\nfrom typing import Union\\r\\n\\r\\nimport requests\\r\\nfrom cropsar_px_openeo.udf.constants import DOWNLOAD_CHUNK_SIZE, DOWNLOAD_TIMEOUT\\r\\nfrom cropsar_px_openeo.utils.logger import Logger\\r\\n\\r\\nlogger = Logger(__name__, udf=True)\\r\\n\\r\\n\\r\\ndef download(url: str, file_path: Union[str, os.PathLike]):\\r\\n \"\"\"\\r\\n Download a file from a URL and save it to the specified path.\\r\\n :param url: URL to download\\r\\n :param file_path: path to save the file to\\r\\n \"\"\"\\r\\n logger.debug(f\"Downloading {url} to {file_path}\")\\r\\n with requests.get(url, stream=True, timeout=DOWNLOAD_TIMEOUT) as r:\\r\\n r.raise_for_status()\\r\\n with open(file_path, \"wb\") as f:\\r\\n for chunk in r.iter_content(chunk_size=DOWNLOAD_CHUNK_SIZE):\\r\\n f.write(chunk)\\r\\n\\r\\n\\r\\ndef unzip(zip_path: Union[str, os.PathLike], target_path: Union[str, os.PathLike]):\\r\\n \"\"\"\\r\\n Unzip a ZIP-file to the target path.\\r\\n :param zip_path: path of the ZIP-file\\r\\n :param target_path: target path\\r\\n \"\"\"\\r\\n logger.debug(f\"Unzipping {zip_path} to {target_path}\")\\r\\n with zipfile.ZipFile(zip_path) as z:\\r\\n z.extractall(target_path)\\r\\n')\n # ruff: noqa: E402\n # set the required dependencies\n import functools\n import os\n import sys\n from pathlib import Path\n import requests\n import os\n import zipfile\n import sys\n from openeo.udf import inspect\n # Fixed directories for dependencies and model files\n stickytape_dir = sys.path[0]\n if 'tmp' not in stickytape_dir:\n stickytape_dir = os.getcwd()\n DEPENDENCIES_DIR = str(os.path.join(stickytape_dir, 'cropsar_px_dependencies'))\n DEPENDENCIES_URL='https://artifactory.vgt.vito.be/artifactory/auxdata-public/cropsar_px/2.0.0/venv_cropsar_lite.zip#tmp/env/venv_cropsar'\n \n def download_file(url, path):\n \"\"\"\n Downloads a file from the given URL to the specified path.\n \"\"\"\n response = requests.get(url, stream=True)\n with open(path, \"wb\") as file:\n file.write(response.content)\n \n \n def extract_zip(zip_path, extract_to):\n \"\"\"\n Extracts a zip file from zip_path to the specified extract_to directory.\n \"\"\"\n with zipfile.ZipFile(zip_path, \"r\") as zip_ref:\n zip_ref.extractall(extract_to)\n os.remove(zip_path) # Clean up the zip file after extraction\n \n \n def add_directory_to_sys_path(directory):\n \"\"\"\n Adds a directory to the Python sys.path if it's not already present.\n \"\"\"\n if directory not in sys.path:\n sys.path.insert(0, directory)\n \n @functools.lru_cache(maxsize=5)\n def setup_dependencies(dependencies_url=DEPENDENCIES_URL):\n \"\"\"\n Main function to set up the dependencies by downloading, extracting,\n and adding necessary directories to sys.path.\n \"\"\"\n \n inspect(message=\"Create directories\")\n # Ensure base directories exist\n os.makedirs(DEPENDENCIES_DIR, exist_ok=True)\n \n # Download and extract dependencies if not already present\n if not os.listdir(DEPENDENCIES_DIR):\n \n inspect(message=\"Extract dependencies\")\n zip_path = os.path.join(DEPENDENCIES_DIR, \"temp.zip\")\n download_file(dependencies_url, zip_path)\n extract_zip(zip_path, DEPENDENCIES_DIR)\n \n # Add the extracted dependencies directory to sys.path\n add_directory_to_sys_path(DEPENDENCIES_DIR)\n inspect(message=\"Added to the sys path\")\n setup_dependencies()\n \n import time\n import numpy\n import pandas\n import xarray\n import sys\n from openeo.udf import XarrayDataCube\n from openeo.udf import inspect\n \n start = time.time()\n \n from cropsar_px_openeo.types import Output\n from cropsar_px_openeo.udf.constants import SENTINEL2_BANDS, TEMPORAL_BUFFER, TEMPORAL_FREQUENCY\n from cropsar_px_openeo.udf.preprocess.s1 import prepare_s1\n from cropsar_px_openeo.udf.preprocess.s2 import prepare_s2\n \n try:\n from cropsar_px_openeo.udf.utils.models import load_model\n except ImportError:\n inspect(data=sys.path,message=\"Failed to import load_model\")\n \n from cropsar_px_openeo.utils.logger import Logger\n from vito_cropsar.inference.predict_arbitrary_shape import main as predict_arbitrary_shape\n logger = Logger(__name__, udf=True)\n \n def apply_datacube(cube: XarrayDataCube, context: dict) -> XarrayDataCube: # noqa\n logger.info(str(context))\n \n startdate = context[\"startdate\"]\n enddate = context[\"enddate\"]\n output = context[\"output\"]\n \n result = process(array=cube.array, startdate=startdate, enddate=enddate, output=output)\n return XarrayDataCube(result)\n \n \n def log_time(message: str, previous=time.time()) -> float:\n now = time.time()\n logger.debug(f\"{message} ({previous - time.time()} seconds)\")\n return now\n \n \n def process(\n array: xarray.DataArray,\n startdate: str,\n enddate: str,\n output: Output,\n ) -> xarray.DataArray: # noqa\n \"\"\"\n Apply the CropSAR_px algorithm to the provided input data.\n \n :param array: input data (Sentinel-1 + Sentinel-2)\n :param startdate: requested start date\n :param enddate: requested end date\n :param output: output type\n :return:\n \"\"\"\n time = log_time(\"Initiated environment\")\n \n input_temporal_index = pandas.date_range(\n start=pandas.to_datetime(startdate) - pandas.to_timedelta(TEMPORAL_BUFFER, unit=\"day\"),\n end=pandas.to_datetime(enddate) + pandas.to_timedelta(TEMPORAL_BUFFER, unit=\"day\"),\n freq=TEMPORAL_FREQUENCY,\n )\n output_temporal_index = pandas.date_range(\n start=pandas.to_datetime(startdate), end=pandas.to_datetime(enddate), freq=TEMPORAL_FREQUENCY\n )\n \n s1 = prepare_s1(array, input_temporal_index)\n s2 = prepare_s2(array, output, input_temporal_index)\n time = log_time(\"Prepared data\", time)\n \n # input checks:\n if numpy.isnan(s1).all() or numpy.isnan(s2).all():\n # don't do a prediction, because it will be based on no input data\n logger.info(\"Not enough input data to make a prediction\")\n return get_empty_array(array, output, output_temporal_index)\n \n model = load_model(output)\n time = log_time(\"Loaded model\", time)\n \n result = predict_arbitrary_shape(s2=s2.values, s1=s1.values, model=model)\n log_time(\"Finished predictions\", time)\n \n # filter result to requested [startdate, enddate] range\n return xarray.DataArray(\n data=result[input_temporal_index.isin(output_temporal_index)],\n dims=[\"t\", \"bands\", \"y\", \"x\"],\n coords={\"bands\": SENTINEL2_BANDS[output], \"t\": output_temporal_index, \"y\": s2.y, \"x\": s2.x},\n )\n \n \n def get_empty_array(array: xarray.DataArray, output: Output, temporal_index: pandas.DatetimeIndex) -> xarray.DataArray:\n \"\"\"\n Get an empty DataArray based on the output type and the shape of the input data.\n :return:\n \"\"\"\n output_bands = SENTINEL2_BANDS[output]\n logger.debug(\"Returning empty data array\")\n return xarray.DataArray(\n data=numpy.full(\n shape=(len(temporal_index), len(output_bands), array.y.shape[0], array.x.shape[0]), fill_value=numpy.nan\n ),\n dims=[\"t\", \"bands\", \"y\", \"x\"],\n coords={\"t\": temporal_index, \"bands\": output_bands, \"y\": array.y, \"x\": array.x},\n )\n " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to include the script to generate this UPD JSON, or at least the UDF used here? It's near impossible to properly read this UDF code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes sure, I can try to include that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@soxofaan, before replicating files in this PR, I created a PR in VITO/repo for discussion and for further discussion.
The process includes nested UDF and data loading. The generate.py similar to this is the UDP.py in the given repo.
Could you provide me with your feedback on that PR before making any changes to this?
fyi @HansVRP
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The graph is based on this repo: Cropsar_px_cdse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Cropsar UDF/UDP is quite complex. I would not recomend porting that entire functionality over. However if it is availble on a public repo, it might be nice to add the info in the documentation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unfortunately no. The CropSAR repo is not available publicly.
The appending of dependency has also be adapted in this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I have a hard time connecting all the dots here. I'm afraid I'm missing the necessary context to figure this out in reasonable time. Don't let a review from me block the progress of this issue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Pratichhya,
I would not dive to deep in how the UDF/UDP exactly works and alter the algorithm.
But please review https://esa-apex.github.io/apex_documentation/guides/udp_writer_guide.html
and make sure to follow it as close as possible.
If some of these conventions cannot be followed, please explain why.
|
||
By default, the output variable is set to NDVI. | ||
However, by supplying one of the supported values listed above to the output parameter, a different result can be obtained. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Accoring to the documentation:
We recommend including these sections:
Description: A short description of what the algorithm does.
Performance characteristics: Information on the computational efficiency of the algorithm. Include a relative cost if available.
Examples: One or more examples of the algorithm in action, using images or plots to show a typical result. Point to a STAC metadata file with an actual output asset to give users full insight into what is generated.
Literature references: If your algorithm is based on scientific literature, provide references to the relevant publications.
Known limitations: Any known limitations of the algorithm, such as the type of data it works best with, or the size of the area it can process efficiently.
Known artifacts: Use images and plots to clearly show any known artifacts that the algorithm may produce. This helps users to understand what to expect from the algorithm.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
noted
benchmark_scenarios/cropsar_px.json
Outdated
"namespace": "https://raw.githubusercontent.com/ESA-APEx/apex_algorithms/3215de22eded7c87c46122f073d15c7258af26b4/openeo_udp/cropsar_px/cropsar_px.json", | ||
"process_id": "cropsar_px" | ||
}, | ||
"saveresult1": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to the documentation:
Avoid use of save_result
Most openEO process graphs end in a save_resultprocess. However, this is not recommended for UDPs, as the user may want to perform additional processing steps before generating the result. So having a DataCube (raster or vector) as the final output is recommended unless if your service wants to enforce specific settings on how the output is to be generated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Noted
], | ||
"type": "Polygon" | ||
}, | ||
"startdate": "2021-01-01" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
according to the documentation:
Temporal filtering
For openEO processes that support arbitrary temporal ranges, we recommend using temporal_extent as the name of parameter to ensure consistency with other openEO processes, such as load_collection.
Many cases also require a time range with a fixed length. In such a case, you can allow to specify only the start or end date, and use the date_shift process to construct the second date in the temporal interval, ensuring a fixed length. This avoids faulty user input.
https://esa-apex.github.io/apex_documentation/guides/udp_writer_guide.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, however, in this case, it takes the start and end date separately and has date_shift implemented in the source code itself. So in that scenario I chose to go ahead with the date schema itself.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sounds good, do make sure it is well documented
Updated with the recent changes also with the dependencies comments from MOGPR PR |
fca2f83
to
854bd5f
Compare
@HansVRP Rebased this branch too. |
algorithm_catalog/cropsar_px.json
Outdated
{ | ||
"name": "Bram Janssen", | ||
"position": "Researcher", | ||
"organization": "VITO", | ||
"links": [ | ||
{ | ||
"href": "https://www.vito.be/", | ||
"rel": "about", | ||
"type": "text/html" | ||
} | ||
], | ||
"contactInstructions": "Contact via VITO", | ||
"roles": [ | ||
"principal investigator" | ||
] | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe there can only be one PI?
{ | |
"name": "Bram Janssen", | |
"position": "Researcher", | |
"organization": "VITO", | |
"links": [ | |
{ | |
"href": "https://www.vito.be/", | |
"rel": "about", | |
"type": "text/html" | |
} | |
], | |
"contactInstructions": "Contact via VITO", | |
"roles": [ | |
"principal investigator" | |
] | |
}, | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe there can only be one PI?
oops sorry didn't know that. updated to only 1 PI
@Pratichhya was this one updated to the new folder structure? |
No, not this one. |
The latest version of cropsar_px is migrated to this repo. The changes include: