diff --git a/sentinelhub/.tests b/sentinelhub/.tests index 30d6720..53b2825 100644 --- a/sentinelhub/.tests +++ b/sentinelhub/.tests @@ -7,3 +7,5 @@ cloudless_process_api.ipynb soil_erosion_risk.ipynb from_browser_to_jupyter.ipynb xcube_on_CDSE.ipynb +Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_One.ipynb +Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_Two.ipynb \ No newline at end of file diff --git a/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_One.ipynb b/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_One.ipynb new file mode 100644 index 0000000..2993149 --- /dev/null +++ b/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_One.ipynb @@ -0,0 +1,840 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c937629a-b80d-4e4e-885d-44175ca5ef6b", + "metadata": {}, + "source": [ + "# Terra em Foco Conference: Introduction to Sentinel Hub APIs in CDSE: Part One" + ] + }, + { + "cell_type": "markdown", + "id": "b23794fe-73b4-43cb-a243-85ecea6560e2", + "metadata": {}, + "source": [ + "*This notebook was written for a two hour training session at the Terra em Foco, the Portuguese Earth Observation National Conference held in Braga, Portugal 12-13th September 2024.*\n", + "\n", + "The Copernicus Data Space Ecosystem offers immediate access to large amounts of open and free Earth observation data from the Copernicus Sentinel satellites, including both new and historical Sentinel images, as well as Copernicus Contributing Missions. As the effects of climate change intensifies, the use of earth observation data will become ever more important to monitor wildfires in the southern Europe including Portugal. These two notebooks will show how you can utilise Copernicus data to do just this.\n", + "\n", + "The Copernicus Data Space Ecosystem supports users in accessing, viewing, using, downloading, and analyzing data. The Copernicus Data Space Ecosystem is set up to further improve access and exploitation of the EU’s Copernicus satellites data. The service aims to support users in building various applications needed to provide accurate, timely and objective information which are crucial to create a more sustainable future.\n", + "\n", + "The Copernicus Data Space Ecosystem offers multiple Application Programming Interfaces (APIs) ranging from catalogue, product download, visualization over processing web services such as STAC, openEO and Sentinel Hub APIs. This Jupyter notebook focuses on the Sentinel Hub APIs The Sentinel Hub API is a RESTful API interface that provides access to various satellite imagery archives. It allows you to access raw satellite data, rendered images, statistical analysis, and other features.\n", + "\n", + "\n", + "This notebook contains three examples:\n", + "\n", + "1. Forest fire detection across south west Portugal utilising Sentinel-3 SLSTR\n", + "2. Visualisation of the forest fire during the event using Sentinel-2\n", + "3. Detection and mapping of the burned area post event\n", + "\n", + "**Note:** You do not need any prior experience of Sentinel Hub APIs to be able to follow the functions in this notebook. However, it is advised that you should have some basic knowledge of python and its data science libraries.\n", + "\n", + "Firstly, before getting started we should import some libraries. In addition, to common data science libraries we will be utilising the [Sentinel Hub Python package](https://sentinelhub-py.readthedocs.io/en/latest/index.html). This comes preinstalled in the Copernicus Data Space Ecosystem Jupyter Lab. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8eef2c05-e174-461b-9d0d-6f937799d00a", + "metadata": {}, + "outputs": [], + "source": [ + "# General utilities\n", + "import warnings\n", + "from typing import Any, Optional, Tuple\n", + "\n", + "# Plotting\n", + "import geopandas as gpd\n", + "import matplotlib.patches as mpatches\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "\n", + "# Sentinel Hub services\n", + "from sentinelhub import (\n", + " CRS,\n", + " DataCollection,\n", + " Geometry,\n", + " MimeType,\n", + " SentinelHubRequest,\n", + " SHConfig,\n", + ")\n", + "from shapely.geometry import shape\n", + "\n", + "warnings.filterwarnings(\"ignore\")\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "id": "3485eb57-85e4-442f-be96-cbd213750340", + "metadata": {}, + "source": [ + "In the following cell, we have defined a python function to help us process and visualise the data later on in the notebook:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5a0c5099-f7d8-43ca-9e50-a52b39b19909", + "metadata": {}, + "outputs": [], + "source": [ + "def plot_image(\n", + " image: np.ndarray,\n", + " factor: float = 1.0,\n", + " clip_range: Optional[Tuple[float, float]] = None,\n", + " **kwargs: Any\n", + ") -> None:\n", + " \"\"\"Utility function for plotting RGB images.\"\"\"\n", + " fig, ax = plt.subplots(nrows=1, ncols=1, figsize=(15, 15))\n", + " if clip_range is not None:\n", + " ax.imshow(np.clip(image * factor, *clip_range), **kwargs)\n", + " else:\n", + " ax.imshow(image * factor, **kwargs)\n", + " ax.set_xticks([])\n", + " ax.set_yticks([])" + ] + }, + { + "cell_type": "markdown", + "id": "e377a2f5-d38d-4070-b985-d32608c26336", + "metadata": { + "tags": [] + }, + "source": [ + "## Authentication\n", + "\n", + "You can obtain the credentials for the Sentinel Hub services (`client_id` & `client_secret`) in your [Dashboard](https://shapps.dataspace.copernicus.eu/dashboard/#/). In the user settings, you can create a new OAuth client to generate these credentials. You can find more detailed instructions on the corresponding [documentation page](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Overview/Authentication.html).\n", + "\n", + "Now that you have your `client_id` & `client_secret`, it is recommended to configure a new profile in your Sentinel Hub Python package. Instructions for configuring your Sentinel Hub Python package can be found [here](https://sentinelhub-py.readthedocs.io/en/latest/configure.html). Using these instructions, you can create a profile specifically tailored to use the package to access the Copernicus Data Space Ecosystem data collections. This is useful because changes to the Config class in your notebook are usually only temporary. If you save the configuration in your profile, you do not have to generate new credentials or overwrite/change the default profile every time you start or write a new Jupyter notebook.\n", + "\n", + "If you are using the Sentinel Hub Python package for the Copernicus Data Space Ecosystem for the first time, you should create a profile specifically for the Copernicus Data Space Ecosystem. You can do this in the following cell:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d88cef5a-2fa1-49f7-9b0d-7caecd9b98b1", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Only run this cell if you have not created a configuration.\n", + "\n", + "# config = SHConfig()\n", + "# config.sh_client_id = getpass.getpass(\"Enter your SentinelHub client id\")\n", + "# config.sh_client_secret = getpass.getpass(\"Enter your SentinelHub client secret\")\n", + "# config.sh_token_url = \"https://identity.dataspace.copernicus.eu/auth/realms/CDSE/protocol/openid-connect/token\"\n", + "# config.sh_base_url = \"https://sh.dataspace.copernicus.eu\"\n", + "# config.save(\"cdse\")" + ] + }, + { + "cell_type": "markdown", + "id": "a12b4d52-8c4c-435e-a51d-92ecca1b376e", + "metadata": {}, + "source": [ + "However, if you have already configured a profile in Sentinel Hub Python for the Copernicus Data Space Ecosystem, then you can run the below cell entering the profile name as a string replacing ``." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "735d5ffb-c793-4e2e-b171-02dffed2b601", + "metadata": {}, + "outputs": [], + "source": [ + "config = SHConfig(\"cdse\")" + ] + }, + { + "cell_type": "markdown", + "id": "6db0936b-3549-4bfd-a74f-241a71bcc114", + "metadata": {}, + "source": [ + "## 1: Forest fire detection across south west Portugal utilising Sentinel-3 SLSTR" + ] + }, + { + "cell_type": "markdown", + "id": "acbf79b4-dc0c-4698-a880-8d08c340f905", + "metadata": {}, + "source": [ + "In this first example, we will make a processing API request to visualise hotspots using Sentinel-3 SLSTR. \n", + "\n", + "Firstly, some information about the AOI and the case study that we will be using for this training. The AOI is located in south west Portugal and was the site of a wildfire in August 2023. Located near the town of Odemira, over 7000 hectares was burnt resulting in the evacuation of 1400 people as over 1000 firefighters were drafted in to fight and control the spread of the fire. [1](https://www.theguardian.com/world/2023/aug/08/firefighters-tackling-blaze-raging-in-southern-portugal-fire) [2](https://www.reuters.com/world/europe/more-than-1000-evacuated-portugal-wildfire-spreads-2023-08-08/)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "72eeb579-0d77-480f-a1e7-d4ddb05a735f", + "metadata": {}, + "outputs": [], + "source": [ + "area_of_interest = \"\"\"{\n", + " \"type\": \"Polygon\",\n", + " \"coordinates\": [\n", + " [\n", + " [\n", + " -8.920627,\n", + " 37.274053\n", + " ],\n", + " [\n", + " -8.178291,\n", + " 37.274053\n", + " ],\n", + " [\n", + " -8.178291,\n", + " 37.810869\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.810869\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.274053\n", + " ]\n", + " ]\n", + " ]\n", + "}\n", + "\"\"\"\n", + "\n", + "aoi = gpd.read_file(area_of_interest)\n", + "aoi[\"geometry\"] = aoi\n", + "aoi[\"area\"] = aoi.area\n", + "aoi.explore(\"area\", color=\"Green\", legend=False)" + ] + }, + { + "cell_type": "markdown", + "id": "3def4c69-4664-47e4-968e-4dd7d3f38cef", + "metadata": {}, + "source": [ + "If you wish to make your own area of interest you can use the [Request Builder](https://shapps.dataspace.copernicus.eu/requests-builder/) application and draw and AOI using the interface before exporting it into your directory. You can then copy and paste the `geojson` contents into the `area_of_interest` variable in the cell above. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6cdf8d18-5afc-4854-93e6-a6ee8d8ffd01", + "metadata": {}, + "outputs": [], + "source": [ + "full_geometry = Geometry(aoi.to_crs(32630).geometry.values[0], crs=CRS.UTM_30N)" + ] + }, + { + "cell_type": "markdown", + "id": "caf9d9ed-2721-4732-bdb7-fbfe70ae7d4c", + "metadata": {}, + "source": [ + "### Processing API\n", + "\n", + "The [Processing API](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Process.html) (or shortly \"Process API\") is the most commonly used API in Sentinel Hub as it provides images based on satellite data. Users can request raw satellite data, simple band combinations such as false colour composites, calculations of simple remote sensing indices like NDVI, or more advanced processing such as calculation of Leaf area index (LAI).\n", + "\n", + "Even though satellite imagery data are often distributed in \"tiles\", we do not want users to be limited to these. Tiles are an artificially introduced entity to make data distribution easier to handle. However, users should not have to care about whether their AOI is on one tile or another, or perhaps on the border of two tiles. This is why Sentinel Hub API hides this complexity and simply makes the data available over chosen area of interest and temporal period of interest. Tiles are therefore automatically stitched together based on defined parameters (AOI, time period, cloud coverage, priority, etc., depending on the data type)." + ] + }, + { + "cell_type": "markdown", + "id": "fe441139-5901-4872-9adf-b7c4a2916bcd", + "metadata": {}, + "source": [ + "### Write an Evalscript\n", + "\n", + "An [evalscript](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Evalscript.html) (or \"custom script\") is a piece of Javascript code which defines how the satellite data shall be processed by Sentinel Hub and what values the service shall return. It is a required part of any process, batch processing or OGC request.\n", + "\n", + "In the following Evalscript we will perform all the processing steps that we performed in the \"traditional\" approach." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0f1b8c6c-ef81-4896-b7d2-636581a634ca", + "metadata": {}, + "outputs": [], + "source": [ + "evalscript_fire_detection = \"\"\"\n", + "// high accuracy Detect active fire points \n", + "//Sentinel-3 SLSTR\n", + "//by Tiznger startup co\n", + "//www.tiznegar.com\n", + "\n", + "var SAHM= ((S6 - S5) / (S6 + S5));\n", + "\n", + "if(SAHM>.05 && S1<.23){\n", + " return[5*S3, 1*S2, 1*S1]\n", + "}\n", + "\n", + "else {\n", + " return [S6,S3,S2]\n", + "}\n", + "\n", + "//Red color indicates active fire areas and points\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "id": "3d10ee3b-ff1a-420a-8638-f986d970a9df", + "metadata": {}, + "source": [ + "### Build the request\n", + "\n", + "We build the request according to the [API Reference](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/ApiReference.html), using the `SentinelHubRequest` class. Each Process API request also needs an [evalscript](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Evalscript.html). An evalscript (or \"custom script\") is a piece of Javascript code which defines how the satellite data shall be processed by Sentinel Hub and what values the service shall return. It is a required part of any [process](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Process.html), [batch processing](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Batch.html) or [OGC request](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/OGC.html).\n", + "\n", + "The information that we specify in the `SentinelHubRequest` object is:\n", + "- an evalscript,\n", + "- a list of input data collections with time interval,\n", + "- a format of the response,\n", + "- a bounding box and its size (size or resolution).\n", + "- `mosaickingOrder` (optional): in this example we have used `leastRecent` which will return pixels from the least recent acquisition in the specified time period.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "92e82f02-5754-46b3-8c4e-e4e6548748a6", + "metadata": {}, + "outputs": [], + "source": [ + "request_true_color = SentinelHubRequest(\n", + " evalscript=evalscript_fire_detection,\n", + " input_data=[\n", + " SentinelHubRequest.input_data(\n", + " data_collection=DataCollection.SENTINEL3_SLSTR.define_from(\n", + " name=\"s3\", service_url=\"https://sh.dataspace.copernicus.eu\"\n", + " ),\n", + " time_interval=(\"2023-08-06\", \"2023-08-06\"),\n", + " other_args={\"dataFilter\": {\"mosaickingOrder\": \"leastRecent\"}},\n", + " )\n", + " ],\n", + " responses=[SentinelHubRequest.output_response(\"default\", MimeType.PNG)],\n", + " geometry=full_geometry,\n", + " size=[1000, 920],\n", + " config=config,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "574920cd-42a8-4ecc-a3ce-880bb4061b74", + "metadata": {}, + "source": [ + "The method `get_data()` will always return a list of length 1 with the available image from the requested time interval in the form of numpy arrays." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "47dff8dd-4e9e-4f9b-a821-550358b2b21e", + "metadata": {}, + "outputs": [], + "source": [ + "true_color_imgs = request_true_color.get_data()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8ce09318-c3b6-4b62-9c17-6c8d00d5f0d7", + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "print(\n", + " f\"Returned data is of type = {type(true_color_imgs)} and length {len(true_color_imgs)}.\"\n", + ")\n", + "print(\n", + " f\"Single element in the list is of type {type(true_color_imgs[-1])} and has shape {true_color_imgs[-1].shape}\"\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5c56d96f-119d-4c51-b97c-049abd1aa0d0", + "metadata": {}, + "outputs": [], + "source": [ + "image = true_color_imgs[0]\n", + "print(f\"Image type: {image.dtype}\")\n", + "\n", + "# plot function\n", + "plot_image(image, factor=1.5 / 255, clip_range=(0, 1))" + ] + }, + { + "cell_type": "markdown", + "id": "6cd7f14a-ec38-4258-a8cd-0d2ee46b9ad2", + "metadata": {}, + "source": [ + "## 2: Visualising forest fires using Sentinel-2 imagery" + ] + }, + { + "cell_type": "markdown", + "id": "f63c72be-0a1e-4ee4-ad07-ffd0140ad137", + "metadata": {}, + "source": [ + "In this second example, we will visualise the wildfire we detected using Sentinel-2 imagery.\n", + "\n", + "### Write an Evalscript\n", + "\n", + "An [evalscript](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Evalscript.html) (or \"custom script\") is a piece of Javascript code which defines how the satellite data shall be processed by Sentinel Hub and what values the service shall return. It is a required part of any process, batch processing or OGC request.\n", + "\n", + "In the following Evalscript we will visualise the forest fire using the SWIR bands that Sentinel-2 offers. It's important to note this is just a visualisation to illustrate the forest fire. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c2db87cb-822b-4809-8e16-69936a160b57", + "metadata": {}, + "outputs": [], + "source": [ + "evalscript_wildfire_visualisation = \"\"\"\n", + "// VERSION=3\n", + "// QuickFire V1.0.0 by Pierre Markuse (https://twitter.com/Pierre_Markuse)\n", + "// Adjusted for use in the Copernicus Browser (https://dataspace.copernicus.eu/browser/)\n", + "// CC BY 4.0 International (https://creativecommons.org/licenses/by/4.0/)\n", + "\n", + "// Copernicus Browser does not have the band CLP, this was replaced with the isCloud() function\n", + "// but do try to turn off cloudAvoidance if results aren't as expected.\n", + "\n", + "function setup() {\n", + " return {\n", + " input: [\"B02\", \"B03\", \"B04\", \"B08\", \"B8A\", \"B11\", \"B12\", \"dataMask\"],\n", + " output: { bands: 4 }\n", + " };\n", + "}\n", + "\n", + "function isCloud(samples) {\n", + " const NGDR = index(samples.B03, samples.B04);\n", + " const bRatio = (samples.B03 - 0.175) / (0.39 - 0.175);\n", + " return bRatio > 1 || (bRatio > 0 && NGDR > 0);\n", + "}\n", + "\n", + "function stretch(val, min, max) { return (val - min) / (max - min); }\n", + "\n", + "function satEnh(arr, s) {\n", + " var avg = arr.reduce((a, b) => a + b, 0) / arr.length;\n", + " return arr.map(a => avg * (1 - s) + a * s);\n", + "}\n", + "\n", + "function layerBlend(lay1, lay2, lay3, op1, op2, op3) {\n", + " return lay1.map(function (num, index) {\n", + " return (num / 100 * op1 + (lay2[index] / 100 * op2) + (lay3[index] / 100 * op3));\n", + " });\n", + "}\n", + "\n", + "function evaluatePixel(sample) {\n", + " const hsThreshold = [2.0, 1.5, 1.25, 1.0];\n", + " const hotspot = 1;\n", + " const style = 1;\n", + " const hsSensitivity = 1.0;\n", + " const boost = 1.2;\n", + "\n", + " const cloudAvoidance = 1;\n", + " const avoidanceHelper = 0.8;\n", + "\n", + " const offset = -0.007;\n", + " const saturation = 1.10;\n", + " const brightness = 1.40;\n", + " const sMin = 0.15;\n", + " const sMax = 0.99;\n", + "\n", + " const showBurnscars = 0;\n", + " const burnscarThreshold = -0.25;\n", + " const burnscarStrength = 0.3;\n", + "\n", + " const NDWI = (sample.B03 - sample.B08) / (sample.B03 + sample.B08);\n", + " const NDVI = (sample.B08 - sample.B04) / (sample.B08 + sample.B04);\n", + " const waterHighlight = 0;\n", + " const waterBoost = 2.0;\n", + " const NDVI_threshold = 0.05;\n", + " const NDWI_threshold = 0.0;\n", + " const waterHelper = 0.1;\n", + "\n", + " const Black = [0, 0, 0];\n", + " const NBRindex = (sample.B08 - sample.B12) / (sample.B08 + sample.B12);\n", + " const naturalColorsCC = [Math.sqrt(brightness * sample.B04 + offset), Math.sqrt(brightness * sample.B03 + offset), Math.sqrt(brightness * sample.B02 + offset)];\n", + " const naturalColors = [(2.5 * brightness * sample.B04 + offset), (2.5 * brightness * sample.B03 + offset), (2.5 * brightness * sample.B02 + offset)];\n", + " const URBAN = [Math.sqrt(brightness * sample.B12 * 1.2 + offset), Math.sqrt(brightness * sample.B11 * 1.4 + offset), Math.sqrt(brightness * sample.B04 + offset)];\n", + " const SWIR = [Math.sqrt(brightness * sample.B12 + offset), Math.sqrt(brightness * sample.B8A + offset), Math.sqrt(brightness * sample.B04 + offset)];\n", + " const NIRblue = colorBlend(sample.B08, [0, 0.25, 1], [[0 / 255, 0 / 255, 0 / 255], [0 / 255, 100 / 255, 175 / 255], [150 / 255, 230 / 255, 255 / 255]]);\n", + " const classicFalse = [sample.B08 * brightness, sample.B04 * brightness, sample.B03 * brightness];\n", + " const NIR = [sample.B08 * brightness, sample.B08 * brightness, sample.B08 * brightness];\n", + " const atmoPen = [sample.B12 * brightness, sample.B11 * brightness, sample.B08 * brightness];\n", + " var enhNaturalColors = [0, 0, 0];\n", + " for (let i = 0; i < 3; i += 1) { enhNaturalColors[i] = (brightness * ((naturalColors[i] + naturalColorsCC[i]) / 2) + (URBAN[i] / 10)); }\n", + "\n", + " const manualCorrection = [0.04, 0.00, -0.05];\n", + "\n", + " var Viz = layerBlend(URBAN, SWIR, naturalColorsCC, 10, 10, 90); // Choose visualization(s) and opacity here\n", + "\n", + " if (waterHighlight) {\n", + " if ((NDVI < NDVI_threshold) && (NDWI > NDWI_threshold) && (sample.B04 < waterHelper)) {\n", + " Viz[1] = Viz[1] * 1.2 * waterBoost + 0.1;\n", + " Viz[2] = Viz[2] * 1.5 * waterBoost + 0.2;\n", + " }\n", + " }\n", + "\n", + " Viz = satEnh(Viz, saturation);\n", + " for (let i = 0; i < 3; i += 1) {\n", + " Viz[i] = stretch(Viz[i], sMin, sMax);\n", + " Viz[i] += manualCorrection[i];\n", + " }\n", + "\n", + " if (hotspot) {\n", + " if ((!cloudAvoidance) || (!isCloud(sample) && (sample.B02 < avoidanceHelper))) {\n", + " switch (style) {\n", + " case 1:\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[0] / hsSensitivity)) return [((boost * 0.50 * sample.B12) + Viz[0]), ((boost * 0.50 * sample.B11) + Viz[1]), Viz[2], sample.dataMask];\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[1] / hsSensitivity)) return [((boost * 0.50 * sample.B12) + Viz[0]), ((boost * 0.20 * sample.B11) + Viz[1]), Viz[2], sample.dataMask];\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[2] / hsSensitivity)) return [((boost * 0.50 * sample.B12) + Viz[0]), ((boost * 0.10 * sample.B11) + Viz[1]), Viz[2], sample.dataMask];\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[3] / hsSensitivity)) return [((boost * 0.50 * sample.B12) + Viz[0]), ((boost * 0.00 * sample.B11) + Viz[1]), Viz[2], sample.dataMask];\n", + " break;\n", + " case 2:\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[3] / hsSensitivity)) return [1, 0, 0, sample.dataMask];\n", + " break;\n", + " case 3:\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[3] / hsSensitivity)) return [1, 1, 0, sample.dataMask];\n", + " break;\n", + " case 4:\n", + " if ((sample.B12 + sample.B11) > (hsThreshold[3] / hsSensitivity)) return [Viz[0] + 0.2, Viz[1] - 0.2, Viz[2] - 0.2, sample.dataMask];\n", + " break;\n", + " default:\n", + " }\n", + " }\n", + " }\n", + "\n", + " if (showBurnscars) {\n", + " if (NBRindex < burnscarThreshold) {\n", + " Viz[0] = Viz[0] + burnscarStrength;\n", + " Viz[1] = Viz[1] + burnscarStrength;\n", + " }\n", + " }\n", + "\n", + " return [Viz[0], Viz[1], Viz[2], sample.dataMask];\n", + "}\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "id": "f384b7b4-ce0c-4910-bcda-d5c53ba4bab7", + "metadata": {}, + "source": [ + "### Build the request\n", + "\n", + "We build the request according to the [API Reference](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/ApiReference.html), using the `SentinelHubRequest` class. Each Process API request also needs an [evalscript](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Evalscript.html). An evalscript (or \"custom script\") is a piece of Javascript code which defines how the satellite data shall be processed by Sentinel Hub and what values the service shall return. It is a required part of any [process](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Process.html), [batch processing](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Batch.html) or [OGC request](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/OGC.html).\n", + "\n", + "The information that we specify in the `SentinelHubRequest` object is:\n", + "- an evalscript,\n", + "- a list of input data collections with time interval,\n", + "- a format of the response,\n", + "- a bounding box and its size (size or resolution).\n", + "- `mosaickingOrder` (optional): in this example we have used `leastRecent` which will return pixels from the least recent acquisition in the specified time period.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "20b8f5f3-0193-49e5-be84-97a4dc7b68ea", + "metadata": {}, + "outputs": [], + "source": [ + "request_wildfire_visualisation = SentinelHubRequest(\n", + " evalscript=evalscript_wildfire_visualisation,\n", + " input_data=[\n", + " SentinelHubRequest.input_data(\n", + " data_collection=DataCollection.SENTINEL2_L2A.define_from(\n", + " name=\"s2\", service_url=\"https://sh.dataspace.copernicus.eu\"\n", + " ),\n", + " time_interval=(\"2023-08-07\", \"2023-08-07\"),\n", + " other_args={\"dataFilter\": {\"mosaickingOrder\": \"leastRecent\"}},\n", + " )\n", + " ],\n", + " responses=[SentinelHubRequest.output_response(\"default\", MimeType.PNG)],\n", + " geometry=full_geometry,\n", + " size=[1000, 920],\n", + " config=config,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "6b1959b5-d314-40b1-9162-d2dfc970a122", + "metadata": {}, + "source": [ + "The method `get_data()` will always return a list of length 1 with the available image from the requested time interval in the form of numpy arrays." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "23a2dec4-9ce7-4c1a-952e-62eab6900f8b", + "metadata": {}, + "outputs": [], + "source": [ + "imgs = request_wildfire_visualisation.get_data()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f3199b4e-f7fa-43c0-827e-23f5d4f94615", + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "print(f\"Returned data is of type = {type(imgs)} and length {len(imgs)}.\")\n", + "print(\n", + " f\"Single element in the list is of type {type(imgs[-1])} and has shape {imgs[-1].shape}\"\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "df292cbf-82c0-443e-97b1-f44e18f42660", + "metadata": {}, + "outputs": [], + "source": [ + "image = imgs[0]\n", + "print(f\"Image type: {image.dtype}\")\n", + "\n", + "# plot function\n", + "plot_image(image, factor=1 / 255, clip_range=(0, 1))" + ] + }, + { + "cell_type": "markdown", + "id": "3265dc76-8478-4098-a786-e89087aa95f0", + "metadata": {}, + "source": [ + "## 3: Mapping and measuring the area of burn scars" + ] + }, + { + "cell_type": "markdown", + "id": "2b7c7787-8b67-4d88-b3c3-80e055cabab7", + "metadata": {}, + "source": [ + "Now we will focus on the post event, using the normalised burn ratio we can visualise the extent of the burn scar very easily. Let's create a more focussed area of interest to map out the burn scar area post event:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "96c14388-a81c-440f-a0fa-ab04376d12ba", + "metadata": {}, + "outputs": [], + "source": [ + "area_of_interest = \"\"\"{\n", + " \"type\": \"Polygon\",\n", + " \"coordinates\": [\n", + " [\n", + " [\n", + " -8.920627,\n", + " 37.362517\n", + " ],\n", + " [\n", + " -8.516631,\n", + " 37.362517\n", + " ],\n", + " [\n", + " -8.516631,\n", + " 37.628372\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.628372\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.362517\n", + " ]\n", + " ]\n", + " ]\n", + "}\n", + "\"\"\"\n", + "\n", + "aoi = gpd.read_file(area_of_interest)\n", + "aoi_simplified = aoi.geometry.simplify(0.001)\n", + "aoi[\"geometry\"] = aoi_simplified\n", + "aoi[\"area\"] = aoi.area\n", + "aoi.explore(\"area\", color=\"Green\", legend=False)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0d7d5183-15f3-4b7b-b725-a5ff5545f058", + "metadata": {}, + "outputs": [], + "source": [ + "full_geometry = Geometry(aoi.to_crs(32630).geometry.values[0], crs=CRS.UTM_30N)" + ] + }, + { + "cell_type": "markdown", + "id": "7c456ac3-5f13-494e-ac8e-0ac72271b3d2", + "metadata": {}, + "source": [ + "### Write an Evalscript & Build the Request\n", + "This time we have combined the evalscript and the API request into a single cell:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "da9524bd-8c05-44a4-ada9-d1f0df46c693", + "metadata": {}, + "outputs": [], + "source": [ + "evalscript_burn_scar_map = \"\"\"\n", + "//VERSION=3\n", + "// Burneed area detection\n", + "// Author: Monja B. Šebela\n", + "\n", + "function setup() {\n", + " return {\n", + " input: [\"B02\", \"B03\", \"B04\", \"B08\", \"B11\", \"B12\", \"dataMask\"],\n", + " output: { bands: 4 }\n", + " };\n", + "}\n", + "\n", + "function evaluatePixel(samples) {\n", + "\tvar NDWI=index(samples.B03, samples.B08); \n", + "\tvar NDVI=index(samples.B08, samples.B04);\n", + "\tvar INDEX= ((samples.B11 - samples.B12) / (samples.B11 + samples.B12))+(samples.B08);\n", + "\n", + " \tif((INDEX>0.2)||(samples.B02>0.1)||(samples.B11<0.1)||(NDVI>0.3)||(NDWI > 0.1)){\n", + " \t\treturn[2.5*samples.B04, 2.5*samples.B03, 2.5*samples.B02, samples.dataMask]\n", + "\t}\n", + "\telse {\n", + " \treturn [1, 0, 0, samples.dataMask]\n", + "\t}\n", + "}\n", + "\"\"\"\n", + "\n", + "request_burn_scar_map = SentinelHubRequest(\n", + " evalscript=evalscript_burn_scar_map,\n", + " input_data=[\n", + " SentinelHubRequest.input_data(\n", + " data_collection=DataCollection.SENTINEL2_L2A.define_from(\n", + " name=\"s2\", service_url=\"https://sh.dataspace.copernicus.eu\"\n", + " ),\n", + " time_interval=(\"2023-08-10\", \"2023-08-15\"),\n", + " other_args={\"dataFilter\": {\"mosaickingOrder\": \"leastRecent\"}},\n", + " )\n", + " ],\n", + " responses=[SentinelHubRequest.output_response(\"default\", MimeType.PNG)],\n", + " geometry=full_geometry,\n", + " size=[1000, 920],\n", + " config=config,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "f199ac76-9251-42c3-a715-bd731933e2dd", + "metadata": {}, + "source": [ + "The method `get_data()` will always return a list of length 1 with the available image from the requested time interval in the form of numpy arrays." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5dae52b6-7d58-4e26-b7ec-f0b155eca9f8", + "metadata": {}, + "outputs": [], + "source": [ + "burn_scar_imgs = request_burn_scar_map.get_data()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "aa0bc5d8-6262-4fa3-9bb0-4c5717a44877", + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "print(\n", + " f\"Returned data is of type = {type(burn_scar_imgs)} and length {len(burn_scar_imgs)}.\"\n", + ")\n", + "print(\n", + " f\"Single element in the list is of type {type(burn_scar_imgs[-1])} and has shape {burn_scar_imgs[-1].shape}\"\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b57bbffe-aac8-4161-9278-075c804f19be", + "metadata": {}, + "outputs": [], + "source": [ + "image = burn_scar_imgs[0]\n", + "print(f\"Image type: {image.dtype}\")\n", + "\n", + "# plot function\n", + "plot_image(image, factor=1.5 / 255, clip_range=(0, 1))" + ] + }, + { + "cell_type": "markdown", + "id": "b7ec57be-13a6-4cdf-b4d5-5c7670ccf4cf", + "metadata": {}, + "source": [ + "The detection of the burn scar is not perfect, but we can now see the spatial extent using the threshold we defined in the evalscript.\n", + "\n", + "#### Exercise:\n", + "\n", + "- Can you improve the detection of the burn scar in the above image? \n", + " - Hint: Can you find and adjust the parameter in the evalscript that does this or formulate your own method to do this.\n", + "- If you have time, are you able to find a way to map severely burned and moderately burned areas?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b1ff820c-7041-443a-b302-e22b265c389d", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Sentinel Hub", + "language": "python", + "name": "sentinelhub" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.14" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_Two.ipynb b/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_Two.ipynb new file mode 100644 index 0000000..40014a6 --- /dev/null +++ b/sentinelhub/Terra_em_Foco_Conference_Introduction_to_Sentinel_Hub_APIs_in_CDSE_Part_Two.ipynb @@ -0,0 +1,884 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c937629a-b80d-4e4e-885d-44175ca5ef6b", + "metadata": {}, + "source": [ + "# Terra em Foco Conference: Introduction to Sentinel Hub APIs in CDSE: Part Two" + ] + }, + { + "cell_type": "markdown", + "id": "b23794fe-73b4-43cb-a243-85ecea6560e2", + "metadata": {}, + "source": [ + "*This notebook was written for a two hour training session at the Terra em Foco, the Portuguese Earth Observation National Conference held in Braga, Portugal 12-13th September 2024.*\n", + "\n", + "The Copernicus Data Space Ecosystem offers immediate access to large amounts of open and free Earth observation data from the Copernicus Sentinel satellites, including both new and historical Sentinel images, as well as Copernicus Contributing Missions. As the effects of climate change intensifies, the use of earth observation data will become ever more important to monitor wildfires in the southern Europe including Portugal. These two notebooks will show how you can utilise Copernicus data to do just this.\n", + "\n", + "The Copernicus Data Space Ecosystem supports users in accessing, viewing, using, downloading, and analyzing data. The Copernicus Data Space Ecosystem is set up to further improve access and exploitation of the EU’s Copernicus satellites data. The service aims to support users in building various applications needed to provide accurate, timely and objective information which are crucial to create a more sustainable future.\n", + "\n", + "The Copernicus Data Space Ecosystem offers multiple Application Programming Interfaces (APIs) ranging from catalogue, product download, visualization over processing web services such as STAC, openEO and Sentinel Hub APIs. This Jupyter notebook focuses on the Sentinel Hub APIs The Sentinel Hub API is a RESTful API interface that provides access to various satellite imagery archives. It allows you to access raw satellite data, rendered images, statistical analysis, and other features.\n", + "\n", + "You will be shown how to:\n", + "\n", + "- Monitor the recovery of vegetation post event visually\n", + "- Delineate and then polygonise the burn scar using Process API and Python\n", + "- Use the polygon to create to create time series of vegetation indices.\n", + "\n", + "**Note:** You should already have some experience of Sentinel Hub APIs to be able to follow the functions in this notebook. You can begin by going through part one of the workshop. However, it is advised that you should have some basic knowledge of python and its data science libraries.\n", + "\n", + "Firstly, before getting started we should import some libraries. In addition, to common data science libraries we will be utilising the [Sentinel Hub Python package](https://sentinelhub-py.readthedocs.io/en/latest/index.html). This comes preinstalled in the Copernicus Data Space Ecosystem Jupyter Lab. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8eef2c05-e174-461b-9d0d-6f937799d00a", + "metadata": {}, + "outputs": [], + "source": [ + "# General utilities\n", + "import getpass\n", + "import warnings\n", + "from pathlib import Path\n", + "from typing import Any, Optional, Tuple\n", + "import datetime\n", + "\n", + "# Plotting\n", + "import geopandas as gpd\n", + "import matplotlib.patches as mpatches\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "import pandas as pd\n", + "\n", + "# Reading satellite imagery\n", + "import rasterio as rio\n", + "from matplotlib import colors\n", + "from rasterio.mask import mask\n", + "from rasterio.merge import merge\n", + "from rasterio.warp import calculate_default_transform, reproject\n", + "from rasterio.features import shapes\n", + "from rasterio.features import sieve\n", + "\n", + "# Sentinel Hub services\n", + "from sentinelhub import (\n", + " CRS,\n", + " DataCollection,\n", + " Geometry,\n", + " MimeType,\n", + " SentinelHubRequest,\n", + " SentinelHubStatistical,\n", + " SHConfig,\n", + " SentinelHubDownloadClient,\n", + ")\n", + "from shapely.geometry import shape\n", + "\n", + "warnings.filterwarnings(\"ignore\")\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "id": "3485eb57-85e4-442f-be96-cbd213750340", + "metadata": {}, + "source": [ + "In the following cell, we have defined some python functions to help us process and visualise the data later on in the notebook:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5a0c5099-f7d8-43ca-9e50-a52b39b19909", + "metadata": {}, + "outputs": [], + "source": [ + "def plot_image(\n", + " image: np.ndarray,\n", + " factor: float = 1.0,\n", + " clip_range: Optional[Tuple[float, float]] = None,\n", + " **kwargs: Any,\n", + ") -> None:\n", + " \"\"\"Utility function for plotting RGB images.\"\"\"\n", + " fig, ax = plt.subplots(nrows=1, ncols=1, figsize=(15, 15))\n", + " if clip_range is not None:\n", + " ax.imshow(np.clip(image * factor, *clip_range), **kwargs)\n", + " else:\n", + " ax.imshow(image * factor, **kwargs)\n", + " ax.set_xticks([])\n", + " ax.set_yticks([])\n", + "\n", + "\n", + "def vectorize_raster(raster_path, vector_path):\n", + " with rio.open(raster_path) as src:\n", + " raster_data = src.read(1) # read only the first band\n", + " transform = (\n", + " src.transform\n", + " ) # to convert pixel coordinates to geographic coordinates\n", + " crs = src.crs\n", + "\n", + " shapes_generator = shapes(\n", + " raster_data, transform=transform\n", + " ) # add mask=raster_data != 0 to exclude nodata values\n", + "\n", + " vector_data = [\n", + " {\"properties\": {\"value\": value}, \"geometry\": shape}\n", + " for shape, value in shapes_generator\n", + " ]\n", + "\n", + " gdf = gpd.GeoDataFrame.from_features(vector_data, crs=crs)\n", + " # save to file\n", + " # gdf.to_file(vector_path, driver=\"GeoJSON\")\n", + " return gdf\n", + "\n", + "\n", + "def reproject_raster(inpath, outpath, crs, method=\"nearest\"):\n", + " \"Reproject a raster to a new coordinate system.\"\n", + "\n", + " dst_crs = f\"EPSG:{crs}\"\n", + "\n", + " with rio.open(inpath) as src:\n", + " transform, width, height = calculate_default_transform(\n", + " src.crs, dst_crs, src.width, src.height, *src.bounds\n", + " )\n", + " kwargs = src.meta.copy()\n", + " kwargs.update(\n", + " {\"crs\": dst_crs, \"transform\": transform, \"width\": width, \"height\": height}\n", + " )\n", + "\n", + " with rio.open(outpath, \"w\", **kwargs) as dst:\n", + " for i in range(1, src.count + 1):\n", + " reproject(\n", + " source=rio.band(src, i),\n", + " destination=rio.band(dst, i),\n", + " src_transform=src.transform,\n", + " src_crs=src.crs,\n", + " dst_transform=transform,\n", + " dst_crs=dst_crs,\n", + " resampling=rio_resample(method),\n", + " )\n", + "\n", + "\n", + "def write_raster(path, raster, crs, transform, nodata, driver=\"GTiff\"):\n", + " \"\"\"Write a raster to a file.\"\"\"\n", + "\n", + " with rio.open(\n", + " path,\n", + " \"w\",\n", + " driver=driver,\n", + " height=raster.shape[0],\n", + " width=raster.shape[1],\n", + " count=1,\n", + " dtype=raster.dtype,\n", + " crs=crs,\n", + " transform=transform,\n", + " nodata=nodata,\n", + " ) as dst:\n", + " dst.write(raster, 1)\n", + "\n", + "\n", + "# define functions to extract statistics for all acquisition dates\n", + "\n", + "\n", + "def extract_stats(date, stat_data):\n", + " d = {}\n", + " for key, value in stat_data[\"outputs\"].items():\n", + " stats = value[\"bands\"][\"B0\"][\"stats\"]\n", + " if stats[\"sampleCount\"] == stats[\"noDataCount\"]:\n", + " continue\n", + " else:\n", + " d[\"date\"] = [date]\n", + " for stat_name, stat_value in stats.items():\n", + " if stat_name == \"sampleCount\" or stat_name == \"noDataCount\":\n", + " continue\n", + " else:\n", + " d[f\"{key}_{stat_name}\"] = [stat_value]\n", + " return pd.DataFrame(d)\n", + "\n", + "\n", + "def read_acquisitions_stats(stat_data):\n", + " df_li = []\n", + " for aq in stat_data:\n", + " date = aq[\"interval\"][\"from\"][:10]\n", + " df_li.append(extract_stats(date, aq))\n", + " return pd.concat(df_li)" + ] + }, + { + "cell_type": "markdown", + "id": "e377a2f5-d38d-4070-b985-d32608c26336", + "metadata": { + "tags": [] + }, + "source": [ + "## Authentication\n", + "\n", + "You can obtain the credentials for the Sentinel Hub services (`client_id` & `client_secret`) in your [Dashboard](https://shapps.dataspace.copernicus.eu/dashboard/#/). In the user settings, you can create a new OAuth client to generate these credentials. You can find more detailed instructions on the corresponding [documentation page](https://documentation.dataspace.copernicus.eu/APIs/SentinelHub/Overview/Authentication.html).\n", + "\n", + "Now that you have your `client_id` & `client_secret`, it is recommended to configure a new profile in your Sentinel Hub Python package. Instructions for configuring your Sentinel Hub Python package can be found [here](https://sentinelhub-py.readthedocs.io/en/latest/configure.html). Using these instructions, you can create a profile specifically tailored to use the package to access the Copernicus Data Space Ecosystem data collections. This is useful because changes to the Config class in your notebook are usually only temporary. If you save the configuration in your profile, you do not have to generate new credentials or overwrite/change the default profile every time you start or write a new Jupyter notebook.\n", + "\n", + "If you are using the Sentinel Hub Python package for the Copernicus Data Space Ecosystem for the first time, you should create a profile specifically for the Copernicus Data Space Ecosystem. You can do this in the following cell:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d88cef5a-2fa1-49f7-9b0d-7caecd9b98b1", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "# Only run this cell if you have not created a configuration.\n", + "\n", + "# config = SHConfig()\n", + "# config.sh_client_id = getpass.getpass(\"Enter your SentinelHub client id\")\n", + "# config.sh_client_secret = getpass.getpass(\"Enter your SentinelHub client secret\")\n", + "# config.sh_token_url = \"https://identity.dataspace.copernicus.eu/auth/realms/CDSE/protocol/openid-connect/token\"\n", + "# config.sh_base_url = \"https://sh.dataspace.copernicus.eu\"\n", + "# config.save(\"cdse\")" + ] + }, + { + "cell_type": "markdown", + "id": "a12b4d52-8c4c-435e-a51d-92ecca1b376e", + "metadata": {}, + "source": [ + "However, if you have already configured a profile in Sentinel Hub Python for the Copernicus Data Space Ecosystem, then you can run the below cell entering the profile name as a string replacing ``." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "735d5ffb-c793-4e2e-b171-02dffed2b601", + "metadata": {}, + "outputs": [], + "source": [ + "config = SHConfig(\"cdse\")" + ] + }, + { + "cell_type": "markdown", + "id": "506537ad-4c1a-43aa-a336-33d04f92e1fc", + "metadata": {}, + "source": [ + "We are in a new notebook so we need to redefine our AOI here:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bdd742cf-c98a-4ffb-ba36-802af01f78b4", + "metadata": {}, + "outputs": [], + "source": [ + "area_of_interest = \"\"\"{\n", + " \"type\": \"Polygon\",\n", + " \"coordinates\": [\n", + " [\n", + " [\n", + " -8.920627,\n", + " 37.362517\n", + " ],\n", + " [\n", + " -8.516631,\n", + " 37.362517\n", + " ],\n", + " [\n", + " -8.516631,\n", + " 37.628372\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.628372\n", + " ],\n", + " [\n", + " -8.920627,\n", + " 37.362517\n", + " ]\n", + " ]\n", + " ]\n", + "}\n", + "\"\"\"\n", + "\n", + "aoi = gpd.read_file(area_of_interest)\n", + "aoi[\"geometry\"] = aoi\n", + "aoi[\"area\"] = aoi.area\n", + "aoi.explore(\"area\", color=\"Green\", legend=False)" + ] + }, + { + "cell_type": "markdown", + "id": "8e793d58-3785-45b9-96b8-d65b1f9dc5ce", + "metadata": {}, + "source": [ + "## 4: Monitoring the recovery of vegetation post event" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e091e623-539d-42ff-9f37-bb228d34c62f", + "metadata": {}, + "outputs": [], + "source": [ + "start = datetime.datetime(2023, 8, 10)\n", + "end = datetime.datetime(2024, 7, 10)\n", + "n_chunks = 10\n", + "tdelta = (end - start) / n_chunks\n", + "edges = [(start + i * tdelta).date().isoformat() for i in range(n_chunks)]\n", + "slots = [(edges[i], edges[i + 1]) for i in range(len(edges) - 1)]\n", + "\n", + "print(\"Time windows:\\n\")\n", + "for slot in slots:\n", + " print(slot)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7ec29abf-5f69-4e91-bf36-30dbdc41b0b2", + "metadata": {}, + "outputs": [], + "source": [ + "full_geometry = Geometry(aoi.to_crs(32630).geometry.values[0], crs=CRS.UTM_30N)\n", + "\n", + "evalscript_swir_composite = \"\"\"\n", + "//VERSION=3\n", + "function setup() {\n", + " return {\n", + " input: [\"B12\",\"B8A\",\"B04\", \"dataMask\"],\n", + " output: { bands: 4 }\n", + " };\n", + "}\n", + "\n", + "function evaluatePixel(sample) {\n", + " return [sample.B12,sample.B8A,sample.B04, sample.dataMask ];\n", + "}\n", + "\"\"\"\n", + "\n", + "\n", + "def get_swir_composite_request(time_interval):\n", + " return SentinelHubRequest(\n", + " evalscript=evalscript_swir_composite,\n", + " input_data=[\n", + " SentinelHubRequest.input_data(\n", + " data_collection=DataCollection.SENTINEL2_L2A.define_from(\n", + " name=\"s2\", service_url=\"https://sh.dataspace.copernicus.eu\"\n", + " ),\n", + " time_interval=time_interval,\n", + " other_args={\"dataFilter\": {\"mosaickingOrder\": \"leastCC\"}},\n", + " )\n", + " ],\n", + " responses=[SentinelHubRequest.output_response(\"default\", MimeType.PNG)],\n", + " geometry=full_geometry,\n", + " size=[1000, 920],\n", + " config=config,\n", + " )" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f1662395-9b50-4557-9730-71bc5717eefb", + "metadata": {}, + "outputs": [], + "source": [ + "# create a list of requests\n", + "list_of_requests = [get_swir_composite_request(slot) for slot in slots]\n", + "list_of_requests = [request.download_list[0] for request in list_of_requests]\n", + "\n", + "# download data with multiple threads\n", + "data = SentinelHubDownloadClient(config=config).download(\n", + " list_of_requests, max_threads=5\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bcba7cab-52ec-4c5c-a753-c32bf4d28747", + "metadata": {}, + "outputs": [], + "source": [ + "# some stuff for pretty plots\n", + "ncols = 3\n", + "nrows = 3\n", + "aspect_ratio = 1000 / 920\n", + "subplot_kw = {\"xticks\": [], \"yticks\": [], \"frame_on\": False}\n", + "\n", + "fig, axs = plt.subplots(\n", + " ncols=ncols,\n", + " nrows=nrows,\n", + " figsize=(5 * ncols * aspect_ratio, 5 * nrows),\n", + " subplot_kw=subplot_kw,\n", + ")\n", + "\n", + "for idx, image in enumerate(data):\n", + " ax = axs[idx // ncols][idx % ncols]\n", + " ax.imshow(np.clip(image * 2.5 / 255, 0, 1))\n", + " ax.set_title(f\"{slots[idx][0]} - {slots[idx][1]}\", fontsize=10)\n", + "\n", + "plt.tight_layout()" + ] + }, + { + "cell_type": "markdown", + "id": "8be41c59-0014-43f4-be60-71c098ba307e", + "metadata": {}, + "source": [ + "## 5. Delineate and then polygonize the burn scar" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "10ba417b-5327-4539-917d-cdd3d0ba54da", + "metadata": {}, + "outputs": [], + "source": [ + "# Set the directory where results will be stored\n", + "results_dir = \"./data\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "35b517cb-b9ac-460f-aa8b-0acbede0923a", + "metadata": {}, + "outputs": [], + "source": [ + "evalscript_burn_scar_detection = \"\"\"\n", + "//VERSION=3\n", + "// Burneed area detection\n", + "// Author: Monja B. Šebela\n", + "\n", + "function setup() {\n", + " return {\n", + " input: [\"B02\", \"B03\", \"B04\", \"B08\", \"B11\", \"B12\", \"dataMask\"],\n", + " output: { bands: 3, sampleType:\"UINT8\" }\n", + " };\n", + "}\n", + "\n", + "function evaluatePixel(samples) {\n", + "\tvar NDWI=index(samples.B03, samples.B08); \n", + "\tvar NDVI=index(samples.B08, samples.B04);\n", + "\tvar INDEX= ((samples.B11 - samples.B12) / (samples.B11 + samples.B12))+(samples.B08);\n", + "\n", + " \tif((INDEX>0.2)||(samples.B02>0.1)||(samples.B11<0.1)||(NDVI>0.3)||(NDWI > 0.1)){\n", + " \t\treturn[0,0,0]\n", + "\t}\n", + "\telse {\n", + " \treturn[1,1,1]\n", + "\t}\n", + "}\n", + "\"\"\"\n", + "\n", + "request = SentinelHubRequest(\n", + " evalscript=evalscript_burn_scar_detection,\n", + " input_data=[\n", + " SentinelHubRequest.input_data(\n", + " data_collection=DataCollection.SENTINEL2_L2A.define_from(\n", + " name=\"s2\", service_url=\"https://sh.dataspace.copernicus.eu\"\n", + " ),\n", + " time_interval=(\"2023-08-12\", \"2023-08-12\"),\n", + " other_args={\"dataFilter\": {\"mosaickingOrder\": \"leastRecent\"}},\n", + " )\n", + " ],\n", + " responses=[SentinelHubRequest.output_response(\"default\", MimeType.TIFF)],\n", + " geometry=full_geometry,\n", + " size=[1000, 920],\n", + " data_folder=results_dir,\n", + " config=config,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "80300e9b-7e0a-4248-b20b-47e1fcf062cc", + "metadata": {}, + "source": [ + "The method `get_data()` will always return a list of length 1 with the available image from the requested time interval in the form of numpy arrays." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7a03f991-be01-42db-8bbd-550db5a043d1", + "metadata": {}, + "outputs": [], + "source": [ + "sh_request = request.get_data(save_data=True)" + ] + }, + { + "cell_type": "markdown", + "id": "70fcd72a-ab81-4c67-a62d-d18b05f9a17b", + "metadata": {}, + "source": [ + "#### Clean up the burn scar\n", + "\n", + "One of the limitations of Sentinel Hub services is that it fumctions on a per-pixel basis. Therefore to perform a morphological operation on the results we will open the raster and apply the last step on the downloaded file." + ] + }, + { + "cell_type": "markdown", + "id": "f207a7f5-8319-4e18-9b04-7f58e443ce15", + "metadata": {}, + "source": [ + "We then open the raster with `rasterio` and clean the data as we did in the \"traditional\" approach. We save the output, as well as the reprojected version to be able to plot the data interactively." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f4dc736a-ccff-4b40-aaf5-40c241446eb3", + "metadata": {}, + "outputs": [], + "source": [ + "burn_scar_raster = request.get_filename_list()" + ] + }, + { + "cell_type": "markdown", + "id": "efbb686d-1615-403d-ae51-884cc2f464d2", + "metadata": {}, + "source": [ + "Next, we can sieve the raster to remove noise from the result:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4ed6497a-504e-43e3-b324-a0b3b583890a", + "metadata": {}, + "outputs": [], + "source": [ + "burn_scar = rio.open(f\"data/%s\" % burn_scar_raster[0])\n", + "\n", + "burn_scar_cleaned = sieve(burn_scar, size=400)\n", + "\n", + "# Save the output\n", + "write_raster(\n", + " f\"{results_dir}/burn_area.tif\",\n", + " burn_scar_cleaned.astype(\"uint8\"),\n", + " burn_scar.crs,\n", + " burn_scar.transform,\n", + " 0,\n", + ")\n", + "\n", + "# Reproject to EPSG 4326\n", + "reproject_raster(\n", + " f\"{results_dir}/burn_area.tif\",\n", + " f\"{results_dir}/burn_area_4326.tif\",\n", + " 3857,\n", + " method=\"nearest\",\n", + ")\n", + "\n", + "burn_scar.close()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3ee4cead-1acf-4d31-b01a-5b6c6bfaf486", + "metadata": {}, + "outputs": [], + "source": [ + "cmap_burn_scar = colors.ListedColormap([\"green\", \"red\"])\n", + "bounds_burn_scar = [0, 0.5, 1]\n", + "norm_burn_scar = colors.BoundaryNorm(bounds_burn_scar, cmap_burn_scar.N)\n", + "\n", + "fig, ax = plt.subplots(figsize=(12, 8))\n", + "im = ax.imshow(burn_scar_cleaned, cmap=cmap_burn_scar, norm=norm_burn_scar)\n", + "ax.set_xticks([])\n", + "ax.set_yticks([])\n", + "ax.legend(\n", + " handles=[\n", + " mpatches.Patch(color=\"green\", label=\"Not burned\"),\n", + " mpatches.Patch(color=\"red\", label=\"Burned\"),\n", + " ]\n", + ")\n", + "ax.set_title(\"Cleaned Burn Scar\")\n", + "\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "55c39229-cb78-4912-bd5a-0fd79a45fa8a", + "metadata": {}, + "source": [ + "Once happy with the result, we can vectorise the raster so that we can use it as an input in the next step in the workflow." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6e465229-cf90-4c66-a284-c88aefdb4ac2", + "metadata": {}, + "outputs": [], + "source": [ + "raster_path = \"data/burn_area_4326.tif\"\n", + "vector_path = \"data/vectorized.geojson\"\n", + "\n", + "burn_scar = vectorize_raster(raster_path, vector_path)" + ] + }, + { + "cell_type": "markdown", + "id": "bb4a2a69-e57e-40f1-b759-7e5f8c0860a3", + "metadata": {}, + "source": [ + "If we examine the burn scar vector, we can currently see that it is made up of several features. To simplify this we can dissolve it into a single Multipolygon feature." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "06856120-d09c-4e66-884d-ac5ae29f39bf", + "metadata": {}, + "outputs": [], + "source": [ + "burn_scar" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "271bfea0-e945-45be-9a6c-a115ff51e329", + "metadata": {}, + "outputs": [], + "source": [ + "burn_scar[\"area\"] = burn_scar.area\n", + "burn_scar[\"area\"]\n", + "burn_scar = burn_scar[burn_scar[\"value\"] == 1.0]\n", + "burn_scar = burn_scar[burn_scar[\"area\"] >= 0]\n", + "burn_scar = burn_scar.dissolve()\n", + "burn_scar" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bde96706-be71-4b33-aec9-044dd931f3c4", + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "burn_scar.explore(\"value\", color=\"Green\", legend=False)" + ] + }, + { + "cell_type": "markdown", + "id": "c1b5ecc6-fdbc-4402-b645-1c58ef1ff2af", + "metadata": {}, + "source": [ + "Now we're ready to use the polygon as an input into a Statistical API request." + ] + }, + { + "cell_type": "markdown", + "id": "96287cfb-ba69-46d4-a5b8-588aef4225f7", + "metadata": {}, + "source": [ + "## 6. Plot an mean NDVI and mean Burn Ratio timeseries using the burnscar post event\n", + "\n", + "We will do this using Statistical API. In the following request, we actually request two different indices in two seperate outputs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9a01b000-2c1c-4095-b860-d0965aeadfdf", + "metadata": {}, + "outputs": [], + "source": [ + "full_geometry = Geometry(burn_scar.to_crs(32630).geometry.values[0], crs=CRS.UTM_30N)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "feed98cd-b1ac-437a-9f82-dbf8c3bc3cca", + "metadata": {}, + "outputs": [], + "source": [ + "yearly_time_interval = \"2023-07-01\", \"2024-07-01\"\n", + "\n", + "ndvi_evalscript = \"\"\"\n", + "//VERSION=3\n", + "\n", + "function setup() {\n", + " return {\n", + " input: [\n", + " {\n", + " bands: [\n", + " \"B04\",\n", + " \"B06\",\n", + " \"B07\",\n", + " \"B08\",\n", + " \"B8A\",\n", + " \"B12\",\n", + " \"dataMask\"\n", + " ]\n", + " }\n", + " ],\n", + " output: [\n", + " {\n", + " id: \"ndvi\",\n", + " bands: 1\n", + " },\n", + " {\n", + " id: \"burnratio\",\n", + " bands: 1\n", + " }, \n", + " {\n", + " id: \"dataMask\",\n", + " bands: 1\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "\n", + "\n", + "function evaluatePixel(samples) {\n", + "\n", + " return {\n", + " ndvi: [index(samples.B08, samples.B04)],\n", + " burnratio: [index(samples.B08, samples.B12)],\n", + " dataMask: [samples.dataMask]\n", + " };\n", + " }\n", + "\"\"\"\n", + "\n", + "aggregation = SentinelHubStatistical.aggregation(\n", + " evalscript=ndvi_evalscript,\n", + " time_interval=yearly_time_interval,\n", + " aggregation_interval=\"P5D\",\n", + " resolution=(10, 10),\n", + ")\n", + "\n", + "input_data = SentinelHubStatistical.input_data(\n", + " DataCollection.SENTINEL2_L2A.define_from(\"s2\", service_url=config.sh_base_url),\n", + " other_args={\"dataFilter\": {\"maxCloudCoverage\": 10}},\n", + ")\n", + "\n", + "ndvi_requests = []\n", + "\n", + "request = SentinelHubStatistical(\n", + " aggregation=aggregation,\n", + " input_data=[input_data],\n", + " geometry=full_geometry,\n", + " config=config,\n", + ")\n", + "response1 = request.get_data()\n", + "response1" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f1cc9ee3", + "metadata": {}, + "outputs": [], + "source": [ + "result_df1 = read_acquisitions_stats(response1[0][\"data\"])\n", + "result_df1" + ] + }, + { + "cell_type": "markdown", + "id": "a40d5ace-f4a7-4369-9121-39d3985520cf", + "metadata": {}, + "source": [ + "We can take this another step further, and display the data in a time series using the Matplotlib python library:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "90d22ebf-5a10-4279-b776-227137f60ef4", + "metadata": {}, + "outputs": [], + "source": [ + "fig_stat, ax_stat = plt.subplots(1, 1, figsize=(18, 6))\n", + "t1 = result_df1[\"date\"]\n", + "t2 = result_df1[\"date\"]\n", + "ndvi_mean_field1 = result_df1[\"ndvi_mean\"]\n", + "ndvi_std_field1 = result_df1[\"ndvi_stDev\"]\n", + "ndvi_mean_field2 = result_df1[\"burnratio_max\"]\n", + "ndvi_std_field2 = result_df1[\"burnratio_stDev\"]\n", + "ax_stat.plot(t1, ndvi_mean_field1, label=\"NDVI mean\")\n", + "ax_stat.fill_between(\n", + " t1,\n", + " ndvi_mean_field1 - ndvi_std_field1,\n", + " ndvi_mean_field1 + ndvi_std_field1,\n", + " alpha=0.3,\n", + " label=\"NDVI stDev\",\n", + ")\n", + "ax_stat.plot(t2, ndvi_mean_field2, label=\"Burn Ratio mean\")\n", + "ax_stat.fill_between(\n", + " t2,\n", + " ndvi_mean_field2 - ndvi_std_field2,\n", + " ndvi_mean_field2 + ndvi_std_field2,\n", + " alpha=0.3,\n", + " label=\"Burn Ratio stDev\",\n", + ")\n", + "ax_stat.tick_params(axis=\"x\", labelrotation=30, labelsize=12)\n", + "ax_stat.tick_params(axis=\"y\", labelsize=12)\n", + "ax_stat.set_xlabel(\"Date\", size=15)\n", + "ax_stat.set_ylabel(\"NDVI/unitless\", size=15)\n", + "ax_stat.legend(loc=\"lower right\", prop={\"size\": 12})\n", + "ax_stat.set_title(\"NDVI time series\", fontsize=20)\n", + "for label in ax_stat.get_xticklabels()[1::20]:\n", + " label.set_visible(False)" + ] + }, + { + "cell_type": "markdown", + "id": "90b698f9-e96f-48cc-8675-383de068c2af", + "metadata": {}, + "source": [ + "## 7. Exercise: Compare burned area with an area of unburned vegetation\n", + "\n", + "Now that you have learned how you visualise time series derived from statistical API, you can apply your knowledge to try some different comparisons; here are some hints on what you could try out:\n", + "- create another AOI - you can visit the Request Builder application, draw an AOI and then download it there. \n", + "- put together a new request with the new AOI\n", + "- change the date range and/or the interval period. try changing the cloud cover percentage used.\n", + "- update the previous cell to plot the two AOIs onto the same plot." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "38a287d5-4b80-4976-be47-66dcf18dcb62", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Sentinel Hub", + "language": "python", + "name": "sentinelhub" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.14" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +}