Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
441063e
Parametrize test parameters across test functions
Sep 18, 2025
5d760bc
Fix pre-commit failure
Sep 18, 2025
b87582d
Fix pyramid store detection in get_tile test
Sep 18, 2025
7890228
Update tests/test_app.py
jbusecke Sep 18, 2025
ad28eab
updated zarr; added zarr v3 test fixture; parametrize caching for tests
Sep 18, 2025
9161be4
Remove dask dependency
Sep 18, 2025
a018066
updated zarr fixtures+scripts; minor test changes+additions;add dask …
Sep 18, 2025
3e57c11
update tilejson expected responses
Sep 18, 2025
4d20b59
All tests passing with zarr v2 and v3
Oct 8, 2025
21f33d1
add icechunk fixture and generation script
Oct 8, 2025
e6d86ea
Remove debugs, fix most tests
Oct 8, 2025
d99859c
Merge branch 'main' into support-icechunk
jbusecke Oct 8, 2025
791d618
Bump xarray + renable testing with cache
Oct 8, 2025
e1d90bf
Some more debugging of the tile test
Oct 8, 2025
98809ae
Fix errors by pinning rio-tiler
Oct 8, 2025
89b1755
Add git installation to Dockerfile
jbusecke Oct 8, 2025
ab81349
dummy whitespace commit
jbusecke Oct 8, 2025
de5e640
add pytest-xdist and regen uv.lock
Oct 8, 2025
900f2d5
Add notebook deps and example notebook
Oct 8, 2025
accce17
remove rio-tiler limit
vincentsarago Oct 9, 2025
9d7c6a8
Add virtual icechunk tests + fixtures + responses
Oct 9, 2025
cbd9105
Merge branch 'support-icechunk' of https://github.com/developmentseed…
Oct 9, 2025
32a9c43
Switch back titiler deps to PR branch
Oct 9, 2025
9aed634
Test increasing timeout
Oct 9, 2025
32f3788
Add test notebook
Oct 9, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:
uv run pre-commit run --all-files

- name: Run tests
run: uv run pytest
run: uv run pytest -n auto

cdk-checks:
needs: [tests]
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -105,3 +105,4 @@ cdk.out/
node_modules
cdk.context.json
*.nc
.DS_Store
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ Example of application built with `titiler.xarray` [package](https://development
# It's recommended to install dependencies in a virtual environment
uv sync --dev
export TEST_ENVIRONMENT=true # set this when running locally to mock redis
#optional: Disable caching
#export TITILER_MULTIDIM_ENABLE_CACHE=false
uv run uvicorn titiler.multidim.main:app --reload
```

Expand Down Expand Up @@ -94,3 +96,4 @@ The following steps detail how to to setup and deploy the CDK stack from your lo

In AWS Lambda environment we need to have specific version of botocore, S3FS, FSPEC and other libraries.
To make sure the application will both work locally and in AWS Lambda environment you can install the dependencies using `python -m pip install -r infrastructure/aws/requirement-lambda.txt`

2 changes: 1 addition & 1 deletion infrastructure/aws/cdk/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def __init__(
scope: Construct,
id: str,
memory: int = 1024,
timeout: int = 30,
timeout: int = 60,
runtime: aws_lambda.Runtime = aws_lambda.Runtime.PYTHON_3_12,
concurrent: Optional[int] = None,
permissions: Optional[List[iam.PolicyStatement]] = None,
Expand Down
2 changes: 1 addition & 1 deletion infrastructure/aws/lambda/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ FROM public.ecr.aws/lambda/python:${PYTHON_VERSION} AS builder
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/

# Install system dependencies needed for compilation
RUN dnf install -y gcc-c++ && dnf clean all
RUN dnf install -y gcc-c++ git && dnf clean all

# Set working directory for build
WORKDIR /build
Expand Down
457 changes: 457 additions & 0 deletions notebooks/test_native_icechunk.ipynb

Large diffs are not rendered by default.

27 changes: 21 additions & 6 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@ classifiers = [
]
dynamic = ["version"]
dependencies = [
"titiler.core>=0.23.0,<0.24",
"titiler.xarray>=0.23.0,<0.24",
"titiler-core>=0.23.0,<0.25",
"titiler-xarray>=0.23.0,<0.25",
"aiohttp",
"aiobotocore>=2.24.0",
"boto3>=1.39.0",
Expand All @@ -41,8 +41,9 @@ dependencies = [
"requests",
"rioxarray",
"s3fs",
"xarray",
"zarr>=2,<3",
"xarray>=2025.10.1",
"zarr>3.1.0",
"icechunk>=1.1.9",
]

[project.optional-dependencies]
Expand All @@ -56,6 +57,7 @@ lambda = [

[dependency-groups]
dev = [
"dask>=2025.9.1",
"fakeredis>=2.23.5",
"httpx",
"ipykernel>=6.30.1",
Expand All @@ -65,21 +67,36 @@ dev = [
"pytest-asyncio>=0.24.0",
"pytest-cov>=5.0.0",
"pytest>=8.3.2",
"pytest-xdist",
"uvicorn>=0.34.0",
"yappi>=1.6.0",
"virtualizarr",
"obstore",
]
deployment = [
"aws-cdk-lib~=2.177.0",
"constructs>=10.4.2",
"pydantic-settings~=2.0",
"python-dotenv>=1.0.1",
]
notebooks = [
"folium",
"httpx",
"matplotlib",
]

[project.urls]
Homepage = "https://github.com/developmentseed/titiler-xarray"
Issues = "https://github.com/developmentseed/titiler-xarray/issues"
Source = "https://github.com/developmentseed/titiler-xarray"

[tool.uv.sources]
titiler-xarray = { git = "https://github.com/jbusecke/titiler.git", branch = "jbusecke-icechunk-reader", subdirectory = "src/titiler/xarray" }
titiler-core = { git = "https://github.com/jbusecke/titiler.git", branch = "jbusecke-icechunk-reader", subdirectory = "src/titiler/core" }
# For local testing: TODO revert to merged titiler feature
#titiler-xarray = { path = "../titiler/src/titiler/xarray", editable = true }
#titiler-core = { path = "../titiler/src/titiler/core", editable = true }

[tool.coverage.run]
branch = true
parallel = true
Expand Down Expand Up @@ -126,8 +143,6 @@ explicit_package_bases = true
requires = ["pdm-backend"]
build-backend = "pdm.backend"



[tool.pdm.version]
source = "file"
path = "src/titiler/multidim/__init__.py"
Expand Down
30 changes: 26 additions & 4 deletions tests/conftest.py
Original file line number Diff line number Diff line change
@@ -1,15 +1,37 @@
"""titiler.multidim tests configuration."""
"""Auto-parametrized fixture that runs both cache configurations."""

import sys
import pytest
from fastapi.testclient import TestClient


@pytest.fixture
def app(monkeypatch):
"""App fixture."""
# This fixture will automatically parametrize ALL tests that use it
@pytest.fixture(
params=[
pytest.param({"cache": True}, id="with_cache"),
pytest.param({"cache": False}, id="without_cache"),
]
)
def app(request, monkeypatch):
"""Auto-parametrized app fixture that runs tests with both cache configurations."""
config = request.param
enable_cache = config.get("cache", False)

# Set environment variables using monkeypatch (auto-cleanup)
monkeypatch.setenv("TITILER_MULTIDIM_DEBUG", "TRUE")
monkeypatch.setenv("TEST_ENVIRONMENT", "1")
monkeypatch.setenv(
"TITILER_MULTIDIM_ENABLE_CACHE", "TRUE" if enable_cache else "FALSE"
)

# Clear module cache to ensure fresh import
modules_to_clear = [
key for key in sys.modules.keys() if key.startswith("titiler.multidim")
]
for module in modules_to_clear:
del sys.modules[module]

# Import and return the app
from titiler.multidim.main import app

with TestClient(app) as client:
Expand Down
60 changes: 60 additions & 0 deletions tests/fixtures/generate_test_icechunk.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
"""Create icechunk fixtures (native and later virtual)."""
# TODO: these files could also be generated together with the zarr files using the same data

import numpy as np
import xarray as xr
import icechunk as ic

# Define dimensions and chunk sizes
res = 5
time_dim = 10
lat_dim = 36
lon_dim = 72
chunk_size = {"time": 10, "lat": 10, "lon": 10}

# Create coordinates
time = np.arange(time_dim)
lat = np.linspace(-90.0 + res / 2, 90.0 - res / 2, lat_dim)
lon = np.linspace(-180.0 + res / 2, 180.0 - res / 2, lon_dim)

dtype = np.float64
# Initialize variables with random data
CDD0 = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="CDD0",
)
DISPH = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="DISPH",
)
FROST_DAYS = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="FROST_DAYS",
)
GWETPROF = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="GWETPROF",
)

# Create dataset
ds = xr.Dataset(
{
"CDD0": CDD0.chunk(chunk_size),
"DISPH": DISPH.chunk(chunk_size),
"FROST_DAYS": FROST_DAYS.chunk(chunk_size),
"GWETPROF": GWETPROF.chunk(chunk_size),
},
coords={"time": time, "lat": lat, "lon": lon},
)
storage = ic.local_filesystem_storage("tests/fixtures/icechunk_native")
config = ic.RepositoryConfig.default()
repo = ic.Repository.create(storage=storage, config=config)
session = repo.writable_session("main")
store = session.store

ds.to_zarr(store, consolidated=False)
session.commit("Add initial data")
61 changes: 61 additions & 0 deletions tests/fixtures/generate_test_icechunk_virtual.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
"""Here we test generating icechunk virtual files"""

from virtualizarr import open_virtual_mfdataset
from virtualizarr.parsers import HDFParser
from virtualizarr.registry import ObjectStoreRegistry

import obstore
import icechunk

# NOTE: For now Ill build stores that are stored locally, but point to data on s3.
# Eventually this should probably be built out with a bunch of different options? Not sure if local storage referencing local files would make sense?

# Store that we cannot access from the tests (to ensure proper error handling) - MUR would fit the bill

# Store that points to a public s3 bucket (Using NLDAS as examples - see https://github.com/virtual-zarr/nldas-icechunk/tree/master for details)

urls = [
"s3://nasa-waterinsight/NLDAS3/forcing/daily/200101/NLDAS_FOR0010_D.A20010101.030.beta.nc",
"s3://nasa-waterinsight/NLDAS3/forcing/daily/200101/NLDAS_FOR0010_D.A20010102.030.beta.nc",
"s3://nasa-waterinsight/NLDAS3/forcing/daily/200101/NLDAS_FOR0010_D.A20010103.030.beta.nc",
]

bucket = "s3://nasa-waterinsight"
store = obstore.store.from_url(bucket, region="us-west-2", skip_signature=True)
registry = ObjectStoreRegistry({bucket: store})
parser = HDFParser()

vds = open_virtual_mfdataset(
urls,
parser=parser,
registry=registry,
)

storage = icechunk.local_filesystem_storage(
"tests/fixtures/icechunk_virtual_accessible"
)

config = icechunk.RepositoryConfig.default()
config.set_virtual_chunk_container(
icechunk.VirtualChunkContainer(
"s3://nasa-waterinsight/NLDAS3/forcing/daily/",
icechunk.s3_store(region="us-west-2"),
)
)

virtual_credentials = icechunk.containers_credentials(
{
"s3://nasa-waterinsight/NLDAS3/forcing/daily/": icechunk.s3_anonymous_credentials()
}
)

repo = icechunk.Repository.open_or_create(
storage=storage,
config=config,
authorize_virtual_chunk_access=virtual_credentials,
)

session = repo.writable_session("main")
vds.vz.to_icechunk(session.store)
session.commit("Committed test dataset with virtual chunks")
print("Done committing virtual dataset with publicly accessible chunk to icechunk repo")
32 changes: 23 additions & 9 deletions tests/fixtures/generate_test_zarr.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""Create zarr fixture."""
"""Create zarr fixtures for v2 and v3."""

import numpy as np
import xarray as xr
Expand All @@ -8,31 +8,32 @@
time_dim = 10
lat_dim = 36
lon_dim = 72
chunk_size = (10, 10, 10)
chunk_size = {"time": 10, "lat": 10, "lon": 10}

# Create coordinates
time = np.arange(time_dim)
lat = np.linspace(-90 + res / 2, 90 - res / 2, lat_dim)
lon = np.linspace(-180 + res / 2, 180 - res / 2, lon_dim)
lat = np.linspace(-90.0 + res / 2, 90.0 - res / 2, lat_dim)
lon = np.linspace(-180.0 + res / 2, 180.0 - res / 2, lon_dim)

dtype = np.float64
# Initialize variables with random data
CDD0 = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(np.uint8),
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="CDD0",
)
DISPH = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(np.uint8),
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="DISPH",
)
FROST_DAYS = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(np.uint8),
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="FROST_DAYS",
)
GWETPROF = xr.DataArray(
np.random.rand(time_dim, lat_dim, lon_dim).astype(np.uint8),
np.random.rand(time_dim, lat_dim, lon_dim).astype(dtype),
dims=("time", "lat", "lon"),
name="GWETPROF",
)
Expand All @@ -49,4 +50,17 @@
)

# Save dataset to a local Zarr store
ds.to_zarr("tests/fixtures/test_zarr_store.zarr", mode="w")
ds.to_zarr(
"tests/fixtures/zarr_store_v3.zarr",
mode="w",
zarr_format=3,
consolidated=False,
)

# Save dataset to a local Zarr store
ds.to_zarr(
"tests/fixtures/zarr_store_v2.zarr",
mode="w",
zarr_format=2,
consolidated=True,
)
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
1 change: 1 addition & 0 deletions tests/fixtures/icechunk_native/refs/branch.main/ref.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"snapshot":"X7NF54E8W362EQT4PJDG"}
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
18 changes: 18 additions & 0 deletions tests/fixtures/icechunk_virtual_accessible/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
inline_chunk_threshold_bytes: null
get_partial_values_concurrency: null
compression: null
max_concurrent_requests: null
caching: null
storage: null
virtual_chunk_containers:
s3://nasa-waterinsight/NLDAS3/forcing/daily/:
name: null
url_prefix: s3://nasa-waterinsight/NLDAS3/forcing/daily/
store: !s3
region: us-west-2
endpoint_url: null
anonymous: false
allow_http: false
force_path_style: false
network_stream_timeout_seconds: 60
manifest: null
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"snapshot":"80FX0M404MCT0R3665E0"}
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Loading