Skip to content

Chore: update demos for release #60

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 19 commits into from
Jul 24, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
19 commits
Select commit Hold shift + click to select a range
47a110c
upgrade postgres and redis versions, upgrade airflow-scheduled-job de…
xeniape Jul 16, 2024
767e4eb
update versions in hbase-hdfs-load-cycling-data and airflow-scheduled…
xeniape Jul 16, 2024
2a81489
Update logging demo for the next release
siegfriedweber Jul 18, 2024
f62f996
Update signal-processing demo and container image
Techassi Jul 18, 2024
879ef47
upgrade minio version and nifi-kafka-druid-superset-s3 versions and d…
xeniape Jul 18, 2024
a64721b
anomaly-detection: update trino, superset, spark products
razvan Jul 18, 2024
914cedd
docs(demos/data-lakehouse-iceberg-trino-spark): update requirements t…
NickLarsenNZ Jul 19, 2024
52eb86d
docs(demos/data-lakehouse-iceberg-trino-spark): be less specific abou…
NickLarsenNZ Jul 19, 2024
9b5df59
docs(demos/data-lakehouse-iceberg-trino-spark): update the link for tpch
NickLarsenNZ Jul 19, 2024
904148d
docs(demos/data-lakehouse-iceberg-trino-spark): upgrade the NOTE abou…
NickLarsenNZ Jul 19, 2024
b22b870
bump versions for jupyterhub-pyspark demo
xeniape Jul 19, 2024
63f08ee
bump opa version for the trino-superset-s3 stack
razvan Jul 19, 2024
44a9f7e
bump testing-tools in nifi-kafka-druid demos and revert druid version…
xeniape Jul 19, 2024
69621fc
bump end-2-end-security versions
maltesander Jul 22, 2024
5dfeac6
consolidate stackable versions
maltesander Jul 22, 2024
cdad9dd
more version bumps (untested)
maltesander Jul 22, 2024
5ae448a
bump to 24.7.0
maltesander Jul 23, 2024
241f4d9
adapt to release 24.7
maltesander Jul 23, 2024
6264a8a
Apply suggestions from code review
NickLarsenNZ Jul 24, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-pyspark-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-date-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/instance=kafka -l app.kubernetes.io/name=kafka"]
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=kafka -l app.kubernetes.io/instance=kafka"]
containers:
- name: create-spark-ingestion-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ingestion-job.yaml"]
volumeMounts:
- name: manifest
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-testdata
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for job load-test-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-test-data"]
containers:
- name: create-tables-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
4 changes: 2 additions & 2 deletions demos/end-to-end-security/create-spark-report.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-trino-tables
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command:
- bash
- -euo
Expand All @@ -23,7 +23,7 @@ spec:
kubectl wait --timeout=30m --for=condition=complete job/create-tables-in-trino
containers:
- name: create-spark-report
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command:
- bash
- -euo
Expand Down
2 changes: 1 addition & 1 deletion demos/end-to-end-security/create-trino-tables.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-tables-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable23.11.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ spec:
spec:
containers:
- name: create-hfile-and-import-to-hbase
image: docker.stackable.tech/stackable/hbase:2.4.17-stackable24.3.0
image: docker.stackable.tech/stackable/hbase:2.4.18-stackable24.7.0
env:
- name: HADOOP_USER_NAME
value: stackable
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: distcp-cycling-data
image: docker.stackable.tech/stackable/hadoop:3.3.6-stackable24.3.0
image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable24.7.0
env:
- name: HADOOP_USER_NAME
value: stackable
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# docker build -t docker.stackable.tech/demos/pyspark-k8s-with-scikit-learn:3.3.0-stackable0.0.0-dev .

FROM docker.stackable.tech/stackable/pyspark-k8s:3.5.0-stackable24.3.0
FROM docker.stackable.tech/stackable/pyspark-k8s:3.5.0-stackable24.7.0

COPY requirements.txt .

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: load-ny-taxi-data
image: docker.stackable.tech/stackable/hadoop:3.3.6-stackable24.3.0
image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable24.7.0
command: ["bash", "-c", "/stackable/hadoop/bin/hdfs dfs -mkdir -p /ny-taxi-data/raw \
&& cd /tmp \
&& for month in 2020-09; do \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor"]
volumeMounts:
- name: ingestion-job-spec
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/stations-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-compaction-job-spec.json https://druid-coordinator:8281/druid/coordinator/v1/config/compaction"]
volumeMounts:
- name: ingestion-job-spec
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
4 changes: 2 additions & 2 deletions demos/signal-processing/Dockerfile-nifi
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# docker build -f ./Dockerfile-nifi -t docker.stackable.tech/demos/nifi:1.25.0-postgresql .
# docker build -f ./Dockerfile-nifi -t docker.stackable.tech/demos/nifi:1.27.0-postgresql .

FROM docker.stackable.tech/stackable/nifi:1.25.0-stackable24.3.0
FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.7.0

RUN curl --fail -o /stackable/nifi/postgresql-42.6.0.jar "https://repo.stackable.tech/repository/misc/postgresql-timescaledb/postgresql-42.6.0.jar"
4 changes: 2 additions & 2 deletions demos/signal-processing/create-nifi-ingestion-job.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for timescaleDB tables to be ready'
&& kubectl wait --for=condition=complete job/create-timescale-tables-job"
]
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "export PGPASSWORD=$(cat /timescale-admin-credentials/password) && \
curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/DownloadAndWriteToDB.xml && \
sed -i \"s/PLACEHOLDERPGPASSWORD/$PGPASSWORD/g\" DownloadAndWriteToDB.xml && \
Expand Down
2 changes: 1 addition & 1 deletion demos/signal-processing/create-timescale-tables.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for timescaleDB to be ready'
&& kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=postgresql-timescaledb"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,11 @@ spec:
spec:
initContainers:
- name: wait-for-testdata
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "echo 'Waiting for job load-ny-taxi-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-ny-taxi-data"]
containers:
- name: create-spark-anomaly-detection-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ad-job.yaml"]
volumeMounts:
- name: manifest
Expand All @@ -37,7 +37,7 @@ data:
name: spark-ad
spec:
sparkImage:
productVersion: 3.5.0
productVersion: 3.5.1
mode: cluster
mainApplicationFile: local:///spark-scripts/spark-ad.py
deps:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/trino-taxi-data/create-table-in-trino.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-ny-taxi-data-table-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/trino-taxi-data/setup-superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
14 changes: 7 additions & 7 deletions docs/modules/demos/pages/data-lakehouse-iceberg-trino-spark.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ $ stackablectl demo install data-lakehouse-iceberg-trino-spark
[#system-requirements]
== System requirements

The demo was developed and tested on a kubernetes cluster with 10 nodes (4 cores (8 threads), 20GiB RAM and 30GB HDD).
The demo was developed and tested on a kubernetes cluster with about 12 nodes (4 cores with hyperthreading/SMT, 20GiB RAM and 30GB HDD).
Instance types that loosely correspond to this on the Hyperscalers are:

- *Google*: `e2-standard-8`
Expand Down Expand Up @@ -179,7 +179,7 @@ As you can see, the table `house-sales` is partitioned by year. Go ahead and cli

image::data-lakehouse-iceberg-trino-spark/minio_5.png[]

You can see that Trino has placed a single file into the selected folder containing all the house sales of that
You can see that Trino has data into the selected folder containing all the house sales of that
particular year.

== NiFi
Expand Down Expand Up @@ -491,7 +491,7 @@ Here you can see all the available Trino catalogs.
* `tpcds`: https://trino.io/docs/current/connector/tpcds.html[TPCDS connector] providing a set of schemas to support the
http://www.tpc.org/tpcds/[TPC Benchmark™ DS]
* `tpch`: https://trino.io/docs/current/connector/tpch.html[TPCH connector] providing a set of schemas to support the
http://www.tpc.org/tpcds/[TPC Benchmark™ DS]
http://www.tpc.org/tpch/[TPC Benchmark™ H]
* `lakehouse`: The lakehouse area containing the enriched and performant accessible data

== Superset
Expand Down Expand Up @@ -541,14 +541,14 @@ image::data-lakehouse-iceberg-trino-spark/superset_7.png[]
On the left, select the database `Trino lakehouse`, the schema `house_sales`, and set `See table schema` to
`house_sales`.

image::data-lakehouse-iceberg-trino-spark/superset_8.png[]

[NOTE]
[IMPORTANT]
====
This older screenshot shows how the table preview would look like. Currently, there is an https://github.com/apache/superset/issues/25307[open issue]
The older screenshot below shows how the table preview would look like. Currently, there is an https://github.com/apache/superset/issues/25307[open issue]
with previewing trino tables using the Iceberg connector. This doesn't affect the execution the following execution of the SQL statement.
====

image::data-lakehouse-iceberg-trino-spark/superset_8.png[]

In the right textbox, you can enter the desired SQL statement. If you want to avoid making one up, use the following:

[source,sql]
Expand Down
4 changes: 2 additions & 2 deletions docs/modules/demos/pages/logging.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -46,10 +46,10 @@ This demo will
synchronization, and providing group services. This demo makes its log data observable in OpenSearch Dashboards.
** *Vector*: A tool for building observability pipelines. This demo uses Vector as a log agent to gather and transform
the logs and as an aggregator to forward the collected logs to OpenSearch.
** *OpenSearch*: A data store and search engine. This demo uses it to store and index the of the log data.
** *OpenSearch*: A data store and search engine. This demo uses it to store and index the log data.
** *OpenSearch Dashboards*: A visualization and user interface. This demo uses it to make the log data easily accessible
to the user.
* Create a view in OpenSearch Dashboards for convenient browsing the log data.
* Create a view in OpenSearch Dashboards to conveniently browse the log data.

You can see the deployed products and their relationship in the following diagram:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ This demo should not be run alongside other demos.

To run this demo, your system needs at least:

* 8 {k8s-cpu}[cpu units] (core/hyperthread)
* 10 {k8s-cpu}[cpu units] (core/hyperthread)
* 32GiB memory
* 35GiB disk storage

Expand Down
4 changes: 2 additions & 2 deletions stacks/_templates/jupyterhub.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: jupyterhub
repo:
name: jupyterhub
url: https://jupyterhub.github.io/helm-chart/
version: 3.2.1
version: 3.3.7
options:
hub:
config:
Expand Down Expand Up @@ -49,7 +49,7 @@ options:
HADOOP_CONF_DIR: "/home/jovyan/hdfs"
initContainers:
- name: download-notebook
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
volumeMounts:
- mountPath: /notebook
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/keycloak.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ spec:
- name: tls
mountPath: /tls/
- name: create-auth-class
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev # We need 0.0.0-dev, so we get kubectl
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command: ["/bin/bash", "-c"]
args:
- |
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/minio-distributed-small.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: minio
repo:
name: minio
url: https://charts.min.io/
version: 5.0.14
version: 5.2.0
options:
additionalLabels:
stackable.tech/vendor: Stackable
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/minio-distributed.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: minio
repo:
name: minio
url: https://charts.min.io/
version: 5.0.14
version: 5.2.0
options:
additionalLabels:
stackable.tech/vendor: Stackable
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/minio.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: minio
repo:
name: minio
url: https://charts.min.io/
version: 5.0.14
version: 5.2.0
options:
additionalLabels:
stackable.tech/vendor: Stackable
Expand Down
Loading