Skip to content

Releases: elyra-ai/elyra

v3.10.0

07 Jul 22:40
Compare
Choose a tag to compare

Quick links

New feature highlights

Pipeline editor: mount shared volumes in custom pipeline nodes

Pipeline nodes that are implemented using custom components or generic components can now be configured to utilize data volume mounts. Take advantage of volume mounts if two or more pipeline nodes need to efficiently share data. Data volume mounts can be configured as pipeline defaults (applying to all generic and custom nodes) or for individual nodes.
Volume mounts are only supported by the Kubeflow Pipelines and Apache Airflow runtimes.

image

URL and Airflow catalog connectors: support user credentials

Elyra uses catalog connectors to make custom components available to the Visual Pipeline Editor. The catalog connectors for the URL component catalog, the Apache Airflow package catalog, and the Apache Airflow provider package catalog were extended to allow for input of user credentials to support access to secured resources.

image

What's Changed

New Features

Bug Fixes

  • Fix for Script Editor console does not properly highlight error messages by @VNA818-RPI in #2800
  • Update docs to reflect proper generic op output file usage by @akchinSTC in #2798
  • Fix invalid repository URLs in 'Elyra in an air gapped environment' topic by @ptitzler in #2810
  • Fix Airflow operation processing for number data types by @kiersten-stokes in #2815
  • Remove final instance of BashOperator from tests by @kiersten-stokes in #2817
  • Optionally search for operators in airflow.contrib.operators package by @ptitzler in #2819
  • Fix version issue in release script by @ptitzler in #2824

Other

New Contributors

Full Changelog: v3.9.1...v3.10.0

v3.9.1

15 Jun 13:43
Compare
Choose a tag to compare

Quick links

What's Changed

Bug Fixes

Other

Full Changelog: v3.9.0...v3.9.1

v3.9.0

03 Jun 22:11
Compare
Choose a tag to compare

Quick links

New feature highlights

Access sensitive information in generic pipeline nodes

Jupyter notebooks, Python scripts or R scripts might require access to resources that are protected using sensitive information, such as an API key or a user id and password. If you are running pipelines on Kubeflow Pipelines or Apache Airflow you can take advantage of Kubernetes secrets that are defined in your cluster. Starting with version 3.9 you can configure pipelines to expose these secrets as environment variables, which notebooks or scripts can access.

Using secrets

Pipeline CLI: identify pipeline dependencies

The elyra-pipeline describe CLI command output now includes information about the following dependencies for nodes that utilize generic components: container images, data volumes, and Kubernetes secrets. The machine readable output (produced when the --json option is specified) is most commonly used to automate processes, such as impact analysis and dependency checking. In the example below the output of the command is piped to the jq command-line processor, which extracts information about the container images that the pipeline's notebooks or script are executed in:

$ elyra-pipeline describe --json my.pipeline | jq '.dependencies.container_images[]'
"tensorflow/tensorflow:2.8.0"

This information could be used to identify pipelines that use a specific container image version or to verify that the container images are available in a specific container registry.

Create code snippets from notebook cells

Create a code snippet by selecting one or more cells in a Jupyter notebook.

2022-05-23_15-19-29 (1)

Documentation: Running Elyra in an air gapped environment

The new documentation topic covers considerations for running Elyra in an air gapped environment.

What's Changed

New Features

Bug Fixes

Other

New Contributors

Full Changelog: v3.8.1...v3.9.0

v3.8.1

25 May 14:52
Compare
Choose a tag to compare

Quick links

What's Changed

Other

Full Changelog: v3.8.0...v3.8.1

v3.8.0

03 May 14:24
Compare
Choose a tag to compare

Quick links

New feature highlights

Pipeline editor: define pipeline defaults for runtime images and environment variables

Pipeline nodes that are implemented using generic components (those being used to run Jupyter notebooks, Python scripts, or R scripts) are configured using properties. These properties define, for example, the container image to be used to run as the execution environment. In previous releases it was required to explicitly associate each pipeline node with a container image. Elyra 3.8+ allows for selection of pipeline defaults that are applied to all applicable nodes. These defaults can be optionally overridden for these nodes. Two such defaults are for the runtime image and environment variables, reducing the number of steps required to configure nodes.

2022-04-28_15-26-32 (1)

Tip: Hover over a node in the canvas to view a summary of its properties.

image

Pipeline editor: mount shared volumes in generic pipeline nodes

Pipeline nodes that are implemented using generic components can now be configured to utilize data volume mounts. Take advantage of volume mounts if two or more pipeline nodes need to efficiently share data. Data volume mounts be configured as pipeline defaults (applying to all generic nodes) or for individual nodes.
Volume mounts are only supported by the Kubeflow Pipelines and Apache Airflow runtimes.

image

Pipeline editor: organize generic pipeline inputs and outputs on object storage

Pipeline nodes that are implemented using generic components (those being used to run Jupyter notebooks, Python scripts, or R scripts) utilize object storage buckets as storage for input artifacts, such as Jupyter notebooks or scripts, and output artifacts, like completed notebooks or data files. The new object storage path prefix pipeline property enables you to designate a custom location where those artifacts are stored.

image

Kubeflow Pipelines with Argo: support emissary workflow executor

Elyra now supports Kubeflow Pipelines installations that are configured to use the emissary executor as Argo workflow executor. There is no need to enable or configure anything in Elyra. Refer to the updated requirements for custom pipeline components for important information.

Kubeflow Pipelines runtime configurations: public API endpoint

Kubeflow Pipeline runtime configurations have been extended to allow for the optional configuration of a public API endpoint. You should configure this endpoint if your Kubeflow Pipelines authentication type is configured as KUBERNETES_SERVICE_ACCOUNT_TOKEN. If configured, Elyra uses this URL (instead of the API URL) to generate links that provide access to the Kubeflow Central Dashboard.

image

Metadata CLI: import metadata

The elyra-metadata CLI now supports import of runtime configurations, runtime images, code snippets, and component catalogs. Run elyra-metadata import --help to learn more or check out the documentation.

What's Changed

New Features

Bug Fixes

Other

Read more

v3.7.0

31 Mar 14:10
Compare
Choose a tag to compare

Quick links

New feature highlights

JupyterLab 3.3 (including settings editor)

Elyra v3.7 is the first release that takes advantage of JupyterLab 3.3. One of the highlights is the settings editor, which enables you to configure JupyterLab (and installed extensions). To customize the behavior of the Visual Pipeline Editor, click the settings link on the canvas or choose 'Settings' > 'Advanced Settings Editor' from the main menu.

image

Search for elyra to see the complete list of customization options. There's only one in version 3.7, defining what happens when you double click on a pipeline node.

image

Pipeline editor: view component definitions

The pipeline editor now allows for viewing of custom component definitions. Select a node that utilizes a custom component, open the context menu, and select 'Open Component Definition':

image

The component's source (YAML for Kubeflow Pipelines components, Python source code for Apache Airflow operators) is rendered using the new generic elyra-code-viewer-extension extension.

image

Note: editing of custom component definitions is not supported.

Pipeline editor: refresh the component palette on demand

The pipeline editor's palette displays components that are stored in local or remote component catalogs. Since the content of these catalogs can change at any time, Elyra now includes refresh buttons to allow for complete palette reload (1) or partial palette reload (2) in the 'Component Catalogs' panel.

image

Metadata CLI: export runtimes, runtime images, code snippets, and component catalogs

Elyra stores information about runtimes, runtime images, code snippets, and component catalogs in metadata files. You can export this metadata (e.g. to create a backup) to a local directory using the new export command of the elyra-metadata CLI. To learn more, run elyra-metadata export -h or take a look at this example.

Support for metadata import will be added in a future release. (#2500)

Metadata CLI: install command replaced by create and update

The elyra-metadata CLI supports two new commands, elyra-metadata create (to add a new runtime, code snippet, runtime image, etc) and elyra-metadata update (to update an existing runtime, code snippet, runtime image, etc). These commands replace the elyra-metadata install command.
If you are currently using the legacy command in your automation scripts or custom Dockerfiles, please migrate by replacing elyra-metadata install with elyra-metadata create and elyra-metadata install --replace with elyra-metadata update. The command parameters have not changed. To learn more about the two new commands, run elyra-metadata create --help or elyra-metadata update --help, respectively.

Breaking change in Elyra 4.0: The elyra-metadata install command is no longer supported. (#2580)

Pipeline CLI: export pipelines to runtime-specific format

Elyra uses a proprietary JSON format to store pipeline files, which cannot be used to natively run pipelines on Kubeflow Pipelines or Apache Airflow. Prior to version 3.7 only the pipeline editor supported exporting Elyra pipelines files to Kubeflow Pipelines and Apache Airflow native formats. The new elyra-pipeline export command extends this capability to the CLI. To learn more, run elyra-pipeline export --help or take a look at this example.

Pipeline CLI: monitor the status of Elyra pipelines running on Kubeflow Pipelines

The elyra-pipeline submit command was extended to allow for monitoring of the submitted pipeline on Kubeflow Pipelines. To learn more, run elyra-pipeline submit --help or take a look at this example.

Prebuilt extensions

The Elyra extensions are now published as prebuilt extensions. This eliminates the need to rebuild JupyterLab after running pip install, reducing the installation time. (Installation doc)

Refreshed system-owned runtime image configurations

Elyra utilizes runtime images to run pipeline nodes on container-based platforms, such as Kubeflow Pipelines or Apache Airflow. Every Elyra installation has a small set of runtime images pre-configured, which are system-owned and cannot be removed. Many of those images were outdated and have been replaced with more current versions:

image

Breaking change in Elyra 4.0: Elyra installations will no longer pre-configure system-owned runtime images. (#2476)

What's Changed

New Features

Bug Fixes

Read more

v3.6.0

09 Feb 22:40
Compare
Choose a tag to compare

New feature highlights

Duplicate runtime configurations, runtime images, code snippets, and more

Quickly duplicate existing runtime configurations, runtime images, code snippets, and component catalogs with a single click in the JupyterLab GUI:

Duplicate metadata instances in the GUI

Add Airflow operators from Airflow built distributions to the pipeline editor

Out of the box the visual pipeline editor for Apache Airflow pipelines only includes operators that allow for the execution of Jupyter notebooks, Python scripts, and R scripts. Starting with this release of Elyra you can add operators from your Apache Airflow built distribution to the palette:

  • Create a catalog connector for Apache Airflow.
  • Configure the connector to use the download link for the Apache Airflow built distribution that you have installed in your Airflow cluster.
  • Open the pipeline editor and start using the imported operators.

Add Apache Airflow built-in operators to the pipeline editor palette

Resources:

Add Airflow operators from Airflow provider packages to the pipeline editor

Starting with this release of Elyra you can add operators from provider packages to the palette:

  • Create a provider package catalog connector for Apache Airflow.
  • Configure the connector to use the download link for the Apache Airflow provider package that you have installed in your Airflow cluster.
  • Open the pipeline editor and start using the imported operators.

Add Apache Airflow provider package operators to the pipeline editor palette

Resources:

What's Changed

Full Changelog: v3.5.0...v3.6.0

v3.5.3

14 Feb 17:16
Compare
Choose a tag to compare

New feature highlights

Support GitLab as DAG repository for Apache Airflow

Elyra now supports GitLab as DAG repository for Apache Airflow runtime configuration. This new feature is only enabled if the optional gitlab dependency is installed.

Configure GitLab as DAG repository

Resources:

Attach pipeline node comments to generated pipelines

Comment nodes provide you with the ability to add embedded documentation to pipelines. Starting with this release these comments are passed through to the target runtime environment.

Kubeflow Pipelines

For Kubeflow Pipelines node comments are attached to the Kubernetes pods as elyra/node-user-doc annotations:

Pipeline node comments are embedded in the generated

Pipeline descriptions are rendered in the Airflow GUI

Tip: If your pipeline includes a description, it is rendered in the Kubeflow Pipelines Dashboard when you open the pipeline:

Pipeline descriptions are rendered in the Kubeflow Pipelines GUI

Apache Airflow

For Apache Airflow node comments are attached to the task instance and can be accessed in the task details view:

Pipeline node comments are embedded in the generated DAG

Tip: If your pipeline includes a description, it is rendered in the Apache Airflow GUI when you open the DAG:

Pipeline descriptions are rendered in the Airflow GUI