Skip to content

Commit

Permalink
v0.1.0a2
Browse files Browse the repository at this point in the history
  • Loading branch information
eterna2 committed Mar 1, 2020
1 parent 6d04943 commit e61ea8d
Show file tree
Hide file tree
Showing 22 changed files with 672 additions and 308 deletions.
2 changes: 1 addition & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ confidence=
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=import-outside-toplevel,bad-continuation,long-suffix,standarderror-builtin,indexing-exception,delslice-method,unichr-builtin,dict-view-method,parameter-unpacking,unicode-builtin,cmp-builtin,intern-builtin,round-builtin,backtick,nonzero-method,xrange-builtin,coerce-method,raw_input-builtin,old-division,filter-builtin-not-iterating,old-octal-literal,input-builtin,map-builtin-not-iterating,buffer-builtin,basestring-builtin,zip-builtin-not-iterating,using-cmp-argument,unpacking-in-except,old-raise-syntax,coerce-builtin,dict-iter-method,hex-method,range-builtin-not-iterating,useless-suppression,cmp-method,print-statement,reduce-builtin,file-builtin,long-builtin,getslice-method,execfile-builtin,no-absolute-import,metaclass-assignment,oct-method,reload-builtin,import-star-module-level,suppressed-message,apply-builtin,raising-string,next-method-called,setslice-method,old-ne-operator,arguments-differ,wildcard-import,locally-disabled
disable=line-too-long,import-outside-toplevel,bad-continuation,long-suffix,standarderror-builtin,indexing-exception,delslice-method,unichr-builtin,dict-view-method,parameter-unpacking,unicode-builtin,cmp-builtin,intern-builtin,round-builtin,backtick,nonzero-method,xrange-builtin,coerce-method,raw_input-builtin,old-division,filter-builtin-not-iterating,old-octal-literal,input-builtin,map-builtin-not-iterating,buffer-builtin,basestring-builtin,zip-builtin-not-iterating,using-cmp-argument,unpacking-in-except,old-raise-syntax,coerce-builtin,dict-iter-method,hex-method,range-builtin-not-iterating,useless-suppression,cmp-method,print-statement,reduce-builtin,file-builtin,long-builtin,getslice-method,execfile-builtin,no-absolute-import,metaclass-assignment,oct-method,reload-builtin,import-star-module-level,suppressed-message,apply-builtin,raising-string,next-method-called,setslice-method,old-ne-operator,arguments-differ,wildcard-import,locally-disabled


[REPORTS]
Expand Down
8 changes: 6 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ check:
pipenv run pylint kfx
pipenv run flake8
pipenv run mypy kfx
# pydocstyle
pipenv run pydocstyle
pipenv run bandit -r kfx -x *_test.py

test: check
Expand All @@ -51,4 +51,8 @@ test-ci: test-all
pipenv run coveralls

schema: force_reload
PYTHONPATH=${PWD} ./scripts/generate_schemas.py
PYTHONPATH=${PWD} ./scripts/generate_schemas.py

commit: docs requirements force_reload
git add .
git commit
1 change: 1 addition & 0 deletions Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ m2r = "==0.2.*"
mistune = "==0.8.*"
twine = ">=3.*"
coveralls = "*"
recommonmark = "*"

[pipenv]
allow_prereleases = true
47 changes: 31 additions & 16 deletions Pipfile.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

136 changes: 103 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,71 +8,141 @@
[![Downloads](https://pepy.tech/badge/kfx/month)](https://pepy.tech/project/kfx/month)

`kfx` is a python package with the namespace `kfx`. Currently, it provides the
following sub-packages:
- `kfx.lib.vis`: Data models and helpers to generate ui metadata object for rendering vis in kubeflow pipeline UI.
- `kfx.lib.utils`: Helpers to extend kubeflow pipeline tasks/containerOps.
following sub-packages

> Documentation: [https://kfx.readthedocs.io](https://kfx.readthedocs.io).
> Repo: [https://github.com/e2fyi/kfx](https://github.com/e2fyi/kfx)
- `kfx.lib.dsl` - Extensions to yje kubeflow pipeline dsl.

- `kfx.lib.vis` - Data models and helpers to help generate the `mlpipeline-ui-metadata.json` required to render visualization in the kubeflow pipeline UI. See [Visualize Results in the Pipelines UI](https://www.kubeflow.org/docs/pipelines/sdk/output-viewer/)

> - Documentation: [https://kfx.readthedocs.io](https://kfx.readthedocs.io).
> - Repo: [https://github.com/e2fyi/kfx](https://github.com/e2fyi/kfx)
## Quick start

Installation

```bash
pip install kfx
```

## Usage

Generating ui metadata artifacts for kubeflow pipeline UI to render visualizations.
```py
import kfx.lib.vis as kfxvis
import kfx.lib.utils as kfxutils
Example: Using `ArtifactLocationHelper` and `get_artifact_uri` to determine the
uri of your data artifact generated by the kubeflow pipeline task.

> `kfx.dsl.ArtifactLocationHelper` is a helper to modify the kubeflow pipeline task
> so that you can use `kfx.dsl.get_artifact_uri` to get the uri of your data
> artifact that will be generated inside the task.
```python
import kfp.components
import kfp.dsl
import kfx.dsl


# creates the helper that has the argo configs (tells you how artifacts will be stored)
# see https://github.com/argoproj/argo/blob/master/docs/workflow-controller-configmap.yaml
helper = kfx.dsl.ArtifactLocationHelper(
scheme="minio", bucket="mlpipeline", key_prefix="artifacts/"
)

@kfp.components.func_to_container_op
def test_op(
mlpipeline_ui_metadata_file: OutputTextFile(str), markdown_file: OutputTextFile(str)
mlpipeline_ui_metadata: OutputTextFile(str), markdown_data_file: OutputTextFile(str)
):
"A test kubeflow pipeline task."

import kfx.lib.utils as kfxutils
import kfx.lib.vis as kfxvis
import kfx.dsl
import kfx.vis

# write the markdown to the `markdown-data` artifact
markdown_data_file.write("### hello world")

# note that artifact name is `markdown` instead of `markdown_file`
# `_file` and `_path` suffix are removed.
markdown_file.write("### hello world")
markdown_src = kfxutils.get_artifact_uri("markdown")

# creates the ui metadata object
mlpipeline_ui_metadata = kfxvis.kfp_ui_metadata(
[kfxvis.markdown(source=markdown_src)]
# creates an ui metadata object
ui_metadata = kfx.vis.kfp_ui_metadata(
# Describes the vis to generate in the kubeflow pipeline UI.
# In this case, a markdown vis from a markdown artifact.
[kfx.vis.markdown(kfx.dsl.kfp_artifact("markdown_data_file"))]
# `kfp_artifact` provides the reference to data artifact created
# inside this task
)
# note that artifact name is `mlpipeline-ui-metadata` and not
# `mlpipeline_ui_metadata_file`.
mlpipeline_ui_metadata_file.write(kfxvis.asjson(mlpipeline_ui_metadata))

# prints the artifact uri that will be saved by kfp to the artifactory.
print(mlpipeline_ui_metadata.outputs[0].source)
# writes the ui metadata object as the `mlpipeline-ui-metadata` artifact
mlpipeline_ui_metadata.write(kfx.vis.asjson(ui_metadata))

# prints the uri to the markdown artifact
print(ui_metadata.outputs[0].source)

# helper to decorate the task so that `kfx.lib.utils.get_artifact_uri` can be
# used to infer the uri of the artifact.
helper = kfxutils.ArtifactLocationHelper(
scheme="minio", bucket="mlpipeline", key_prefix="artifacts/"
)

@kfp.dsl.pipeline()
def test_pipeline():
"""Test pipeline."""
"A test kubeflow pipeline"

op: kfp.dsl.ContainerOp = test_op()
# setup the required image and env vars, so that `kfx.lib.utils.get_artifact_uri`
# can be used to infer artifact uri.

# modify kfp operator with artifact location metadata through env vars
op.apply(helper.set_envs())

```

Example: Using [pydantic](https://pydantic-docs.helpmanual.io/) data models to generate
[`mlpipeline_ui_metadata`](https://www.kubeflow.org/docs/pipelines/sdk/output-viewer/).

> `kfx.vis` has helper functions (with corresponding hints) to describe and create a
> [`mlpipeline_ui_metadata.json`](https://www.kubeflow.org/docs/pipelines/sdk/output-viewer/)
> file (required by kubeflow pipeline UI to render any visualizations).
```python
import kfp.components
import kfx.vis

from kfx.vis.enums import KfpStorage


@func_to_container_op
def some_op(mlpipeline_ui_metadata: OutputTextFile(str)):
"kfp operator that provides metadata for visualizations."

mlpipeline_ui_metadata = kfx.vis.kfp_ui_metadata(
[
# creates a confusion matrix vis
kfx.vis.confusion_matrix(
source="gs://your_project/your_bucket/your_cm_file",
labels=["True", "False"],
),
# creates a markdown with inline source
kfx.vis.markdown(
"# Inline Markdown: [A link](https://www.kubeflow.org/)",
storage="inline",
),
# creates a markdown with a remote source
kfx.vis.markdown(
"gs://your_project/your_bucket/your_markdown_file",
),
# creates a ROC curve with a remote source
kfx.vis.roc(
"gs://your_project/your_bucket/your_roc_file",
),
# creates a Table with a remote source
kfx.vis.table(
"gs://your_project/your_bucket/your_csv_file",
header=["col1", "col2"],
),
# creates a tensorboard viewer
kfx.vis.tensorboard(
"gs://your_project/your_bucket/logs/*",
),
# creates a custom web app from a remote html file
kfx.vis.web_app(
"gs://your_project/your_bucket/your_html_file",
),
]
)

# write ui metadata so that kubeflow pipelines UI can render visualizations
mlpipeline_ui_metadata.write(kfx.vis.asjson(mlpipeline_ui_metadata))
```

## Developer guide

This project used:
Expand Down
1 change: 0 additions & 1 deletion kfx/components/__init__.py

This file was deleted.

59 changes: 58 additions & 1 deletion kfx/dsl/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1,58 @@
"""Extension to kfp dsl."""
"""Extension to kfp dsl.
::
import kfp.components
import kfp.dsl
import kfx.dsl
helper = kfx.dsl.ArtifactLocationHelper(
scheme="minio", bucket="mlpipeline", key_prefix="artifacts/"
)
@kfp.components.func_to_container_op
def test_op(
mlpipeline_ui_metadata: OutputTextFile(str), markdown_data_file: OutputTextFile(str)
):
"A test kubeflow pipeline task."
import kfx.dsl
import kfx.vis
# write the markdown to the `markdown-data` artifact
markdown_data_file.write("### hello world")
# creates an ui metadata object
ui_metadata = kfx.vis.kfp_ui_metadata(
# Describes the vis to generate in the kubeflow pipeline UI.
# In this case, a markdown vis from a markdown artifact.
[kfx.vis.markdown(kfx.dsl.kfp_artifact("markdown_data_file"))]
# `kfp_artifact` provides the reference to data artifact created
# inside this task
)
# writes the ui metadata object as the `mlpipeline-ui-metadata` artifact
mlpipeline_ui_metadata.write(kfx.vis.asjson(ui_metadata))
# prints the uri to the markdown artifact
print(ui_metadata.outputs[0].source)
@kfp.dsl.pipeline()
def test_pipeline():
"A test kubeflow pipeline"
op: kfp.dsl.ContainerOp = test_op()
# modify kfp operator with artifact location metadata through env vars
op.apply(helper.set_envs())
"""
from kfx.dsl._artifact_location import (
WorkflowVars,
ArtifactLocationHelper,
kfp_artifact,
set_workflow_env,
set_pod_metadata_envs,
)
Loading

0 comments on commit e61ea8d

Please sign in to comment.