Skip to content

Commit f978586

Browse files
committedMar 5, 2020
Update version to 0.14.0
1 parent 1539585 commit f978586

File tree

23 files changed

+53
-58
lines changed

23 files changed

+53
-58
lines changed
 

‎README.md

+6-10
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,6 @@ Cortex is an open source platform for deploying machine learning models as produ
44

55
<br>
66

7-
<!-- Delete on release branches -->
8-
<!-- CORTEX_VERSION_README_MINOR -->
9-
[install](https://cortex.dev/install)[tutorial](https://cortex.dev/iris-classifier)[docs](https://cortex.dev)[examples](https://github.com/cortexlabs/cortex/tree/0.13/examples)[we're hiring](https://angel.co/cortex-labs-inc/jobs)[email us](mailto:hello@cortex.dev)[chat with us](https://gitter.im/cortexlabs/cortex)<br><br>
10-
117
<!-- Set header Cache-Control=no-cache on the S3 object metadata (see https://help.github.com/en/articles/about-anonymized-image-urls) -->
128
![Demo](https://d1zqebknpdh033.cloudfront.net/demo/gif/v0.13_2.gif)
139

@@ -33,7 +29,7 @@ Cortex is designed to be self-hosted on any AWS account. You can spin up a clust
3329
<!-- CORTEX_VERSION_README_MINOR -->
3430
```bash
3531
# install the CLI on your machine
36-
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.13/get-cli.sh)"
32+
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.14/get-cli.sh)"
3733

3834
# provision infrastructure on AWS and spin up a cluster
3935
$ cortex cluster up
@@ -140,8 +136,8 @@ The CLI sends configuration and code to the cluster every time you run `cortex d
140136
## Examples of Cortex deployments
141137

142138
<!-- CORTEX_VERSION_README_MINOR x5 -->
143-
* [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.13/examples/tensorflow/sentiment-analyzer): deploy a BERT model for sentiment analysis.
144-
* [Image classification](https://github.com/cortexlabs/cortex/tree/0.13/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
145-
* [Search completion](https://github.com/cortexlabs/cortex/tree/0.13/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
146-
* [Text generation](https://github.com/cortexlabs/cortex/tree/0.13/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
147-
* [Iris classification](https://github.com/cortexlabs/cortex/tree/0.13/examples/sklearn/iris-classifier): deploy a scikit-learn model to classify iris flowers.
139+
* [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/sentiment-analyzer): deploy a BERT model for sentiment analysis.
140+
* [Image classification](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
141+
* [Search completion](https://github.com/cortexlabs/cortex/tree/0.14/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
142+
* [Text generation](https://github.com/cortexlabs/cortex/tree/0.14/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
143+
* [Iris classification](https://github.com/cortexlabs/cortex/tree/0.14/examples/sklearn/iris-classifier): deploy a scikit-learn model to classify iris flowers.

‎build/build-image.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ set -euo pipefail
1919

2020
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")"/.. >/dev/null && pwd)"
2121

22-
CORTEX_VERSION=master
22+
CORTEX_VERSION=0.14.0
2323

2424
dir=$1
2525
image=$2

‎build/cli.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ set -euo pipefail
1919

2020
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")"/.. >/dev/null && pwd)"
2121

22-
CORTEX_VERSION=master
22+
CORTEX_VERSION=0.14.0
2323

2424
arg1=${1:-""}
2525
upload="false"

‎build/push-image.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717

1818
set -euo pipefail
1919

20-
CORTEX_VERSION=master
20+
CORTEX_VERSION=0.14.0
2121

2222
image=$1
2323

‎docs/cluster-management/config.md

+21-21
Original file line numberDiff line numberDiff line change
@@ -43,28 +43,28 @@ instance_volume_size: 50
4343
log_group: cortex
4444

4545
# whether to use spot instances in the cluster (default: false)
46-
# see https://cortex.dev/v/master/cluster-management/spot-instances for additional details on spot configuration
46+
# see https://cortex.dev/v/0.14/cluster-management/spot-instances for additional details on spot configuration
4747
spot: false
4848

4949
# docker image paths
50-
image_python_serve: cortexlabs/python-serve:master
51-
image_python_serve_gpu: cortexlabs/python-serve-gpu:master
52-
image_tf_serve: cortexlabs/tf-serve:master
53-
image_tf_serve_gpu: cortexlabs/tf-serve-gpu:master
54-
image_tf_api: cortexlabs/tf-api:master
55-
image_onnx_serve: cortexlabs/onnx-serve:master
56-
image_onnx_serve_gpu: cortexlabs/onnx-serve-gpu:master
57-
image_operator: cortexlabs/operator:master
58-
image_manager: cortexlabs/manager:master
59-
image_downloader: cortexlabs/downloader:master
60-
image_request_monitor: cortexlabs/request-monitor:master
61-
image_cluster_autoscaler: cortexlabs/cluster-autoscaler:master
62-
image_metrics_server: cortexlabs/metrics-server:master
63-
image_nvidia: cortexlabs/nvidia:master
64-
image_fluentd: cortexlabs/fluentd:master
65-
image_statsd: cortexlabs/statsd:master
66-
image_istio_proxy: cortexlabs/istio-proxy:master
67-
image_istio_pilot: cortexlabs/istio-pilot:master
68-
image_istio_citadel: cortexlabs/istio-citadel:master
69-
image_istio_galley: cortexlabs/istio-galley:master
50+
image_python_serve: cortexlabs/python-serve:0.14.0
51+
image_python_serve_gpu: cortexlabs/python-serve-gpu:0.14.0
52+
image_tf_serve: cortexlabs/tf-serve:0.14.0
53+
image_tf_serve_gpu: cortexlabs/tf-serve-gpu:0.14.0
54+
image_tf_api: cortexlabs/tf-api:0.14.0
55+
image_onnx_serve: cortexlabs/onnx-serve:0.14.0
56+
image_onnx_serve_gpu: cortexlabs/onnx-serve-gpu:0.14.0
57+
image_operator: cortexlabs/operator:0.14.0
58+
image_manager: cortexlabs/manager:0.14.0
59+
image_downloader: cortexlabs/downloader:0.14.0
60+
image_request_monitor: cortexlabs/request-monitor:0.14.0
61+
image_cluster_autoscaler: cortexlabs/cluster-autoscaler:0.14.0
62+
image_metrics_server: cortexlabs/metrics-server:0.14.0
63+
image_nvidia: cortexlabs/nvidia:0.14.0
64+
image_fluentd: cortexlabs/fluentd:0.14.0
65+
image_statsd: cortexlabs/statsd:0.14.0
66+
image_istio_proxy: cortexlabs/istio-proxy:0.14.0
67+
image_istio_pilot: cortexlabs/istio-pilot:0.14.0
68+
image_istio_citadel: cortexlabs/istio-citadel:0.14.0
69+
image_istio_galley: cortexlabs/istio-galley:0.14.0
7070
```

‎docs/cluster-management/install.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ See [cluster configuration](config.md) to learn how you can customize your clust
1212
<!-- CORTEX_VERSION_MINOR -->
1313
```bash
1414
# install the CLI on your machine
15-
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/master/get-cli.sh)"
15+
$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.14/get-cli.sh)"
1616

1717
# provision infrastructure on AWS and spin up a cluster
1818
$ cortex cluster up
@@ -38,7 +38,7 @@ your cluster is ready!
3838

3939
```bash
4040
# clone the Cortex repository
41-
git clone -b master https://github.com/cortexlabs/cortex.git
41+
git clone -b 0.14 https://github.com/cortexlabs/cortex.git
4242

4343
# navigate to the TensorFlow iris classification example
4444
cd cortex/examples/tensorflow/iris-classifier

‎docs/cluster-management/update.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ cortex cluster update
2222
cortex cluster down
2323

2424
# update your CLI
25-
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/master/get-cli.sh)"
25+
bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.14/get-cli.sh)"
2626

2727
# confirm version
2828
cortex version

‎docs/deployments/onnx.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ You can log information about each request by adding a `?debug=true` parameter t
6767
An ONNX Predictor is a Python class that describes how to serve your ONNX model to make predictions.
6868

6969
<!-- CORTEX_VERSION_MINOR -->
70-
Cortex provides an `onnx_client` and a config object to initialize your implementation of the ONNX Predictor class. The `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session and helps make predictions using your model. Once your implementation of the ONNX Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `onnx_client.predict()` to make an inference against your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
70+
Cortex provides an `onnx_client` and a config object to initialize your implementation of the ONNX Predictor class. The `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/0.14/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session and helps make predictions using your model. Once your implementation of the ONNX Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `onnx_client.predict()` to make an inference against your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
7171

7272
## Implementation
7373

@@ -133,6 +133,6 @@ requests==2.22.0
133133
```
134134

135135
<!-- CORTEX_VERSION_MINOR x2 -->
136-
The pre-installed system packages are listed in the [onnx-serve Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/onnx-serve/Dockerfile) (for CPU) or the [onnx-serve-gpu Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/onnx-serve-gpu/Dockerfile) (for GPU).
136+
The pre-installed system packages are listed in the [onnx-serve Dockerfile](https://github.com/cortexlabs/cortex/tree/0.14/images/onnx-serve/Dockerfile) (for CPU) or the [onnx-serve-gpu Dockerfile](https://github.com/cortexlabs/cortex/tree/0.14/images/onnx-serve-gpu/Dockerfile) (for GPU).
137137

138138
If your application requires additional dependencies, you can [install additional Python packages](../dependency-management/python-packages.md) or [install additional system packages](../dependency-management/system-packages.md).

‎docs/deployments/python.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,6 @@ xgboost==0.90
171171
```
172172

173173
<!-- CORTEX_VERSION_MINOR x2 -->
174-
The pre-installed system packages are listed in the [python-serve Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/python-serve/Dockerfile) (for CPU) or the [python-serve-gpu Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/python-serve-gpu/Dockerfile) (for GPU).
174+
The pre-installed system packages are listed in the [python-serve Dockerfile](https://github.com/cortexlabs/cortex/tree/0.14/images/python-serve/Dockerfile) (for CPU) or the [python-serve-gpu Dockerfile](https://github.com/cortexlabs/cortex/tree/0.14/images/python-serve-gpu/Dockerfile) (for GPU).
175175

176176
If your application requires additional dependencies, you can [install additional Python packages](../dependency-management/python-packages.md) or [install additional system packages](../dependency-management/system-packages.md).

‎docs/deployments/tensorflow.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ You can log information about each request by adding a `?debug=true` parameter t
6868
A TensorFlow Predictor is a Python class that describes how to serve your TensorFlow model to make predictions.
6969

7070
<!-- CORTEX_VERSION_MINOR -->
71-
Cortex provides a `tensorflow_client` and a config object to initialize your implementation of the TensorFlow Predictor class. The `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container via gRPC to make predictions using your model. Once your implementation of the TensorFlow Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `tensorflow_client.predict()` to make an inference against your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
71+
Cortex provides a `tensorflow_client` and a config object to initialize your implementation of the TensorFlow Predictor class. The `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/0.14/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container via gRPC to make predictions using your model. Once your implementation of the TensorFlow Predictor class has been initialized, the replica is available to serve requests. Upon receiving a request, your implementation's `predict()` function is called with the JSON payload and is responsible for returning a prediction or batch of predictions. Your `predict()` function should call `tensorflow_client.predict()` to make an inference against your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
7272

7373
## Implementation
7474

@@ -128,6 +128,6 @@ tensorflow==2.1.0
128128
```
129129

130130
<!-- CORTEX_VERSION_MINOR -->
131-
The pre-installed system packages are listed in the [tf-api Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/tf-api/Dockerfile).
131+
The pre-installed system packages are listed in the [tf-api Dockerfile](https://github.com/cortexlabs/cortex/tree/0.14/images/tf-api/Dockerfile).
132132

133133
If your application requires additional dependencies, you can [install additional Python packages](../dependency-management/python-packages.md) or [install additional system packages](../dependency-management/system-packages.md).

‎docs/packaging-models/tensorflow.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# TensorFlow
22

33
<!-- CORTEX_VERSION_MINOR -->
4-
Export your trained model and upload the export directory, or a checkpoint directory containing the export directory (which is usually the case if you used `estimator.train_and_evaluate`). An example is shown below (here is the [complete example](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/sentiment-analyzer)):
4+
Export your trained model and upload the export directory, or a checkpoint directory containing the export directory (which is usually the case if you used `estimator.train_and_evaluate`). An example is shown below (here is the [complete example](https://github.com/cortexlabs/cortex/blob/0.14/examples/tensorflow/sentiment-analyzer)):
55

66
```python
77
import tensorflow as tf

‎docs/summary.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
* [Install](cluster-management/install.md)
55
* [Tutorial](../examples/sklearn/iris-classifier/README.md)
66
* [GitHub](https://github.com/cortexlabs/cortex)
7-
* [Examples](https://github.com/cortexlabs/cortex/tree/master/examples) <!-- CORTEX_VERSION_MINOR -->
7+
* [Examples](https://github.com/cortexlabs/cortex/tree/0.14/examples) <!-- CORTEX_VERSION_MINOR -->
88
* [Chat with us](https://gitter.im/cortexlabs/cortex)
99
* [Email us](mailto:hello@cortex.dev)
1010
* [We're hiring](https://angel.co/cortex-labs-inc/jobs)

‎examples/tensorflow/image-classifier/inception.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -204,7 +204,7 @@
204204
},
205205
"source": [
206206
"<!-- CORTEX_VERSION_MINOR -->\n",
207-
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow/image-classifier) for how to deploy the model as an API."
207+
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/image-classifier) for how to deploy the model as an API."
208208
]
209209
}
210210
]

‎examples/tensorflow/iris-classifier/tensorflow.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -289,7 +289,7 @@
289289
},
290290
"source": [
291291
"<!-- CORTEX_VERSION_MINOR -->\n",
292-
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow/iris-classifier) for how to deploy the model as an API."
292+
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/iris-classifier) for how to deploy the model as an API."
293293
]
294294
}
295295
]

‎examples/tensorflow/sentiment-analyzer/bert.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -1000,7 +1000,7 @@
10001000
},
10011001
"source": [
10021002
"<!-- CORTEX_VERSION_MINOR -->\n",
1003-
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow/sentiment-analyzer) for how to deploy the model as an API."
1003+
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/sentiment-analyzer) for how to deploy the model as an API."
10041004
]
10051005
}
10061006
]

‎examples/tensorflow/text-generator/gpt-2.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -346,7 +346,7 @@
346346
},
347347
"source": [
348348
"<!-- CORTEX_VERSION_MINOR x2 -->\n",
349-
"We also need to upload `vocab.bpe` and `encoder.json`, so that the [encoder](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/text-generator/encoder.py) in the [Predictor](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/text-generator/predictor.py) can encode the input text before making a request to the model."
349+
"We also need to upload `vocab.bpe` and `encoder.json`, so that the [encoder](https://github.com/cortexlabs/cortex/blob/0.14/examples/tensorflow/text-generator/encoder.py) in the [Predictor](https://github.com/cortexlabs/cortex/blob/0.14/examples/tensorflow/text-generator/predictor.py) can encode the input text before making a request to the model."
350350
]
351351
},
352352
{
@@ -376,7 +376,7 @@
376376
},
377377
"source": [
378378
"<!-- CORTEX_VERSION_MINOR -->\n",
379-
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow/text-generator) for how to deploy the model as an API."
379+
"That's it! See the [example on GitHub](https://github.com/cortexlabs/cortex/tree/0.14/examples/tensorflow/text-generator) for how to deploy the model as an API."
380380
]
381381
}
382382
]

‎examples/xgboost/iris-classifier/xgboost.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -237,7 +237,7 @@
237237
},
238238
"source": [
239239
"<!-- CORTEX_VERSION_MINOR -->\n",
240-
"That's it! See the [example](https://github.com/cortexlabs/cortex/tree/master/examples/xgboost/iris-classifier) for how to deploy the model as an API."
240+
"That's it! See the [example](https://github.com/cortexlabs/cortex/tree/0.14/examples/xgboost/iris-classifier) for how to deploy the model as an API."
241241
]
242242
}
243243
]

‎get-cli.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616

1717
set -e
1818

19-
CORTEX_VERSION_BRANCH_STABLE=master
19+
CORTEX_VERSION_BRANCH_STABLE=0.14.0
2020

2121
case "$OSTYPE" in
2222
darwin*) parsed_os="darwin" ;;

‎manager/install.sh

+1-2
Original file line numberDiff line numberDiff line change
@@ -16,13 +16,12 @@
1616

1717
set -e
1818

19-
CORTEX_VERSION=master
19+
CORTEX_VERSION=0.14.0
2020
EKSCTL_TIMEOUT=45m
2121

2222
arg1="$1"
2323

2424
function ensure_eks() {
25-
# Cluster statuses: https://github.com/aws/aws-sdk-go/blob/master/service/eks/api.go#L2785
2625
set +e
2726
cluster_info=$(eksctl get cluster --name=$CORTEX_CLUSTER_NAME --region=$CORTEX_REGION -o json)
2827
cluster_info_exit_code=$?

‎pkg/consts/consts.go

+2-2
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ limitations under the License.
1717
package consts
1818

1919
var (
20-
CortexVersion = "master" // CORTEX_VERSION
21-
CortexVersionMinor = "master" // CORTEX_VERSION_MINOR
20+
CortexVersion = "0.14.0" // CORTEX_VERSION
21+
CortexVersionMinor = "0.14" // CORTEX_VERSION_MINOR
2222

2323
MaxClassesPerTrackerRequest = 20 // cloudwatch.GeMetricData can get up to 100 metrics per request, avoid multiple requests and have room for other stats
2424
)

‎pkg/workloads/cortex/client/cortex/client.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ def __init__(self, aws_access_key_id, aws_secret_access_key, operator_url):
4444
self.aws_access_key_id = aws_access_key_id
4545
self.aws_secret_access_key = aws_secret_access_key
4646
self.headers = {
47-
"CortexAPIVersion": "master", # CORTEX_VERSION
47+
"CortexAPIVersion": "0.14.0", # CORTEX_VERSION
4848
"Authorization": "CortexAWS {}|{}".format(
4949
self.aws_access_key_id, self.aws_secret_access_key
5050
),

‎pkg/workloads/cortex/client/setup.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616

1717
setup(
1818
name="cortex",
19-
version="master", # CORTEX_VERSION
19+
version="0.14.0", # CORTEX_VERSION
2020
description="",
2121
author="Cortex Labs",
2222
author_email="dev@cortexlabs.com",

‎pkg/workloads/cortex/consts.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,4 +12,4 @@
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
1414

15-
CORTEX_VERSION = "master"
15+
CORTEX_VERSION = "0.14.0"

0 commit comments

Comments
 (0)
Please sign in to comment.