Skip to content

Commit

Permalink
fix: update torchserve readme links (kubeflow#2394)
Browse files Browse the repository at this point in the history
* fix: update readme doc

Signed-off-by: Jagadeesh J <jagadeeshj@ideas2it.com>

* Update torchserve transformer doc

Signed-off-by: Jagadeesh J <jagadeeshj@ideas2it.com>

* fix: website url

Signed-off-by: Jagadeesh J <jagadeeshj@ideas2it.com>

Signed-off-by: Jagadeesh J <jagadeeshj@ideas2it.com>
Co-authored-by: Dan Sun <dsun20@bloomberg.net>
  • Loading branch information
Jagadeesh J and yuzisun authored Aug 25, 2022
1 parent ba77c30 commit 94b1fab
Show file tree
Hide file tree
Showing 16 changed files with 17 additions and 17 deletions.
2 changes: 1 addition & 1 deletion docs/samples/logger/basic/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Let's apply this YAML:
kubectl create -f sklearn-logging.yaml
```

We can now send a request to the sklearn model. Check the README [here](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports)
We can now send a request to the sklearn model. Check the README [here](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports)
to learn how to determine the INGRESS_HOST and INGRESS_PORT used in curling the InferenceService.

```
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/logger/knative-eventing/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ Let's apply this YAML:
kubectl create -f sklearn-logging.yaml
```

We can now send a request to the sklearn model. Check the README [here](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports)
We can now send a request to the sklearn model. Check the README [here](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports)
to learn how to determine the INGRESS_HOST and INGRESS_PORT used in curling the InferenceService.

```
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/custom/prebuilt-image/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ inferenceservice.serving.kserve.io/custom-prebuilt-image
```

## Run a prediction
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

This example uses the [codait/max-object-detector](https://github.com/IBM/MAX-Object-Detector) image. The Max Object Detector api server expects a POST request to the `/model/predict` endpoint that includes an `image` multipart/form-data and an optional `threshold` query string.

Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/custom/torchserve/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ $inferenceservice.serving.kubeflow.org/torchserve-custom created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

Download input image:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ $inferenceservice.serving.kubeflow.org/torchserve-bert created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
MODEL_NAME=torchserve-bert
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/custom/torchserve/docs/autoscaling.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ $inferenceservice.serving.kserve.io/torchserve-custom created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

### Steps

Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/custom/torchserve/docs/canary.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ $inferenceservice.serving.kserve.io/torchserve-custom created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
MODEL_NAME=torchserve-custom
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/custom/torchserve/docs/metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ $inferenceservice.serving.kserve.io/torchserve-custom created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

## Inference

Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/onnx/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ $ inferenceservice.serving.kserve.io/style-sample configured
## Run a sample inference

1. Setup env vars
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```
export MODEL_NAME=style-sample
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/sklearn/v1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Expected Output
$ inferenceservice.serving.kserve.io/sklearn-iris created
```
## Run a prediction
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```
MODEL_NAME=sklearn-iris
Expand Down
4 changes: 2 additions & 2 deletions docs/samples/v1beta1/torchserve/autoscaling/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
KServe supports the implementation of Knative Pod Autoscaler (KPA) and Kubernetes’ Horizontal Pod Autoscaler (HPA).
The features and limitations of each of these Autoscalers are listed below.

IMPORTANT: If you want to use Kubernetes Horizontal Pod Autoscaler (HPA), you must install [HPA extension](https://knative.dev/docs/install/any-kubernetes-cluster/#optional-serving-extensions)
IMPORTANT: If you want to use Kubernetes Horizontal Pod Autoscaler (HPA), you must install [HPA extension](https://knative.dev/docs/install/yaml-install/serving/install-serving-with-yaml/#install-optional-serving-extensions)
after you install Knative Serving.

Knative Pod Autoscaler (KPA)
Expand Down Expand Up @@ -66,7 +66,7 @@ $inferenceservice.serving.kserve.io/torchserve created

## Run inference with concurrent requests

The first step is to [determine the ingress IP and ports](../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

Install hey load generator
```bash
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/torchserve/canary/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ torchserve-predictor-default-kxp96 torchserve-predictor-default torchserve-p

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
MODEL_NAME=mnist
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/torchserve/metrics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ $inferenceservice.serving.kserve.io/torch-metrics created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

## Inference

Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/torchserve/v1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Expected Output
$inferenceservice.serving.kserve.io/torchserve created
```

The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
MODEL_NAME=mnist
Expand Down
2 changes: 1 addition & 1 deletion docs/samples/v1beta1/torchserve/v2/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ $inferenceservice.serving.kserve.io/torchserve-mnist-v2 created

## Inference with V2 REST Protocol

The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/get_started/first_isvc/#3-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
MODEL_NAME=mnist
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ inferenceservice.serving.kserve.io/torchserve-transformer created

## Run a prediction

The first step is to [determine the ingress IP and ports](../../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`
The first step is to [determine the ingress IP and ports](https://kserve.github.io/website/master/get_started/first_isvc/#4-determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```bash
SERVICE_NAME=torchserve-transformer
Expand Down

0 comments on commit 94b1fab

Please sign in to comment.