You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Although the S3 endpoint URL is configurable in S3 deployment option. It is hardcoded in workflow-controller-configmap which causes issue if the global S3 endpoint does not work in some region like gov cloud: s3.us-gov-west-1.amazonaws.com
Steps To Reproduce
run data passing sample pipeline in gov cloud region
Expected behavior
Successfully pass data between pipeline steps
Environment
Kubernetes version
Using EKS (yes/no), if so version?
Kubeflow version 1.4
AWS build number 1.0.0
AWS service targeted (S3, RDS, etc.) S3
Screenshots
Additional context
Original issue:
Hi everyone! @Suraj Kota I'm relatively new to Kubeflow and currently stuck and looking to see if anyone else encountered a similar issue. I've successfully run an example Kubeflow pipeline locally on minikube and I'm trying to bring it up to our new Kubeflow v1.4 installation in AWS EKS and now it appears I can't pass artifacts between stages. When passing artifacts between stages on minikube it was by default implicitly using MinIO when using InputPath/OutputPath types. Now when running the same pipeline I'm getting:
This step is in Error state with this message: Error (exit code 1): failed to put file: The AWS Access Key Id you provided does not exist in our records.
I have followed these instructions to create a Kubernetes Secret and attach AWS credentials. It's not clear to me when passing artifacts between stages with InputPath/OutputPath whether Kubeflow pipeline is still using MinIO or AWS S3. In either case the credentials should be there (I've verified that MinIO credentials are there and unchanged from default values).
Describe the bug
Although the S3 endpoint URL is configurable in S3 deployment option. It is hardcoded in workflow-controller-configmap which causes issue if the global S3 endpoint does not work in some region like gov cloud:
s3.us-gov-west-1.amazonaws.com
Steps To Reproduce
run data passing sample pipeline in gov cloud region
Expected behavior
Successfully pass data between pipeline steps
Environment
Screenshots
Additional context
Original issue:
Hi everyone! @Suraj Kota I'm relatively new to Kubeflow and currently stuck and looking to see if anyone else encountered a similar issue. I've successfully run an example Kubeflow pipeline locally on minikube and I'm trying to bring it up to our new Kubeflow v1.4 installation in AWS EKS and now it appears I can't pass artifacts between stages. When passing artifacts between stages on minikube it was by default implicitly using MinIO when using InputPath/OutputPath types. Now when running the same pipeline I'm getting:
I have followed these instructions to create a Kubernetes Secret and attach AWS credentials. It's not clear to me when passing artifacts between stages with InputPath/OutputPath whether Kubeflow pipeline is still using MinIO or AWS S3. In either case the credentials should be there (I've verified that MinIO credentials are there and unchanged from default values).
Sample pipeline from Ryan McCaffrey:
The text was updated successfully, but these errors were encountered: