-
Notifications
You must be signed in to change notification settings - Fork 835
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document custom inference servers #1673
Document custom inference servers #1673
Conversation
Tue Apr 7 17:05:55 UTC 2020 impatient try |
Tue Apr 7 17:06:00 UTC 2020 impatient try |
Awesome!! @adriangonz if you add the link to the index in readme.md it should be good to land |
If it possible to specify custom inference server without modifying |
@RafalSkolasinski that's an interesting point, I think we could abstract this eventually, but I think we shoudl create a ticket for that, as I agree that this is something that may be worth exploring - ie how users coudl add their own prepackaged servers withou modifying the global config.. perhaps this could even be a namespace configmap that can overrride the global configmap |
/lgtm |
/approve |
@adriangonz It may be a good idea to link that custom prepackaged servers from mlflow server docpage, what do you think? |
Thanks for the comments @RafalSkolasinski @axsaucedo. Both should be added now! |
Awesome stuff! |
Wed Apr 8 14:41:57 UTC 2020 impatient try |
Wed Apr 8 14:42:02 UTC 2020 impatient try |
[APPROVALNOTIFIER] This PR is APPROVED Approval requirements bypassed by manually added approval. This pull-request has been approved by: RafalSkolasinski The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
failed to trigger Pull Request pipeline
|
Fixes #1416
Changelog