Skip to content

fix to remove on-prem introducing-private-hub.md #1166

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 1 addition & 11 deletions introducing-private-hub.md
Original file line number Diff line number Diff line change
Expand Up @@ -138,16 +138,6 @@ With the Private Hub, data scientists can seamlessly work with [Transformers](ht
- [Inference API](https://huggingface.co/inference-api): every private model created on the Private Hub is deployed for inference in your own infrastructure via simple API calls.
- [PRs and Discussions](https://huggingface.co/blog/community-update): support peer reviews on models, datasets, and spaces to improve collaboration across teams.

From research to production, your data never leaves your servers. The Private Hub runs in your own compliant server. It provides enterprise security features like security scans, audit trail, SSO, and control access to keep your models and data secure.

We provide flexible options for deploying your Private Hub in your private, compliant environment, including:

- **Managed Private Hub (SaaS)**: runs in segregated virtual private servers (VPCs) owned by Hugging Face. You can enjoy the full Hugging Face experience on your own private Hub without having to manage any infrastructure.

- **On-cloud Private Hub**: runs in a cloud account on AWS, Azure or GCP owned by the customer. This deployment option gives you full administrative control of the underlying cloud infrastructure and lets you achieve stronger security and compliance.

- **On-prem Private Hub**: on-premise deployment of the Hugging Face Hub on your own infrastructure. For customers with strict compliance rules and/or workloads where they don't want or are not allowed to run on a public cloud.

Now that we have covered the basics of what the Private Hub is, let's go over how companies are using it to accelerate their ML development.

## 3. How Are Companies Using the Private Hub to Accelerate Their ML Roadmap?
Expand Down Expand Up @@ -282,4 +272,4 @@ But building machine learning is still hard. Many ML projects are rushed and nev

At Hugging Face, we believe there is a better way to build machine learning. And this is why we created the [Private Hub](https://huggingface.co/platform). We think that providing a unified set of tools for every step of the machine learning development and the right tools to collaborate will lead to better ML work, bring more ML solutions to production, and help ML teams spark innovation.

Interested in learning more? [Request a demo](https://huggingface.co/platform#form) to see how you can leverage the Private Hub to accelerate ML development within your organization.
Interested in learning more? [Request a demo](https://huggingface.co/platform#form) to see how you can leverage the Private Hub to accelerate ML development within your organization.