diff --git a/docs/inference-providers/_toctree.yml b/docs/inference-providers/_toctree.yml index cbd2f4cb6..a0bae1c91 100644 --- a/docs/inference-providers/_toctree.yml +++ b/docs/inference-providers/_toctree.yml @@ -28,6 +28,17 @@ - local: guides/vscode title: VS Code with GitHub Copilot +- title: Integrations + sections: + - local: integrations/index + title: Integrations Overview + - local: integrations/adding-integration + title: Add Your Integration + - local: integrations/macwhisper + title: MacWhisper + - local: integrations/open-code + title: OpenCode + - local: tasks/index title: Inference Tasks sections: diff --git a/docs/inference-providers/integrations/adding-integration.md b/docs/inference-providers/integrations/adding-integration.md new file mode 100644 index 000000000..b035b30ca --- /dev/null +++ b/docs/inference-providers/integrations/adding-integration.md @@ -0,0 +1,47 @@ +# Add Your Integration + +Building a tool that works with Hugging Face Inference Providers? We'd love to feature it in our integrations directory! + +## Requirements + +To be listed, your integration should: + +- Work with Inference Providers +- Be actively maintained with recent commits or releases +- Have clear documentation showing how to connect to Inference Providers + +## How to Submit + +1. Test your integration with Hugging Face Inference Providers +2. Fork the repository at [github.com/huggingface/hub-docs](https://github.com/huggingface/hub-docs) +3. Update the index in `docs/inference-providers/integrations/index.md` to add your tool to the table with a link to your integration docs. +4. Submit a Pull Request with your changes +5. (Optional) Create a dedicated integration page in `docs/inference-providers/integrations/your-tool-name.md` with detailed setup instructions. + +## Integration Page Template + +Create a file named `your-tool-name.md` with this structure: + +``` +# Your Tool Name + +Brief description of what your tool does. + +## Overview + +How your tool integrates with Hugging Face Inference Providers. + +## Prerequisites + +- Your tool installed +- HF account with [API token](https://huggingface.co/settings/tokens) + +## Configuration + +Step-by-step setup instructions with code examples. + +## Resources + +- [Your Tool Documentation](https://yourtool.com/docs) +- [HF Integration Guide](link-to-your-guide) +``` diff --git a/docs/inference-providers/integrations/index.md b/docs/inference-providers/integrations/index.md new file mode 100644 index 000000000..c04b692e6 --- /dev/null +++ b/docs/inference-providers/integrations/index.md @@ -0,0 +1,68 @@ +# Integrations Overview + +Hugging Face Inference Providers works with a growing ecosystem of developer tools, frameworks, and platforms. These integrations let you use state-of-the-art models in your existing workflows and development environments. + +If a tool doesn't have explicit support for Inference Providers, it is often still compatible via its OpenAI-compatible API support. Check the documentation for your tool to see if it can be configured to use custom endpoints. + +## Why Use Integrations? + +- **Keep your existing tools**: Use Inference Providers with tools you already know +- **Access dozens of providers**: Switch between providers without changing your code +- **Zero markup pricing**: Get the same rates as going direct to providers +- **Single API token**: One HF token for all providers and models + +## Overview + +This table lists _some_ tools, libraries, and applications that work with Hugging Face Inference Providers. For detailed setup instructions, follow the links in the Documentation column. + +| Integration | Description | Documentation | +| ------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------- | +| [Continue](https://continue.dev/) | AI code assistant for VS Code and JetBrains | [Integration docs](https://docs.continue.dev/customize/model-providers/more/huggingfaceinferenceapi#hugging-face) | +| [Haystack](https://haystack.deepset.ai/) | Open-source LLM framework for building production applications | [Integration docs](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) | +| [Inspect](https://inspect.aisi.org.uk/) | AI safety and evaluation framework | [Integration docs](https://inspect.aisi.org.uk/providers.html#hugging-face) | +| [LangChain](https://www.langchain.com/) | LLM application framework | [Integration docs](https://docs.langchain.com/oss/python/integrations/providers/huggingface#huggingfaceendpoint) | +| [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) | Unified interface for 100+ LLMs | [Integration docs](https://docs.litellm.ai/docs/providers/huggingface) | +| [LlamaIndex](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) | Data framework for LLM applications | [Integration docs](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) | +| [MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) | Speech-to-text application for macOS | [Guide](./macwhisper) | +| [OpenCode](./open-code) | AI coding agent built for the terminal | [Integration docs](https://opencode.ai/docs/providers#hugging-face) / [HF Guide](./open-code) | +| [Roo Code](https://docs.roocode.com/) | AI-powered code generation and refactoring | [Integration docs](https://docs.roocode.com/providers/huggingface) | + +## Integrations by Category + +### API Clients + +Client libraries and gateways for simplified LLM access. + +- [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) - Unified interface for calling 100+ LLMs with the same format + +### Applications + +End-user applications and interfaces powered by LLMs. + +- [MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) - Speech-to-text application for macOS ([HF setup Guide](./macwhisper)) + +### Developer Tools + +AI-powered coding assistants and development environments. + +- [Continue](https://docs.continue.dev/customize/model-providers/more/huggingfaceinferenceapi#hugging-face) - AI code assistant for VS Code, JetBrains and the terminal. +- [OpenCode](https://opencode.ai/docs/providers#hugging-face) - AI coding agent built for the terminal. +- [Roo Code](https://docs.roocode.com/providers/huggingface) - AI-powered code generation and refactoring. + +### Evaluation Frameworks + +Tools for assessing and ensuring AI safety and performance. + +- [Inspect](https://inspect.aisi.org.uk/providers.html#hugging-face) - AI safety and evaluation framework. + +### LLM Frameworks + +LLM application frameworks and orchestration platforms. + +- [Haystack](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) - Open-source framework for building production-ready LLM applications +- [LangChain](https://docs.langchain.com/oss/python/integrations/providers/huggingface#huggingfaceendpoint) - Popular framework for developing LLM applications +- [LlamaIndex](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) - Data framework for connecting custom data to LLMs + + diff --git a/docs/inference-providers/integrations/macwhisper.md b/docs/inference-providers/integrations/macwhisper.md new file mode 100644 index 000000000..eef51250c --- /dev/null +++ b/docs/inference-providers/integrations/macwhisper.md @@ -0,0 +1,44 @@ +# MacWhisper + +[MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) is a native macOS application that lets you transcribe audio using Whisper models locally on your Mac, with no technical setup required. + +## Overview + +MacWhisper includes AI services that can process your transcriptions—for example, to clean up dictation, summarize content, or translate to another language. You can configure these AI services to use Hugging Face Inference Providers, giving you access to a wide range of open models from multiple providers at zero markup pricing. + +## Prerequisites + +- MacWhisper installed. +- A Hugging Face account with an [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission) + +## Configuration + +1. **Get a Hugging Face token** + Create a token at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) with "Make calls to Inference Providers" permission. + +2. **Open MacWhisper settings** + Go to **Settings → AI → Service** + +3. **Select Hugging Face Inference Providers** + Choose "Hugging Face Inference Providers" from the service dropdown. + +4. **Enter your API token** + Paste your Hugging Face token in the API token field. + +5. **Choose a model** + Enter the model ID you want to use (e.g., `openai/gpt-oss-20b`). + +That's it! MacWhisper will now use Hugging Face Inference Providers for AI processing of your transcriptions. + +## Common Use Cases + +- **Clean up dictation**: Remove filler words and improve grammar in voice-to-text +- **Summarization**: Generate summaries of meeting transcriptions or long audio files +- **Translation**: Translate transcriptions to other languages +- **Custom processing**: Apply custom prompts to transform transcriptions for your workflow + +## Resources + +- [MacWhisper product page](https://goodsnooze.gumroad.com/l/macwhisper) +- [Hugging Face Inference Providers documentation](https://huggingface.co/docs/inference-providers) +- [Available models](https://huggingface.co/models?pipeline_tag=text-generation&inference_provider=all&sort=trending) diff --git a/docs/inference-providers/integrations/open-code.md b/docs/inference-providers/integrations/open-code.md new file mode 100644 index 000000000..913871996 --- /dev/null +++ b/docs/inference-providers/integrations/open-code.md @@ -0,0 +1,66 @@ +# OpenCode + + + +[OpenCode](https://opencode.ai/) is an AI coding agent built for the terminal that helps with code review, refactoring, testing, and general development tasks. + +## Overview + +OpenCode natively supports Hugging Face Inference Providers, giving you access to open models from 17+ providers through a single interface. + +## Prerequisites + +- OpenCode installed ([installation guide](https://opencode.ai/docs)) +- A Hugging Face account with [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission) + +## Configuration + +### Quick Setup + +1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) + +2. Run `opencode auth login` and select **Hugging Face**: + +```bash +$ opencode auth login + +┌ Add credential +│ +◆ Select provider +│ ● Hugging Face +│ ... +└ +``` + +3. Enter your Hugging Face token when prompted: + +```bash +┌ Add credential +│ +◇ Select provider +│ Hugging Face +│ +◇ Enter your API key +│ hf_... +└ +``` + +4. Run the `/models` command in OpenCode to select a model. + +Once configured, OpenCode will use your selected model for all operations. You can switch models anytime using the `/models` command in the OpenCode TUI (Terminal User Interface). + +## GitHub Actions Integration + +OpenCode can also be used to run open models in GitHub Actions via Inference Providers. See our [GitHub Actions guide](../guides/github-actions-code-review) for setting up automated PR reviews. + +## Resources + +- [OpenCode Documentation](https://opencode.ai/docs) +- [OpenCode Provider Configuration](https://opencode.ai/docs/providers/#hugging-face) +- [GitHub Actions Integration Guide](../guides/github-actions-code-review)