Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
c2645e7
feat/integrations: add initial documentation for available integrations
davanstrien Nov 3, 2025
91e4a4f
feat(open-code): add initial documentation for OpenCode integration w…
davanstrien Nov 3, 2025
fbcc4cf
fix(open-code): clarify GitHub Actions integration usage in documenta…
davanstrien Nov 3, 2025
e89f905
feat(integrations): add Inference Provider Integrations section to do…
davanstrien Nov 3, 2025
794cece
feat(integrations): add OpenCode section to Inference Provider Integr…
davanstrien Nov 3, 2025
55ce637
fix(integrations): correct spelling and formatting in Integrations se…
davanstrien Nov 3, 2025
ed4587f
try different organization
davanstrien Nov 3, 2025
21e1e21
temp example intergration page
davanstrien Nov 3, 2025
1113a5d
fix(integrations): update OpenCode logo URLs to use absolute paths
davanstrien Nov 3, 2025
7c14a24
feat(integrations): add MacWhisper integration documentation
davanstrien Nov 3, 2025
7175c29
fix(integrations): comment out OpenCode logo HTML for clarity
davanstrien Nov 3, 2025
c7afa96
refactor(integrations): enhance overview and structure of integration…
davanstrien Nov 3, 2025
5c86e6f
typo
davanstrien Nov 4, 2025
1d96831
feat(integrations): add initial integrations overview documentation
davanstrien Nov 4, 2025
65a509a
simple index version
davanstrien Nov 6, 2025
44513d4
feat(integrations): add simple version of integrations overview
davanstrien Nov 6, 2025
9c79d19
feat(integrations): update integrations section and remove outdated d…
davanstrien Nov 6, 2025
45caf3d
tidy poc
davanstrien Nov 6, 2025
0af6158
first pass
davanstrien Nov 14, 2025
569c6c2
second pass
davanstrien Nov 14, 2025
76dcade
feat(integrations): implement hybrid table + categories structure
davanstrien Nov 14, 2025
79dbac3
fix(macwhisper): update prerequisites and model ID examples for clarity
davanstrien Nov 14, 2025
de18917
fix(adding-integration): simplify requirements and submission steps f…
davanstrien Nov 14, 2025
109fa93
fix(adding-integration): update code block formatting for consistency
davanstrien Nov 14, 2025
13c0c8a
fix(adding-integration): correct terminology for Inference Providers
davanstrien Nov 14, 2025
c44e8c9
fix(adding-integration): remove emphasis formatting from submission s…
davanstrien Nov 14, 2025
59d6270
fix(adding-integration): remove unnecessary questions section for cla…
davanstrien Nov 14, 2025
b6075ea
fix(adding-integration): add clarification on compatibility with Open…
davanstrien Nov 14, 2025
e3f5c89
fix(adding-integration): update documentation to clarify connection t…
davanstrien Nov 14, 2025
e23f893
simpler
davanstrien Nov 14, 2025
c6dbff4
Update docs/inference-providers/integrations/index.md
davanstrien Nov 14, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions docs/inference-providers/_toctree.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,17 @@
- local: guides/vscode
title: VS Code with GitHub Copilot

- title: Integrations
sections:
- local: integrations/index
title: Integrations Overview
- local: integrations/adding-integration
title: Add Your Integration
- local: integrations/macwhisper
title: MacWhisper
- local: integrations/open-code
title: OpenCode

- local: tasks/index
title: Inference Tasks
sections:
Expand Down
47 changes: 47 additions & 0 deletions docs/inference-providers/integrations/adding-integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Add Your Integration

Building a tool that works with Hugging Face Inference Providers? We'd love to feature it in our integrations directory!

## Requirements

To be listed, your integration should:

- Work with Inference Providers
- Be actively maintained with recent commits or releases
- Have clear documentation showing how to connect to Inference Providers

## How to Submit

1. Test your integration with Hugging Face Inference Providers
2. Fork the repository at [github.com/huggingface/hub-docs](https://github.com/huggingface/hub-docs)
3. Update the index in `docs/inference-providers/integrations/index.md` to add your tool to the table with a link to your integration docs.
4. Submit a Pull Request with your changes
5. (Optional) Create a dedicated integration page in `docs/inference-providers/integrations/your-tool-name.md` with detailed setup instructions.

## Integration Page Template

Create a file named `your-tool-name.md` with this structure:

```
# Your Tool Name

Brief description of what your tool does.

## Overview

How your tool integrates with Hugging Face Inference Providers.

## Prerequisites

- Your tool installed
- HF account with [API token](https://huggingface.co/settings/tokens)

## Configuration

Step-by-step setup instructions with code examples.

## Resources

- [Your Tool Documentation](https://yourtool.com/docs)
- [HF Integration Guide](link-to-your-guide)
```
68 changes: 68 additions & 0 deletions docs/inference-providers/integrations/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Integrations Overview

Hugging Face Inference Providers works with a growing ecosystem of developer tools, frameworks, and platforms. These integrations let you use state-of-the-art models in your existing workflows and development environments.

If a tool doesn't have explicit support for Inference Providers, it is often still compatible via its OpenAI-compatible API support. Check the documentation for your tool to see if it can be configured to use custom endpoints.

## Why Use Integrations?

- **Keep your existing tools**: Use Inference Providers with tools you already know
- **Access dozens of providers**: Switch between providers without changing your code
- **Zero markup pricing**: Get the same rates as going direct to providers
- **Single API token**: One HF token for all providers and models

## Overview

This table lists _some_ tools, libraries, and applications that work with Hugging Face Inference Providers. For detailed setup instructions, follow the links in the Documentation column.

| Integration | Description | Documentation |
| ------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------- |
| [Continue](https://continue.dev/) | AI code assistant for VS Code and JetBrains | [Integration docs](https://docs.continue.dev/customize/model-providers/more/huggingfaceinferenceapi#hugging-face) |
| [Haystack](https://haystack.deepset.ai/) | Open-source LLM framework for building production applications | [Integration docs](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) |
| [Inspect](https://inspect.aisi.org.uk/) | AI safety and evaluation framework | [Integration docs](https://inspect.aisi.org.uk/providers.html#hugging-face) |
| [LangChain](https://www.langchain.com/) | LLM application framework | [Integration docs](https://docs.langchain.com/oss/python/integrations/providers/huggingface#huggingfaceendpoint) |
| [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) | Unified interface for 100+ LLMs | [Integration docs](https://docs.litellm.ai/docs/providers/huggingface) |
| [LlamaIndex](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) | Data framework for LLM applications | [Integration docs](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) |
| [MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) | Speech-to-text application for macOS | [Guide](./macwhisper) |
| [OpenCode](./open-code) | AI coding agent built for the terminal | [Integration docs](https://opencode.ai/docs/providers#hugging-face) / [HF Guide](./open-code) |
| [Roo Code](https://docs.roocode.com/) | AI-powered code generation and refactoring | [Integration docs](https://docs.roocode.com/providers/huggingface) |

## Integrations by Category

### API Clients

Client libraries and gateways for simplified LLM access.

- [LiteLLM](https://docs.litellm.ai/docs/providers/huggingface) - Unified interface for calling 100+ LLMs with the same format

### Applications

End-user applications and interfaces powered by LLMs.

- [MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) - Speech-to-text application for macOS ([HF setup Guide](./macwhisper))

### Developer Tools

AI-powered coding assistants and development environments.

- [Continue](https://docs.continue.dev/customize/model-providers/more/huggingfaceinferenceapi#hugging-face) - AI code assistant for VS Code, JetBrains and the terminal.
- [OpenCode](https://opencode.ai/docs/providers#hugging-face) - AI coding agent built for the terminal.
- [Roo Code](https://docs.roocode.com/providers/huggingface) - AI-powered code generation and refactoring.

### Evaluation Frameworks

Tools for assessing and ensuring AI safety and performance.

- [Inspect](https://inspect.aisi.org.uk/providers.html#hugging-face) - AI safety and evaluation framework.

### LLM Frameworks

LLM application frameworks and orchestration platforms.

- [Haystack](https://docs.haystack.deepset.ai/docs/huggingfaceapichatgenerator) - Open-source framework for building production-ready LLM applications
- [LangChain](https://docs.langchain.com/oss/python/integrations/providers/huggingface#huggingfaceendpoint) - Popular framework for developing LLM applications
- [LlamaIndex](https://developers.llamaindex.ai/python/examples/llm/huggingface/#use-a-model-via-inference-providers) - Data framework for connecting custom data to LLMs

<!-- ## Add Your Integration

Building something with Inference Providers? [Let us know](./adding-integration) and we'll add it to the list. -->
44 changes: 44 additions & 0 deletions docs/inference-providers/integrations/macwhisper.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# MacWhisper

[MacWhisper](https://goodsnooze.gumroad.com/l/macwhisper) is a native macOS application that lets you transcribe audio using Whisper models locally on your Mac, with no technical setup required.

## Overview

MacWhisper includes AI services that can process your transcriptions—for example, to clean up dictation, summarize content, or translate to another language. You can configure these AI services to use Hugging Face Inference Providers, giving you access to a wide range of open models from multiple providers at zero markup pricing.

## Prerequisites

- MacWhisper installed.
- A Hugging Face account with an [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission)

## Configuration

1. **Get a Hugging Face token**
Create a token at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) with "Make calls to Inference Providers" permission.

2. **Open MacWhisper settings**
Go to **Settings → AI → Service**

3. **Select Hugging Face Inference Providers**
Choose "Hugging Face Inference Providers" from the service dropdown.

4. **Enter your API token**
Paste your Hugging Face token in the API token field.

5. **Choose a model**
Enter the model ID you want to use (e.g., `openai/gpt-oss-20b`).

That's it! MacWhisper will now use Hugging Face Inference Providers for AI processing of your transcriptions.

## Common Use Cases

- **Clean up dictation**: Remove filler words and improve grammar in voice-to-text
- **Summarization**: Generate summaries of meeting transcriptions or long audio files
- **Translation**: Translate transcriptions to other languages
- **Custom processing**: Apply custom prompts to transform transcriptions for your workflow

## Resources

- [MacWhisper product page](https://goodsnooze.gumroad.com/l/macwhisper)
- [Hugging Face Inference Providers documentation](https://huggingface.co/docs/inference-providers)
- [Available models](https://huggingface.co/models?pipeline_tag=text-generation&inference_provider=all&sort=trending)
66 changes: 66 additions & 0 deletions docs/inference-providers/integrations/open-code.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# OpenCode

<!--
<div class="flex justify-center">
<a href="https://opencode.ai/" target="_blank">
<img class="block dark:hidden" src="https://opencode.ai/_build/assets/preview-opencode-wordmark-light-nzmKQT2r.png" alt="OpenCode">
<img class="hidden dark:block" src="https://opencode.ai/_build/assets/preview-opencode-wordmark-dark-tZ1Y3VXe.png" alt="OpenCode"/>
</a>
</div> -->

[OpenCode](https://opencode.ai/) is an AI coding agent built for the terminal that helps with code review, refactoring, testing, and general development tasks.

## Overview

OpenCode natively supports Hugging Face Inference Providers, giving you access to open models from 17+ providers through a single interface.

## Prerequisites

- OpenCode installed ([installation guide](https://opencode.ai/docs))
- A Hugging Face account with [API token](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained) (needs "Make calls to Inference Providers" permission)

## Configuration

### Quick Setup

1. Create a Hugging Face token with Inference Providers permissions at [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens/new?ownUserPermissions=inference.serverless.write&tokenType=fineGrained)

2. Run `opencode auth login` and select **Hugging Face**:

```bash
$ opencode auth login

┌ Add credential
◆ Select provider
│ ● Hugging Face
│ ...
```

3. Enter your Hugging Face token when prompted:

```bash
┌ Add credential
◇ Select provider
│ Hugging Face
◇ Enter your API key
│ hf_...
```

4. Run the `/models` command in OpenCode to select a model.

Once configured, OpenCode will use your selected model for all operations. You can switch models anytime using the `/models` command in the OpenCode TUI (Terminal User Interface).

## GitHub Actions Integration

OpenCode can also be used to run open models in GitHub Actions via Inference Providers. See our [GitHub Actions guide](../guides/github-actions-code-review) for setting up automated PR reviews.

## Resources

- [OpenCode Documentation](https://opencode.ai/docs)
- [OpenCode Provider Configuration](https://opencode.ai/docs/providers/#hugging-face)
- [GitHub Actions Integration Guide](../guides/github-actions-code-review)