Skip to content

Commit

Permalink
fix doc
Browse files Browse the repository at this point in the history
  • Loading branch information
shreemaan-abhishek committed Sep 23, 2024
1 parent 1f4528d commit ef16068
Showing 1 changed file with 24 additions and 3 deletions.
27 changes: 24 additions & 3 deletions docs/en/latest/plugins/ai-content-moderation.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@ description: This document contains information about the Apache APISIX ai-conte

The `ai-content-moderation` plugin processes the request body to check for toxicity and rejects the request if it exceeds the configured threshold.

**_This plugin must be used in routes that proxy requests to LLMs only._**

## Plugin Attributes

| **Field** | **Required** | **Type** | **Description** |
Expand All @@ -41,10 +43,11 @@ The `ai-content-moderation` plugin processes the request body to check for toxic
| provider.aws_comprehend.endpoint | No | String | AWS Comprehend service endpoint. Must match the pattern `^https?://` |
| moderation_categories | No | Object | Configuration for moderation categories. Must be one of: PROFANITY, HATE_SPEECH, INSULT, HARASSMENT_OR_ABUSE, SEXUAL, VIOLENCE_OR_THREAT |
| toxicity_level | No | Number | Threshold for overall toxicity detection. Range: 0 - 1. Default: 0.5 |
| llm_provider | Yes | String | Name of the LLM provider that this route will proxy requests to. |

## Example usage

Create a route with the `ai-content-moderation` plugin like so:
Create a route with the `ai-content-moderation` and `ai-proxy` plugin like so:

```shell
curl "http://127.0.0.1:9180/apisix/admin/routes/1" -X PUT \
Expand All @@ -55,13 +58,29 @@ curl "http://127.0.0.1:9180/apisix/admin/routes/1" -X PUT \
"ai-content-moderation": {
"provider": {
"aws_comprehend": {
"access_key_id": "access",
"secret_access_key": "ea+secret",
"access_key_id": "some_access_key",
"secret_access_key": "some_secret_key",
"region": "us-east-1"
}
},
"moderation_categories": {
"PROFANITY": 0.5
},
"llm_provider": "openai"
},
"ai-proxy": {
"auth": {
"header": {
"Authorization": "Bearer token"
}
},
"model": {
"provider": "openai",
"name": "gpt-4",
"options": {
"max_tokens": 512,
"temperature": 1.0
}
}
}
},
Expand All @@ -74,6 +93,8 @@ curl "http://127.0.0.1:9180/apisix/admin/routes/1" -X PUT \
}'
```

The `ai-proxy` plugin is used here as it simplifies access to LLMs. However, you may configure the LLM in the upstream configuration as well.

Now send a request:

```shell
Expand Down

0 comments on commit ef16068

Please sign in to comment.