Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(plugins): ai-prompt-template plugin #12340

Merged
merged 4 commits into from
Jan 24, 2024
Merged

Conversation

tysoekong
Copy link
Contributor

Summary

This plugin adds the ability to provide tuned AI prompts to users, who only need to fill in the blanks with variable placeholders (in moustache format: {{variable}} )

It is to be used in conjunction with ai-proxy plugin, as with other AI Family plugins.

AI-Proxy plugin

This allows prompt tuning to be started, or performed, by the experience LLM users in an organisation, and then consumed by anyone.

When activated, it looks for template references in the following forms:

Chat:

{
	"messages": "{template://developer-chat}",
	"properties": {
		"language": "python",
		"program": "flask web server"
	}
}

Prompt:

{
	"prompt": "{template://developer-prompt}",
	"properties": {
		"language": "python",
		"program": "flask web server"
	}
}

Based on this gateway configuration:

_format_version: "3.0"

services:
- name: default
  host: localhost
  path: "/"
  port: 9900
  protocol: https
  routes:
  - name: openai-chat
    paths:
    - "~/openai/chat$"
    methods:
    - POST
    plugins:
    - name: ai-proxy
      config:
        route_type: "llm/v1/chat"
        auth:
          header_name: "Authorization"
          header_value: "Bearer REDACT"
        logging:
          log_statistics: true
          log_payloads: false
        model:
          provider: "openai"
          name: "gpt-3.5-turbo"
          options:
            max_tokens: 512
            temperature: 1.0
    plugins:
    - name: ai-prompt-template
      config:
        allow_untemplated_requests: true
        templates:
        - name: "developer-chat"
          template:  |-
            {
              "messages": [
                {
                  "role": "system",
                  "content": "You are a {{program}} expert, in {{language}} programming language."
                },
                {
                  "role": "user",
                  "content": "Write me a {{program}} program."
                }
              ]
            }
        - name: "developer-prompt"
          template:  |-
            {
              "prompt": "You are a {{language}} programming language expert. Write me a {{program}} program."
            }

It has been designed to work with other formats as soon as they are supported.

It also sanitises string inputs to ensure that json control characters are escaped, preventing arbitrary prompt injection.

Checklist

  • The Pull Request has tests
  • A changelog file has been created under changelog/unreleased/kong or skip-changelog label added on PR if changelog is unnecessary. README.md
  • There is a user-facing docs PR against https://github.com/Kong/docs.konghq.com - docs PR for all four "AI Family" pluigins is still being worked on separately.

Issue reference

Internal project.

@github-actions github-actions bot added chore Not part of the core functionality of kong, but still needed schema-change-noteworthy labels Jan 12, 2024
@ttyS0e ttyS0e mentioned this pull request Jan 12, 2024
3 tasks
@tysoekong tysoekong force-pushed the feat/ai_prompt_template_plugin branch from 54c6c44 to 635dad9 Compare January 12, 2024 12:02
@RobSerafini RobSerafini requested a review from a team January 17, 2024 16:42
@flrgh
Copy link
Contributor

flrgh commented Jan 18, 2024

An http client is required to have knowledge of all of the configured templates and their respective parameters in order to make a request. What convenience is provided by having Kong store and render the templates rather than the http client?

kong/plugins/ai-prompt-template/handler.lua Outdated Show resolved Hide resolved
kong/plugins/ai-prompt-template/handler.lua Outdated Show resolved Hide resolved
kong/plugins/ai-prompt-template/handler.lua Show resolved Hide resolved
kong/plugins/ai-prompt-template/handler.lua Outdated Show resolved Hide resolved
@flrgh flrgh added the cherry-pick kong-ee schedule this PR for cherry-picking to kong/kong-ee label Jan 19, 2024
@flrgh flrgh added this to the 3.6.0 milestone Jan 19, 2024
@tysoekong tysoekong force-pushed the feat/ai_prompt_template_plugin branch 3 times, most recently from 154afd2 to 81db037 Compare January 23, 2024 09:56
@tysoekong
Copy link
Contributor Author

An http client is required to have knowledge of all of the configured templates and their respective parameters in order to make a request. What convenience is provided by having Kong store and render the templates rather than the http client?

@flrgh It is designed this way, not so much for restricting exactly what human-callers can do, but so our Kong admins can turn a complex LLM backend into a simple API for consumption from something else.

@tysoekong tysoekong force-pushed the feat/ai_prompt_template_plugin branch 2 times, most recently from 33a6ef0 to 755f89f Compare January 24, 2024 14:47
@tysoekong tysoekong force-pushed the feat/ai_prompt_template_plugin branch from 755f89f to 746c22b Compare January 24, 2024 14:49
@flrgh flrgh merged commit d289c8c into master Jan 24, 2024
23 checks passed
@flrgh flrgh deleted the feat/ai_prompt_template_plugin branch January 24, 2024 16:59
@team-gateway-bot
Copy link
Collaborator

Cherry-pick failed for master, because it was unable to cherry-pick the commit(s).

Please cherry-pick the changes locally.

git remote add upstream https://github.com/kong/kong-ee
git fetch upstream master
git worktree add -d .worktree/cherry-pick-12340-to-master-to-upstream upstream/master
cd .worktree/cherry-pick-12340-to-master-to-upstream
git checkout -b cherry-pick-12340-to-master-to-upstream
ancref=$(git merge-base 14cc90fbe83b6fb04e4ef832519bc10204f0d4bc 746c22b6316def683a6e58fb6f0453ef2a0c48ec)
git cherry-pick -x $ancref..746c22b6316def683a6e58fb6f0453ef2a0c48ec

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cherry-pick kong-ee schedule this PR for cherry-picking to kong/kong-ee chore Not part of the core functionality of kong, but still needed schema-change-noteworthy size/XL
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants