Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine system message settings #14877

Merged
merged 7 commits into from
Feb 19, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 11 additions & 9 deletions packages/ai-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,12 +23,12 @@ You can configure the end points via the `ai-features.openAiCustom.customOpenAiM

```ts
{
model: string
url: string
id?: string
apiKey?: string | true
apiVersion?: string | true
supportsDeveloperMessage?: boolean
model: string,
url: string,
id?: string,
apiKey?: string | true,
apiVersion?: string | true,
developerMessageSettings?: 'user' | 'system' | 'developer' | 'mergeWithFollowingUserMessage' | 'skip',
enableStreaming?: boolean
}
```
Expand All @@ -37,7 +37,9 @@ You can configure the end points via the `ai-features.openAiCustom.customOpenAiM
- `id` is an optional attribute which is used in the UI to refer to this configuration
- `apiKey` is either the key to access the API served at the given URL or `true` to use the global OpenAI API key. If not given 'no-key' will be used.
- `apiVersion` is either the api version to access the API served at the given URL in Azure or `true` to use the global OpenAI API version.
- `supportsDeveloperMessage` is a flag that indicates whether the model supports the `developer` role or not. `true` by default.
- `developerMessageSettings` Controls the handling of system messages: `user`, `system`, and `developer` will be used as a role, `mergeWithFollowingUserMessage` will prefix the
following user message with the system message or convert the system message to user message if the next message is not a user message. `skip` will just remove the system message.
Defaulting to `developer`.
- `enableStreaming` is a flag that indicates whether the streaming API shall be used or not. `true` by default.

### Azure OpenAI
Expand All @@ -49,7 +51,7 @@ Requests to an OpenAI model hosted on Azure need an `apiVersion`. To configure a
Note that if you don't configure an `apiVersion`, the default `OpenAI` object is used for initialization and a connection to an Azure hosted OpenAI model will fail.

An OpenAI model version deployed on Azure might not support the `developer` role. In that case it is possible to configure whether the `developer` role is supported or not via the
`supportsDeveloperMessage` option, which defaults to `true`.
`developerMessageSettings` option, e.g. setting it to `system` or `user`.

The following snippet shows a possible configuration to access an OpenAI model hosted on Azure. The `AZURE_OPENAI_API_BASE_URL` needs to be given without the `/chat/completions`
path and without the `api-version` parameter, e.g. _`https://<my_prefix>.openai.azure.com/openai/deployments/<my_deployment>`_
Expand All @@ -64,7 +66,7 @@ path and without the `api-version` parameter, e.g. _`https://<my_prefix>.openai.
"id": "azure-deployment",
"apiKey": "<AZURE_OPENAI_API_KEY>",
"apiVersion": "<AZURE_OPENAI_API_VERSION>",
"supportsDeveloperMessage": false
"developerMessageSettings": "system"
}
],
"ai-features.agentSettings": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
model.url === newModel.url &&
model.apiKey === newModel.apiKey &&
model.apiVersion === newModel.apiVersion &&
model.supportsDeveloperMessage === newModel.supportsDeveloperMessage &&
model.developerMessageSettings === newModel.developerMessageSettings &&
model.supportsStructuredOutput === newModel.supportsStructuredOutput &&
model.enableStreaming === newModel.enableStreaming));

Expand All @@ -117,7 +117,7 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
model: modelId,
apiKey: true,
apiVersion: true,
supportsDeveloperMessage: !openAIModelsNotSupportingDeveloperMessages.includes(modelId),
developerMessageSettings: openAIModelsNotSupportingDeveloperMessages.includes(modelId) ? 'user' : 'developer',
enableStreaming: !openAIModelsWithDisabledStreaming.includes(modelId),
supportsStructuredOutput: !openAIModelsWithoutStructuredOutput.includes(modelId),
defaultRequestSettings: modelRequestSetting?.requestSettings
Expand All @@ -143,7 +143,7 @@ export class OpenAiFrontendApplicationContribution implements FrontendApplicatio
url: pref.url,
apiKey: typeof pref.apiKey === 'string' || pref.apiKey === true ? pref.apiKey : undefined,
apiVersion: typeof pref.apiVersion === 'string' || pref.apiVersion === true ? pref.apiVersion : undefined,
supportsDeveloperMessage: pref.supportsDeveloperMessage ?? true,
developerMessageSettings: pref.developerMessageSettings ?? 'developer',
supportsStructuredOutput: pref.supportsStructuredOutput ?? true,
enableStreaming: pref.enableStreaming ?? true,
defaultRequestSettings: modelRequestSetting?.requestSettings
Expand Down
49 changes: 28 additions & 21 deletions packages/ai-openai/src/browser/openai-preferences.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,23 +45,26 @@ on the machine running Theia. Use the environment variable `OPENAI_API_KEY` to s
type: 'array',
title: AI_CORE_PREFERENCES_TITLE,
markdownDescription: nls.localize('theia/ai/openai/customEndpoints/mdDescription',
'Integrate custom models compatible with the OpenAI API, for example via `vllm`. The required attributes are `model` and `url`.\n\
\n\
Optionally, you can\
\n\
- specify a unique `id` to identify the custom model in the UI. If none is given `model` will be used as `id`.\
\n\
- provide an `apiKey` to access the API served at the given url. Use `true` to indicate the use of the global OpenAI API key.\
\n\
- provide an `apiVersion` to access the API served at the given url in Azure. Use `true` to indicate the use of the global OpenAI API version.\
\n\
- specify `supportsDeveloperMessage: false` to indicate that the developer role shall not be used.\
\n\
- specify `supportsStructuredOutput: false` to indicate that structured output shall not be used.\
\n\
- specify `enableStreaming: false` to indicate that streaming shall not be used.\n\
\n\
Refer to [our documentation](https://theia-ide.org/docs/user_ai/#openai-compatible-models-eg-via-vllm) for more information.'),
'Integrate custom models compatible with the OpenAI API, for example via `vllm`. The required attributes are `model` and `url`.\
\n\
Optionally, you can\
\n\
- specify a unique `id` to identify the custom model in the UI. If none is given `model` will be used as `id`.\
\n\
- provide an `apiKey` to access the API served at the given url. Use `true` to indicate the use of the global OpenAI API key.\
\n\
- provide an `apiVersion` to access the API served at the given url in Azure. Use `true` to indicate the use of the global OpenAI API version.\
\n\
- set `developerMessageSettings` to one of `user`, `system`, `developer`, `mergeWithFollowingUserMessage`, or `skip` to control how the developer message is\
included (where `user`, `system`, and `developer` will be used as a role, `mergeWithFollowingUserMessage` will prefix the following user message with the system\
message or convert the system message to user message if the next message is not a user message. `skip` will just remove the system message).\
Defaulting to `developer`.\
\n\
- specify `supportsStructuredOutput: false` to indicate that structured output shall not be used.\
\n\
- specify `enableStreaming: false` to indicate that streaming shall not be used.\
\n\
Refer to [our documentation](https://theia-ide.org/docs/user_ai/#openai-compatible-models-eg-via-vllm) for more information.'),
default: [],
items: {
type: 'object',
Expand All @@ -88,10 +91,14 @@ Refer to [our documentation](https://theia-ide.org/docs/user_ai/#openai-compatib
title: nls.localize('theia/ai/openai/customEndpoints/apiVersion/title',
'Either the version to access the API served at the given url in Azure or `true` to use the global OpenAI API version'),
},
supportsDeveloperMessage: {
type: 'boolean',
title: nls.localize('theia/ai/openai/customEndpoints/supportsDevMessage/title',
'Indicates whether the model supports the `developer` role. `true` by default.'),
developerMessageSettings: {
type: 'string',
enum: ['user', 'system', 'developer', 'mergeWithFollowingUserMessage', 'skip'],
default: 'developer',
title: nls.localize('theia/ai/openai/customEndpoints/developerMessageSettings/title',
'Controls the handling of system messages: `user`, `system`, and `developer` will be used as a role, `mergeWithFollowingUserMessage` will prefix\
the following user message with the system message or convert the system message to user message if the next message is not a user message.\
`skip` will just remove the system message), defaulting to `developer`.')
},
supportsStructuredOutput: {
type: 'boolean',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
// *****************************************************************************
export const OPENAI_LANGUAGE_MODELS_MANAGER_PATH = '/services/open-ai/language-model-manager';
export const OpenAiLanguageModelsManager = Symbol('OpenAiLanguageModelsManager');

export interface OpenAiModelDescription {
/**
* The identifier of the model which will be shown in the UI.
Expand All @@ -41,9 +42,12 @@ export interface OpenAiModelDescription {
*/
enableStreaming: boolean;
/**
* Flag to configure whether the OpenAPI model supports the `developer` role. Default is `true`.
* Property to configure the developer message of the model. Setting this property to 'user', 'system', or 'developer' will use that string as the role for the system message.
* Setting it to 'mergeWithFollowingUserMessage' will prefix the following user message with the system message or convert the system message to user if the following message
* is not a user message. 'skip' will remove the system message altogether.
* Defaults to 'developer'.
*/
supportsDeveloperMessage: boolean;
developerMessageSettings?: 'user' | 'system' | 'developer' | 'mergeWithFollowingUserMessage' | 'skip';
/**
* Flag to configure whether the OpenAPI model supports structured output. Default is `true`.
*/
Expand Down
2 changes: 2 additions & 0 deletions packages/ai-openai/src/node/openai-backend-module.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ import { OPENAI_LANGUAGE_MODELS_MANAGER_PATH, OpenAiLanguageModelsManager } from
import { ConnectionHandler, RpcConnectionHandler } from '@theia/core';
import { OpenAiLanguageModelsManagerImpl } from './openai-language-models-manager-impl';
import { ConnectionContainerModule } from '@theia/core/lib/node/messaging/connection-container-module';
import { OpenAiModelUtils } from './openai-language-model';

export const OpenAiModelFactory = Symbol('OpenAiModelFactory');

Expand All @@ -32,5 +33,6 @@ const openAiConnectionModule = ConnectionContainerModule.create(({ bind, bindBac
});

export default new ContainerModule(bind => {
bind(OpenAiModelUtils).toSelf().inSingletonScope();
bind(ConnectionContainerModule).toConstantValue(openAiConnectionModule);
});
Loading
Loading