Skip to content
This repository has been archived by the owner on Sep 16, 2024. It is now read-only.

max tokens is always 3071 #246

Closed
Dual-0 opened this issue Nov 6, 2023 · 3 comments · Fixed by #248
Closed

max tokens is always 3071 #246

Dual-0 opened this issue Nov 6, 2023 · 3 comments · Fixed by #248

Comments

@Dual-0
Copy link

Dual-0 commented Nov 6, 2023

Hello,

no matter which model I use e.g. gpt-4-32k-0613 or the new gpt-4-1106-preview with 128k context window I always get:

Mon, 06 Nov 2023 21:44:24 GMT [ERROR] [OpenAI-API Error: Error: Prompt is too long. Max token count is 3071, but prompt is 3539 tokens long.]
TypeError: Cannot read properties of undefined (reading 'response')
     at CommandHandler.onMessage (file:///opt/matrix_gpt_bot/3.1.4/dist/handlers.js:132:86)
     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

thanks in advance

@jame25
Copy link

jame25 commented Nov 6, 2023

gpt-4-1106-preview is working perfectly for me with v3.1.4, I did however also update npm to latest version.

@Dual-0
Copy link
Author

Dual-0 commented Nov 6, 2023

I think we need the options maxContextTokens and maxPromptTokens from use-client.js in clientOptions
My test went well with:

    maxContextTokens: 128000,
    maxPromptTokens: 32000,

gpt-4-1106-preview is working perfectly for me with v3.1.4, I did however also update npm to latest version.

you have made an input with over 3071 tokens?

@Dual-0
Copy link
Author

Dual-0 commented Nov 7, 2023

Thanks @max298

I raised maxContextTokens: 128000. I don't use maxPromptTokens at the moment so it falls back to default value. Works good so far. Any advice for maxPromptTokens?

bertybuttface added a commit that referenced this issue Nov 8, 2023
add option to increase token size; closes #246
spantaleev added a commit to spantaleev/matrix-docker-ansible-deploy that referenced this issue Feb 22, 2024
The new version is very broken. It has at least 2 issues.

The first one is:

```
Error: maxPromptTokens + max_tokens (3097 + 1024 = 4121) must be less than or equal to maxContextTokens (4097)
    at ChatGPTClient.setOptions (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:72:19)
    at new ChatGPTClient (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:23:14)
    at main (file:///usr/src/app/dist/index.js:62:21)
    at file:///usr/src/app/dist/index.js:94:1
    at ModuleJob.run (node:internal/modules/esm/module_job:218:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:329:24)
    at async loadESM (node:internal/process/esm_loader:28:7)
    at async handleMainPromise (node:internal/modules/run_main:113:12)
```

Likely related to:

- matrixgpt/matrix-chatgpt-bot#246
- matrixgpt/matrix-chatgpt-bot#248

It can be tweaked around by overriding some default environment
variables (`roles/custom/matrix-bot-chatgpt/templates/env.j2`) in order to tweak them:

```
CHATGPT_MAX_CONTEXT_TOKENS=4097
CHATGPT_MAX_PROMPT_TOKENS=2500
```

This leads us to another issue:

```
node:internal/process/promises:289
            triggerUncaughtException(err, true /* fromPromise */);
            ^
[Error: Failed to deserialize or serialize a JSON value missing field `version` at line 1 column 6704] {
  code: 'GenericFailure'
}
Node.js v20.11.1
error Command failed with exit code 1.
```

... whatever that means.
KarolosLykos pushed a commit to KarolosLykos/matrix-docker-ansible-deploy that referenced this issue Mar 5, 2024
The new version is very broken. It has at least 2 issues.

The first one is:

```
Error: maxPromptTokens + max_tokens (3097 + 1024 = 4121) must be less than or equal to maxContextTokens (4097)
    at ChatGPTClient.setOptions (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:72:19)
    at new ChatGPTClient (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:23:14)
    at main (file:///usr/src/app/dist/index.js:62:21)
    at file:///usr/src/app/dist/index.js:94:1
    at ModuleJob.run (node:internal/modules/esm/module_job:218:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:329:24)
    at async loadESM (node:internal/process/esm_loader:28:7)
    at async handleMainPromise (node:internal/modules/run_main:113:12)
```

Likely related to:

- matrixgpt/matrix-chatgpt-bot#246
- matrixgpt/matrix-chatgpt-bot#248

It can be tweaked around by overriding some default environment
variables (`roles/custom/matrix-bot-chatgpt/templates/env.j2`) in order to tweak them:

```
CHATGPT_MAX_CONTEXT_TOKENS=4097
CHATGPT_MAX_PROMPT_TOKENS=2500
```

This leads us to another issue:

```
node:internal/process/promises:289
            triggerUncaughtException(err, true /* fromPromise */);
            ^
[Error: Failed to deserialize or serialize a JSON value missing field `version` at line 1 column 6704] {
  code: 'GenericFailure'
}
Node.js v20.11.1
error Command failed with exit code 1.
```

... whatever that means.
ignyx pushed a commit to Tawkie/matrix-docker-ansible-deploy that referenced this issue Jun 20, 2024
The new version is very broken. It has at least 2 issues.

The first one is:

```
Error: maxPromptTokens + max_tokens (3097 + 1024 = 4121) must be less than or equal to maxContextTokens (4097)
    at ChatGPTClient.setOptions (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:72:19)
    at new ChatGPTClient (file:///usr/src/app/node_modules/@waylaidwanderer/chatgpt-api/src/ChatGPTClient.js:23:14)
    at main (file:///usr/src/app/dist/index.js:62:21)
    at file:///usr/src/app/dist/index.js:94:1
    at ModuleJob.run (node:internal/modules/esm/module_job:218:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:329:24)
    at async loadESM (node:internal/process/esm_loader:28:7)
    at async handleMainPromise (node:internal/modules/run_main:113:12)
```

Likely related to:

- matrixgpt/matrix-chatgpt-bot#246
- matrixgpt/matrix-chatgpt-bot#248

It can be tweaked around by overriding some default environment
variables (`roles/custom/matrix-bot-chatgpt/templates/env.j2`) in order to tweak them:

```
CHATGPT_MAX_CONTEXT_TOKENS=4097
CHATGPT_MAX_PROMPT_TOKENS=2500
```

This leads us to another issue:

```
node:internal/process/promises:289
            triggerUncaughtException(err, true /* fromPromise */);
            ^
[Error: Failed to deserialize or serialize a JSON value missing field `version` at line 1 column 6704] {
  code: 'GenericFailure'
}
Node.js v20.11.1
error Command failed with exit code 1.
```

... whatever that means.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants