[BYOM]: Allow setting a custom max context size for custom models #41167
Labels
browser-ai
feature-request
good first issue
OS/Desktop
priority/P3
The next thing for us to work on. It'll ride the trains.
QA Pass-Win64
QA/Yes
release-notes/include
Milestone
Platforms
Linux, macOS, Windows
Description
BYOM currently uses a fixed context size for all custom models. Make the context size editable in the model details cards.
The text was updated successfully, but these errors were encountered: