Releases: jupyterlab/jupyter-ai
v2.28.4
2.28.4
π Merry Christmas and happy holidays to all! We have worked with Santa to bring you some enhancements & fixes for Jupyter AI. Notably, some embedding models now support a configurable base URL, and the reliability of /generate
has been significantly improved. Thank you @srdas for contributing these changes!
Note to contributors: This is planned to be the last v2 release from the main
branch. After the first v3 pre-release, main
will track Jupyter AI v3, while Jupyter AI v2 will continue to be maintained from the 2.x
branch.
Enhancements made
Bugs fixed
- Update
/generate
to not split classes & functions across cells #1158 (@srdas) - Fix code output format in IPython #1155 (@divyansshhh)
Maintenance and upkeep improvements
Documentation improvements
- Improve user messaging and documentation for Cross-Region Inference on Amazon Bedrock #1134 (@srdas)
Contributors to this release
(GitHub contributors page for this release)
@divyansshhh | @dlqqq | @krassowski | @mlucool | @srdas | @Zsailer
v2.28.3
2.28.3
This release notably fixes a major bug with updated model fields not being used until after a server restart, and fixes a bug with Ollama in the chat. Thank you for your patience as we continue to improve Jupyter AI! π€
Enhancements made
Bugs fixed
- Fix install step in CI #1139 (@dlqqq)
- Update completion model fields immediately on save #1137 (@dlqqq)
- Fix JSON serialization error in Ollama models #1129 (@JanusChoi)
- Update model fields immediately on save #1125 (@dlqqq)
- Downgrade spurious 'error' logs #1119 (@ctcjab)
Maintenance and upkeep improvements
Contributors to this release
(GitHub contributors page for this release)
@ctcjab | @dlqqq | @JanusChoi | @krassowski | @pre-commit-ci | @srdas
v2.28.2
v2.28.1
v2.28.0
2.28.0
Release summary
This release notably includes the following changes:
-
Models from the
Anthropic
andChatAnthropic
providers are now merged in the config UI, so all Anthropic models are shown in the same place in the "Language model" dropdown. -
Anthropic Claude v1 LLMs have been removed, as the models are retired and no longer available from the API.
-
The chat system prompt has been updated to encourage the LLM to express dollar quantities in LaTeX, i.e. the LLM should prefer returning
\(\$100\)
instead of$100
. For the latest LLMs, this generally fixes a rendering issue when multiple dollar quantities are given literally in the same sentence.- Note that the issue may still persist in older LLMs, which do not respect the system prompt as frequently.
-
/export
has been fixed to include streamed replies, which were previously omitted. -
Calling non-chat providers with history has been fixed to behave properly in magics.
Enhancements made
- Remove retired models and add new
Haiku-3.5
model in Anthropic #1092 (@srdas) - Reduced padding in cell around code icons in code toolbar #1072 (@srdas)
- Merge Anthropic language model providers #1069 (@srdas)
- Add examples of using Fields and EnvAuthStrategy to developer documentation #1056 (@alanmeeson)
Bugs fixed
- Continue to allow
$
symbols to delimit inline math in human messages #1094 (@dlqqq) - Fix
/export
by including streamed agent messages #1077 (@mcavdar) - Fix magic commands when using non-chat providers w/ history #1075 (@alanmeeson)
- Allow
$
to literally denote quantities of USD in chat #1068 (@dlqqq)
Documentation improvements
- Improve installation documentation and clarify provider dependencies #1087 (@srdas)
- Added Ollama to the providers table in user docs #1064 (@srdas)
Contributors to this release
(GitHub contributors page for this release)
@alanmeeson | @dlqqq | @krassowski | @mcavdar | @srdas
v2.27.0
2.27.0
Enhancements made
Documentation improvements
Contributors to this release
v2.26.0
2.26.0
This release notably includes the addition of a "Stop streaming" button, which takes over the "Send" button when a reply is streaming and the chat input is empty. While Jupyternaut is streaming a reply to a user, the user has the option to click the "Stop streaming" button to interrupt Jupyternaut and stop it from streaming further. Thank you @krassowski for contributing this feature! π
Enhancements made
- Support Quarto Markdown in
/learn
#1047 (@dlqqq) - Update requirements contributors doc #1045 (@JasonWeill)
- Remove clear_message_ids from RootChatHandler #1042 (@michaelchia)
- Migrate streaming logic to
BaseChatHandler
#1039 (@dlqqq) - Unify message clearing & broadcast logic #1038 (@dlqqq)
- Learn from JSON files #1024 (@jlsajfj)
- Allow users to stop message streaming #1022 (@krassowski)
Bugs fixed
- Always use
username
fromIdentityProvider
#1034 (@krassowski)
Maintenance and upkeep improvements
- Support
jupyter-collaboration
v3 #1035 (@krassowski) - Test Python 3.9 and 3.12 on CI, test minimum dependencies #1029 (@krassowski)
Documentation improvements
- Update requirements contributors doc #1045 (@JasonWeill)
Contributors to this release
(GitHub contributors page for this release)
@dlqqq | @JasonWeill | @jlsajfj | @krassowski | @michaelchia | @pre-commit-ci
v2.25.0
2.25.0
Enhancements made
- Export context hooks from NPM package entry point #1020 (@dlqqq)
- Add support for optional telemetry plugin #1018 (@dlqqq)
- Add back history and reset subcommand in magics #997 (@akaihola)
Maintenance and upkeep improvements
Contributors to this release
(GitHub contributors page for this release)
@akaihola | @dlqqq | @jtpio | @pre-commit-ci
v2.24.1
2.24.1
Enhancements made
- Make path argument required on /learn #1012 (@andrewfulton9)
Bugs fixed
Contributors to this release
v2.24.0
2.24.0
This release notably introduces a new context command @file:<file-path>
to the chat UI, which includes the content of the target file with your prompt when sent. This allows you to ask questions like:
What does @file:src/components/ActionButton.tsx do?
Can you refactor @file:src/index.ts to use async/await syntax?
How do I add an optional dependency to @file:pyproject.toml?
The context command feature also includes an autocomplete menu UI to help navigate your filesystem with fewer keystrokes.
Thank you @michaelchia for developing this feature!
Enhancements made
- Migrate to
ChatOllama
base class in Ollama provider #1015 (@srdas) - Add
metadata
field to agent messages #1013 (@dlqqq) - Add OpenRouter support #996 (@akaihola)
- Framework for adding context to LLM prompt #993 (@michaelchia)
- Adds unix shell-style wildcard matching to
/learn
#989 (@andrewfulton9)
Bugs fixed
- Run mypy on CI, fix or ignore typing issues #987 (@krassowski)
Maintenance and upkeep improvements
Contributors to this release
(GitHub contributors page for this release)
@akaihola | @andrewfulton9 | @dlqqq | @ellisonbg | @hockeymomonow | @krassowski | @michaelchia | @srdas