Skip to content

Releases: jupyterlab/jupyter-ai

v2.28.4

24 Dec 22:34
Compare
Choose a tag to compare

2.28.4

πŸŽ„ Merry Christmas and happy holidays to all! We have worked with Santa to bring you some enhancements & fixes for Jupyter AI. Notably, some embedding models now support a configurable base URL, and the reliability of /generate has been significantly improved. Thank you @srdas for contributing these changes!

Note to contributors: This is planned to be the last v2 release from the main branch. After the first v3 pre-release, main will track Jupyter AI v3, while Jupyter AI v2 will continue to be maintained from the 2.x branch.

(Full Changelog)

Enhancements made

  • Add base API URL field for Ollama and OpenAI embedding models #1136 (@srdas)

Bugs fixed

Maintenance and upkeep improvements

  • Trigger update snapshots based on commenter's role #1160 (@dlqqq)

Documentation improvements

  • Improve user messaging and documentation for Cross-Region Inference on Amazon Bedrock #1134 (@srdas)

Contributors to this release

(GitHub contributors page for this release)

@divyansshhh | @dlqqq | @krassowski | @mlucool | @srdas | @Zsailer

v2.28.3

05 Dec 23:48
Compare
Choose a tag to compare

2.28.3

This release notably fixes a major bug with updated model fields not being used until after a server restart, and fixes a bug with Ollama in the chat. Thank you for your patience as we continue to improve Jupyter AI! πŸ€—

(Full Changelog)

Enhancements made

  • Removes outdated OpenAI models and adds new ones #1127 (@srdas)

Bugs fixed

Maintenance and upkeep improvements

Contributors to this release

(GitHub contributors page for this release)

@ctcjab | @dlqqq | @JanusChoi | @krassowski | @pre-commit-ci | @srdas

v2.28.2

18 Nov 19:25
Compare
Choose a tag to compare

2.28.2

(Full Changelog)

Bugs fixed

  • Bump LangChain minimum versions #1109 (@dlqqq)
  • Catch error on non plaintext files in @file and reply gracefully in chat #1106 (@srdas)
  • Fix rendering of code blocks in JupyterLab 4.3.0+ #1104 (@dlqqq)

Contributors to this release

(GitHub contributors page for this release)

@dlqqq | @srdas

v2.28.1

11 Nov 19:55
Compare
Choose a tag to compare

2.28.1

(Full Changelog)

Bugs fixed

Contributors to this release

(GitHub contributors page for this release)

@dlqqq

v2.28.0

07 Nov 00:31
Compare
Choose a tag to compare

2.28.0

(Full Changelog)

Release summary

This release notably includes the following changes:

  • Models from the Anthropic and ChatAnthropic providers are now merged in the config UI, so all Anthropic models are shown in the same place in the "Language model" dropdown.

  • Anthropic Claude v1 LLMs have been removed, as the models are retired and no longer available from the API.

  • The chat system prompt has been updated to encourage the LLM to express dollar quantities in LaTeX, i.e. the LLM should prefer returning \(\$100\) instead of $100. For the latest LLMs, this generally fixes a rendering issue when multiple dollar quantities are given literally in the same sentence.

    • Note that the issue may still persist in older LLMs, which do not respect the system prompt as frequently.
  • /export has been fixed to include streamed replies, which were previously omitted.

  • Calling non-chat providers with history has been fixed to behave properly in magics.

Enhancements made

  • Remove retired models and add new Haiku-3.5 model in Anthropic #1092 (@srdas)
  • Reduced padding in cell around code icons in code toolbar #1072 (@srdas)
  • Merge Anthropic language model providers #1069 (@srdas)
  • Add examples of using Fields and EnvAuthStrategy to developer documentation #1056 (@alanmeeson)

Bugs fixed

  • Continue to allow $ symbols to delimit inline math in human messages #1094 (@dlqqq)
  • Fix /export by including streamed agent messages #1077 (@mcavdar)
  • Fix magic commands when using non-chat providers w/ history #1075 (@alanmeeson)
  • Allow $ to literally denote quantities of USD in chat #1068 (@dlqqq)

Documentation improvements

  • Improve installation documentation and clarify provider dependencies #1087 (@srdas)
  • Added Ollama to the providers table in user docs #1064 (@srdas)

Contributors to this release

(GitHub contributors page for this release)

@alanmeeson | @dlqqq | @krassowski | @mcavdar | @srdas

v2.27.0

29 Oct 19:16
Compare
Choose a tag to compare

2.27.0

(Full Changelog)

Enhancements made

Documentation improvements

  • Added Developer documentation for streaming responses #1051 (@srdas)

Contributors to this release

(GitHub contributors page for this release)

@dlqqq | @pre-commit-ci | @srdas

v2.26.0

21 Oct 23:02
Compare
Choose a tag to compare

2.26.0

This release notably includes the addition of a "Stop streaming" button, which takes over the "Send" button when a reply is streaming and the chat input is empty. While Jupyternaut is streaming a reply to a user, the user has the option to click the "Stop streaming" button to interrupt Jupyternaut and stop it from streaming further. Thank you @krassowski for contributing this feature! πŸŽ‰

(Full Changelog)

Enhancements made

Bugs fixed

Maintenance and upkeep improvements

Documentation improvements

Contributors to this release

(GitHub contributors page for this release)

@dlqqq | @JasonWeill | @jlsajfj | @krassowski | @michaelchia | @pre-commit-ci

v2.25.0

07 Oct 23:52
Compare
Choose a tag to compare

2.25.0

(Full Changelog)

Enhancements made

  • Export context hooks from NPM package entry point #1020 (@dlqqq)
  • Add support for optional telemetry plugin #1018 (@dlqqq)
  • Add back history and reset subcommand in magics #997 (@akaihola)

Maintenance and upkeep improvements

Contributors to this release

(GitHub contributors page for this release)

@akaihola | @dlqqq | @jtpio | @pre-commit-ci

v2.24.1

04 Oct 22:21
Compare
Choose a tag to compare

2.24.1

(Full Changelog)

Enhancements made

Bugs fixed

Contributors to this release

(GitHub contributors page for this release)

@andrewfulton9 | @dlqqq | @hockeymomonow

v2.24.0

26 Sep 20:55
Compare
Choose a tag to compare

2.24.0

(Full Changelog)

This release notably introduces a new context command @file:<file-path> to the chat UI, which includes the content of the target file with your prompt when sent. This allows you to ask questions like:

  • What does @file:src/components/ActionButton.tsx do?
  • Can you refactor @file:src/index.ts to use async/await syntax?
  • How do I add an optional dependency to @file:pyproject.toml?

The context command feature also includes an autocomplete menu UI to help navigate your filesystem with fewer keystrokes.

Thank you @michaelchia for developing this feature!

Enhancements made

Bugs fixed

Maintenance and upkeep improvements

  • Upgrade to actions/upload-artifact@v4 in workflows #992 (@dlqqq)

Contributors to this release

(GitHub contributors page for this release)

@akaihola | @andrewfulton9 | @dlqqq | @ellisonbg | @hockeymomonow | @krassowski | @michaelchia | @srdas