chore(deps): update container image docker.io/localai/localai to v2.10.0@5cd0285 by renovate #19391
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
v2.9.0
->v2.10.0
Warning
Some dependencies could not be looked up. Check the Dependency Dashboard for more information.
Release Notes
mudler/LocalAI (docker.io/localai/localai)
v2.10.0
Compare Source
LocalAI v2.10.0 Release Notes
Excited to announce the release of LocalAI v2.10.0! This version introduces significant changes, including breaking changes, numerous bug fixes, exciting new features, dependency updates, and more. Here's a summary of what's new:
Breaking Changes 🛠
trust_remote_code
setting in the YAML config file of the model are now consumed for enhanced security measures also for the AutoGPTQ and transformers backend, thanks to @dave-gray101's contribution (#1799). If your model relied on the old behavior and you are sure of what you are doing, settrust_remote_code: true
in the YAML config file.Bug Fixes 🐛
finish_reason
fields for better compatibility with the OpenAI API, fixed by @mudler (#1745).default.metallib
has been resolved, which should now allow running the llama-cpp backend on Apple arm64, fixed by @dave-gray101 (#1837).Exciting New Features 🎉
transformer
backend also on Intel GPUs, implemented by @mudler (#1746).stream: true
! This feature was introduced by @golgeek (#1749).Dependency Updates 👒
ggerganov/llama.cpp
,donomii/go-rwkv.cpp
,mudler/go-stable-diffusion
, and others, ensuring that LocalAI is built on the latest and most secure libraries.Other Changes
Details of What's Changed
Breaking Changes 🛠
trust_remote_code
by @dave-gray101 in https://github.com/mudler/LocalAI/pull/1799Bug fixes 🐛
Exciting New Features 🎉
👒 Dependencies
Other Changes
New Contributors
Thank you to all contributors and users for your continued support and feedback, making LocalAI better with each release!
Full Changelog: mudler/LocalAI@v2.9.0...v2.10.0
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR has been generated by Renovate Bot.