Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): bump dependabot/fetch-metadata from 2.0.0 to 2.1.0 #2186

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Apr 29, 2024

Bumps dependabot/fetch-metadata from 2.0.0 to 2.1.0.

Release notes

Sourced from dependabot/fetch-metadata's releases.

v2.1.0

What's Changed

New Contributors

Full Changelog: dependabot/fetch-metadata@v2.0.0...v2.1.0

Commits
  • 5e5f996 Merge pull request #518 from dependabot/bump-to-v2.1.0
  • 63415e5 v2.1.0
  • 76b7fe9 Merge pull request #509 from dependabot/switch-to-monthly-release-cadence
  • 7c323d5 Switch to monthly release cadence
  • 5c7b450 Merge pull request #450 from HealthengineAU/handle-branches-with-hyphens
  • a44a9df Handle branch names containing hyphen separators
  • 518993c Relax engine-strict=true (#510)
  • See full diff in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [dependabot/fetch-metadata](https://github.com/dependabot/fetch-metadata) from 2.0.0 to 2.1.0.
- [Release notes](https://github.com/dependabot/fetch-metadata/releases)
- [Commits](dependabot/fetch-metadata@v2.0.0...v2.1.0)

---
updated-dependencies:
- dependency-name: dependabot/fetch-metadata
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies github_actions Pull requests that update GitHub Actions code labels Apr 29, 2024
@github-actions github-actions bot enabled auto-merge (squash) April 29, 2024 18:58
Copy link

netlify bot commented Apr 29, 2024

Deploy Preview for localai canceled.

Name Link
🔨 Latest commit c961c4d
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/662fedac76f45400082b4782

@github-actions github-actions bot merged commit 53c3842 into master Apr 29, 2024
52 checks passed
@github-actions github-actions bot deleted the dependabot/github_actions/dependabot/fetch-metadata-2.1.0 branch April 29, 2024 21:12
truecharts-admin referenced this pull request in truecharts/public May 5, 2024
…4.0 by renovate (#21605)

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0-cublas-cuda11-ffmpeg-core` ->
`v2.14.0-cublas-cuda11-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0-cublas-cuda11-core` -> `v2.14.0-cublas-cuda11-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0-cublas-cuda12-ffmpeg-core` ->
`v2.14.0-cublas-cuda12-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0-cublas-cuda12-core` -> `v2.14.0-cublas-cuda12-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0-ffmpeg-core` -> `v2.14.0-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.13.0` -> `v2.14.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>mudler/LocalAI (docker.io/localai/localai)</summary>

###
[`v2.14.0`](https://togithub.com/mudler/LocalAI/releases/tag/v2.14.0)

[Compare
Source](https://togithub.com/mudler/LocalAI/compare/v2.13.0...v2.14.0)

##### 🚀  AIO Image Update: llama3 has landed!

We're excited to announce that our AIO image has been upgraded with the
latest LLM model, llama3, enhancing our capabilities with more accurate
and dynamic responses. Behind the scenes uses
https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF which
is ready for function call, yay!

##### 💬 WebUI enhancements: Updates in Chat, Image Generation, and TTS

|Chat  | TTS | Image gen |
|------------|----------------|-------------|
|
![chatui](https://togithub.com/mudler/LocalAI/assets/2420543/ff71ad02-841d-48a9-99a7-30f024ae3331)
|
![ttsui](https://togithub.com/mudler/LocalAI/assets/2420543/0c137ba5-cb35-426d-ae5d-390679432cf0)
|
![image](https://togithub.com/mudler/LocalAI/assets/2420543/88f8ef30-e06a-454f-b01a-08fcd6917188)
|

Our interfaces for Chat, Text-to-Speech (TTS), and Image Generation have
finally landed. Enjoy streamlined and simple interactions thanks to the
efforts of our team, led by
[@&#8203;mudler](https://togithub.com/mudler), who have worked
tirelessly to enhance your experience. The WebUI interface serves as a
quick way to debug and assess models loaded in LocalAI - there is much
to improve, but we have now a small, hackable interface!

##### 🖼️ Many new models in the model gallery!

|
![local-ai-gallery](https://togithub.com/mudler/LocalAI/assets/2420543/06a06d3c-b91a-472b-892a-a1b69ddc8c56)
|
|------------|

The model gallery has received a substantial upgrade with numerous new
models, including Einstein v6.1, SOVL, and several specialized Llama3
iterations. These additions are designed to cater to a broader range of
tasks , making LocalAI more versatile than ever. Kudos to
[@&#8203;mudler](https://togithub.com/mudler) for spearheading these
exciting updates - now you can select with a couple of click the model
you like!

##### 🛠️ Robust Fixes and Optimizations

This update brings a series of crucial bug fixes and security
enhancements to ensure our platform remains secure and efficient.
Special thanks to
[@&#8203;dave-gray101](https://togithub.com/dave-gray101),
[@&#8203;cryptk](https://togithub.com/cryptk), and
[@&#8203;fakezeta](https://togithub.com/fakezeta) for their diligent
work in rooting out and resolving these issues 🤗

##### ✨ OpenVINO and more

We're introducing OpenVINO acceleration, and many OpenVINO models in the
gallery. You can now enjoy fast-as-hell speed on Intel CPU and GPUs.
Applause to [@&#8203;fakezeta](https://togithub.com/fakezeta) for the
contributions!

##### 📚 Documentation and Dependency Upgrades

We've updated our documentation and dependencies to keep you equipped
with the latest tools and knowledge. These updates ensure that LocalAI
remains a robust and dependable platform.

##### 👥 A Community Effort

A special shout-out to our new contributors,
[@&#8203;QuinnPiers](https://togithub.com/QuinnPiers) and
[@&#8203;LeonSijiaLu](https://togithub.com/LeonSijiaLu), who have
enriched our community with their first contributions. Welcome aboard,
and thank you for your dedication and fresh insights!

Each update in this release not only enhances our platform's
capabilities but also ensures a safer and more user-friendly experience.
We are excited to see how our users leverage these new features in their
projects, freel free to hit a line on Twitter or in any other social,
we'd be happy to hear how you use LocalAI!

##### 📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you
who've chipped in to squash bugs and suggest cool new features for
LocalAI. Your help, kind words, and brilliant ideas are truly
appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help
out fellow users on Discord and in our repo, you're absolutely amazing.
We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate
sponsors behind it. It's all us, folks. So, if you've found value in
what we're building together and want to keep the momentum going,
consider showing your support. A little shoutout on your favorite social
platforms using @&#8203;LocalAI_OSS and @&#8203;mudler_it or joining our
sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the
link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us
keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and.. exciting times ahead with LocalAI!

##### What's Changed

##### Bug fixes 🐛

- fix: `config_file_watcher.go` - root all file reads for safety by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2144](https://togithub.com/mudler/LocalAI/pull/2144)
- fix: github bump_docs.sh regex to drop emoji and other text by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2180](https://togithub.com/mudler/LocalAI/pull/2180)
- fix: undefined symbol: iJIT_NotifyEvent in import torch
#[#&#8203;2153](https://togithub.com/mudler/LocalAI/issues/2153) by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[https://github.com/mudler/LocalAI/pull/2179](https://togithub.com/mudler/LocalAI/pull/2179)
- fix: security scanner warning noise: error handlers part 2 by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2145](https://togithub.com/mudler/LocalAI/pull/2145)
- fix: ensure GNUMake jobserver is passed through to whisper.cpp build
by [@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2187](https://togithub.com/mudler/LocalAI/pull/2187)
- fix: bring everything onto the same GRPC version to fix tests by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2199](https://togithub.com/mudler/LocalAI/pull/2199)

##### Exciting New Features 🎉

- feat(gallery): display job status also during navigation by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2151](https://togithub.com/mudler/LocalAI/pull/2151)
- feat: cleanup Dockerfile and make final image a little smaller by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2146](https://togithub.com/mudler/LocalAI/pull/2146)
- fix: swap to WHISPER_CUDA per deprecation message from whisper.cpp by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2170](https://togithub.com/mudler/LocalAI/pull/2170)
- feat: only keep the build artifacts from the grpc build by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2172](https://togithub.com/mudler/LocalAI/pull/2172)
- feat(gallery): support model deletion by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2173](https://togithub.com/mudler/LocalAI/pull/2173)
- refactor(application): introduce application global state by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2072](https://togithub.com/mudler/LocalAI/pull/2072)
- feat: organize Dockerfile into distinct sections by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2181](https://togithub.com/mudler/LocalAI/pull/2181)
- feat: OpenVINO acceleration for embeddings in transformer backend by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[https://github.com/mudler/LocalAI/pull/2190](https://togithub.com/mudler/LocalAI/pull/2190)
- chore: update go-stablediffusion to latest commit with Make jobserver
fix by [@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2197](https://togithub.com/mudler/LocalAI/pull/2197)
- feat: user defined inference device for CUDA and OpenVINO by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[https://github.com/mudler/LocalAI/pull/2212](https://togithub.com/mudler/LocalAI/pull/2212)
- feat(ux): Add chat, tts, and image-gen pages to the WebUI by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2222](https://togithub.com/mudler/LocalAI/pull/2222)
- feat(aio): switch to llama3-based for LLM by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2225](https://togithub.com/mudler/LocalAI/pull/2225)
- feat(ui): support multilineand style `ul` by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2226](https://togithub.com/mudler/LocalAI/pull/2226)

##### 🧠 Models

- models(gallery): add Einstein v6.1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2152](https://togithub.com/mudler/LocalAI/pull/2152)
- models(gallery): add SOVL by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2154](https://togithub.com/mudler/LocalAI/pull/2154)
- models(gallery): add average_normie by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2155](https://togithub.com/mudler/LocalAI/pull/2155)
- models(gallery): add solana by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2157](https://togithub.com/mudler/LocalAI/pull/2157)
- models(gallery): add poppy porpoise by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2158](https://togithub.com/mudler/LocalAI/pull/2158)
- models(gallery): add Undi95/Llama-3-LewdPlay-8B-evo-GGUF by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2160](https://togithub.com/mudler/LocalAI/pull/2160)
- models(gallery): add biomistral-7b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2161](https://togithub.com/mudler/LocalAI/pull/2161)
- models(gallery): add llama3-32k by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2183](https://togithub.com/mudler/LocalAI/pull/2183)
- models(gallery): add openvino models by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2184](https://togithub.com/mudler/LocalAI/pull/2184)
- models(gallery): add lexifun by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2193](https://togithub.com/mudler/LocalAI/pull/2193)
- models(gallery): add suzume-llama-3-8B-multilingual-gguf by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2194](https://togithub.com/mudler/LocalAI/pull/2194)
- models(gallery): add guillaumetell by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2195](https://togithub.com/mudler/LocalAI/pull/2195)
- models(gallery): add wizardlm2 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2209](https://togithub.com/mudler/LocalAI/pull/2209)
- models(gallery): Add Hermes-2-Pro-Llama-3-8B-GGUF by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2218](https://togithub.com/mudler/LocalAI/pull/2218)

##### 📖 Documentation and examples

- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2149](https://togithub.com/mudler/LocalAI/pull/2149)
- draft:Update model-gallery.md with correct gallery file by
[@&#8203;QuinnPiers](https://togithub.com/QuinnPiers) in
[https://github.com/mudler/LocalAI/pull/2163](https://togithub.com/mudler/LocalAI/pull/2163)
- docs: update gallery, add rerankers by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2166](https://togithub.com/mudler/LocalAI/pull/2166)
- docs: enhance and condense few sections by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2178](https://togithub.com/mudler/LocalAI/pull/2178)
- \[Documentations] Removed invalid numberings from `troubleshooting
mac` by [@&#8203;LeonSijiaLu](https://togithub.com/LeonSijiaLu) in
[https://github.com/mudler/LocalAI/pull/2174](https://togithub.com/mudler/LocalAI/pull/2174)

##### 👒 Dependencies

- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2150](https://togithub.com/mudler/LocalAI/pull/2150)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2159](https://togithub.com/mudler/LocalAI/pull/2159)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2176](https://togithub.com/mudler/LocalAI/pull/2176)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2177](https://togithub.com/mudler/LocalAI/pull/2177)
- update go-tinydream to latest commit by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2182](https://togithub.com/mudler/LocalAI/pull/2182)
- build(deps): bump dependabot/fetch-metadata from 2.0.0 to 2.1.0 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2186](https://togithub.com/mudler/LocalAI/pull/2186)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2189](https://togithub.com/mudler/LocalAI/pull/2189)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2188](https://togithub.com/mudler/LocalAI/pull/2188)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2203](https://togithub.com/mudler/LocalAI/pull/2203)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2213](https://togithub.com/mudler/LocalAI/pull/2213)

##### Other Changes

- Revert ":arrow_up: Update docs version mudler/LocalAI" by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2165](https://togithub.com/mudler/LocalAI/pull/2165)
- Issue-1720: Updated `Build on mac` documentations by
[@&#8203;LeonSijiaLu](https://togithub.com/LeonSijiaLu) in
[https://github.com/mudler/LocalAI/pull/2171](https://togithub.com/mudler/LocalAI/pull/2171)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2224](https://togithub.com/mudler/LocalAI/pull/2224)

##### New Contributors

- [@&#8203;QuinnPiers](https://togithub.com/QuinnPiers) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2163](https://togithub.com/mudler/LocalAI/pull/2163)
- [@&#8203;LeonSijiaLu](https://togithub.com/LeonSijiaLu) made their
first contribution in
[https://github.com/mudler/LocalAI/pull/2171](https://togithub.com/mudler/LocalAI/pull/2171)

**Full Changelog**:
mudler/LocalAI@v2.13.0...v2.14.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Enabled.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR has been generated by [Renovate
Bot](https://togithub.com/renovatebot/renovate).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4zNDEuMCIsInVwZGF0ZWRJblZlciI6IjM3LjM0MS4wIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies github_actions Pull requests that update GitHub Actions code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants