-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fixes #1051: handle openai presence and request penalty parameters #1817
Conversation
✅ Deploy Preview for localai ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
f0f7f17
to
72a4585
Compare
@mudler please do not merge yet, I noticed that PresencePenalty is not handled on the API side but is present in llama backend. I will try to add it as well |
@blob42 mind also about merge conflicts - this can't be merged until rebased on master |
Signed-off-by: blob42 <contact@blob42.xyz>
Signed-off-by: blob42 <contact@blob42.xyz>
Signed-off-by: blob42 <contact@blob42.xyz>
@mudler I am done with modifications on this PR, let me know if I should add something in the docs or tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looking good, thanks!
…0.1 by renovate (#19487) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0-cublas-cuda11-ffmpeg-core` -> `v2.10.1-cublas-cuda11-ffmpeg-core` | | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0-cublas-cuda11-core` -> `v2.10.1-cublas-cuda11-core` | | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0-cublas-cuda12-ffmpeg-core` -> `v2.10.1-cublas-cuda12-ffmpeg-core` | | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0-cublas-cuda12-core` -> `v2.10.1-cublas-cuda12-core` | | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0-ffmpeg-core` -> `v2.10.1-ffmpeg-core` | | [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) | patch | `v2.10.0` -> `v2.10.1` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. --- ### Release Notes <details> <summary>mudler/LocalAI (docker.io/localai/localai)</summary> ### [`v2.10.1`](https://togithub.com/mudler/LocalAI/releases/tag/v2.10.1) [Compare Source](https://togithub.com/mudler/LocalAI/compare/v2.10.0...v2.10.1) <!-- Release notes generated using configuration in .github/release.yml at master --> ##### What's Changed ##### Bug fixes 🐛 - fix(llama.cpp): fix eos without cache by [@​mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1852](https://togithub.com/mudler/LocalAI/pull/1852) - fix(config): default to debug=false if not set by [@​mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1853](https://togithub.com/mudler/LocalAI/pull/1853) - fix(config-watcher): start only if config-directory exists by [@​mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1854](https://togithub.com/mudler/LocalAI/pull/1854) ##### Exciting New Features 🎉 - deps(whisper.cpp): update, fix cublas build by [@​mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1846](https://togithub.com/mudler/LocalAI/pull/1846) ##### Other Changes - fixes [#​1051](https://togithub.com/mudler/LocalAI/issues/1051): handle openai presence and request penalty parameters by [@​blob42](https://togithub.com/blob42) in [https://github.com/mudler/LocalAI/pull/1817](https://togithub.com/mudler/LocalAI/pull/1817) - fix(make): allow to parallelize jobs by [@​cryptk](https://togithub.com/cryptk) in [https://github.com/mudler/LocalAI/pull/1845](https://togithub.com/mudler/LocalAI/pull/1845) - fix(go-llama): use llama-cpp as default by [@​mudler](https://togithub.com/mudler) in [https://github.com/mudler/LocalAI/pull/1849](https://togithub.com/mudler/LocalAI/pull/1849) - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1847](https://togithub.com/mudler/LocalAI/pull/1847) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://togithub.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1848](https://togithub.com/mudler/LocalAI/pull/1848) - test/fix: OSX Test Repair by [@​dave-gray101](https://togithub.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/1843](https://togithub.com/mudler/LocalAI/pull/1843) **Full Changelog**: mudler/LocalAI@v2.10.0...v2.10.1 </details> --- ### Configuration 📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Enabled. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about these updates again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://togithub.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4yNTYuMCIsInVwZGF0ZWRJblZlciI6IjM3LjI1Ni4wIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIn0=-->
Description
Notes for Reviewers
Signed commits