-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Codestral (Mistral code suggestion) #12519
Comments
Additionally, It'd be amazing if we could use this for |
Don't be too excited. Codestral is terrible at doing FIM. I have switched to asking Sonnet 3.5 to just fill in the marked part, and it does the job 10x better, even though it is a chat model and not tuned for FIM at all. Codestral can't even match the parentheses right. |
I could use Codestral model with private-gpt (fork from zylon-ai's private-gpt) in chat mode running in Docker with NVIDIA GPU support. So it would be cool if we could get it to work with |
I did a basic implementation that works: #15573 A few outstanding questions as I don't know this code base very well. |
FTR, my settings for codestral: {
"language_models": {
"openai": {
"version": "1",
"api_url": "https://codestral.mistral.ai/v1",
"available_models": [
{ "custom": { "name": "codestral-latest", "max_tokens": 131072 } }
]
}
},
"assistant": {
"version": "2",
"default_model": {
"provider": "openai",
"model": "codestral-latest"
}
},
... |
Note the different endpoint from regular mistral models. |
Can you also use codestral as an Ollama pull? |
I don't have the hardware. |
Codestral is too large for my machine. I’m on an M1 Mac mini 16GB of RAM. However, other, smaller Ollama Pulls Work.
… On Aug 13, 2024, at 11:04 PM, Étienne BERSAC ***@***.***> wrote:
Note the different endpoint from regular mistral models.
Can you also use codestral as an Ollama pull?
I don't have the hardware.
—
Reply to this email directly, view it on GitHub <#12519 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AADS5FNUWA2W7IQ325YQRATZRLXNHAVCNFSM6AAAAABISF6WAWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBXHEZDGNZSGQ>.
You are receiving this because you commented.
|
Codestal Fill In The Middle (FIM) works like a charm on vscode with continue.dev plugin Currently Zed does not support other model or ollama models for code completion. Is this feature planned or does it depends on commercial agreements with AI providers ? |
I have found that any model that can be pulled on Ollama works on Zed. The limitation is the user’s computer, memory, processor. Codestral is slow on my machine because I only have 16GB on my M1. Has anyone tried building a Linux AI server w tons of RAM and a GPU that you can remote into using the gateway method on Zed?On Aug 15, 2024, at 11:19 AM, vlebert ***@***.***> wrote:
Codestal Fill In The Middle (FIM) works like a charm on vscode with continue.dev plugin
Local ollama models such as starcoder are also light and interesting.
Currently Zed does not support other model or ollama models for code completion. Is this feature planned or does it depends on commercial agreements with AI providers ?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: ***@***.***>
|
@kanelee they do work for assistant but how do you use a custom code completion (copilot) model ? |
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This is not "custom", they are the only option available in zed at the moment |
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
This comment was marked as off-topic.
I am also interested in this feature, to run FIM with a local model. Qwen2.5-Coder does also a good job at inline completion. |
Apologies for resurrecting this issue discussion. Has there been any movement on adding Mistral alongside OpenAI and friends? #15573 seems to have done a lot of the heavy lifting on this already. |
Check for existing issues
Describe the feature
Support codestral from MistralAI as an equivalent of OpenAI.
Codestral support infill and VsCode plugins are already available.
https://mistral.ai/news/codestral/
Thanks!
If applicable, add mockups / screenshots to help present your vision of the feature
No response
The text was updated successfully, but these errors were encountered: