Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codestral (Mistral code suggestion) #12519

Open
1 task done
Solido opened this issue May 31, 2024 · 20 comments
Open
1 task done

Codestral (Mistral code suggestion) #12519

Solido opened this issue May 31, 2024 · 20 comments
Labels
ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant (inline or panel) enhancement [core label]

Comments

@Solido
Copy link

Solido commented May 31, 2024

Check for existing issues

  • Completed

Describe the feature

Support codestral from MistralAI as an equivalent of OpenAI.

Codestral support infill and VsCode plugins are already available.

https://mistral.ai/news/codestral/

Thanks!

If applicable, add mockups / screenshots to help present your vision of the feature

No response

@Solido Solido added admin read Pending admin review enhancement [core label] triage Maintainer needs to classify the issue labels May 31, 2024
@JosephTLyons JosephTLyons added ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant (inline or panel) and removed triage Maintainer needs to classify the issue admin read Pending admin review labels May 31, 2024
@universalmind303
Copy link

Additionally, It'd be amazing if we could use this for inline_completions.

@NightMachinery
Copy link

NightMachinery commented Jul 3, 2024

Don't be too excited. Codestral is terrible at doing FIM. I have switched to asking Sonnet 3.5 to just fill in the marked part, and it does the job 10x better, even though it is a chat model and not tuned for FIM at all. Codestral can't even match the parentheses right.

@neofob
Copy link

neofob commented Jul 10, 2024

I could use Codestral model with private-gpt (fork from zylon-ai's private-gpt) in chat mode running in Docker with NVIDIA GPU support. So it would be cool if we could get it to work with zed locally.

@seddonm1
Copy link

I did a basic implementation that works: #15573

A few outstanding questions as I don't know this code base very well.

@bersace
Copy link

bersace commented Aug 12, 2024

FTR, my settings for codestral:

{
  "language_models": {
    "openai": {
      "version": "1",
      "api_url": "https://codestral.mistral.ai/v1",
      "available_models": [
        { "custom": { "name": "codestral-latest", "max_tokens": 131072 } }
      ]
    }
  },
  "assistant": {
    "version": "2",
    "default_model": {
      "provider": "openai",
      "model": "codestral-latest"
    }
  },
...

@bersace
Copy link

bersace commented Aug 12, 2024

Note the different endpoint from regular mistral models.

@kanelee
Copy link

kanelee commented Aug 13, 2024

Note the different endpoint from regular mistral models.

Can you also use codestral as an Ollama pull?

@bersace
Copy link

bersace commented Aug 14, 2024

Note the different endpoint from regular mistral models.

Can you also use codestral as an Ollama pull?

I don't have the hardware.

@kanelee
Copy link

kanelee commented Aug 14, 2024 via email

@vlebert
Copy link

vlebert commented Aug 15, 2024

Codestal Fill In The Middle (FIM) works like a charm on vscode with continue.dev plugin
Local ollama models such as starcoder are also light and interesting.

Currently Zed does not support other model or ollama models for code completion. Is this feature planned or does it depends on commercial agreements with AI providers ?

@kanelee
Copy link

kanelee commented Aug 15, 2024 via email

@vlebert
Copy link

vlebert commented Aug 15, 2024

@kanelee they do work for assistant but how do you use a custom code completion (copilot) model ?

@kanelee

This comment was marked as off-topic.

@kanelee

This comment was marked as off-topic.

@vlebert
Copy link

vlebert commented Aug 16, 2024

This is not "custom", they are the only option available in zed at the moment
My point is to use codestral for code completion

@kanelee

This comment was marked as off-topic.

@vlebert

This comment was marked as off-topic.

@kanelee

This comment was marked as off-topic.

@tbocek
Copy link

tbocek commented Oct 19, 2024

I am also interested in this feature, to run FIM with a local model. Qwen2.5-Coder does also a good job at inline completion.

@RoryLawless
Copy link

Apologies for resurrecting this issue discussion. Has there been any movement on adding Mistral alongside OpenAI and friends? #15573 seems to have done a lot of the heavy lifting on this already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant (inline or panel) enhancement [core label]
Projects
None yet
Development

No branches or pull requests