Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: More configurations for GitHub Copilot chat (language, model, etc.) #612

Open
2 tasks done
jdcola opened this issue Dec 5, 2024 · 8 comments
Open
2 tasks done
Assignees
Labels
enhancement New feature or request investigating partly implemented Not fully implemented

Comments

@jdcola
Copy link

jdcola commented Dec 5, 2024

Before Reporting

  • I have checked FAQ, and there is no solution to my issue
  • I have searched the existing issues, and there is no existing issue for my issue

What happened?

  1. Select "Chinese" option for "Feature -> Chat -> Reply in language"
  2. The chat window always replay in English

How to reproduce the bug.

It just happened!

Relevant log output

No response

macOS version

15.1.1

Xcode version

16.1

Copilot for Xcode version

0.35.2

@jdcola jdcola added the bug Something isn't working label Dec 5, 2024
@intitni
Copy link
Owner

intitni commented Dec 5, 2024

I guess you are using GitHub Copilot Chat? Language, temperature and additional system prompt doesn't work for GitHub Copilot chat.

@intitni
Copy link
Owner

intitni commented Dec 5, 2024

Looks like they have a parameter for that now, will give it a try next release.

userLanguage: T.Optional(T.String()),

@jdcola
Copy link
Author

jdcola commented Dec 5, 2024

Looks like they have a parameter for that now, will give it a try next release.

Thanks! (I'm also using Zed's chat feature where GH Copilot automatically answers in the same language of the question.)

By the way, is there an option to choose the underlying model of GitHub Copilot? Because Claude is better than OpenAI models. (Also available in Zed's chat panel.)

@intitni
Copy link
Owner

intitni commented Dec 5, 2024

I wish I know. There is no documentation so we need to dig into the obfuscated source code for it.

@intitni intitni added enhancement New feature or request investigating and removed bug Something isn't working labels Dec 5, 2024
@intitni
Copy link
Owner

intitni commented Dec 5, 2024

Ok since Zed is open source, we can copy their implementations. zed-industries/zed@6f06558

By glancing at the code, I think they are calling the endpoint directly. I am not sure if it is the right way to go since some other repositries were blocked because of that.

@intitni
Copy link
Owner

intitni commented Dec 5, 2024

But zed is apparently more popular than this project. If zed survives, I think it would be safe to just call the API directly?

But I don't have time for it right now, if anyone wants to contribute, let me know.

@intitni intitni changed the title [Bug]: "Replay in language:" option "Chinese" does not effect [Enhancement]: More configurations for GitHub Copilot chat (language, model, etc.) Dec 5, 2024
@intitni
Copy link
Owner

intitni commented Dec 5, 2024

After digging into the language server code, I can't find anything that allows users to specify the model for GitHub Copilot chat.

For anyone interested, the chat feature will call getBestChatModelConfig to get the model configuration. The first arugment is a string array of model family names, the function will use the name to determine the correct model configuration. But the name list is hardcoded as far as I can tell. Other than that, the getFirstMatchingModelMetadata seems to be returning nothing, so only the fallback values are used.

If you really want to use Claude in GitHub Copilot chat right now, you can open the language-server.js in VSCode, format the code and search for something like

case "gpt-4o":
  return {
    modelId: i.id,
    uiName: i.name,
    modelFamily: r,
    maxRequestTokens: await $St(this.ctx, i),
    maxResponseTokens: 4096,
    baseTokensPerMessage: 3,
    baseTokensPerName: 1,
    baseTokensPerCompletion: 3,
    tokenizer: "o200k_base",
    isExperimental: (c = i.isExperimental) != null ? c : !1,
  };

and change the model id to claude-3.5-sonnet. Use at your own risk though. The system prompt will not be updated accordingly so it still says it's using gpt-x.

Update:

I am not very sure but if it doesn't reach the fallback value in your case, you can try editing the model family list by searching for code that looks like:

function Yo(e) {
  switch (e) {
    case "user":
    case "inline":
      return ["gpt-4o", "gpt-4-turbo", "gpt-4"];
    case "meta":
    case "suggestions":
    case "synonyms":
      return ["gpt-4o-mini", "gpt-3.5-turbo"];
  }
}

And add claude-3.5-sonnet to the start of the first array. The function name may be different.

@intitni
Copy link
Owner

intitni commented Dec 23, 2024

The language option will now take effect in GitHub Copilot chat in 0.35.3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request investigating partly implemented Not fully implemented
Projects
None yet
Development

No branches or pull requests

2 participants