-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zed Extension #18
Comments
After few tries , lsp-ai starts working but completions are not shown, in server logs (rpc)
Seems its working and I guess its editor problem |
Thanks for creating an issue! I haven't tested it in Zed yet. Can you share your configuration for Zed so I can try to recreate it and fix these bugs? |
Here is extension source: https://github.com/bajrangCoder/zed-lsp-ai
|
Got it, thank you! I will look at this tonight. |
I spent some time playing around with this. As you saw, I did see the completion requests being fulfilled by LSP-AI, but Zed doesn't always show them. I think this might be a bug with Zed, I'm not sure. It would be really awesome to have a good integration with their editor. Maybe create an issue on their github? If you do, can you tag me in it, or send me the link, I would love to see what they say. Also, how do you find codegemma's results? I was not able to get it to produce good outputs. I found llama3-8b with some decent prompting greatly outperforms it, but maybe I am prompting it incorrectly. |
Does Zed show completions sometimes, as you mentioned, but not always ? 😳 I had already share this in their discord, btw I will create a issue for it on their GitHub.
Currently, Ollama fails to install any model on my system. Previously, I had Codestral and Llama3, but when I tried to install a new model, I faced some issues. I thought it was a problem with my system, so I reinstalled Ollama. Now, I have lost all installed models. Previously, I downloaded the Codegemma binary (.gguf) just for testing and added it locally to try out LSP AI. This is why I am compelled to use Codegemma.(Yep, it's not good for coding even its name suggests) |
There may be a rather large quality loss when converting it to gguf. I was testing it outside of LSP-AI using Ollama's python library directly. Thanks for creating the issue! Excited to see what the zed team says, |
I noticed that Zed does already support Ollama and a few other model execution frameworks/services: https://github.com/zed-industries/zed/tree/main/crates/language_model/src/provider I understand that lsp-ai has a different range of supported frameworks and is flexible in different ways, so it's still exciting to see a proper Zed extension, but some users may find it is already not necessary for their use cases |
That's code assistant and that's like a chat or some ai features that built inside editor that's not like copilot |
Yes that's why it's cool and a alternative to copilot or codeium like tools
Maybe but many peoples wants other tools instead of copilot |
I am trying to use this in Zed using the LSP extension. I downloaded the lsp-ai using Cargo, bound it to different file types, and passed the following initialization options:
(Note: Codegemma is installed on my system using Ollama)
However, when I tested it, I didn't get any completions. Here are the LSP logs of lsp-ai:
Server Logs:
Server Logs (RPC):
Is this an issue with the editor, lsp-ai, or is it my fault?
The text was updated successfully, but these errors were encountered: