Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zed Extension #18

Open
bajrangCoder opened this issue Jun 11, 2024 · 10 comments
Open

Zed Extension #18

bajrangCoder opened this issue Jun 11, 2024 · 10 comments
Labels
enhancement New feature or request

Comments

@bajrangCoder
Copy link

I am trying to use this in Zed using the LSP extension. I downloaded the lsp-ai using Cargo, bound it to different file types, and passed the following initialization options:

(Note: Codegemma is installed on my system using Ollama)

{
    "memory": {
        "file_store": {}
    },
    "models": {
        "model1": {
            "type": "ollama",
            "model": "codegemma"
        }
    },
    "completion": {
        "model": "model1",
        "parameters": {
            "fim": {
                "start": "<|fim_begin|>",
                "middle": "<|fim_hole|>",
                "end": "<|fim_end|>"
            },
            "max_context": 2000,
            "options": {
                "num_predict": 32
            }
        }
    }
}

However, when I tested it, I didn't get any completions. Here are the LSP logs of lsp-ai:

Server Logs:

stderr: ERROR lsp_ai::memory_worker: error in memory worker task: Error getting rope slice
stderr: ERROR lsp_ai::transformer_worker: generating response: channel closed
stderr: ERROR lsp_ai::memory_worker: error in memory worker task: Error getting rope slice
stderr: ERROR lsp_ai::transformer_worker: generating response: channel closed
stderr: ERROR lsp_ai::memory_worker: error in memory worker task: Error getting rope slice
stderr: ERROR lsp_ai::transformer_worker: generating response: channel closed
stderr: ERROR lsp_ai::memory_worker: error in memory worker task: Error getting rope slice
stderr: ERROR lsp_ai::transformer_worker: generating response: channel closed
stderr: ERROR lsp_ai::memory_worker: error in memory worker task: Error getting rope slice
stderr: ERROR lsp_ai::transformer_worker: generating response: channel closed

Server Logs (RPC):

// Send:
{"jsonrpc":"2.0","id":8,"method":"textDocument/completion","params":{"textDocument":{"uri":"file:///home/raunak/Documents/zed-lsp-ai/test.py"},"position":{"line":1,"character":5}}}
// Receive:
{"jsonrpc":"2.0","id":8,"error":{"code":-32603,"message":"channel closed"}}
// Send:
{"jsonrpc":"2.0","id":9,"method":"textDocument/completion","params":{"textDocument":{"uri":"file:///home/raunak/Documents/zed-lsp-ai/test.py"},"position":{"line":2,"character":5}}}
// Receive:
{"jsonrpc":"2.0","id":9,"error":{"code":-32603,"message":"channel closed"}}

Is this an issue with the editor, lsp-ai, or is it my fault?

@bajrangCoder
Copy link
Author

After few tries , lsp-ai starts working but completions are not shown, in server logs (rpc)
I get like this(after a long delay)

.....
// Receive:
{"jsonrpc":"2.0","id":11,"result":{"isIncomplete":false,"items":[{"filterText":"def a","kind":1,"label":"ai - b: int ->int(x,y)\n\nreturn x + y\n\n\nfunction c <| fim begin ||> d  |<|{print(\"Hello World","textEdit":{"newText":"b: int ->int(x,y)\n\nreturn x + y\n\n\nfunction c <| fim begin ||> d  |<|{print(\"Hello World","range":{"end":{"character":5,"line":1},"start":{"character":5,"line":1}}}}]}}
// Receive:
{"jsonrpc":"2.0","id":13,"result":{"isIncomplete":false,"items":[{"filterText":"def add","kind":1,"label":"ai -  (a: int, b :int) ->  | fim begin | >result < |end match result do if true then return(add_(1","textEdit":{"newText":" (a: int, b :int) ->  | fim begin | >result < |end match result do if true then return(add_(1","range":{"end":{"character":7,"line":1},"start":{"character":7,"line":1}}}}]}}

Seems its working and I guess its editor problem

@SilasMarvin
Copy link
Owner

Thanks for creating an issue! I haven't tested it in Zed yet. Can you share your configuration for Zed so I can try to recreate it and fix these bugs?

@SilasMarvin SilasMarvin added the bug Something isn't working label Jun 11, 2024
@bajrangCoder
Copy link
Author

bajrangCoder commented Jun 11, 2024

Thanks for creating an issue! I haven't tested it in Zed yet. Can you share your configuration for Zed so I can try to recreate it and fix these bugs?

Here is extension source: https://github.com/bajrangCoder/zed-lsp-ai

We can't configure any unknown lsp directly, it should be added through extension in zed. But we can configure lsp options from settings

@SilasMarvin
Copy link
Owner

Got it, thank you! I will look at this tonight.

@SilasMarvin
Copy link
Owner

I spent some time playing around with this. As you saw, I did see the completion requests being fulfilled by LSP-AI, but Zed doesn't always show them. I think this might be a bug with Zed, I'm not sure. It would be really awesome to have a good integration with their editor. Maybe create an issue on their github? If you do, can you tag me in it, or send me the link, I would love to see what they say.

Also, how do you find codegemma's results? I was not able to get it to produce good outputs. I found llama3-8b with some decent prompting greatly outperforms it, but maybe I am prompting it incorrectly.

@bajrangCoder
Copy link
Author

bajrangCoder commented Jun 12, 2024

but Zed doesn't always show them

Does Zed show completions sometimes, as you mentioned, but not always ? 😳

I had already share this in their discord, btw I will create a issue for it on their GitHub.

Also, how do you find codegemma's results?.......

Currently, Ollama fails to install any model on my system. Previously, I had Codestral and Llama3, but when I tried to install a new model, I faced some issues. I thought it was a problem with my system, so I reinstalled Ollama. Now, I have lost all installed models.

Previously, I downloaded the Codegemma binary (.gguf) just for testing and added it locally to try out LSP AI. This is why I am compelled to use Codegemma.(Yep, it's not good for coding even its name suggests)

@SilasMarvin
Copy link
Owner

SilasMarvin commented Jun 12, 2024

There may be a rather large quality loss when converting it to gguf. I was testing it outside of LSP-AI using Ollama's python library directly.

Thanks for creating the issue! Excited to see what the zed team says,

@SilasMarvin SilasMarvin changed the title LSP issue: Error getting rope slice Zed Extension Jun 12, 2024
@SilasMarvin SilasMarvin added enhancement New feature or request and removed bug Something isn't working labels Jun 12, 2024
@jokeyrhyme
Copy link

I noticed that Zed does already support Ollama and a few other model execution frameworks/services: https://github.com/zed-industries/zed/tree/main/crates/language_model/src/provider

I understand that lsp-ai has a different range of supported frameworks and is flexible in different ways, so it's still exciting to see a proper Zed extension, but some users may find it is already not necessary for their use cases

@bajrangCoder
Copy link
Author

I noticed that Zed does already support Ollama and a few other model execution frameworks/services: https://github.com/zed-industries/zed/tree/main/crates/language_model/src/provider

That's code assistant and that's like a chat or some ai features that built inside editor that's not like copilot

@bajrangCoder
Copy link
Author

I understand that lsp-ai has a different range of supported frameworks and is flexible in different ways, so it's still exciting to see a proper Zed extension,

Yes that's why it's cool and a alternative to copilot or codeium like tools

but some users may find it is already not necessary for their use cases

Maybe but many peoples wants other tools instead of copilot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants