-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama provider returns no results when ollama is running behind an https proxy on a remote server #64
Comments
I am also behind a proxy and I get the following error with hugginface:
And I have the following config: {
'gsuuon/model.nvim',
cmd = { 'M', 'Model', 'Mchat' },
init = function()
vim.filetype.add({
extension = {
mchat = 'mchat',
}
})
end,
ft = 'mchat',
-- keys = {
-- { '<C-m>d', ':Mdelete<cr>', mode = 'n' },
-- { '<C-m>s', ':Mselect<cr>', mode = 'n' },
-- { '<C-m><space>', ':Mchat<cr>', mode = 'n' }
-- },
-- To override defaults add a config field and call setup()
config = function()
require('model').setup({
prompts = require('model.util').module.autoload('prompt_library'),
chats = {
['hf:starcoder'] = {
provider = require('model.providers.huggingface'),
options = {
model = 'bigcode/starcoder'
},
builder = function(input)
return { inputs = input }
end
},
},
})
end
}, Not sure if it is the proxy or my config. Usually curl uses the configured |
@zbindenren That would be a different type of problem. It looks like maybe you put a completion prompt where you needed a chat prompt. I think the chat prompts require a |
Are you able to directly curl that endpoint? You can set @zbindenren I think @FlippingBinary is right here. I should improve the docs with respect to the distinction between chat prompts and completion prompts (or maybe unify the interface). |
@gsuuon When I curl the url in the console, I get "Ollama is running." I also tried sending a prompt to the url plus curl https://ollama.local/api/generate -d '{
"model": "llama3.1",
"prompt": "Why is the sky blue?"
}' This gave me a response as a stream that looks valid. Hmm.. I tried enabling the debugging messages and got two errors instead of the curl args when calling
Aside from a single blank space, there was no text after |
Are you using a graphical neovim client? There may be something going awry with the dialog event, but the Pipe the body into curl with the args, so something like Do you see this issue with any of the other providers? |
Well this is weird. I was sure neither the chat nor the completion were working before. Now the error message I reported above appears to be a problem with Git, not the plugin. When I run
... followed by the big usage message. I'm not sure why that is, but it doesn't seem to be related to this plugin. I'll close this issue for now because it looks like I just didn't configure the plugin correctly. Thanks for the tips. |
No problem! Hope it works out. |
It's not clear to me if the server is returning an empty result or if there is some other connectivity problem, but that issue is covered by #35.
Here is the configuration I'm using:
Has anyone else tried using this plugin with Ollama behind an https reverse proxy?
The text was updated successfully, but these errors were encountered: