Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to connect to the server... Seems like you are using the custom OpenAI provider... #471

Open
digitalenterprises opened this issue Nov 17, 2024 · 8 comments
Labels
bug Something isn't working

Comments

@digitalenterprises
Copy link

Describe the bug
After install when I go to the url http://localhost:3000/ I get two errors:
"Failed to connect to the server. Please try again later."
and
"Seems like you are using the custom OpenAI provider, please open the settings and configure the API key and base URL"

To Reproduce
Steps to reproduce the behavior: go to the url http://localhost:3000/

  1. Go to the url http://localhost:3000/

Expected behavior
I expected perplexica screen to load.

Screenshots
Screenshot from 2024-11-16 15-51-50
Screenshot from 2024-11-16 23-18-23
Screenshot from 2024-11-16 23-18-55
Screenshot from 2024-11-16 23-19-11

Additional context
Ollama is not installed in docker but is running and working http://127.0.0.1:11434/
Searxng is running in docker and is running and working http://127.0.0.1:32768/
I am running on ubuntu PopOS

@digitalenterprises digitalenterprises added the bug Something isn't working label Nov 17, 2024
@ItzCrazyKns
Copy link
Owner

@digitalenterprises
Copy link
Author

Thank for the response.
I have previously tried your suggestions from reading similar issues here.

Linux: Use http://<private_ip_of_host>:11434
Inside /etc/systemd/system/ollama.service, added Environment="OLLAMA_HOST=0.0.0.0"

I know the port isnt blocked since Im using open-webui in docker and it connects to ollama on http://127.0.0.1:11434

Screenshot from 2024-11-17 08-07-43
Screenshot from 2024-11-17 07-54-26

@ItzCrazyKns
Copy link
Owner

Please remove the / from the end of the URL. If that doesn't work can you show the output of sending a curl request at the same URL.

@ItzCrazyKns ItzCrazyKns reopened this Nov 17, 2024
@digitalenterprises
Copy link
Author

Removing the slash didnt work. Here is the result of the curl request:

`$ curl -i http://192.168.111.3:11434
HTTP/1.1 200 OK
Content-Type: text/plain; charset=utf-8
Date: Sun, 17 Nov 2024 14:05:47 GMT
Content-Length: 17

Ollama is running(base)`

@ItzCrazyKns
Copy link
Owner

Did you pressed the blue save button after removing the / from the URL?

@digitalenterprises
Copy link
Author

Did you pressed the blue save button after removing the / from the URL?

Yes, I did save it.

@SCharan24
Copy link

SCharan24 commented Nov 21, 2024

Screenshot 2024-11-21 225537
Had similar issue, but on windows. After changing the"Chat Model provider" it worked. Hope this helps.

@digitalenterprises
Copy link
Author

Screenshot 2024-11-21 225537 Had similar issue, but on windows. After changing the"Chat Model provider" it worked. Hope this helps.

Thank you for the response. However, I have no options for chat model provider, only the default.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants