-
-
Notifications
You must be signed in to change notification settings - Fork 14.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ollama-webui package and service for Mixtral #275448
Conversation
This pull request has been mentioned on NixOS Discourse. There might be relevant details there: https://discourse.nixos.org/t/how-to-package-static-single-page-nodejs-webapp-ollama-webui/37074/1 |
a4d4021
to
13003f5
Compare
9bb3767
to
e07509d
Compare
Hi @malteneuss, thanks for packaging Ollama Web UI. 💙 In the meantime, Ollama has been added as a service (but I haven't tested the service yet).1 What's missing for bringing the Ollama Web UI to Nixpkgs? 😊 Footnotes |
As-is this does not build, but I have been running this PR with a few changes to get it to build locally for a few days and it's been working well. I have not looked at what would be required to land it in nixpkgs. |
To make open-source large langue models (LLM) accessible there are projects like Ollama that make it almost trivial to download and run them locally on a consumer computer. We already have Ollama in Nixpkgs, but that can only be run conveniently in a terminal (and doesn't store previous chats). What's missing is a web UI, e.g. Ollama-WebUI that mimics ChatGPT's frontend and integrates nicely with Ollama.
Make it convenient to setup Ollama-Webui on a server to run large language models (LLM).
e07509d
to
d48979f
Compare
@trzpiot I wanted to add an automatic NixOS test (since the ollama-webui needs to closely follow ollama; otherwise it broke a few times), but i haven't had time to learn to setup this up. Maybe a manual test would suffice for now to get it merged. Thanks for redirecting me to the existing ollama service (someone was faster ;) but i still would like to add some more knobs to make Ollama be deployable to a home lab server). @mschwaig Thanks for testing it out. I pushed similar fixes to yours into the MR. I will try to test the setup during the weekend. Maybe you could do the same? |
I am running this branch since yesterday. Two things that I noticed were
Overall it seems to be running fine. When you have added the extra config options that you still wanted to add I think you could remove the draft flag from this PR and at that point we can get someone to take a look who feels confident in reviewing the systemd services configs. |
|
||
ollama-webui-package = mkPackageOption pkgs "ollama-webui" { }; | ||
|
||
host = mkOption { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this host isn't used anywhere, is that intentional ?
in { | ||
ExecStart = "${cfg.ollama-webui-package}/bin/ollama-webui --port ${toString cfg.port} ${cors-arg}"; | ||
DynamicUser = "true"; | ||
Type = "simple"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just for information the default type is simple for services is simple. I'm not sure how I feel about having it written explicitely, if you prefer it that way, feel free to keep it.
npmDepsHash = "sha256-SI2dPn1SwbGwl8093VBtcDsA2eHSxr3UUC+ta68w2t8="; | ||
|
||
# We have to bake in the default URL it will use for ollama webserver here, | ||
# but it can be overriden in the UI later. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when you say it can be overridden later, does that mean inside the module ?
just wondering if you still need to add the override.
Optional: This module is configured to run locally, but can be served from a (home) server, | ||
ideally behind a secured reverse-proxy. | ||
Look at <https://nixos.wiki/wiki/Nginx> or <https://nixos.wiki/wiki/Caddy> | ||
on how to set up a reverse proxy. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you think referencing the caddy file of the original repo would be a good idea ?
https://github.com/ollama-webui/ollama-webui/blob/main/Caddyfile.localhost
PUBLIC_API_BASE_URL = "http://localhost:11434/api"; | ||
|
||
# The path '/ollama/api' will be redirected to the specified backend URL | ||
OLLAMA_API_BASE_URL = PUBLIC_API_BASE_URL; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the env var only seems to be set after the build has succeeded
https://github.com/ollama-webui/ollama-webui/blob/main/Dockerfile#L22
does the build fail without ?
if not, this is probably something that should be set in the service, not in the package.
mkdir -p $out/lib | ||
cp -R ./build/. $out/lib | ||
|
||
mkdir -p $out/bin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need those last instructions.
I could be wrong, but if this is a standard node project, you just need to package the build directory
then in the service you can use node directly to just run the build directory
here are some reference of how we did this with lemmy
name = "lemmy-ui"; |
cfg = config.services.lemmy; |
second point and this is entirely optional.
rather than making a complety separate service for ollama-webui, how about including it in the original ollama service.
I don't think it makes sense to have ollama-webui as a standalone for now.
(you can take inspiration in the lemmy module for what we did if you find stuff that you like).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a bit more context on not providing an executable for a node package.
the main reason is that without several environment variable set, the binary is just not runnable. So most of the time you package the build directory and in the service you provide all the necessary to run it.
if you prefer to provide a binary, I respect your decision, in that case, you should probably use makeWrapper, you can look at jellyseerr for an example (you'll find many more if you don't like this particular one).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
second point and this is entirely optional.
rather than making a complety separate service for ollama-webui, how about including it in the original ollama service.
I don't think it makes sense to have ollama-webui as a standalone for now.
(you can take inspiration in the lemmy module for what we did if you find stuff that you like).
Since ollama-webui
is a community project and not officially associated with ollama
(see the top of their README.md), I tend to think it does not make sense to have the ollama-webui
service live under ollama
as ollama.ui
or ollama.web-ui
.
There are a bunch of other frontends that people might want to use with the ollama
service, such as oterm
.
hey, I've provided a couple of comments, but all in all, this is very nice! Thank you for your contribution. I'm here to help, so let me know if anything! |
Hi, I just used your branch, and I noticed two things:
Thanks for creating this PR! |
@malteneuss are you still interested in/do you still have time to move this forward? Would it be OK for someone else to pick it up? For those who are interested in this functionality, I have a branch with a rough update of this branch to a more recent version of what's now called open-webui: https://github.com/mschwaig/nixpkgs/tree/open-webui To me it looks like those more recent versions require also packaging some python (i think mostly for RAG), which blows up the scope a bit. Especially since some dependencies have not landed in nixpkgs yet. |
@happysalada @mschwaig Thanks for the review comments and pushing this topic further. I've been busy with my full-time job and a two year old, and have little to no time to move this forward in the next months. Would your branch be stable and usable enough to be merged and improved in smaller steps? |
HOME = "%S/ollama"; | ||
OLLAMA_MODELS = "%S/ollama/models"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this still on-going?
|
As far as I remember the chat functionality is working on my branch (linked above). I do not have a lot of time to look into this in the next two weeks. I would recommend that you use my branch as a starting point. |
Already done with #316248 |
Description of changes
Related to #273556:
Background:
To make open-source large langue models (LLM) accessible there are projects like Ollama that make it almost trivial to download and run them locally on a consumer computer.
We already have Ollama in Nixpkgs, but that can only be run conveniently in a terminal (and doesn't store previous chats). What's missing is a web UI, e.g. Ollama-WebUI that mimics ChatGPT's frontend and integrates nicely with Ollama.
Things done
nix.conf
? (See Nix manual)sandbox = relaxed
sandbox = true
nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD"
. Note: all changes have to be committed, also see nixpkgs-review usage./result/bin/
)Add a 👍 reaction to pull requests you find important.