-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker and和 WebUI #38
base: main
Are you sure you want to change the base?
Conversation
Does not work. Results in a |
|
http://localhost:8000/docs#/ is working for me. What do the logs say? docker logs janus |
Thanks for sharing the docker script. It setups ok, but now i need to interface with it. The test page from http://localhost:8000/ after installer the docker goes to: The output from the janus logs is:
|
You should be good to go, as that means the server is up. Try http://localhost:8000/docs |
|
You are right, thank you. Going to /docs gives me the FastAPI document which even i get can now use to get results. Thank you! |
Ok super impressed so far and thanks for making it this easy to use. This is going to sound like a stupid question. But which size model is the docker using? |
Good question, as the current quality isn't very high. I'll look into the possibility of running the Pro model. Currently it's Janus1.3B. Perhaps I'll make this settable as an environment variable |
the low quality is has been great to explore, but i'm hoping the Pro one will give the cats less legs. An environment variable would be great. |
Pull the latest image, and try docker run -it --rm \
-p 8000:8000 \
-d \
-v huggingface:/root/.cache/huggingface \
-w /app \
--gpus all \
--name janus \
-e MODEL_NAME=deepseek-ai/Janus-Pro-7B \
julianfl0w/janus:latest And let me know how it goes. The model is too big for my humble RTX3060 |
…ts to the same 1.3B model
looks like compiled PNG data. Either FastAPI needs to handle the generated image better or you need a better Web UI |
Pull the latest image, Run it, and navigate to: |
it works |
Does it need a lot from the graphics card? Is running with a NVIDIA T550 possible? I got this at startup of the container:
|
Not enough memory on your GPU. I get this error when running Pro-7B model on my NVIDIA RTX 3060. I can run |
When thinking about runnit it locally with a single command on many common hardware, I wonder if there is a llamafile for it out there somewhere? |
Hi does anyone know of a project with an openai-like api to replace dall-e with janus pro? It would pair nicely with open-webui |
This Docker implementation makes it easy for people to run the server and a helpful web UI with a single command:
这个 Docker 实现让人们可以用一个命令轻松运行服务器和一个有用的 Web UI:
You can make sure its running by navigating to
打开浏览器并访问
http://localhost:8000/webui
or,
或者,
docker logs janus
NOTE: You will need NVIDIA Container Runtime or equivalent
注意:你需要 NVIDIA Container Runtime 或同等版本