UOS-AI support to use ollama local models #7714
marcogutama
started this conversation in
Features Request & Ideas | 特性请求 & 头脑风暴
Replies: 1 comment
-
cc @meiyixiang |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
It would be great if functionality was added to uos-ai to use local Ollama models. In this way, those people who do not have internet or have the AI pages blocked at work (as is my case) can use AI in Deepin.
Beta Was this translation helpful? Give feedback.
All reactions