diff --git a/README.md b/README.md index 491cee7..f250a40 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@

AIGC Application Platform: Lingmengcan AI

English | 中文

-Lingmengcan AI is an AI system based on large models.Currently, it provides functions such as large language model dialogue, model management, text-to-image generation, and background role management. The technology stack used includes Stable Diffusion, OpenAI, ChatGPT, LangChainJS as the AI layer, Vue 3, Naive UI, and Tailwind CSS for the UI layer, and NestJS, LangChainJS, and MySQL for the service layer, with ChromaDB as the vector database. This project is a knowledge base enhancement solution that can achieve **fully localized** reasoning and also provides AIGC functions, focusing on solving the pain points of enterprises regarding data security protection and private domain deployment. +Lingmengcan AI is an AI system based on large models.Currently, it provides functions such as large language model dialogue, model management, text-to-image generation, and background role management. The technology stack used includes Stable Diffusion, OpenAI, ChatGPT, LangChainJS as the AI layer, Vue 3, Naive UI, and Tailwind CSS for the UI layer, and NestJS, LangChainJS, Ollama, and MySQL for the service layer, with ChromaDB as the vector database. This project is a knowledge base enhancement solution that can achieve **fully localized** reasoning and also provides AIGC functions, focusing on solving the pain points of enterprises regarding data security protection and private domain deployment. ## Community @@ -52,6 +52,12 @@ Ensure that your development environment meets the following requirements: ### If You Have a Large Language Model Locally: Load the Model from Local +#### Ollama + +Reference for Deploying Models Locally with Ollama [https://github.com/ollama/ollama](https://github.com/ollama/ollama) + +#### ChatGLM3 + Please refer to [THUDM/ChatGLM3#Load the Model from Local](https://github.com/THUDM/ChatGLM3#从本地加载模型) ```bash diff --git a/README.zh-CN.md b/README.zh-CN.md index 0591ae6..2dcc710 100644 --- a/README.zh-CN.md +++ b/README.zh-CN.md @@ -1,7 +1,7 @@

大模型 AI 应用平台 Lingmengcan AI

English | 中文

-lingmengcan-ai 是一个基于大模型的 ai 系统,目前提供大语言模型对话、模型管理、文生图和后台角色管理等等功能。使用的技术栈,包括 stable deffusion、openai、chatgpt、LangChainJS 作为 ai 层,Vue 3、Naive UI 和 Tailwind CSS 构建 UI 层,以及 NestJS、LangChainJS、MySQL 为服务层,chromadb 为向量数据库。该项目是一个可以实现**完全本地化**推理的知识库增强方案,同时提供 AIGC 功能, 重点解决数据安全保护,私域化部署的企业痛点。 +lingmengcan-ai 是一个基于大模型的 ai 系统,目前提供大语言模型对话、模型管理、文生图和后台角色管理等等功能。使用的技术栈,包括 stable deffusion、openai、chatgpt、LangChainJS 作为 ai 层,Vue 3、Naive UI 和 Tailwind CSS 构建 UI 层,以及 NestJS、LangChainJS、Ollama、MySQL 为服务层,chromadb 为向量数据库。该项目是一个可以实现**完全本地化**推理的知识库增强方案,同时提供 AIGC 功能, 重点解决数据安全保护,私域化部署的企业痛点。 ## 社区 @@ -52,6 +52,12 @@ lingmengcan-ai 是一个基于大模型的 ai 系统,目前提供大语言模 ### 如果本地已有大语言模型:从本地加载模型 +#### Ollama + +通过 Ollama 本地部署模型参考[https://github.com/ollama/ollama](https://github.com/ollama/ollama) + +#### ChatGLM3 + 请参考 [THUDM/ChatGLM3#从本地加载模型](https://github.com/THUDM/ChatGLM3#从本地加载模型) ```bash diff --git a/service/package.json b/service/package.json index d68fb29..439ea19 100644 --- a/service/package.json +++ b/service/package.json @@ -1,6 +1,6 @@ { "name": "lingmengcan-service", - "version": "1.0.0.20240911", + "version": "1.0.1.20240926", "description": "lingmengcan-ai", "author": "lingmengcan", "private": true, diff --git a/web/package.json b/web/package.json index 83d54ec..de03baa 100644 --- a/web/package.json +++ b/web/package.json @@ -1,6 +1,6 @@ { "name": "lingmengcan-web", - "version": "1.0.0.20240911", + "version": "1.0.1.20240926", "private": true, "type": "module", "scripts": { diff --git a/web/src/views/draw/generate.vue b/web/src/views/draw/generate.vue index eb561af..88d727b 100644 --- a/web/src/views/draw/generate.vue +++ b/web/src/views/draw/generate.vue @@ -11,10 +11,10 @@ - + - +