-
Notifications
You must be signed in to change notification settings - Fork 458
Issues: OpenBMB/MiniCPM
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Feature Request]: mlcllm support for minicpm3
feature
New features
#265
opened Nov 19, 2024 by
lin-calvin
[Bug]: return_dict=False is not working in minicpm3
bug
Something isn't working
triage
#263
opened Nov 18, 2024 by
TianmengChen
1 task done
[Feature Request]: History checkpoints of 1B model
feature
New features
#262
opened Nov 16, 2024 by
ZeguanXiao
[Feature Request]: Script used to convert MiniCPM to Llama format
feature
New features
#261
opened Nov 14, 2024 by
ZeguanXiao
[Feature Request]: post pre-training
feature
New features
#260
opened Nov 13, 2024 by
zhangzhixun1999
[Feature Request]: 建议将finetune里的.sh文件中的--include localhost:1更改为--include localhost:0
feature
New features
#255
opened Oct 16, 2024 by
wsstudent
[Feature Request]: 如果使用CPU的话,要求的CPU核数和内存大概是多大,运行4B的模型。
feature
New features
#254
opened Oct 14, 2024 by
zaojiahua
[Bad Case]: error loading model architecture: unknown model architecture: 'minicpm3' time=2024-10-12T20:12:01.292+08:00 level=ERROR source=sched.go:456 msg="error loading llama server" error="llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade"
badcase
Bad cases
#253
opened Oct 14, 2024 by
Hoyxxx
[Bug]: MiniCPM3在做gptq量化时报错
bug
Something isn't working
triage
#252
opened Oct 11, 2024 by
vokkko
1 task done
Due to Flashattention, inference cannot be performed on v100
badcase
Bad cases
#249
opened Sep 28, 2024 by
jiangchengchengark
[Feature Request]: MiniCPM3中是否还使用了“粗质量Stable+高质量Decay”?
feature
New features
#248
opened Sep 28, 2024 by
huyiwen
[Question]:请问llama.cpp部署中convert_hf_to_gguf时,为什么需要modify_tensors转换部分权重呢?
#247
opened Sep 25, 2024 by
FdyCN
[Feature Request]: MiniCPM3-4B会考虑开源base权重么?
feature
New features
#246
opened Sep 24, 2024 by
Rookie-Kai
[Bug]: RuntimeError
bug
Something isn't working
triage
#245
opened Sep 24, 2024 by
WoNiuHu
1 task done
[Feature Request]: Need LoRA model in New features
.gguf
format
feature
#243
opened Sep 20, 2024 by
bioinformatist
[Feature Request]: 工具调用可以支持openai 的api function call吗
feature
New features
#239
opened Sep 16, 2024 by
lonngxiang
ImportError: cannot import name 'SamplerOutput' from 'vllm.sequence' (/root/miniconda3/lib/python3.11/site-packages/vllm/sequence.py
bug
Something isn't working
triage
#236
opened Sep 14, 2024 by
badarrrr
1 task done
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.