This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Description
Motivation
-
Do we package the cuda toolkit to the engine?
Yes? Then will have to do the same for llamacpp, tensorrt-llm and onnx?
No? Will download separatedly
-
Folder structures (e.g if user have llamacpp, tensorrt at the same time)?
Resources
Llamacpp release
Currently we are downloading toolkit dependency via https://catalog.jan.ai/dist/cuda-dependencies/<version>/<platform>/cuda.tar.gz
cc @vansangpfiev @nguyenhoangthuan99 @dan-homebrew
Update sub-tasks: