This is basically a wrapper of llama.cpp package and the purpose of this repo is to provide a swiftier API for Swift developers.
.package(url: "https://github.com/ShenghaiWang/SwiftLlama.git", from: "0.3.0")
let swiftLlama = try SwiftLlama(modelPath: path))
let response: String = try await swiftLlama.start(for: prompt)
for try await value in await swiftLlama.start(for: prompt) {
result += value
}
await swiftLlama.start(for: prompt)
.sink { _ in
} receiveValue: {[weak self] value in
self?.result += value
}.store(in: &cancallable)
This video was the command line app running with Llama 3 model.
For using it in iOS or MacOS app, please refer to the TestProjects folder.
In theory, it should support all the models that llama.cpp suports. However, the prompt format might need to be updated for some models.
If you want to test it out quickly, please use this model codellama-7b-instruct.Q4_K_S.gguf