Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions .github/workflows/rusttest.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Rust-test

on:
pull_request:
paths:
- 'sdk/rust/**'
push:
paths:
- 'sdk/rust/**'
branches:
- main
workflow_dispatch:

jobs:
check:
runs-on: ubuntu-22.04
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Update toolchain
run: rustup update --no-self-update stable && rustup default stable
- name: Run Unit Tests
working-directory: sdk/rust
run: cargo test
24 changes: 15 additions & 9 deletions sdk/rust/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Foundry Local Rust SDK

A Rust SDK for interacting with the Microsoft Foundry Local service. This SDK allows you to manage and use AI models locally on your device.
A Rust SDK for interacting with the Microsoft Foundry Local service. This SDK allows you to manage and use AI models locally on your device. See [Foundry Local](http://aka.ms/foundrylocal) for more infromation.

## Features
- Start and manage the Foundry Local service
Expand All @@ -17,17 +17,20 @@ use anyhow::Result;

#[tokio::main]
async fn main() -> Result<()> {
// Create a FoundryLocalManager instance with default options
let manager = FoundryLocalManager::new("phi-3.5-mini", true).await?;
// Create a FoundryLocalManager instance with the option to automatically download and start the service and a model
let manager = FoundryLocalManager::builder()
.alias_or_model_id("phi-3.5-mini")
.bootstrap(true)
.build()
.await?;

// Use the OpenAI compatible API to interact with the model
let client = reqwest::Client::new();
let response = client.post(&format!("{}/chat/completions", manager.endpoint()))
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {}", manager.api_key()))
let response = client
.post(format!("{}/chat/completions", manager.endpoint()?))
.json(&serde_json::json!({
"model": manager.get_model_info("phi-3.5-mini").await?.id,
"messages": [{"role": "user", "content": "What is the golden ratio?"}],
"model": model_info.id,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does not look as though the model_info has been declared before being used.

"messages": [{"role": "user", "content": prompt}],
}))
.send()
.await?;
Expand All @@ -50,7 +53,10 @@ foundry-local = "0.1.0"

## Requirements

- Foundry Local must be installed and available on the PATH
- Foundry Local must be installed. On Windows you can run the following to install latest.
```powershell
winget install Microsoft.FoundryLocal
```
- Rust 1.70.0 or later

## License
Expand Down
Loading