Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync experiments branch with main 0.18.2 #186

Merged
merged 22 commits into from
Jan 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion async-openai/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "async-openai"
version = "0.18.0"
version = "0.18.2"
authors = [
"Himanshu Neema"
]
Expand Down
17 changes: 15 additions & 2 deletions async-openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,9 +118,22 @@ async fn main() -> Result<(), Box<dyn Error>> {

## Contributing

Thank you for your time to contribute and improve the project, I'd be happy to have you!
Thank you for taking the time to contribute and improve the project. I'd be happy to have you!

A good starting point would be existing [open issues](https://github.com/64bit/async-openai/issues).
All forms of contributions, such as new features requests, bug fixes, issues, documentation, testing, comments, [examples](../examples) etc. are welcome.

A good starting point would be to look at existing [open issues](https://github.com/64bit/async-openai/issues).

To maintain quality of the project, a minimum of the following is a must for code contribution:
- **Documented**: Primary source of doc comments is description field from OpenAPI spec.
- **Tested**: Examples are primary means of testing and should continue to work. For new features supporting example is required.
- **Scope**: Keep scope limited to APIs available in official documents such as [API Reference](https://platform.openai.com/docs/api-reference) or [OpenAPI spec](https://github.com/openai/openai-openapi/). Other LLMs or AI Providers offer OpenAI-compatible APIs, yet they may not always have full parity. In such cases, the OpenAI spec takes precedence.
- **Consistency**: Keep code style consistent across all the "APIs" that library exposes; it creates a great developer experience.

This project adheres to [Rust Code of Conduct](https://www.rust-lang.org/policies/code-of-conduct)

## Complimentary Crates
- [openai-func-enums](https://github.com/frankfralick/openai-func-enums) provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existing [clap](https://github.com/clap-rs/clap) application subcommands for natural language use of command line tools. It also supports openai's [parallel tool calls](https://platform.openai.com/docs/guides/function-calling/parallel-function-calling) and allows you to choose between running multiple tool calls concurrently or own their own OS threads.

## Complimentary Crates
- [openai-func-enums](https://github.com/frankfralick/openai-func-enums) provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existing [clap](https://github.com/clap-rs/clap) application subcommands for natural language use of command line tools. It also supports openai's [parallel tool calls](https://platform.openai.com/docs/guides/function-calling/parallel-function-calling) and allows you to choose between running multiple tool calls concurrently or own their own OS threads.
Expand Down
22 changes: 22 additions & 0 deletions async-openai/src/embedding.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ impl<'c, C: Config> Embeddings<'c, C> {
#[cfg(test)]
mod tests {
use crate::{types::CreateEmbeddingRequestArgs, Client};
use crate::types::{CreateEmbeddingResponse, Embedding};

#[tokio::test]
async fn test_embedding_string() {
Expand Down Expand Up @@ -105,4 +106,25 @@ mod tests {

assert!(response.is_ok());
}

#[tokio::test]
async fn test_embedding_with_reduced_dimensions() {
let client = Client::new();
let dimensions = 256u32;
let request = CreateEmbeddingRequestArgs::default()
.model("text-embedding-3-small")
.input("The food was delicious and the waiter...")
.dimensions(dimensions)
.build()
.unwrap();

let response = client.embeddings().create(request).await;

assert!(response.is_ok());

let CreateEmbeddingResponse { mut data, ..} = response.unwrap();
assert_eq!(data.len(), 1);
let Embedding { embedding, .. } = data.pop().unwrap();
assert_eq!(embedding.len(), dimensions as usize);
}
}
3 changes: 2 additions & 1 deletion async-openai/src/file.rs
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,9 @@ mod tests {
//assert_eq!(openai_file.purpose, "fine-tune");

//assert_eq!(openai_file.status, Some("processed".to_owned())); // uploaded or processed
let query = [("purpose", "fine-tune")];

let list_files = client.files().list().await.unwrap();
let list_files = client.files().list(&query).await.unwrap();

assert_eq!(list_files.data.into_iter().last().unwrap(), openai_file);

Expand Down
2 changes: 1 addition & 1 deletion async-openai/src/types/chat.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
use std::collections::HashMap;

use derive_builder::Builder;
use serde::{Deserialize, Serialize};
use crate::client::OpenAIEventStream;
use serde::{Deserialize, Serialize};

use crate::error::OpenAIError;

Expand Down
2 changes: 1 addition & 1 deletion async-openai/src/types/completion.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
use std::collections::HashMap;

use derive_builder::Builder;
use serde::{Deserialize, Serialize};
use crate::client::OpenAIEventStream;
use serde::{Deserialize, Serialize};

use crate::error::OpenAIError;

Expand Down
4 changes: 4 additions & 0 deletions async-openai/src/types/embedding.rs
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,10 @@ pub struct CreateEmbeddingRequest {
/// to monitor and detect abuse. [Learn more](https://platform.openai.com/docs/usage-policies/end-user-ids).
#[serde(skip_serializing_if = "Option::is_none")]
pub user: Option<String>,

/// The number of dimensions the resulting output embeddings should have. Only supported in text-embedding-3 and later models.
#[serde(skip_serializing_if = "Option::is_none")]
pub dimensions: Option<u32>
}

/// Represents an embedding vector returned by embedding endpoint.
Expand Down
2 changes: 1 addition & 1 deletion async-openai/src/types/fine_tune.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use derive_builder::Builder;
use serde::{Deserialize, Serialize};
use crate::client::OpenAIEventStream;
use serde::{Deserialize, Serialize};

use crate::error::OpenAIError;

Expand Down