Skip to content

Conversation

@trrwilson
Copy link
Member

This PR is a non-exhaustive demonstration of using a custom pipeline transport (concept from System.ClientModel) to facilitate with the official OpenAI library from https://github.com/openai/openai-dotnet. It's not intended to merge as-is, but rather provide a reference for official client integration options and approaches.

Only non-streaming is initially implemented -- but streaming shouldn't be too different, especially with impl sources available.

Program.cs is an adapted HelloFoundryLocalSdk , but uses the included FoundryLocalChatClient for use of the OpenAI.

Note

Integrated into Foundry Local, client retrieval wouldn't require the new public type, as something like the following would "just work" to provide a client:

// Internally: return new FoundryLocalChatClient(this)
ChatClient client = model.GetChatClient();

The guts of the derived client lie in the FoundryLocalPipelineTransport and its nested classes. This transport replaces standard HTTP traffic behavior with use of Foundry Local CoreInterop, instead, stubbing substantial portions of functionality that aren't applicable to the scenario.

As not all CoreInterop capabilities have external visibility, this use relies on Reflection via an encapsulated FoundryLocaInteropWrapper type at the end of the custom transport's nested classes.

@vercel
Copy link

vercel bot commented Dec 19, 2025

@trrwilson is attempting to deploy a commit to the MSFT-AIP Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant