See releases for more.
Bumps support range to include OpenAI Python SDK v1.54.
Bumps support range to include OpenAI Python SDK v1.51.
OpenAI Python SDK minimum version is now v1.50.
Bumps support range to include OpenAI Python SDK v1.47.
Bumps support range to include OpenAI Python SDK v1.45 and includes o1 family of models in fetch models response.
Issues:
Bumps support range to include OpenAI Python SDK v1.43.
Adds support for moderations endpoint.
Issues:
Pull requests:
Bumps support range to include OpenAI Python SDK v1.42. Also adds env for testing Pydantic V1 and V2 in tox.
Adds support for structured outputs and the beta.chat.completions
route.
Misc:
- Adds a working example of how to use this library to test
langchain-openai
(asked about in #58: Wouldopenai-responses-python
work with langchain_openai?)
Bumps support range to include OpenAI Python SDK v1.40.
Bumps support range to include OpenAI Python SDK v1.38. Also removes dependency on faker
.
Bumps support range to include OpenAI Python SDK v1.37.
Bumps support range to include OpenAI Python SDK v1.36.
Bumps support range to include OpenAI Python SDK v1.35.
Breaking change that removes the event method on the event stream classes and forces the user to explicitly define and construct the generated events.
This will help cut down on feature drift overtime since this is relying on OpenAI's types instead of relying on custom code that tried to make this easier for the user.
Closed:
Pulls in breaking changes fromr from OpenAI Python SDK v1.32+ and pins version support to >=1.32,<1.35
Pins OpenAI SDK version to >=1.25,<1.32
Adds vector store responses
Closed:
- #24: feat: support vector store endpoints
- #47: feat: support vector store file batch endpoint
- #49: feat: create vector store from assistant and thread create
Adds routes for models and retrieving file content.
Closed:
Adds setter for OpenAIMock
state store
Important
Breaking change
Note
✨ Streaming support is here
Tons of changes:
- New
EventStream
andAsyncEventStream
objects to create mock event streams OpenAIMock
class now exposes state store throughstate
property- Updated and more organized API
- Replacement of
calls
property on routes in favor ofroute
property - New examples:
- Create run with streaming
- Create run with streaming (async)
- Exporting router as transport to use as
http_client
in OpenAI client constructor
- Plus some small bug fixes and QoL enhancements
Overriding base URL was not working properly for Azure endpoints. Thanks @mapohjola for pointing this out. This moves the version prefix (i.e. /v1
) from the OpenAI routes to the default base URL. Also added an example using Azure endpoints.
Closed:
Fixes incorrect partial type definition for run step tool calls.
Adds router
property on OpenAIMock
class to expose instance of respx.MockRouter to easily allow user to add additional routes to mock like non-OpenAI API calls, or enable a call to a route to pass through to the external service.
Usage example can be found here.
Closed:
Fixes:
- #31: feat: add support for null args on create_and_run
- #32: feat: add support for parameters in client.beta.threads.messages.list
Thanks @pietroMonta42 for finding these issues.
Important
Breaking change
Introducing an all-new API that is both simpler to use and much more flexible. See docs for more.
In addition to a new API, this release closed these issues:
- #1: feat: ability to raise exceptions
- #9: feat: base url override
- #28: feat: automatically share state between chained mocks
Additional notes:
- Removes token estimation. This is now the responsibility of the user to provided mock token count
- Adds more example files
- Still not completely happy with current state of mocking run steps. Will likely change in the near future.
Warning
Deprecated
Fixes issue where messages included in run create params (using additional_messages
) was ignored.
Caution
Yanked
Migrates assistant endpoints to Assistants V2
Caution
Yanked
Fixes some issues with chat completions and other stateless mocks
Caution
Yanked
Initial release with minimally useful support for what I needed.