Skip to content

[AI Evaluation] Support response caching for safety evaluators #6260

@shyamnamboodiripad

Description

@shyamnamboodiripad

Unlike the Quality evaluators (which talk to an LLM and cache LMM responses using ResposeCachingChatClient), Safety evaluators which talk to the Azure AI Content Safety service currently do not cache the evaluation responses from the service. This issue tracks the task to add response caching support for the Safety evaluators as well.

Metadata

Metadata

Labels

area-ai-evalMicrosoft.Extensions.AI.Evaluation and related

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions