This repository was archived by the owner on Jul 16, 2025. It is now read-only.

Description
Given this user configuration:
llm_chain:
embedder:
default:
platform: 'llm_chain.platform.mistral'
model:
name: 'Embeddings'
version: 'mistral-embed'
If the embeddings model name is "Embeddings" (as is the case for OpenAI, Mistral and Google), the model object injected in Embedder object will always be PhpLlm\LlmChain\Platform\Bridge\OpenAI\Embeddings due to https://github.com/php-llm/llm-chain-bundle/blob/main/src/DependencyInjection/LlmChainExtension.php#L462
I see 3 solutions to fix this issue:
- Force to (prefix|suffix) Embeddings model name with the platform name (e.g.
MistralEmbeddings, OpenAIEmbeddings)
- Check the platform name to determine the Embeddings class to set in Definition
- Change the
name config entry to className and allow users to pass PhpLlm\LlmChain\Platform\Bridge\Mistral\Embeddings as value. In LlmChainExtension we could check the class existence and if it extends PhpLlm\LlmChain\Platform\Model.
@chr-hertel @OskarStark you probably have a clear idea of what is the best solution here, maybe another one?
Backported in symfony/ai#59