-
Notifications
You must be signed in to change notification settings - Fork 895
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ollama): add retry template integration to OllamaChatModel #1852
base: main
Are you sure you want to change the base?
feat(ollama): add retry template integration to OllamaChatModel #1852
Conversation
apappascs
commented
Dec 2, 2024
- Integrated RetryTemplate for API calls in OllamaChatModel.
- Ensured consistent retry logic across all model classes.
Thanks for your contribution! |
@@ -198,7 +213,8 @@ public Flux<ChatResponse> stream(Prompt prompt) { | |||
|
|||
observation.parentObservation(contextView.getOrDefault(ObservationThreadLocalAccessor.KEY, null)).start(); | |||
|
|||
Flux<OllamaApi.ChatResponse> ollamaResponse = this.chatApi.streamingChat(request); | |||
Flux<OllamaApi.ChatResponse> ollamaResponse = this.retryTemplate |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When in streaming mode, we don't use the RetryTemplate
as it's not gonna work with the reactive streams. There is pending design on how to configure retries for streaming. You can follow this issue for updates: #1193
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @ThomasVitale for pointing this out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding some context to this issue: a
RetryTemplate
is used to add retry capabilities to the blocking model calls. At first, the same strategy was used for the streaming calls. After realising that it wasn't working as expected in a reactive stream, the usage ofRetryTemplate
streaming calls has been removed. In this issue, we would need to come up with a design for a retry feature to adopt in all streaming implementations across the different model providers.
Its still in use in the code tho: https://github.com/search?q=repo%3Aspring-projects%2Fspring-ai+%22Flux%3CChatCompletionChunk%3E%22+this.retryTemplate+execute&type=code
should I open a PR removing it?
@@ -124,6 +125,7 @@ public OllamaChatModel ollamaChat(OllamaApi ollamaApi) { | |||
return OllamaChatModel.builder() | |||
.withOllamaApi(ollamaApi) | |||
.withDefaultOptions(OllamaOptions.create().withModel(MODEL).withTemperature(0.9)) | |||
.withRetryTemplate(RetryUtils.DEFAULT_RETRY_TEMPLATE) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this needed? If I read correctly, RetryUtils.DEFAULT_RETRY_TEMPLATE
is provided by default if no explicit RestTemplate
object is provided, so there shouldn't be a need for adding this line.
Same question for all the other places where this line was added.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in comparison to other tests on ollama the model is configured inside the test https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-ollama/src/test/java/org/springframework/ai/ollama/OllamaChatModelIT.java#L244
3a28241
to
729b807
Compare
729b807
to
55d38fd
Compare