Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ollama): add retry template integration to OllamaChatModel #1852

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

apappascs
Copy link
Contributor

  • Integrated RetryTemplate for API calls in OllamaChatModel.
  • Ensured consistent retry logic across all model classes.

@apappascs apappascs marked this pull request as ready for review December 2, 2024 15:15
@ThomasVitale
Copy link
Contributor

Thanks for your contribution!

@@ -198,7 +213,8 @@ public Flux<ChatResponse> stream(Prompt prompt) {

observation.parentObservation(contextView.getOrDefault(ObservationThreadLocalAccessor.KEY, null)).start();

Flux<OllamaApi.ChatResponse> ollamaResponse = this.chatApi.streamingChat(request);
Flux<OllamaApi.ChatResponse> ollamaResponse = this.retryTemplate
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When in streaming mode, we don't use the RetryTemplate as it's not gonna work with the reactive streams. There is pending design on how to configure retries for streaming. You can follow this issue for updates: #1193

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @ThomasVitale for pointing this out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding some context to this issue: a RetryTemplate is used to add retry capabilities to the blocking model calls. At first, the same strategy was used for the streaming calls. After realising that it wasn't working as expected in a reactive stream, the usage of RetryTemplate streaming calls has been removed. In this issue, we would need to come up with a design for a retry feature to adopt in all streaming implementations across the different model providers.

Its still in use in the code tho: https://github.com/search?q=repo%3Aspring-projects%2Fspring-ai+%22Flux%3CChatCompletionChunk%3E%22+this.retryTemplate+execute&type=code

should I open a PR removing it?

@@ -124,6 +125,7 @@ public OllamaChatModel ollamaChat(OllamaApi ollamaApi) {
return OllamaChatModel.builder()
.withOllamaApi(ollamaApi)
.withDefaultOptions(OllamaOptions.create().withModel(MODEL).withTemperature(0.9))
.withRetryTemplate(RetryUtils.DEFAULT_RETRY_TEMPLATE)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this needed? If I read correctly, RetryUtils.DEFAULT_RETRY_TEMPLATE is provided by default if no explicit RestTemplate object is provided, so there shouldn't be a need for adding this line.

Same question for all the other places where this line was added.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@apappascs apappascs force-pushed the feature/add-retry-to-ollama-model branch from 3a28241 to 729b807 Compare December 5, 2024 10:19
@apappascs apappascs force-pushed the feature/add-retry-to-ollama-model branch from 729b807 to 55d38fd Compare December 10, 2024 11:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants