-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement conversation context and streaming with OllamaSharp #310
Conversation
…ue to Ollama misconfiguration
…nd port; Precise warning message
…sible null exception
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kborowinski Thanks for making improvements to the ollama agent!
if (Process.GetProcessesByName("ollama").Length is 0) | ||
{ | ||
host.RenderFullResponse("Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); | ||
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
host.MarkupWarningLine($"[[{Name}]]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); | |
host.WriteErrorLine($"[{Name}]: Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, given that the prompt will be @ollama
when the agent is in use, maybe we don't need to include the [{Name}]
part in the error message.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, I think we should check if ollama
is running only when the Endpoint is set to localhost. This will enable support for remote ollama
endpoints.
if (IsLocalHost().IsMatch(_client.Uri.Host) && Process.GetProcessesByName("ollama").Length is 0)
{
host.WriteErrorLine("Please be sure the Ollama is installed and server is running. Check all the prerequisites in the README of this agent are met.");
return false;
}
and
/// <summary>
/// Defines a generated regular expression to match localhost addresses
/// "localhost", "127.0.0.1" and "[::1]" with case-insensitivity.
/// </summary>
[GeneratedRegex("^(localhost|127\\.0\\.0\\.1|\\[::1\\])$", RegexOptions.IgnoreCase)]
internal partial Regex IsLocalHost();
shell/nuget.config
Outdated
<configuration> | ||
<packageSources> | ||
<clear /> | ||
<add key="PowerShell_PublicPackages" value="https://pkgs.dev.azure.com/powershell/PowerShell/_packaging/powershell/nuget/v3/index.json" /> | ||
<add key="NuGet" value="https://api.nuget.org/v3/index.json" /> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have saved the OllamaSharp
package to the MS feed (we're required to use it only), so you can revert the changes in this file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The OllamaSharp
was just updated to 4.0.8
and you do not have this version on the MS feed yet, so I will keep it for the moment. I guess in the future you will pull it automatically to the MS feed.
@daxian-dbw Thank you very much for your time and the review. I did implement the changes you required. If you have any additional suggestions, please let me know. |
@daxian-dbw Apologies for the repeated pings. Please take a look whenever you have a chance. |
Sorry for the delay. I’m out of town this week and hopefully will be back next Tuesday. I will get to the PR right away after I'm back.
…________________________________
From: Kris Borowinski ***@***.***>
Sent: Thursday, November 28, 2024 5:36:49 PM
To: PowerShell/AIShell ***@***.***>
Cc: Dongbo Wang ***@***.***>; Mention ***@***.***>
Subject: Re: [PowerShell/AIShell] Implement conversation context and streaming with OllamaSharp (PR #310)
@daxian-dbw<https://github.com/daxian-dbw> Apologies for the repeated pings. Please take a look whenever you have a chance.
—
Reply to this email directly, view it on GitHub<#310 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AAA7DWWK5VN3RGQWFGM4DKT2C3P3DAVCNFSM6AAAAABSHKI72WVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMBVGY3DEMJTG4>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
This is AWESOME @kborowinski ! Thanks for the contributions! Will definitely make the Ollama agent better 😄 I will work to modify the agent creation instructions to either comply with these changes or keep the old code since it may serve better as example, thanks again! |
All look good! I made a couple minor changes to the style, nothing big. Thanks again @kborowinski! |
PR Summary
This PR refactors the
AIShell.Ollama.Agent
to use theOllamaSharp
library, enhancing functionality and maintainability. Key updates include support for conversation context, streaming, improved settings management, and better error handling.Important:
If you choose this PR, then close Implement settings file for Ollama agent
PR Context
This update improves the
AIShell.Ollama.Agent
by:Switching to OllamaSharp:
OllamaApiClient
.Introducing Context Support:
Enhancing Configuration Management:
ollama.config.json
with real-time updates usingFileSystemWatcher
.Improving Chat Interaction:
Model
andEndpoint
) to ensure proper functionality.This PR also aligns with prerequisites outlined in issue #155 and improves the user experience with clearer feedback and enhanced features.