Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama provider onboarding #55

Merged
merged 6 commits into from
Nov 12, 2024
Merged

Ollama provider onboarding #55

merged 6 commits into from
Nov 12, 2024

Conversation

SunnyYasser
Copy link
Member

Users can now employ any locally deployed ollama model. Plus some other infra changes for easier provider onboarding.

const bool json_response) {
auto provider = GetProviderType(model_details.provider_name);
switch (provider) {
case FLOCKMTL_OPENAI:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this true / false return, why not raise an exception?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exception is raised earlier, default case will never execute. Had to add to prevent compiler from generating warning

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then let's remove the true/false.


inline SupportedModels GetModelType(std::string model, std::string provider) {
std::transform(provider.begin(), provider.end(), provider.begin(), [](unsigned char c) { return std::tolower(c); });
if (provider == "ollama")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like the use of strings here and everywhere else. Let's make these constants somewhere so we change them in one place.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

auto res = available_model.find(user_model_name) != std::string::npos;

if (res)
return true;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should return res with res being initially set to false. It shouldn't return literals true and false

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

OllamaModelManager &operator=(OllamaModelManager &&) = delete;

nlohmann::json CallComplete(const nlohmann::json &json, const std::string &contentType = "application/json") {
std::string url = "http://localhost:11434/api/chat";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The url should be provided as part of the input? it is unclear what the port will be. It is also not necessarily a local server and can be a remote one with a different url.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

@queryproc queryproc merged commit 2164511 into main Nov 12, 2024
@queryproc queryproc deleted the feat/ollama_onboard branch November 12, 2024 19:08
@queryproc queryproc mentioned this pull request Nov 19, 2024
15 tasks
dorbanianas pushed a commit that referenced this pull request Jan 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants