Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support for OpenAI #202

Merged
merged 58 commits into from
Oct 28, 2024
Merged

feat: add support for OpenAI #202

merged 58 commits into from
Oct 28, 2024

Conversation

GregoMac1
Copy link
Collaborator

No description provided.

@GregoMac1 GregoMac1 self-assigned this Sep 30, 2024
@GregoMac1 GregoMac1 linked an issue Sep 30, 2024 that may be closed by this pull request
Copy link
Owner

@fmaclen fmaclen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't had time to do a lot of deep analysis but directionally I think it's a great start 👍

Left you some comments related to how I think we should choose one strategy over the other.

Another minor thing is that we can probably drop the strategy naming convention, at least on the file names.

src/lib/chat/chatStrategy.ts Outdated Show resolved Hide resolved
src/lib/chat/index.ts Outdated Show resolved Hide resolved
src/lib/localStorage.ts Outdated Show resolved Hide resolved
@GregoMac1
Copy link
Collaborator Author

Thanks for the feedback! I'm working on it 🫡

@GregoMac1
Copy link
Collaborator Author

@fmaclen Just to be sure: should we get rid of all the 'connected' or 'disconnected' logic? For example, in the settings page, should the status badge be removed?

@fmaclen
Copy link
Owner

fmaclen commented Oct 5, 2024

should we get rid of all the 'connected' or 'disconnected' logic?

@GregoMac1 let's keep that functionality only in the Settings view, it's still useful to know if we are correctly connected to an Ollama server.

@GregoMac1
Copy link
Collaborator Author

GregoMac1 commented Oct 7, 2024

Remaining tasks:

  • Match types between both APIs (that should fix the linting errors).
  • Check if the OpenAI strategy works correctly with a real API key.
  • Add missing tanslations for es, ja, and tr.
  • Fix existing tests.
  • Add new tests.

@GregoMac1 GregoMac1 requested a review from fmaclen October 7, 2024 22:02
Copy link
Owner

@fmaclen fmaclen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wasn't able to try the functionality, when visiting a Session I kept getting a Can't connect to Ollama server error, even though my Ollama server was running.

Left you several comments but primarily I think we should limit the amount of calls we do to the API.

src/i18n/es/index.ts Outdated Show resolved Hide resolved
src/i18n/es/index.ts Outdated Show resolved Hide resolved
src/lib/chat/index.ts Outdated Show resolved Hide resolved
src/routes/settings/OpenAI.svelte Outdated Show resolved Hide resolved
src/routes/sessions/[id]/+page.svelte Outdated Show resolved Hide resolved
src/routes/sessions/[id]/+page.svelte Outdated Show resolved Hide resolved
src/routes/sessions/[id]/Prompt.svelte Show resolved Hide resolved
src/routes/settings/OpenAI.svelte Outdated Show resolved Hide resolved
src/routes/settings/OpenAI.svelte Outdated Show resolved Hide resolved
@GregoMac1 GregoMac1 requested a review from fmaclen October 9, 2024 16:25
Copy link

Deploying hollama with  Cloudflare Pages  Cloudflare Pages

Latest commit: 330658e
Status: ✅  Deploy successful!
Preview URL: https://18201545.hollama.pages.dev
Branch Preview URL: https://94-support-openai-api-format.hollama.pages.dev

View logs

@fmaclen fmaclen marked this pull request as ready for review October 25, 2024 14:26
Copy link
Owner

@fmaclen fmaclen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I pushed some changes for updating the MOTD and README.
I also merged the latest set of changes from main and now there are 3 test failures.

Everything else looks good AFAIK, let's get those test fixed and we'll get this out 🚀

@fmaclen
Copy link
Owner

fmaclen commented Oct 26, 2024

@GregoMac1 I get these 2 tests failing:

openai.test.ts:56:2 › OpenAI Integration › models list is sorted correctly
session.test.ts:720:2 › Session › handles errors when fetching models

This one I'm not entirely sure why it's failing, looks like a race condition (it wasn't, here's why it fails):
trace.zip

1) openai.test.ts:56:2 › OpenAI Integration › models list is sorted correctly ────────────────────

    Test timeout of 5000ms exceeded.

    Error: expect(locator).toHaveText(expected)

    Locator: locator('div[role="option"]').first()
    Expected string: "gpt-3.5-turbo"
    Received string: " codegemma:latest 9B  "
    Call log:
      - expect.toHaveText with timeout 5000ms
      - waiting for locator('div[role="option"]').first()
      -   locator resolved to <div role="option" id="yfbQhZM0We" aria-selected="fal…>…</div>
      -   unexpected value " codegemma:latest 9B  "
      -   locator resolved to <div role="option" id="yfbQhZM0We" aria-selected="fal…>…</div>
      -   unexpected value " codegemma:latest 9B  "

      61 |
      62 | 		const modelOptions = page.locator('div[role="option"]');
    > 63 | 		await expect(modelOptions.nth(0)).toHaveText('gpt-3.5-turbo');
         | 		                                  ^
      64 | 		await expect(modelOptions.nth(1)).toHaveText('gpt-4');
      65 | 	});
      66 |

        at /Users/winston/projects/hollama/tests/openai.test.ts:63:37

This next test actually broke with the changes we made to Sessions:
trace.zip

I suspect that when you were running the tests locally you didn't have an Ollama server running at the same time, if you do you'll see the error I'm getting in the flash alert.

I think the issue is that we were mocking **/api/tags before and now we need to mock **/api/chat.

  2) session.test.ts:720:2 › Session › handles errors when fetching models ─────────────────────────

    Test timeout of 5000ms exceeded.

    Error: expect(locator).toBeVisible()

    Locator: locator('ol[data-sonner-toaster] li').filter({ hasText: 'Can\'t connect to Ollama server' })
    Expected: visible
    Received: hidden
    Call log:
      - expect.toBeVisible with timeout 5000ms
      - waiting for locator('ol[data-sonner-toaster] li').filter({ hasText: 'Can\'t connect to Ollama server' })


      741 | 		await expect(
      742 | 			page.locator('ol[data-sonner-toaster] li', { hasText: "Can't connect to Ollama server" })
    > 743 | 		).toBeVisible();
          | 		  ^
      744 | 	});
      745 |
      746 | 	test('can navigate out of session during completion', async ({ page }) => {

        at /Users/winston/projects/hollama/tests/session.test.ts:743:5

tests/openai.test.ts Outdated Show resolved Hide resolved
tests/openai.test.ts Outdated Show resolved Hide resolved
Copy link
Owner

@fmaclen fmaclen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was certainly a tough one but I think we are good to go 👍
Let's get it out!

@GregoMac1 GregoMac1 merged commit 6b52e23 into main Oct 28, 2024
2 checks passed
@GregoMac1 GregoMac1 deleted the 94-support-openai-api-format branch October 28, 2024 15:14
@fmaclen
Copy link
Owner

fmaclen commented Oct 28, 2024

🎉 This PR is included in version 0.20.0 🎉

The release is available on GitHub release

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support OpenAI API format
2 participants