-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use new LLM async support to get models other than just Anthropic #13
Comments
In working on this I found and fixed this issue: |
The plugin currently uses Anthropic's datasette-query-assistant/datasette_query_assistant/__init__.py Lines 71 to 80 in a777a80
LLM doesn't yet have a neat mechanism for building a fake conversation like that and other models don't support I'm going to tell the model to output this and then extract the SQL with a regex:
|
The new default model is e.g. just saw this in debug logs:
And honestly even without calculating the 50% discount on those cached tokens that's wildly inexpensive: |
This plugin currently hard-codes to using Claude 3 Haiku and the Anthropic client library:
datasette-query-assistant/datasette_query_assistant/__init__.py
Line 64 in a777a80
datasette-query-assistant/datasette_query_assistant/__init__.py
Lines 133 to 135 in a777a80
Now that LLM has async support I want to switch to that in order to get multiple models supported at once:
The text was updated successfully, but these errors were encountered: