Skip to content

Commit

Permalink
feat!: change type of keep_alive to text (#251)
Browse files Browse the repository at this point in the history
The `keep_alive` parameter to the `ai.ollama_embed`,
`ai.ollama_generate`, and `ai.ollama_chat_complete` functions now takes
a `text` value instead of a `float8` value. This makes for a more
flexible API, as `keep_alive` can now be specified using human-readable
duration values like `'10m'`, in addition to numeric (text) values like
`'120'`.

This is a breaking change. The impact of this change is twofold:

Users who used `keep_alive => <some floating point value>` in the
argument to one of `ai.ollama_embed`, `ai.ollama_generate`, or
`ai.ollama_chat_complete` must cast that value to `text`, i.e.
`keep_alive => <some floating point value>::text`.

Users who have used one of the `ai.ollama_embed`, `ai.ollama_generate`,
or `ai.ollama_chat_complete` functions in a SQL function using the
`sql_body` form (see [CREATE FUNCTION]), or in a view will be forced to
drop those functions and views, upgrade the pgai extension, and then
re-create the functions and views.

[CREATE FUNCTION]: https://www.postgresql.org/docs/current/sql-createfunction.html
  • Loading branch information
JamesGuthrie authored Dec 6, 2024
1 parent 1ab5c16 commit 0c74741
Show file tree
Hide file tree
Showing 5 changed files with 24 additions and 21 deletions.
6 changes: 3 additions & 3 deletions projects/extension/sql/idempotent/002-ollama.sql
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ create or replace function ai.ollama_embed
( model text
, input_text text
, host text default null
, keep_alive float8 default null
, keep_alive text default null
, embedding_options jsonb default null
) returns @extschema:vector@.vector
as $python$
Expand All @@ -125,7 +125,7 @@ create or replace function ai.ollama_generate
, prompt text
, host text default null
, images bytea[] default null
, keep_alive float8 default null
, keep_alive text default null
, embedding_options jsonb default null
, system_prompt text default null
, template text default null
Expand Down Expand Up @@ -175,7 +175,7 @@ create or replace function ai.ollama_chat_complete
( model text
, messages jsonb
, host text default null
, keep_alive float8 default null
, keep_alive text default null
, chat_options jsonb default null
) returns jsonb
as $python$
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
drop function if exists ai.ollama_embed(text, text, text, float8, jsonb);
drop function if exists ai.ollama_generate(text, text, text, bytea[], float8, jsonb, text, text, int[]);
drop function if exists ai.ollama_chat_complete(text, jsonb, text, float8, jsonb);
6 changes: 3 additions & 3 deletions projects/extension/tests/contents/output16.expected
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,9 @@ CREATE EXTENSION
function ai.indexing_diskann(integer,text,integer,integer,double precision,integer,integer,boolean)
function ai.indexing_hnsw(integer,text,integer,integer,boolean)
function ai.indexing_none()
function ai.ollama_chat_complete(text,jsonb,text,double precision,jsonb)
function ai.ollama_embed(text,text,text,double precision,jsonb)
function ai.ollama_generate(text,text,text,bytea[],double precision,jsonb,text,text,integer[])
function ai.ollama_chat_complete(text,jsonb,text,text,jsonb)
function ai.ollama_embed(text,text,text,text,jsonb)
function ai.ollama_generate(text,text,text,bytea[],text,jsonb,text,text,integer[])
function ai.ollama_list_models(text)
function ai.ollama_ps(text)
function ai.openai_chat_complete_simple(text,text,text)
Expand Down
6 changes: 3 additions & 3 deletions projects/extension/tests/contents/output17.expected
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,9 @@ CREATE EXTENSION
function ai.indexing_diskann(integer,text,integer,integer,double precision,integer,integer,boolean)
function ai.indexing_hnsw(integer,text,integer,integer,boolean)
function ai.indexing_none()
function ai.ollama_chat_complete(text,jsonb,text,double precision,jsonb)
function ai.ollama_embed(text,text,text,double precision,jsonb)
function ai.ollama_generate(text,text,text,bytea[],double precision,jsonb,text,text,integer[])
function ai.ollama_chat_complete(text,jsonb,text,text,jsonb)
function ai.ollama_embed(text,text,text,text,jsonb)
function ai.ollama_generate(text,text,text,bytea[],text,jsonb,text,text,integer[])
function ai.ollama_list_models(text)
function ai.ollama_ps(text)
function ai.openai_chat_complete_simple(text,text,text)
Expand Down
24 changes: 12 additions & 12 deletions projects/extension/tests/privileges/function.expected
Original file line number Diff line number Diff line change
Expand Up @@ -220,18 +220,18 @@
f | bob | execute | no | ai | indexing_none()
f | fred | execute | no | ai | indexing_none()
f | jill | execute | YES | ai | indexing_none()
f | alice | execute | YES | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive double precision, chat_options jsonb)
f | bob | execute | no | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive double precision, chat_options jsonb)
f | fred | execute | no | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive double precision, chat_options jsonb)
f | jill | execute | YES | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive double precision, chat_options jsonb)
f | alice | execute | YES | ai | ollama_embed(model text, input_text text, host text, keep_alive double precision, embedding_options jsonb)
f | bob | execute | no | ai | ollama_embed(model text, input_text text, host text, keep_alive double precision, embedding_options jsonb)
f | fred | execute | no | ai | ollama_embed(model text, input_text text, host text, keep_alive double precision, embedding_options jsonb)
f | jill | execute | YES | ai | ollama_embed(model text, input_text text, host text, keep_alive double precision, embedding_options jsonb)
f | alice | execute | YES | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive double precision, embedding_options jsonb, system_prompt text, template text, context integer[])
f | bob | execute | no | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive double precision, embedding_options jsonb, system_prompt text, template text, context integer[])
f | fred | execute | no | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive double precision, embedding_options jsonb, system_prompt text, template text, context integer[])
f | jill | execute | YES | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive double precision, embedding_options jsonb, system_prompt text, template text, context integer[])
f | alice | execute | YES | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive text, chat_options jsonb)
f | bob | execute | no | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive text, chat_options jsonb)
f | fred | execute | no | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive text, chat_options jsonb)
f | jill | execute | YES | ai | ollama_chat_complete(model text, messages jsonb, host text, keep_alive text, chat_options jsonb)
f | alice | execute | YES | ai | ollama_embed(model text, input_text text, host text, keep_alive text, embedding_options jsonb)
f | bob | execute | no | ai | ollama_embed(model text, input_text text, host text, keep_alive text, embedding_options jsonb)
f | fred | execute | no | ai | ollama_embed(model text, input_text text, host text, keep_alive text, embedding_options jsonb)
f | jill | execute | YES | ai | ollama_embed(model text, input_text text, host text, keep_alive text, embedding_options jsonb)
f | alice | execute | YES | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive text, embedding_options jsonb, system_prompt text, template text, context integer[])
f | bob | execute | no | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive text, embedding_options jsonb, system_prompt text, template text, context integer[])
f | fred | execute | no | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive text, embedding_options jsonb, system_prompt text, template text, context integer[])
f | jill | execute | YES | ai | ollama_generate(model text, prompt text, host text, images bytea[], keep_alive text, embedding_options jsonb, system_prompt text, template text, context integer[])
f | alice | execute | YES | ai | ollama_list_models(host text)
f | bob | execute | no | ai | ollama_list_models(host text)
f | fred | execute | no | ai | ollama_list_models(host text)
Expand Down

0 comments on commit 0c74741

Please sign in to comment.