Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python] Created get_parameters API #668

Merged
merged 1 commit into from
Jan 2, 2024
Merged

[python] Created get_parameters API #668

merged 1 commit into from
Jan 2, 2024

Conversation

rossdanlm
Copy link
Contributor

@rossdanlm rossdanlm commented Dec 29, 2023

[python] Created get_parameters API

Disclaimer: I know this is 400+ lines changed, but ~300 of those are just adding/refactoring automated tests!!!

We already had get_prompt_parameters, but this isn't good enough if parameters aren't defined locally and we want to bubble up to the aiconfig defaults. See comment in #661 (comment).

This is important so that we can ensure we have valid params if not defined in the correct section of our AIConfig schema. This is similar in theme to #600

The only callsite we have today for this is in python in the params.py function, so I updated that callsite too. I don't know why it doesn't exist in Typescript (cc @rholinshead) but I also added this functionality in typescript in #669 (I'm not as familiar with automated testing there so will ask Ryan for a code pointer to help get me started)

Test plan

Added a bunch of automated tests which you can run by doing going to the aiconfig/python dir and running pytest


Stack created with Sapling. Best reviewed with ReviewStack.


assert prompt is None or isinstance(prompt, Prompt)
if prompt is None or not prompt.metadata or not prompt.metadata.parameters:
return self.get_global_parameters()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's confusing to return the global parameters when the requested prompt doesn't have any parameters, but otherwise returning only the prompt parameters if they exist.

In the parameter resolution we actually use a union of global parameters & local parameters (overriding global if they match). Shouldn't this do the same?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's confusing to return the global parameters when the requested prompt doesn't have any parameters, but otherwise returning only the prompt parameters if they exist.

If the prompt parameters do not exist, it makes sense to check if any are defined in the global params. It's the same concept as #600

In the parameter resolution we actually use a union of global parameters & local parameters (overriding global if they match)

Tbh I feel like "check the value closest in scope (prompt) and then check the next closest value in scope (aiconfig)" is more intuitive because it's something a real human would do, but yea it doesn't matter, I can also implement it this way

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename to resolve_prameters or something?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe get_available_parameters or similar? That avoids the potential for confusion about which parameters this refers to; resolve_parameters to me means to resolve the parameters that are being used within the prompt input.

And I think this should do the same logic as in resolve_prompt_string (less reference/input params) which:

  • gets config-level params first
  • gets current prompt params, overwriting matching config-level param names with prompt-level value

Otherwise it's pretty confusing to have different behaviours and removes most of the benefits you've listed about reusability / testability.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And I think this should do the same logic as in resolve_prompt_string
gets config-level params first
gets current prompt params, overwriting matching config-level param names with prompt-level value

Yea sure, I can do that as well as change the name to get_available_parameters! I will fix forward to avoid merge conflicts: #695

@rholinshead
Copy link
Contributor

We already had get_prompt_parameters, but this isn't good enough if parameters aren't defined locally and we want to bubble up to the aiconfig defaults. See comment in #661 (comment).

More generally, is this change actually needed? The bubbling up already happens when running the prompt, right? So the editor implementation doesn't need to explicitly get the local + global params because it will be run with the params set

@rossdanlm
Copy link
Contributor Author

rossdanlm commented Dec 29, 2023

More generally, is this change actually needed? The bubbling up already happens when running the prompt, right? So the editor implementation doesn't need to explicitly get the local + global params because it will be run with the params set

You don't need to pass in params anymore from the client, this abstracts it away. You still can pass it in explicitly into the run() command as a method arg if you want, but this simplifies it a lot

The bubbling up already happens when running the prompt, right?

This is also done INSIDE of the run() implementation, which means the logic is tightly coupled to it. This way we have separated out the logic into it's own functionality making it:

  1. Easier to re-use --> Can be used anywhere as a separate unit of functionality
  2. Easier to test --> Don't need to test the entire run-flow, can be unit tested much more easily (like what I already did this diff)
  3. Easier to extend --> Ex: if we ever want to extend this functionality to be get_XYZ, not just get_parameters (similar to issue fixed by [issue #454][AIC-py] get default model if no prompt model #600 for getting the default models), this is much easier to do now since we only need to change a method, not touch anything that could affect or break the entire run implementation

**Disclaimer:** I know this is 400+ lines changed, but ~300 of those are just adding/refactoring automated tests!!!

We already had `get_prompt_parameters`, but this isn't good enough if parameters aren't defined locally and we want to bubble up to the aiconfig defaults. See comment in #661 (comment).

This is important so that we can ensure we have valid params if not defined in the correct section of our AIConfig schema. This is similar in theme to #600

The only callsite we have today for this is in python in the `params.py` function, so I updated that callsite too. I don't know why it doesn't exist in Typescript (cc @rholinshead) but I also added this functionality in typescript in #669 (I'm not as familiar with automated testing there so will ask Ryan for a code pointer to help get me started)

## Test plan
Added a bunch of automated tests which you can run by doing going to the `aiconfig/python` dir and running `pytest`
or default_return_value
# pylint: enable=W0102

def _get_prompt_parameters_exact(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just FYI, {} is falsey in expressions like self._get_global_parameters_exact() or default_return_value, just like None is.


assert prompt is None or isinstance(prompt, Prompt)
if prompt is None or not prompt.metadata or not prompt.metadata.parameters:
return self.get_global_parameters()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename to resolve_prameters or something?

@jonathanlastmileai
Copy link
Contributor

ac to unblock

@rossdanlm rossdanlm merged commit 6bb4f7c into main Jan 2, 2024
3 checks passed
@rossdanlm rossdanlm deleted the pr668 branch January 2, 2024 16:23
rossdanlm added a commit that referenced this pull request Jan 2, 2024
Create update_model endpoint




TSIA, backend implementation of `update_model()`. The other changes were
from running the auto-formatter:
```
fd --glob '*.py'  python/src/aiconfig/editor/server | xargs python -m 'scripts.lint' --mode=fix --files
```

## Test Plan
Follow dev README to setup the local editor:
https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev,
then run this command
```
curl http://localhost:8080/api/update_model -d curl http://localhost:8080/api/update_model -d '{"prompt_name":"get_activities", "model_name": "gpt-4", "settings": {"top_p": 0.9}}' -X POST -H 'Content-Type: application/json'
```
Notice that the model name is updated to `gpt-4`, and the `top_p` is now
at 0.9


https://github.com/lastmile-ai/aiconfig/assets/151060367/3031fa59-8925-495c-a5eb-e56ec65b7fba

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/688).
* #693
* #692
* #691
* #690
* __->__ #688
* #670
* #668
rossdanlm added a commit that referenced this pull request Jan 2, 2024
Create delete_parameter endpoint







TSIA

## Test Plan
Follow dev README to setup the local editor:
https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev,
then run this command
```
curl http://localhost:8080/api/delete_parameter -d '{"prompt_name":"get_activities", "parameter_name": "city"}' -X POST -H 'Content-Type: application/json'
```
Notice that the parameters are now deleted

Prompt-level


https://github.com/lastmile-ai/aiconfig/assets/151060367/866db51d-6b84-438d-ac4e-483d0cfbaa40

AIConfig-level


https://github.com/lastmile-ai/aiconfig/assets/151060367/dcd47f62-875c-40ac-9e81-efcd8cf9b45a

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/691).
* #693
* #692
* __->__ #691
* #690
* #688
* #670
* #668
rossdanlm added a commit that referenced this pull request Jan 2, 2024
Create set_name endpoint


TSIA

## Test Plan
Follow dev README to setup the local editor:
https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev,
then run this command
```
curl http://localhost:8080/api/set_name -d '{"name":"cool_new_name"}' -X POST -H 'Content-Type: application/json'
```


https://github.com/lastmile-ai/aiconfig/assets/151060367/efcb7ceb-e390-4208-93b3-3effd5d66f8b

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/692).
* #693
* __->__ #692
* #691
* #690
* #688
* #670
* #668
rossdanlm added a commit that referenced this pull request Jan 2, 2024
Create set_description endpoint


TSIA

## Test Plan
Follow dev README to setup the local editor:
https://github.com/lastmile-ai/aiconfig/tree/main/python/src/aiconfig/editor#dev,
then run this command
```
curl http://localhost:8080/api/set_description -d '{"description":"yo this is a cool new description get rekt dont @ me bro"}' -X POST -H 'Content-Type: application/json'
```


https://github.com/lastmile-ai/aiconfig/assets/151060367/0db278b3-8273-4178-8010-380538dcd0ca

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/693).
* __->__ #693
* #692
* #691
* #690
* #688
* #670
* #668
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants