-
Notifications
You must be signed in to change notification settings - Fork 30.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat variables API #212995
Comments
With the latest 1.90.0 and chatVariableResolver api it appears, from my usage, there appears to be a few oddities.
In general, I would like to have the uri present when it is possible. For participant provided variables it would be helpful to have the following:
In my particular use case I have many variables (as many as 200-300) per workspace that I want to make available however the compute effort to provide them is high - using the chat completions is preferable to an interactive selector (such as I am hopeful that I can build a user experience where the user effectively "points" to some element (per above) and issues commands such as |
If
Yeah, it does have this fallback to
I think I will change selection to actually report a Location instead of just a string
Same here
Not really- the reference in the prompt includes the range where the reference appears in the prompt, and you can use that to replace it or whatever you want. VS Code is responsible for parsing parts of the prompt.
We will need to add this to the chat sample extension as well. Interesting use-case, thanks. Is this for data from an existing extension? Would you want your variable to be available to any chat participant, or only your own chat participant? |
It is for an existing in-house extension that manages a custom DSL. There are a of couple different focused participants (development aids, audit and quality) that we want to explore. The participants rely heavily on the original extension for model content. The variables are really only useful if you are aware of the additional context so there is little value in cluttering the other participants with a large symbol set. We have played with having the participant priming the conversation with the @workspace participant since we can make better relevance selections. Exposure to other participants would be useful but only if we could delay value rendering until the point of consumption and know what participants we were rendering for so either generic or specific content could be provided. Right now, it is a bit frustrating that we have to supply a populated value prior to actual use. At one point, it appeared like the api was going to defer resolution to a fully flushed out result but that currently isn’t the case. Perhaps I can fake it with a getter but it doesn’t feel quite right. I will keep an eye on the selection issue and if I see a clear pattern I will create an issue. Yeah, I missed the range pointing at the prompt content itself - my bad. Thanks for the work so far, very promising stuff. |
@rvanider thanks for sharing more details. Can you share the company name? If you do not want to disclose publicly you can also reach out to inikolic@microsoft.com
The chat sample had a |
There is no way to know what data from variable is actually sent to the LLM. While UI will probably better expose this in the future, would is be possible to simply add logs in the meantime, as suggested in #1251? |
The chat participant can reply with a |
The recommended direction for extensions for this sort of thing for now is tools |
@roblourens It's great that we now can add chat variables using tools! However, it's hard to find out how to do this (this was at least my experience). Afaik it's undocumented:
I found out the hard way how to get working chat variables, by browsing VS Code's sources: when adding a tool in
Only if the first key/value is added, the chat variable is usable in the chat (highlighted in blue, tab-completion works). Please correct me here, if I missed something somewhere in the docs. Otherwise, it would be great, if the documentation could be updated accordingly. |
@swvanbuuren thank you for your feedback. |
@isidorn @swvanbuuren Yes, please create an issue in the vscode-docs repo. Appreciate the feedback! |
Proposal dts: https://github.com/microsoft/vscode/blob/roblou/chat-references/src/vscode-dts/vscode.proposed.chatVariableResolver.d.ts
Sample: https://github.com/microsoft/vscode-extension-samples/tree/main/chat-sample
Docs: https://code.visualstudio.com/api/extension-guides/chat#variables
Extension authors can subscribe to this issue to get updates about the proposed Variables API. There may still be breaking changes coming, and we will post here whenever a breaking change is made.
We are very interested in feedback about how you might use this API.
The text was updated successfully, but these errors were encountered: