You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Chat endpoint/ui for talking about your project. Feed it your project source, then have a ChatGPT-like conversation about the project.
Additional context
Since the model already has the context of your source code, only offering code completion feels like a missed opportunity. 99% of the time I want AI to help with my project, I'm feeding ChatGPT 4 the contents of a few relevant files and then asking it for its advice on a bigger solution. The answer might affect several files in one response. Some of the useful response might not even be actual code.
All other active projects I've found that even come close to this rely on cloud models and they can't even help with anything beyond boilerplate on new projects because it's so slow and expensive (and sometimes hard-limited) to feed it the context of an already established project.
Tabby solves that with local models. I'd love to see that potential tapped into for big-picture help and not just isolated code completions.
Could this be a possible future feature for Tabby?
Please reply with a 👍 if you want this feature.
The text was updated successfully, but these errors were encountered:
Thank you for your feedback. This is actually something that has been on our minds for a long time, and it is also the main reason that drove me to switch Tabby's underlying implementation from Triton FasterTransformer to CTranslate2 (which provides a much easier way to implement new ops like MQA used in starcoder / santacoder).
We are planning to release something w/ StarCoder around mid Q-3. Please stay tuned for updates!
Please describe the feature you want
A Chat endpoint/ui for talking about your project. Feed it your project source, then have a ChatGPT-like conversation about the project.
Additional context
Since the model already has the context of your source code, only offering code completion feels like a missed opportunity. 99% of the time I want AI to help with my project, I'm feeding ChatGPT 4 the contents of a few relevant files and then asking it for its advice on a bigger solution. The answer might affect several files in one response. Some of the useful response might not even be actual code.
All other active projects I've found that even come close to this rely on cloud models and they can't even help with anything beyond boilerplate on new projects because it's so slow and expensive (and sometimes hard-limited) to feed it the context of an already established project.
Tabby solves that with local models. I'd love to see that potential tapped into for big-picture help and not just isolated code completions.
Could this be a possible future feature for Tabby?
Please reply with a 👍 if you want this feature.
The text was updated successfully, but these errors were encountered: