-
Notifications
You must be signed in to change notification settings - Fork 957
-
Notifications
You must be signed in to change notification settings - Fork 957
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Code Completion Quality - Ideas #2674
Comments
Agent: |
IDE / Extensions: |
Completion preview: |
Under construction
Since the release of version 0.13, the core functionality of code completion has become largely stable. This marks an opportune moment to consolidate current implementation and begin exploring for more enhancements
Prompt construction
Tabby generates code completion requests using the following information:
It is important to note that prefix and suffix are guaranteed to be included in the LLM inference request, provided they exist. However, lsp definitions, recently changed content, and repository-level relevant code snippets are subject to the limitations of the context window, which is currently set to 1536 tokens. The priority in which these elements are filled corresponds to the order listed above.
To ensure low latency, we are currently using a relatively conservative max decoding tokens, 64.
Ideas
Server
Agent
IDE / Extensions
The text was updated successfully, but these errors were encountered: