You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PR #9285 introduced new APIs for the document mapping service that don't allocate for their outputs, and try not to allocate for intermediate state. We should go through existing callers of the old API and consider moving them where appropriate. If all that is happening is mapping being done from LSP types, and being sent back over LSP, then it doesn't matter, but if there is internal state being manipulated it would be a good idea.
Semantic Tokens was done in that same PR, as an example.
The text was updated successfully, but these errors were encountered:
Follow up to #9280
This creates a struct based API for document mapping, and moves semantic
tokens to it. I did it in a source-compatible way so we can upgrade
existing features as necessary. I suspect most won't get the benefit
that semantic tokens gets, as most things just ferry ranges and
positions around, and don't process them much. Logged
#9284 to follow up though.
Commit-at-a-time might be easiest.
Results for semantic tokens are pretty good though:
![image](https://github.com/dotnet/razor/assets/754264/7316e5db-0e90-4b32-a807-d4ee5d40741a)
I looked, the only endpoint that calls the service for internal data only is the spell checking endpoint, and that is only used for verification. By virtue of the code using var this was updated automatically when the API was changed.
PR #9285 introduced new APIs for the document mapping service that don't allocate for their outputs, and try not to allocate for intermediate state. We should go through existing callers of the old API and consider moving them where appropriate. If all that is happening is mapping being done from LSP types, and being sent back over LSP, then it doesn't matter, but if there is internal state being manipulated it would be a good idea.
Semantic Tokens was done in that same PR, as an example.
The text was updated successfully, but these errors were encountered: