-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix syntax fixup producing invalid punctuation #19252
Conversation
This also means we silently ignore unbalanced parentheses in macros. Not sure that's ideal 😅 But it might be better to record them as errors somehow while constructing the token tree. |
ba609e8
to
7f09cda
Compare
spacing: _, | ||
})) => { | ||
let found_expected_delimiter = | ||
builder.expected_delimiters().enumerate().find(|(_, delim)| match delim.kind { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldnt we only check the last expected her as well (and panic otherwise if it doesn't match) given syntax fixup should produce valid pairs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basically what this does is move reponsibility for handling unbalanced delimiters from syntax fixup to convert_tokens
. I think that makes sense if we want to disallow convert_tokens
from producing invalid token trees (i.e. disallow '(' etc as Punct instead of delimiters, which is what it does right now if it encounters unbalanced delimiters). But I'm not 100% sure of the implications.
crates/hir-expand/src/fixup.rs
Outdated
if tt.delimiter.close.anchor.ast_id == FIXUP_DUMMY_AST_ID | ||
|| tt.delimiter.open.anchor.ast_id == FIXUP_DUMMY_AST_ID | ||
&& tt.delimiter.open.anchor.ast_id == FIXUP_DUMMY_AST_ID |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is because the syntax bridge will remove these elements right? Might be good to add to the comment since we should never leak the FIXUP_DUMMY_AST_ID
s
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm no, it was because I made the ArgList fixup add the closing parenthesis in the foo(a
case. I changed it back to ||
, so we instead rely on convert_tokens
to keep trees balanced (but we ensure we don't leak the fake AST ID).
The parser should already complain about that I think? |
crates/hir-expand/src/fixup.rs
Outdated
@@ -571,6 +559,17 @@ mod tests { | |||
parse.syntax_node() | |||
); | |||
|
|||
// the fixed-up tree should not contain braces as punct | |||
// FIXME: should probably instead check that it's a valid punctuation character | |||
for x in tt.iter() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps it's worth gating it under cfg(debug_assertions)
for perf?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also note that this needs to iterator over flat_tokens()
instead, as written it won't validate grandchildren (and also be a bit slower. Iterating over flat tokens is okay here since we only look for leafs).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just in the test, so it should be fine. You're right about flat_tokens
, oops.
7f09cda
to
b1fac5e
Compare
b1fac5e
to
da87f56
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll hold off with merging until tomorrow for the new release
Fixes #19206.
Fixes #18244.
r? @Veykril
It's a bit hacky, since we keep emitting these invalid
Punct
s from syntax fixup but letconvert_tokens
deal with them.