-
Notifications
You must be signed in to change notification settings - Fork 12.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adjust close delim span #65966
Adjust close delim span #65966
Conversation
r? @eddyb (rust_highfive has picked a reviewer for you, use r? to override) |
also add @estebank for review. |
@@ -0,0 +1,3 @@ | |||
// ignore-tidy-trailing-newlines | |||
// error-pattern: aborting due to 2 previous errors | |||
fn main((ؼ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note that this file needs "no-newline" otherwise it won't crash in previous version.
Should we create some mechanism in the code to verify that this file has no new line at the end of file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I notice that some editors would try to add a new line by accident.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add this as a comment to the test on a line right before fn main(
so that people reading the file understand why the ignore
rule is there.
r? @petrochenkov cc @dtolnay |
This comment has been minimized.
This comment has been minimized.
Please run |
span.with_hi(span.lo() + BytePos(delim.len() as u32)) | ||
}; | ||
TokenTree::token(token::OpenDelim(delim), open_span) | ||
} | ||
|
||
/// Returns the closing delimiter as a token tree. | ||
pub fn close_tt(span: Span, delim: DelimToken) -> TokenTree { | ||
let close_span = if span.is_dummy() { | ||
let close_span = if span.is_dummy() || span.lo() == span.hi() { | ||
span | ||
} else { | ||
span.with_lo(span.hi() - BytePos(delim.len() as u32)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This may be trickier than it may look on the first glance.
If span
is arbitrary (e.g. created by a proc macro), then span + 1
/span - 1
can always step into something bad in the source map.
The only way to know for sure is to check whether span_to_source
returns an error or not, but that's expensive.
If we want to check this without accessing the source map, then we need to understand when exactly this span is used for indexing into it (only when reporting "un-closed delimiter" or somewhere else?) and whether it can contain spans produced by proc macros at that point, or only lexer produced spans.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the tokenstream would insert an un-closed delimiter with a span whose lo
equals hi
. I'm not sure about proc macros, haven't got time into that part yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@petrochenkov I think it would be reasonable to commit this PR with a FIXME comment pretty much containing the same text in your comment, as well as a ticket for that work. Although this won't catch every single ICE that could be caused by proc_macro span shenaningans, it will catch one that is relatively easy to hit. We could also add a check for whether the span has a macro backtrace and also use span
directly in those cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather add some asserts instead of the FIXME, it's quite possible that this is only called with spans from lexer, will try to look into this tomorrow.
I'd also inline open_tt
and close_tt
into the caller since they are only called once and cannot work with arbitrary arguments.
Per estebank's comment, r? @estebank |
Superseded by #66054. |
Fixes #62524
The only thing that's not quite perfect is that I need to update two existing unit tests... If I remove the newline in these two files, then the "^" position looks reasonable. Any hints how to adjust that? Or do you think this fix is a good start? (at least it fixes the ICE).