-
Notifications
You must be signed in to change notification settings - Fork 12.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
error: expected identifier when building time-macros-impl v0.1.1 #72545
Comments
@rustbot ping cleanup |
Hey Cleanup Crew ICE-breakers! This bug has been identified as a good cc @AminArria @chrissimpkins @contrun @DutchGhost @elshize @ethanboxx @h-michael @HallerPatrick @hdhoang @hellow554 @imtsuki @jakevossen5 @kanru @KarlK90 @LeSeulArtichaut @MAdrianMattocks @matheus-consoli @mental32 @nmccarty @Noah-Kennedy @pard68 @PeytonT @pierreN @Redblueflame @RobbieClarken @RobertoSnap @robjtede @SarthakSingh31 @senden9 @shekohex @sinato @spastorino @turboladen @woshilapin @yerke |
Reverting implicates #72388, cc @Aaron1011. |
Minimized: pub(crate) struct Time;
use proc_macro_hack::proc_macro_hack;
macro_rules! impl_macros {
($($name:ident : $type:ty),* $(,)?) => {
$(
#[proc_macro_hack]
#[allow(clippy::unimplemented)]
pub fn $name(input: proc_macro::TokenStream) -> proc_macro::TokenStream {
panic!()
}
)*
};
}
impl_macros! {
time: Time,
} |
Minimized even further: // src/lib.rs
use proc_macro::TokenStream;
#[proc_macro_attribute]
pub fn print_input(_attr: TokenStream, tokens: TokenStream) -> TokenStream {
println!("Input: {:?}", tokens);
tokens
} // src/main.rs
use proc_rules::print_input;
macro_rules! with_ident {
($name: ident) => {
#[print_input]
pub fn $name() {}
}
}
with_ident!(foo);
fn main() {}
Output of
Output of
|
This regression is caused by the fact that #72388 causes us to pass the original I'm not sure what the proper solution is here. While #72388 is causing breakage, the breakage is due to us passing the correct |
I think rustc's behavior is fine*. I published a fix in proc-macro-hack 0.5.16. * This is a separate issue but actually I think |
In case anyone is running into this here and not on the time-rs/time repo, please run |
It looks like this same error is affecting |
|
#72388 is turning out to have an unexpectedly large amount of fallout. While I believe the change itself is definitely correct (the alternative is to pass the unspanned re-parsed tokens to proc-macros), several crates seem to be running into problems due to differences in the captured vs reparsed tokens. I see several options here:
|
I suppose seemingly no-op groups with no delimiter can be useful because they provide additional span information. In #72608 (comment) the group has the span of the When APIs like On this other hand, this is breakage of code that previously ran on Stable. I think doing a Crater run in any case would be good, to get a sense of how widespread this breakage is. Based on that we can decide to flatten delimiter-less groups, or do some effort to work around at least in |
I think the best course of action is to perform a crater run, and if there is significant breakage beyond what's already been noted, revert the PR until everything has been figured out. I'm not familiar with the technical aspects of what was changed and why, but it does sound like it was the correct technical decision. As such, eventually merging it back in once regressions are fixed seems sane. |
The priority is not critical anymore, but still pretty high because we need to re-land #72388 in some form sooner, otherwise more macros using hygiene holes caused by stringification (#72622 (comment)) will be written. |
@petrochenkov yeah, that was my impression too. Going to label this as |
Re-landing is currently blocked on tesaguri/oauth1-request-rs#3 - the |
I'll write a short version instead. The root cause of the regression with it exists for a very long time and was stabilized together with macros 1.1 (custom derives), but I discovered it only when auditing the implementation for macros 1.2 (end of 2018). The hack exists because we pass items (and other fragments) to attribute and derive macros as nonterminal tokens (with dummy spans specifically): #[my_attr]
struct S; // Passed to `my_attr` as `TokenStream(NtItem(struct S;))` , which is only necessary to... improve pretty-printing of token streams ¯_(ツ)_/¯ If we pass the fragments as |
proc_macro: Stop flattening groups with dummy spans Reduce the scope of the hack described in rust-lang#72545 (comment). We still pass AST fragments to attribute and derive macros as single nonterminal tokens rather than as tokens streams, but now use a precise flag instead of the span-based heuristic that could do lead to incorrect behavior in unrelated cases. rust-lang#73345 attempts to fully resolve this issue, but there are some compatibility issues to be addressed.
proc_macro: Stop flattening groups with dummy spans Reduce the scope of the hack described in rust-lang#72545 (comment). We still pass AST fragments to attribute and derive macros as single nonterminal tokens rather than as tokens streams, but now use a precise flag instead of the span-based heuristic that could do lead to incorrect behavior in unrelated cases. rust-lang#73345 attempts to fully resolve this issue, but there are some compatibility issues to be addressed.
expand: Stop using nonterminals for passing tokens to attribute and derive macros Make one more step towards fully token-based expansion and fix issues described in rust-lang#72545 (comment). Now `struct S;` is passed to `foo!(struct S;)` and `#[foo] struct S;` in the same way - as a token stream `struct S ;`, rather than a single non-terminal token `NtItem` which is then broken into parts later. The cost is making pretty-printing of token streams less pretty. Some of the pretty-printing regressions will be recovered by keeping jointness with each token, which we will need to do anyway. Unfortunately, this is not exactly the same thing as rust-lang#73102. One more observable effect is how `$crate` is printed in the attribute input. Inside `NtItem` was printed as `crate` or `that_crate`, now as a part of a token stream it's printed as `$crate` (there are good reasons for these differences, see rust-lang#62393 and related PRs). This may break old proc macros (custom derives) written before the main portion of the proc macro API (macros 1.2) was stabilized, those macros did `input.to_string()` and reparsed the result, now that result can contain `$crate` which cannot be reparsed. So, I think we should do this regardless, but we need to run crater first. r? @Aaron1011
expand: Stop using nonterminals for passing tokens to attribute and derive macros Make one more step towards fully token-based expansion and fix issues described in rust-lang#72545 (comment). Now `struct S;` is passed to `foo!(struct S;)` and `#[foo] struct S;` in the same way - as a token stream `struct S ;`, rather than a single non-terminal token `NtItem` which is then broken into parts later. The cost is making pretty-printing of token streams less pretty. Some of the pretty-printing regressions will be recovered by keeping jointness with each token, which we will need to do anyway. Unfortunately, this is not exactly the same thing as rust-lang#73102. One more observable effect is how `$crate` is printed in the attribute input. Inside `NtItem` was printed as `crate` or `that_crate`, now as a part of a token stream it's printed as `$crate` (there are good reasons for these differences, see rust-lang#62393 and related PRs). This may break old proc macros (custom derives) written before the main portion of the proc macro API (macros 1.2) was stabilized, those macros did `input.to_string()` and reparsed the result, now that result can contain `$crate` which cannot be reparsed. So, I think we should do this regardless, but we need to run crater first. r? @Aaron1011
expand: Stop using nonterminals for passing tokens to attribute and derive macros Make one more step towards fully token-based expansion and fix issues described in rust-lang#72545 (comment). Now `struct S;` is passed to `foo!(struct S;)` and `#[foo] struct S;` in the same way - as a token stream `struct S ;`, rather than a single non-terminal token `NtItem` which is then broken into parts later. The cost is making pretty-printing of token streams less pretty. Some of the pretty-printing regressions will be recovered by keeping jointness with each token, which we will need to do anyway. Unfortunately, this is not exactly the same thing as rust-lang#73102. One more observable effect is how `$crate` is printed in the attribute input. Inside `NtItem` was printed as `crate` or `that_crate`, now as a part of a token stream it's printed as `$crate` (there are good reasons for these differences, see rust-lang#62393 and related PRs). This may break old proc macros (custom derives) written before the main portion of the proc macro API (macros 1.2) was stabilized, those macros did `input.to_string()` and reparsed the result, now that result can contain `$crate` which cannot be reparsed. So, I think we should do this regardless, but we need to run crater first. r? @Aaron1011
expand: Stop using nonterminals for passing tokens to attribute and derive macros Make one more step towards fully token-based expansion and fix issues described in rust-lang#72545 (comment). Now `struct S;` is passed to `foo!(struct S;)` and `#[foo] struct S;` in the same way - as a token stream `struct S ;`, rather than a single non-terminal token `NtItem` which is then broken into parts later. The cost is making pretty-printing of token streams less pretty. Some of the pretty-printing regressions will be recovered by keeping jointness with each token, which we will need to do anyway. Unfortunately, this is not exactly the same thing as rust-lang#73102. One more observable effect is how `$crate` is printed in the attribute input. Inside `NtItem` was printed as `crate` or `that_crate`, now as a part of a token stream it's printed as `$crate` (there are good reasons for these differences, see rust-lang#62393 and related PRs). This may break old proc macros (custom derives) written before the main portion of the proc macro API (macros 1.2) was stabilized, those macros did `input.to_string()` and reparsed the result, now that result can contain `$crate` which cannot be reparsed. So, I think we should do this regardless, but we need to run crater first. r? @Aaron1011
expand: Stop using nonterminals for passing tokens to attribute and derive macros Make one more step towards fully token-based expansion and fix issues described in rust-lang#72545 (comment). Now `struct S;` is passed to `foo!(struct S;)` and `#[foo] struct S;` in the same way - as a token stream `struct S ;`, rather than a single non-terminal token `NtItem` which is then broken into parts later. The cost is making pretty-printing of token streams less pretty. Some of the pretty-printing regressions will be recovered by keeping jointness with each token, which we will need to do anyway. Unfortunately, this is not exactly the same thing as rust-lang#73102. One more observable effect is how `$crate` is printed in the attribute input. Inside `NtItem` was printed as `crate` or `that_crate`, now as a part of a token stream it's printed as `$crate` (there are good reasons for these differences, see rust-lang#62393 and related PRs). This may break old proc macros (custom derives) written before the main portion of the proc macro API (macros 1.2) was stabilized, those macros did `input.to_string()` and reparsed the result, now that result can contain `$crate` which cannot be reparsed. So, I think we should do this regardless, but we need to run crater first. r? @Aaron1011
Closing since the regression was fixed by reverting #72388, and there are no immediate plans to change the treatment of |
…d, r=petrochenkov Re-land PR rust-lang#72388: Recursively expand `TokenKind::Interpolated` in `probably_equal_for_proc_macro` PR rust-lang#72388 allowed us to preserve the original `TokenStream` in more cases during proc-macro expansion, but had to be reverted due to a large number of regressions (See rust-lang#72545 and rust-lang#72622). These regressions fell into two categories 1. Missing handling for `Group`s with `Delimiter::None`, which are inserted during `macro_rules!` expansion (but are lost during stringification and re-parsing). A large number of these regressions were due to `syn` and `proc-macro-hack`, but several crates needed changes to their own proc-macro code. 2. Legitimate hygiene issues that were previously being masked by stringification. Some of these were relatively benign (e.g. [a compiliation error](paritytech/parity-scale-codec#210) caused by misusing `quote_spanned!`). However, two crates had intentionally written unhygenic `macro_rules!` macros, which were able to access identifiers that were not passed as arguments (see rust-lang#72622 (comment)). All but one of the Crater regressions have now been fixed upstream (see https://hackmd.io/ItrXWRaSSquVwoJATPx3PQ?both). The remaining crate (which has a PR pending at sammhicks/face-generator#1) is not on `crates.io`, and is a Yew application that seems unlikely to have any reverse dependencies. As @petrochenkov mentioned in rust-lang#72545 (comment), not re-landing PR rust-lang#72388 allows more crates to write unhygenic `macro_rules!` macros, which will eventually stop compiling. Since there is only one Crater regression remaining, since additional crates could write unhygenic `macro_rules!` macros in the time it takes that PR to be merged.
Steps to reproduce:
Meta
Very recent regression; versions
<= nightly-2020-05-24
are not affected.The text was updated successfully, but these errors were encountered: