Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serialize additional data for procedural macros #63269

Merged
merged 1 commit into from
Aug 18, 2019

Conversation

Aaron1011
Copy link
Member

Split off from #62855

This PR serializes the declaration Span and attributes for all
procedural macros. This allows Rustdoc to properly render doc comments
and source links when performing inlinig procedural macros across crates

@rust-highfive
Copy link
Collaborator

r? @zackmdavis

(rust_highfive has picked a reviewer for you, use r? to override)

@rust-highfive rust-highfive added the S-waiting-on-review Status: Awaiting review from the assignee but also interested parties. label Aug 4, 2019
@Aaron1011
Copy link
Member Author

r? @eddyb

@rust-highfive rust-highfive assigned eddyb and unassigned zackmdavis Aug 4, 2019
src/librustc/hir/lowering.rs Outdated Show resolved Hide resolved
@eddyb
Copy link
Member

eddyb commented Aug 9, 2019

r? @petrochenkov

@rust-highfive rust-highfive assigned petrochenkov and unassigned eddyb Aug 9, 2019
@petrochenkov
Copy link
Contributor

Oh no.
@eddyb, I specifically wanted you to review the metadata part (#62855 (comment)).

@petrochenkov petrochenkov assigned eddyb and unassigned petrochenkov Aug 9, 2019

/// Whether or not this crate should be consider a private dependency
/// for purposes of the 'exported_private_dependencies' lint
pub private_dep: bool
}

pub struct FullProcMacro {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe call this ProcMacroDef?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's already a struct called ProcMacroDef in src/libsyntax_ext/proc_macro_harness.rs. I didn't want to have two different structs with the same name.

/// This needs to come before 'def_path_table',
/// as we need use data from it when decoding `def_path_table.
/// This also needs to come at the very end of the 'Lazy/LazySeq' data,
/// as we need all of the other data in order to deserialize it
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The last two lines here don't make sense: a Lazy/LazySeq is just an offset into the file (hence "lazy"), so the order doesn't matter at all.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant to refer to the encoding order, which does matter. However, now that I lazily deserialize the proc macros, it doesn't matter.

/// This also needs to come at the very end of the 'Lazy/LazySeq' data,
/// as we need all of the other data in order to deserialize it
pub proc_macro_data: Option<LazySeq<ProcMacroData>>,
pub def_path_table: Lazy<hir::map::definitions::DefPathTable>,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no need to move this.

// that's *almost* complete - it's just missing 'proc_macros' and 'def_path_table'.
// We then deserialize 'proc_macros' and 'def_path_table', using (cmeta, self.sess)
// as our deserializer. Then, we replace the dummy 'proc_macros' and 'def_path_table'
// with the data we just deserialized.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The correct solution is to make loading the proc macros lazy.

@@ -581,8 +597,10 @@ impl<'a> CrateLoader<'a> {
/// implemented as dynamic libraries, but we have a possible future where
/// custom derive (and other macro-1.1 style features) are implemented via
/// executables and custom IPC.
fn load_derive_macros(&mut self, root: &CrateRoot<'_>, dylib: Option<PathBuf>, span: Span)
-> Vec<(ast::Name, Lrc<SyntaxExtension>)> {
fn load_derive_macros(&mut self, root: &CrateRoot<'_>, dylib: Option<PathBuf>, span: Span,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Heh, this should be load_proc_macros nowadays.

@@ -203,11 +203,6 @@ provide! { <'tcx> tcx, def_id, other, cdata,
DefId { krate: def_id.krate, index }
})
}
proc_macro_decls_static => {
cdata.root.proc_macro_decls_static.map(|index| {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand, this was supposed to be used to compute the symbol name, what happened to that?

#[derive(Clone, RustcEncodable, RustcDecodable, Debug)]
pub struct ProcMacroInfo {
pub span: Span,
pub attrs: Vec<Attribute>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@petrochenkov This is the part I was r?-ing you for but it's simpler than I initially thought.

proc_macro_infos.borrow_mut().push(ProcMacroInfo {
span: cd.span,
attrs: cd.raw_attrs.clone()
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@petrochenkov I guess this is what I want to make sure you have nothing against.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is something I have a lot against, actually.

Aren't fn items encoded into the metadata already (including spans and attributes)?
As I understand it, a compiled proc macro crate has two views - the "regular" one with the original fn items and everything, and the "proc-macro facade" one masking all the original items and making it look like the crate only exposes macros.

So we just need to link from the proc-macro view (CrateMetadata::proc_macros) to the real items somehow instead of cloning some data from those items, keeping them in AST and encoding them for the second time.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah, we'd just need to map the DefId's.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to unconditionally decode and store the 'real' def_path_table, but continue to use a 'fake' table for most operations on proc macro crates?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose, yeah. At this point I'm not sure why we don't use the original DefIds directly, and just change what def_kind returns?

@@ -427,7 +422,7 @@ impl cstore::CStore {
pub fn load_macro_untracked(&self, id: DefId, sess: &Session) -> LoadedMacro {
let data = self.get_crate_data(id.krate);
if let Some(ref proc_macros) = data.proc_macros {
return LoadedMacro::ProcMacro(proc_macros[id.index.to_proc_macro_index()].1.clone());
return LoadedMacro::ProcMacro(proc_macros[id.index.to_proc_macro_index()].ext.clone());
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@petrochenkov Does this need to be cached, or could the ext be loaded here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Resolver already has a cache for this (macro_map in fn get_macro_by_def_id), so load_macro_untracked can just load the extension from metadata.

return Lrc::new([]);
return Lrc::from(
self.proc_macros.as_ref().unwrap()[node_id.to_proc_macro_index()].attrs
.clone()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could have attributes be lazy, like a regular Entry.

macros.len(), decls.len());
}

let extensions = macros.zip(decls.iter()).map(|(proc_macro, &decl) | {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think CrateMetadata should hold onto decls (as a &'static [ProcMacro]) rather than the FullProcMacros.

@bors
Copy link
Contributor

bors commented Aug 12, 2019

☔ The latest upstream changes (presumably #63469) made this pull request unmergeable. Please resolve the merge conflicts.

@rust-highfive
Copy link
Collaborator

The job x86_64-gnu-llvm-6.0 of your PR failed (raw log). Through arcane magic we have determined that the following fragments from the build log may contain information about the problem.

Click to expand the log.
2019-08-13T23:49:48.9863276Z ##[command]git remote add origin https://github.com/rust-lang/rust
2019-08-13T23:49:49.0042814Z ##[command]git config gc.auto 0
2019-08-13T23:49:49.0090370Z ##[command]git config --get-all http.https://github.com/rust-lang/rust.extraheader
2019-08-13T23:49:49.0137032Z ##[command]git config --get-all http.proxy
2019-08-13T23:49:49.0275049Z ##[command]git -c http.extraheader="AUTHORIZATION: basic ***" fetch --force --tags --prune --progress --no-recurse-submodules --depth=2 origin +refs/heads/*:refs/remotes/origin/* +refs/pull/63269/merge:refs/remotes/pull/63269/merge
---
2019-08-13T23:50:23.8100287Z do so (now or later) by using -b with the checkout command again. Example:
2019-08-13T23:50:23.8100320Z 
2019-08-13T23:50:23.8100551Z   git checkout -b <new-branch-name>
2019-08-13T23:50:23.8100605Z 
2019-08-13T23:50:23.8100661Z HEAD is now at 48edf7fda Merge 6ff269308d88711ac3efbd4b0e73700605159ca9 into 60960a260f7b5c695fd0717311d72ce62dd4eb43
2019-08-13T23:50:23.8277050Z ##[section]Starting: Collect CPU-usage statistics in the background
2019-08-13T23:50:23.8280226Z ==============================================================================
2019-08-13T23:50:23.8280290Z Task         : Bash
2019-08-13T23:50:23.8280355Z Description  : Run a Bash script on macOS, Linux, or Windows
---
2019-08-13T23:56:02.1266583Z    Compiling serde_json v1.0.40
2019-08-13T23:56:06.2489768Z    Compiling tidy v0.1.0 (/checkout/src/tools/tidy)
2019-08-13T23:56:14.4673315Z     Finished release [optimized] target(s) in 1m 25s
2019-08-13T23:56:14.4747165Z tidy check
2019-08-13T23:56:14.5868535Z tidy error: /checkout/src/librustc_metadata/cstore_impl.rs:442: line longer than 100 chars
2019-08-13T23:56:16.3325619Z some tidy checks failed
2019-08-13T23:56:16.3326563Z 
2019-08-13T23:56:16.3326563Z 
2019-08-13T23:56:16.3327763Z command did not execute successfully: "/checkout/obj/build/x86_64-unknown-linux-gnu/stage0-tools-bin/tidy" "/checkout/src" "/checkout/obj/build/x86_64-unknown-linux-gnu/stage0/bin/cargo" "--no-vendor"
2019-08-13T23:56:16.3328461Z 
2019-08-13T23:56:16.3328673Z 
2019-08-13T23:56:16.3332873Z failed to run: /checkout/obj/build/bootstrap/debug/bootstrap test src/tools/tidy
2019-08-13T23:56:16.3333223Z Build completed unsuccessfully in 0:01:28
2019-08-13T23:56:16.3333223Z Build completed unsuccessfully in 0:01:28
2019-08-13T23:56:17.7762566Z ##[error]Bash exited with code '1'.
2019-08-13T23:56:17.7794105Z ##[section]Starting: Checkout
2019-08-13T23:56:17.7795846Z ==============================================================================
2019-08-13T23:56:17.7795920Z Task         : Get sources
2019-08-13T23:56:17.7795962Z Description  : Get sources from a repository. Supports Git, TfsVC, and SVN repositories.

I'm a bot! I can only do what humans tell me to, so if this was not helpful or you have suggestions for improvements, please ping or otherwise contact @TimNN. (Feature Requests)

@Aaron1011
Copy link
Member Author

@eddyb: I've made the changes you requested.

@petrochenkov petrochenkov added the S-waiting-on-author Status: This is awaiting some action (such as code changes or more information) from the author. label Aug 16, 2019
@Aaron1011 Aaron1011 force-pushed the feature/proc-macro-data branch from 463147c to 73d7719 Compare August 16, 2019 23:53
@Aaron1011
Copy link
Member Author

@petrochenkov: Squashed

@petrochenkov
Copy link
Contributor

@bors r=eddyb,petrochenkov

@bors
Copy link
Contributor

bors commented Aug 17, 2019

📌 Commit 73d7719 has been approved by eddyb,petrochenkov

@bors bors added S-waiting-on-bors Status: Waiting on bors to run and complete tests. Bors will change the label on completion. and removed S-waiting-on-author Status: This is awaiting some action (such as code changes or more information) from the author. labels Aug 17, 2019
Centril added a commit to Centril/rust that referenced this pull request Aug 17, 2019
…r=eddyb,petrochenkov

Serialize additional data for procedural macros

Split off from rust-lang#62855

This PR serializes the declaration `Span` and attributes for all
procedural macros. This allows Rustdoc to properly render doc comments
and source links when performing inlinig procedural macros across crates
@Centril
Copy link
Contributor

Centril commented Aug 17, 2019

Failed in #63653 (comment), @bors r-

(Known by process of elimination as #63655 excluded that PR and it is working.)

@bors bors added S-waiting-on-author Status: This is awaiting some action (such as code changes or more information) from the author. and removed S-waiting-on-bors Status: Waiting on bors to run and complete tests. Bors will change the label on completion. labels Aug 17, 2019
Split off from rust-lang#62855

This PR deerializes the declaration `Span` and attributes for all
procedural macros from their underlying function definitions.
This allows Rustdoc to properly render doc comments
and source links when inlining procedural macros across crates
@Aaron1011 Aaron1011 force-pushed the feature/proc-macro-data branch from 73d7719 to 64f867a Compare August 17, 2019 17:14
@Aaron1011
Copy link
Member Author

@petrochenkov: I had accidentally removed this check. The build should now pass.

@petrochenkov
Copy link
Contributor

@bors r=eddyb,petrochenkov

@bors
Copy link
Contributor

bors commented Aug 17, 2019

📌 Commit 64f867a has been approved by eddyb,petrochenkov

@bors bors added S-waiting-on-bors Status: Waiting on bors to run and complete tests. Bors will change the label on completion. and removed S-waiting-on-author Status: This is awaiting some action (such as code changes or more information) from the author. labels Aug 17, 2019
@bors
Copy link
Contributor

bors commented Aug 18, 2019

⌛ Testing commit 64f867a with merge 71e2882...

bors added a commit that referenced this pull request Aug 18, 2019
…rochenkov

Serialize additional data for procedural macros

Split off from #62855

This PR serializes the declaration `Span` and attributes for all
procedural macros. This allows Rustdoc to properly render doc comments
and source links when performing inlinig procedural macros across crates
@eddyb
Copy link
Member

eddyb commented Aug 18, 2019

@petrochenkov in #63269 (comment):

One big remaining question: why do we need the static _DECLS at all?

I've ranted about this before but I'm not sure if in this thread or on Discord.
To summarize, what we can do is maybe have one static proc_macro::bridge::client::Client per function (or even one static per crate plus the functions themselves), but not get rid of them completely.

If you look at what that Client contains, it's not just the proc macro fn pointer (which, yes, we'd be able to get via gensym), which would go into the f: F field, but also two other fn pointers, which need to come from the proc_macro library that the proc macro crate was linked against - so in a sense, the Client type encapsulates the ABI of a proc macro expander function.

/// A client-side "global object" (usually a function pointer),
/// which may be using a different `proc_macro` from the one
/// used by the server, but can be interacted with compatibly.
///
/// N.B., `F` must have FFI-friendly memory layout (e.g., a pointer).
/// The call ABI of function pointers used for `F` doesn't
/// need to match between server and client, since it's only
/// passed between them and (eventually) called by the client.
#[repr(C)]
#[derive(Copy, Clone)]
pub struct Client<F> {
pub(super) get_handle_counters: extern "C" fn() -> &'static HandleCounters,
pub(super) run: extern "C" fn(Bridge<'_>, F) -> Buffer<u8>,
pub(super) f: F,
}
// FIXME(#53451) public to work around `Cannot create local mono-item` ICE,
// affecting not only the function itself, but also the `BridgeState` `thread_local!`.
pub extern "C" fn __run_expand1(
mut bridge: Bridge<'_>,
f: fn(crate::TokenStream) -> crate::TokenStream,
) -> Buffer<u8> {
// The initial `cached_buffer` contains the input.
let mut b = bridge.cached_buffer.take();
panic::catch_unwind(panic::AssertUnwindSafe(|| {
bridge.enter(|| {
let reader = &mut &b[..];
let input = TokenStream::decode(reader, &mut ());
// Put the `cached_buffer` back in the `Bridge`, for requests.
Bridge::with(|bridge| bridge.cached_buffer = b.take());
let output = f(crate::TokenStream(input)).0;
// Take the `cached_buffer` back out, for the output value.
b = Bridge::with(|bridge| bridge.cached_buffer.take());
// HACK(eddyb) Separate encoding a success value (`Ok(output)`)
// from encoding a panic (`Err(e: PanicMessage)`) to avoid
// having handles outside the `bridge.enter(|| ...)` scope, and
// to catch panics that could happen while encoding the success.
//
// Note that panics should be impossible beyond this point, but
// this is defensively trying to avoid any accidental panicking
// reaching the `extern "C"` (which should `abort` but may not
// at the moment, so this is also potentially preventing UB).
b.clear();
Ok::<_, ()>(output).encode(&mut b, &mut ());
})
}))
.map_err(PanicMessage::from)
.unwrap_or_else(|e| {
b.clear();
Err::<(), _>(e).encode(&mut b, &mut ());
});
b
}
impl Client<fn(crate::TokenStream) -> crate::TokenStream> {
pub const fn expand1(f: fn(crate::TokenStream) -> crate::TokenStream) -> Self {
Client {
get_handle_counters: HandleCounters::get,
run: __run_expand1,
f,
}
}
}
// FIXME(#53451) public to work around `Cannot create local mono-item` ICE,
// affecting not only the function itself, but also the `BridgeState` `thread_local!`.
pub extern "C" fn __run_expand2(
mut bridge: Bridge<'_>,
f: fn(crate::TokenStream, crate::TokenStream) -> crate::TokenStream,
) -> Buffer<u8> {
// The initial `cached_buffer` contains the input.
let mut b = bridge.cached_buffer.take();
panic::catch_unwind(panic::AssertUnwindSafe(|| {
bridge.enter(|| {
let reader = &mut &b[..];
let input = TokenStream::decode(reader, &mut ());
let input2 = TokenStream::decode(reader, &mut ());
// Put the `cached_buffer` back in the `Bridge`, for requests.
Bridge::with(|bridge| bridge.cached_buffer = b.take());
let output = f(crate::TokenStream(input), crate::TokenStream(input2)).0;
// Take the `cached_buffer` back out, for the output value.
b = Bridge::with(|bridge| bridge.cached_buffer.take());
// HACK(eddyb) Separate encoding a success value (`Ok(output)`)
// from encoding a panic (`Err(e: PanicMessage)`) to avoid
// having handles outside the `bridge.enter(|| ...)` scope, and
// to catch panics that could happen while encoding the success.
//
// Note that panics should be impossible beyond this point, but
// this is defensively trying to avoid any accidental panicking
// reaching the `extern "C"` (which should `abort` but may not
// at the moment, so this is also potentially preventing UB).
b.clear();
Ok::<_, ()>(output).encode(&mut b, &mut ());
})
}))
.map_err(PanicMessage::from)
.unwrap_or_else(|e| {
b.clear();
Err::<(), _>(e).encode(&mut b, &mut ());
});
b
}
impl Client<fn(crate::TokenStream, crate::TokenStream) -> crate::TokenStream> {
pub const fn expand2(
f: fn(crate::TokenStream, crate::TokenStream) -> crate::TokenStream
) -> Self {
Client {
get_handle_counters: HandleCounters::get,
run: __run_expand2,
f,
}
}
}

@bors
Copy link
Contributor

bors commented Aug 18, 2019

☀️ Test successful - checks-azure
Approved by: eddyb,petrochenkov
Pushing 71e2882 to master...

@bors bors added the merged-by-bors This PR was explicitly merged by bors. label Aug 18, 2019
@bors bors merged commit 64f867a into rust-lang:master Aug 18, 2019
Centril added a commit to Centril/rust that referenced this pull request Aug 27, 2019
Propagate spans and attributes from proc macro definitions

Thanks to rust-lang#63269 we now have spans and attributes from proc macro definitions available in metadata.

However, that PR didn't actually put them into use! This PR finishes that work.

Attributes `rustc_macro_transparency`, `allow_internal_unstable`, `allow_internal_unsafe`, `local_inner_macros`, `rustc_builtin_macro`, `stable`, `unstable`, `rustc_deprecated`, `deprecated` now have effect when applied to proc macro definition functions.
From those attributes only `deprecated` is both stable and supposed to be used in new code.
(`#![staged_api]` still cannot be used in proc macro crates for unrelated reasons though.)

`Span::def_site` from the proc macro API now returns the correct location of the proc macro definition.

Also, I made a mistake in rust-lang#63269 (comment), loaded proc macros didn't actually use the resolver cache.
This PR fixes the caching issue, now proc macros go through the `Resolver::macro_map` cache as well.

(Also, the first commit turns `proc_macro::quote` into a regular built-in macro to reduce the number of places where `SyntaxExtension`s need to be manually created.)
Centril added a commit to Centril/rust that referenced this pull request Aug 27, 2019
Propagate spans and attributes from proc macro definitions

Thanks to rust-lang#63269 we now have spans and attributes from proc macro definitions available in metadata.

However, that PR didn't actually put them into use! This PR finishes that work.

Attributes `rustc_macro_transparency`, `allow_internal_unstable`, `allow_internal_unsafe`, `local_inner_macros`, `rustc_builtin_macro`, `stable`, `unstable`, `rustc_deprecated`, `deprecated` now have effect when applied to proc macro definition functions.
From those attributes only `deprecated` is both stable and supposed to be used in new code.
(`#![staged_api]` still cannot be used in proc macro crates for unrelated reasons though.)

`Span::def_site` from the proc macro API now returns the correct location of the proc macro definition.

Also, I made a mistake in rust-lang#63269 (comment), loaded proc macros didn't actually use the resolver cache.
This PR fixes the caching issue, now proc macros go through the `Resolver::macro_map` cache as well.

(Also, the first commit turns `proc_macro::quote` into a regular built-in macro to reduce the number of places where `SyntaxExtension`s need to be manually created.)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
merged-by-bors This PR was explicitly merged by bors. S-waiting-on-bors Status: Waiting on bors to run and complete tests. Bors will change the label on completion.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants