Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rollup of 7 pull requests #37382

Merged
merged 19 commits into from
Oct 25, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
19 commits
Select commit Hold shift + click to select a range
0d3bdc6
Fix a error of 'book/deref-coercions.html'
loggerhead Oct 17, 2016
9ec7c65
Add missing urls in collections module
GuillaumeGomez Oct 20, 2016
1fadd86
Improve E0277 help message
GuillaumeGomez Oct 21, 2016
e6aa92c
trans: Make names of internal symbols independent of CGU translation …
michaelwoerister Oct 21, 2016
6f3edb0
type_id: Make result of std::intrinsics::type_id() endian-independent.
michaelwoerister Oct 21, 2016
c52836c
debuginfo: Use TypeIdHasher to create global type identifiers for deb…
michaelwoerister Oct 21, 2016
e46c1ad
Adapt codegen test to new naming scheme for generated symbols.
michaelwoerister Oct 21, 2016
7ef418b
Make ArchIndependentHasher publicly visible.
michaelwoerister Oct 21, 2016
cac3e5a
Run rustfmt on metadata folder - (1/2)
srinivasreddy Oct 23, 2016
27dbfff
Link to PathBuf from the Path docs
Oct 24, 2016
025b27d
debuginfo: Erase regions when creating debuginfo for statics.
michaelwoerister Oct 24, 2016
992203b
Adapt rmake-test to new naming scheme for internal symbols.
michaelwoerister Oct 24, 2016
91c7a82
Rollup merge of #37228 - loggerhead:patch-1, r=steveklabnik
Oct 24, 2016
855f3e7
Rollup merge of #37304 - GuillaumeGomez:collections_url, r=frewsxcv
Oct 24, 2016
050499c
Rollup merge of #37324 - GuillaumeGomez:trait_error_message, r=jonath…
Oct 24, 2016
e7da619
Rollup merge of #37328 - michaelwoerister:stable-local-symbol-names, …
Oct 24, 2016
691ab94
Rollup merge of #37336 - michaelwoerister:debuginfo-type-ids, r=eddyb
Oct 24, 2016
59b7ea4
Rollup merge of #37349 - srinivasreddy:meta_1, r=nikomatsakis
Oct 24, 2016
e948cf1
Rollup merge of #37372 - vtduncan:pathbuf-docs-link, r=steveklabnik
Oct 24, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/doc/book/deref-coercions.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ foo(&counted);
All we’ve done is wrap our `String` in an `Rc<T>`. But we can now pass the
`Rc<String>` around anywhere we’d have a `String`. The signature of `foo`
didn’t change, but works just as well with either type. This example has two
conversions: `Rc<String>` to `String` and then `String` to `&str`. Rust will do
conversions: `&Rc<String>` to `&String` and then `&String` to `&str`. Rust will do
this as many times as possible until the types match.

Another very common implementation provided by the standard library is:
Expand Down
6 changes: 4 additions & 2 deletions src/librustc/traits/error_reporting.rs
Original file line number Diff line number Diff line change
Expand Up @@ -445,8 +445,10 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
let mut err = struct_span_err!(self.tcx.sess, span, E0277,
"the trait bound `{}` is not satisfied",
trait_ref.to_predicate());
err.span_label(span, &format!("trait `{}` not satisfied",
trait_ref.to_predicate()));
err.span_label(span, &format!("the trait `{}` is not implemented \
for `{}`",
trait_ref,
trait_ref.self_ty()));

// Try to report a help message

Expand Down
57 changes: 32 additions & 25 deletions src/librustc/ty/util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -392,27 +392,30 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
}
}

// When hashing a type this ends up affecting properties like symbol names. We
// want these symbol names to be calculated independent of other factors like
// what architecture you're compiling *from*.
//
// The hashing just uses the standard `Hash` trait, but the implementations of
// `Hash` for the `usize` and `isize` types are *not* architecture independent
// (e.g. they has 4 or 8 bytes). As a result we want to avoid `usize` and
// `isize` completely when hashing. To ensure that these don't leak in we use a
// custom hasher implementation here which inflates the size of these to a `u64`
// and `i64`.
struct WidenUsizeHasher<H> {
/// When hashing a type this ends up affecting properties like symbol names. We
/// want these symbol names to be calculated independent of other factors like
/// what architecture you're compiling *from*.
///
/// The hashing just uses the standard `Hash` trait, but the implementations of
/// `Hash` for the `usize` and `isize` types are *not* architecture independent
/// (e.g. they has 4 or 8 bytes). As a result we want to avoid `usize` and
/// `isize` completely when hashing. To ensure that these don't leak in we use a
/// custom hasher implementation here which inflates the size of these to a `u64`
/// and `i64`.
///
/// The same goes for endianess: We always convert multi-byte integers to little
/// endian before hashing.
pub struct ArchIndependentHasher<H> {
inner: H,
}

impl<H> WidenUsizeHasher<H> {
fn new(inner: H) -> WidenUsizeHasher<H> {
WidenUsizeHasher { inner: inner }
impl<H> ArchIndependentHasher<H> {
pub fn new(inner: H) -> ArchIndependentHasher<H> {
ArchIndependentHasher { inner: inner }
}
}

impl<H: Hasher> Hasher for WidenUsizeHasher<H> {
impl<H: Hasher> Hasher for ArchIndependentHasher<H> {
fn write(&mut self, bytes: &[u8]) {
self.inner.write(bytes)
}
Expand All @@ -425,44 +428,44 @@ impl<H: Hasher> Hasher for WidenUsizeHasher<H> {
self.inner.write_u8(i)
}
fn write_u16(&mut self, i: u16) {
self.inner.write_u16(i)
self.inner.write_u16(i.to_le())
}
fn write_u32(&mut self, i: u32) {
self.inner.write_u32(i)
self.inner.write_u32(i.to_le())
}
fn write_u64(&mut self, i: u64) {
self.inner.write_u64(i)
self.inner.write_u64(i.to_le())
}
fn write_usize(&mut self, i: usize) {
self.inner.write_u64(i as u64)
self.inner.write_u64((i as u64).to_le())
}
fn write_i8(&mut self, i: i8) {
self.inner.write_i8(i)
}
fn write_i16(&mut self, i: i16) {
self.inner.write_i16(i)
self.inner.write_i16(i.to_le())
}
fn write_i32(&mut self, i: i32) {
self.inner.write_i32(i)
self.inner.write_i32(i.to_le())
}
fn write_i64(&mut self, i: i64) {
self.inner.write_i64(i)
self.inner.write_i64(i.to_le())
}
fn write_isize(&mut self, i: isize) {
self.inner.write_i64(i as i64)
self.inner.write_i64((i as i64).to_le())
}
}

pub struct TypeIdHasher<'a, 'gcx: 'a+'tcx, 'tcx: 'a, H> {
tcx: TyCtxt<'a, 'gcx, 'tcx>,
state: WidenUsizeHasher<H>,
state: ArchIndependentHasher<H>,
}

impl<'a, 'gcx, 'tcx, H: Hasher> TypeIdHasher<'a, 'gcx, 'tcx, H> {
pub fn new(tcx: TyCtxt<'a, 'gcx, 'tcx>, state: H) -> Self {
TypeIdHasher {
tcx: tcx,
state: WidenUsizeHasher::new(state),
state: ArchIndependentHasher::new(state),
}
}

Expand Down Expand Up @@ -493,6 +496,10 @@ impl<'a, 'gcx, 'tcx, H: Hasher> TypeIdHasher<'a, 'gcx, 'tcx, H> {
pub fn def_path(&mut self, def_path: &ast_map::DefPath) {
def_path.deterministic_hash_to(self.tcx, &mut self.state);
}

pub fn into_inner(self) -> H {
self.state.inner
}
}

impl<'a, 'gcx, 'tcx, H: Hasher> TypeVisitor<'tcx> for TypeIdHasher<'a, 'gcx, 'tcx, H> {
Expand Down
27 changes: 14 additions & 13 deletions src/librustc_metadata/astencode.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ use rustc_serialize::Encodable;
pub struct Ast<'tcx> {
id_range: IdRange,
item: Lazy<InlinedItem>,
side_tables: LazySeq<(ast::NodeId, TableEntry<'tcx>)>
side_tables: LazySeq<(ast::NodeId, TableEntry<'tcx>)>,
}

#[derive(RustcEncodable, RustcDecodable)]
Expand All @@ -39,7 +39,7 @@ enum TableEntry<'tcx> {
NodeType(Ty<'tcx>),
ItemSubsts(ty::ItemSubsts<'tcx>),
Adjustment(ty::adjustment::AutoAdjustment<'tcx>),
ConstQualif(ConstQualif)
ConstQualif(ConstQualif),
}

impl<'a, 'tcx> EncodeContext<'a, 'tcx> {
Expand All @@ -48,7 +48,7 @@ impl<'a, 'tcx> EncodeContext<'a, 'tcx> {
match ii {
InlinedItemRef::Item(_, i) => id_visitor.visit_item(i),
InlinedItemRef::TraitItem(_, ti) => id_visitor.visit_trait_item(ti),
InlinedItemRef::ImplItem(_, ii) => id_visitor.visit_impl_item(ii)
InlinedItemRef::ImplItem(_, ii) => id_visitor.visit_impl_item(ii),
}

let ii_pos = self.position();
Expand All @@ -58,27 +58,27 @@ impl<'a, 'tcx> EncodeContext<'a, 'tcx> {
let tables_count = {
let mut visitor = SideTableEncodingIdVisitor {
ecx: self,
count: 0
count: 0,
};
match ii {
InlinedItemRef::Item(_, i) => visitor.visit_item(i),
InlinedItemRef::TraitItem(_, ti) => visitor.visit_trait_item(ti),
InlinedItemRef::ImplItem(_, ii) => visitor.visit_impl_item(ii)
InlinedItemRef::ImplItem(_, ii) => visitor.visit_impl_item(ii),
}
visitor.count
};

self.lazy(&Ast {
id_range: id_visitor.result(),
item: Lazy::with_position(ii_pos),
side_tables: LazySeq::with_position_and_length(tables_pos, tables_count)
side_tables: LazySeq::with_position_and_length(tables_pos, tables_count),
})
}
}

struct SideTableEncodingIdVisitor<'a, 'b:'a, 'tcx:'b> {
struct SideTableEncodingIdVisitor<'a, 'b: 'a, 'tcx: 'b> {
ecx: &'a mut EncodeContext<'b, 'tcx>,
count: usize
count: usize,
}

impl<'a, 'b, 'tcx, 'v> Visitor<'v> for SideTableEncodingIdVisitor<'a, 'b, 'tcx> {
Expand Down Expand Up @@ -114,10 +114,11 @@ pub fn decode_inlined_item<'a, 'tcx>(cdata: &CrateMetadata,

let cnt = ast.id_range.max.as_usize() - ast.id_range.min.as_usize();
let start = tcx.sess.reserve_node_ids(cnt);
let id_ranges = [ast.id_range, IdRange {
min: start,
max: ast::NodeId::new(start.as_usize() + cnt)
}];
let id_ranges = [ast.id_range,
IdRange {
min: start,
max: ast::NodeId::new(start.as_usize() + cnt),
}];

let ii = ast.item.decode((cdata, tcx, id_ranges));
let ii = ast_map::map_decoded_item(&tcx.map,
Expand All @@ -129,7 +130,7 @@ pub fn decode_inlined_item<'a, 'tcx>(cdata: &CrateMetadata,
let item_node_id = match ii {
&InlinedItem::Item(_, ref i) => i.id,
&InlinedItem::TraitItem(_, ref ti) => ti.id,
&InlinedItem::ImplItem(_, ref ii) => ii.id
&InlinedItem::ImplItem(_, ref ii) => ii.id,
};
let inlined_did = tcx.map.local_def_id(item_node_id);
tcx.register_item_type(inlined_did, tcx.lookup_item_type(orig_did));
Expand Down
84 changes: 46 additions & 38 deletions src/librustc_metadata/cstore.rs
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ pub struct ImportedFileMap {
/// The end of this FileMap within the codemap of its original crate
pub original_end_pos: syntax_pos::BytePos,
/// The imported FileMap's representation within the local codemap
pub translated_filemap: Rc<syntax_pos::FileMap>
pub translated_filemap: Rc<syntax_pos::FileMap>,
}

pub struct CrateMetadata {
Expand Down Expand Up @@ -141,21 +141,23 @@ impl CStore {
self.metas.borrow_mut().insert(cnum, data);
}

pub fn iter_crate_data<I>(&self, mut i: I) where
I: FnMut(CrateNum, &Rc<CrateMetadata>),
pub fn iter_crate_data<I>(&self, mut i: I)
where I: FnMut(CrateNum, &Rc<CrateMetadata>)
{
for (&k, v) in self.metas.borrow().iter() {
i(k, v);
}
}

/// Like `iter_crate_data`, but passes source paths (if available) as well.
pub fn iter_crate_data_origins<I>(&self, mut i: I) where
I: FnMut(CrateNum, &CrateMetadata, Option<CrateSource>),
pub fn iter_crate_data_origins<I>(&self, mut i: I)
where I: FnMut(CrateNum, &CrateMetadata, Option<CrateSource>)
{
for (&k, v) in self.metas.borrow().iter() {
let origin = self.opt_used_crate_source(k);
origin.as_ref().map(|cs| { assert!(k == cs.cnum); });
origin.as_ref().map(|cs| {
assert!(k == cs.cnum);
});
i(k, &v, origin);
}
}
Expand All @@ -167,10 +169,12 @@ impl CStore {
}
}

pub fn opt_used_crate_source(&self, cnum: CrateNum)
-> Option<CrateSource> {
self.used_crate_sources.borrow_mut()
.iter().find(|source| source.cnum == cnum).cloned()
pub fn opt_used_crate_source(&self, cnum: CrateNum) -> Option<CrateSource> {
self.used_crate_sources
.borrow_mut()
.iter()
.find(|source| source.cnum == cnum)
.cloned()
}

pub fn reset(&self) {
Expand All @@ -182,19 +186,17 @@ impl CStore {
self.statically_included_foreign_items.borrow_mut().clear();
}

pub fn crate_dependencies_in_rpo(&self, krate: CrateNum) -> Vec<CrateNum>
{
pub fn crate_dependencies_in_rpo(&self, krate: CrateNum) -> Vec<CrateNum> {
let mut ordering = Vec::new();
self.push_dependencies_in_postorder(&mut ordering, krate);
ordering.reverse();
ordering
}

pub fn push_dependencies_in_postorder(&self,
ordering: &mut Vec<CrateNum>,
krate: CrateNum)
{
if ordering.contains(&krate) { return }
pub fn push_dependencies_in_postorder(&self, ordering: &mut Vec<CrateNum>, krate: CrateNum) {
if ordering.contains(&krate) {
return;
}

let data = self.get_crate_data(krate);
for &dep in data.cnum_map.borrow().iter() {
Expand All @@ -215,20 +217,25 @@ impl CStore {
// In order to get this left-to-right dependency ordering, we perform a
// topological sort of all crates putting the leaves at the right-most
// positions.
pub fn do_get_used_crates(&self, prefer: LinkagePreference)
pub fn do_get_used_crates(&self,
prefer: LinkagePreference)
-> Vec<(CrateNum, Option<PathBuf>)> {
let mut ordering = Vec::new();
for (&num, _) in self.metas.borrow().iter() {
self.push_dependencies_in_postorder(&mut ordering, num);
}
info!("topological ordering: {:?}", ordering);
ordering.reverse();
let mut libs = self.used_crate_sources.borrow()
let mut libs = self.used_crate_sources
.borrow()
.iter()
.map(|src| (src.cnum, match prefer {
LinkagePreference::RequireDynamic => src.dylib.clone().map(|p| p.0),
LinkagePreference::RequireStatic => src.rlib.clone().map(|p| p.0),
}))
.map(|src| {
(src.cnum,
match prefer {
LinkagePreference::RequireDynamic => src.dylib.clone().map(|p| p.0),
LinkagePreference::RequireStatic => src.rlib.clone().map(|p| p.0),
})
})
.collect::<Vec<_>>();
libs.sort_by(|&(a, _), &(b, _)| {
let a = ordering.iter().position(|x| *x == a);
Expand All @@ -243,9 +250,7 @@ impl CStore {
self.used_libraries.borrow_mut().push((lib, kind));
}

pub fn get_used_libraries<'a>(&'a self)
-> &'a RefCell<Vec<(String,
NativeLibraryKind)>> {
pub fn get_used_libraries<'a>(&'a self) -> &'a RefCell<Vec<(String, NativeLibraryKind)>> {
&self.used_libraries
}

Expand All @@ -255,13 +260,11 @@ impl CStore {
}
}

pub fn get_used_link_args<'a>(&'a self) -> &'a RefCell<Vec<String> > {
pub fn get_used_link_args<'a>(&'a self) -> &'a RefCell<Vec<String>> {
&self.used_link_args
}

pub fn add_extern_mod_stmt_cnum(&self,
emod_id: ast::NodeId,
cnum: CrateNum) {
pub fn add_extern_mod_stmt_cnum(&self, emod_id: ast::NodeId, cnum: CrateNum) {
self.extern_mod_crate_map.borrow_mut().insert(emod_id, cnum);
}

Expand All @@ -273,8 +276,7 @@ impl CStore {
self.statically_included_foreign_items.borrow().contains(&id)
}

pub fn do_extern_mod_stmt_cnum(&self, emod_id: ast::NodeId) -> Option<CrateNum>
{
pub fn do_extern_mod_stmt_cnum(&self, emod_id: ast::NodeId) -> Option<CrateNum> {
self.extern_mod_crate_map.borrow().get(&emod_id).cloned()
}

Expand All @@ -288,14 +290,20 @@ impl CStore {
}

impl CrateMetadata {
pub fn name(&self) -> &str { &self.root.name }
pub fn hash(&self) -> Svh { self.root.hash }
pub fn disambiguator(&self) -> &str { &self.root.disambiguator }
pub fn name(&self) -> &str {
&self.root.name
}
pub fn hash(&self) -> Svh {
self.root.hash
}
pub fn disambiguator(&self) -> &str {
&self.root.disambiguator
}

pub fn is_staged_api(&self) -> bool {
self.get_item_attrs(CRATE_DEF_INDEX).iter().any(|attr| {
attr.name() == "stable" || attr.name() == "unstable"
})
self.get_item_attrs(CRATE_DEF_INDEX)
.iter()
.any(|attr| attr.name() == "stable" || attr.name() == "unstable")
}

pub fn is_allocator(&self) -> bool {
Expand Down
Loading