Skip to content

Commit 1685c92

Browse files
committed
Auto merge of #42727 - alexcrichton:allocators-new, r=eddyb
rustc: Implement the #[global_allocator] attribute This PR is an implementation of [RFC 1974] which specifies a new method of defining a global allocator for a program. This obsoletes the old `#![allocator]` attribute and also removes support for it. [RFC 1974]: rust-lang/rfcs#1974 The new `#[global_allocator]` attribute solves many issues encountered with the `#![allocator]` attribute such as composition and restrictions on the crate graph itself. The compiler now has much more control over the ABI of the allocator and how it's implemented, allowing much more freedom in terms of how this feature is implemented. cc #27389
2 parents 4d526e0 + 695dee0 commit 1685c92

File tree

115 files changed

+2828
-1169
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

115 files changed

+2828
-1169
lines changed

src/Cargo.lock

+18
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
# `allocator_internals`
2+
3+
This feature does not have a tracking issue, it is an unstable implementation
4+
detail of the `global_allocator` feature not intended for use outside the
5+
compiler.
6+
7+
------------------------

src/doc/unstable-book/src/language-features/allocator.md

-119
This file was deleted.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# `global_allocator`
2+
3+
The tracking issue for this feature is: [#27389]
4+
5+
[#27389]: https://github.com/rust-lang/rust/issues/27389
6+
7+
------------------------
8+
9+
Rust programs may need to change the allocator that they're running with from
10+
time to time. This use case is distinct from an allocator-per-collection (e.g. a
11+
`Vec` with a custom allocator) and instead is more related to changing the
12+
global default allocator, e.g. what `Vec<T>` uses by default.
13+
14+
Currently Rust programs don't have a specified global allocator. The compiler
15+
may link to a version of [jemalloc] on some platforms, but this is not
16+
guaranteed. Libraries, however, like cdylibs and staticlibs are guaranteed
17+
to use the "system allocator" which means something like `malloc` on Unixes and
18+
`HeapAlloc` on Windows.
19+
20+
[jemalloc]: https://github.com/jemalloc/jemalloc
21+
22+
The `#[global_allocator]` attribute, however, allows configuring this choice.
23+
You can use this to implement a completely custom global allocator to route all
24+
default allocation requests to a custom object. Defined in [RFC 1974] usage
25+
looks like:
26+
27+
[RFC 1974]: https://github.com/rust-lang/rfcs/pull/1974
28+
29+
```rust
30+
#![feature(global_allocator, heap_api)]
31+
32+
use std::heap::{Alloc, System, Layout, AllocErr};
33+
34+
struct MyAllocator;
35+
36+
unsafe impl<'a> Alloc for &'a MyAllocator {
37+
unsafe fn alloc(&mut self, layout: Layout) -> Result<*mut u8, AllocErr> {
38+
System.alloc(layout)
39+
}
40+
41+
unsafe fn dealloc(&mut self, ptr: *mut u8, layout: Layout) {
42+
System.dealloc(ptr, layout)
43+
}
44+
}
45+
46+
#[global_allocator]
47+
static GLOBAL: MyAllocator = MyAllocator;
48+
49+
fn main() {
50+
// This `Vec` will allocate memory through `GLOBAL` above
51+
let mut v = Vec::new();
52+
v.push(1);
53+
}
54+
```
55+
56+
And that's it! The `#[global_allocator]` attribute is applied to a `static`
57+
which implements the `Alloc` trait in the `std::heap` module. Note, though,
58+
that the implementation is defined for `&MyAllocator`, not just `MyAllocator`.
59+
You may wish, however, to also provide `Alloc for MyAllocator` for other use
60+
cases.
61+
62+
A crate can only have one instance of `#[global_allocator]` and this instance
63+
may be loaded through a dependency. For example `#[global_allocator]` above
64+
could have been placed in one of the dependencies loaded through `extern crate`.
65+
66+
Note that `Alloc` itself is an `unsafe` trait, with much documentation on the
67+
trait itself about usage and for implementors. Extra care should be taken when
68+
implementing a global allocator as well as the allocator may be called from many
69+
portions of the standard library, such as the panicking routine. As a result it
70+
is highly recommended to not panic during allocation and work in as many
71+
situations with as few dependencies as possible as well.

src/liballoc/allocator.rs

+21-2
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
slightly, especially to possibly take into account the \
1414
types being stored to make room for a future \
1515
tracing garbage collector",
16-
issue = "27700")]
16+
issue = "32838")]
1717

1818
use core::cmp;
1919
use core::fmt;
@@ -73,6 +73,7 @@ impl Layout {
7373
/// * `size`, when rounded up to the nearest multiple of `align`,
7474
/// must not overflow (i.e. the rounded value must be less than
7575
/// `usize::MAX`).
76+
#[inline]
7677
pub fn from_size_align(size: usize, align: usize) -> Option<Layout> {
7778
if !align.is_power_of_two() {
7879
return None;
@@ -96,13 +97,28 @@ impl Layout {
9697
return None;
9798
}
9899

99-
Some(Layout { size: size, align: align })
100+
unsafe {
101+
Some(Layout::from_size_align_unchecked(size, align))
102+
}
103+
}
104+
105+
/// Creates a layout, bypassing all checks.
106+
///
107+
/// # Unsafety
108+
///
109+
/// This function is unsafe as it does not verify that `align` is a power of
110+
/// two nor that `size` aligned to `align` fits within the address space.
111+
#[inline]
112+
pub unsafe fn from_size_align_unchecked(size: usize, align: usize) -> Layout {
113+
Layout { size: size, align: align }
100114
}
101115

102116
/// The minimum size in bytes for a memory block of this layout.
117+
#[inline]
103118
pub fn size(&self) -> usize { self.size }
104119

105120
/// The minimum byte alignment for a memory block of this layout.
121+
#[inline]
106122
pub fn align(&self) -> usize { self.align }
107123

108124
/// Constructs a `Layout` suitable for holding a value of type `T`.
@@ -135,6 +151,7 @@ impl Layout {
135151
///
136152
/// Panics if the combination of `self.size` and the given `align`
137153
/// violates the conditions listed in `from_size_align`.
154+
#[inline]
138155
pub fn align_to(&self, align: usize) -> Self {
139156
Layout::from_size_align(self.size, cmp::max(self.align, align)).unwrap()
140157
}
@@ -155,6 +172,7 @@ impl Layout {
155172
/// to be less than or equal to the alignment of the starting
156173
/// address for the whole allocated block of memory. One way to
157174
/// satisfy this constraint is to ensure `align <= self.align`.
175+
#[inline]
158176
pub fn padding_needed_for(&self, align: usize) -> usize {
159177
let len = self.size();
160178

@@ -556,6 +574,7 @@ pub unsafe trait Alloc {
556574
/// However, for clients that do not wish to track the capacity
557575
/// returned by `alloc_excess` locally, this method is likely to
558576
/// produce useful results.
577+
#[inline]
559578
fn usable_size(&self, layout: &Layout) -> (usize, usize) {
560579
(layout.size(), layout.size())
561580
}

src/liballoc/arc.rs

+6-4
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,6 @@ use core::sync::atomic::Ordering::{Acquire, Relaxed, Release, SeqCst};
2323
use core::borrow;
2424
use core::fmt;
2525
use core::cmp::Ordering;
26-
use core::mem::{align_of_val, size_of_val};
2726
use core::intrinsics::abort;
2827
use core::mem;
2928
use core::mem::uninitialized;
@@ -34,7 +33,8 @@ use core::marker::Unsize;
3433
use core::hash::{Hash, Hasher};
3534
use core::{isize, usize};
3635
use core::convert::From;
37-
use heap::deallocate;
36+
37+
use heap::{Heap, Alloc, Layout};
3838

3939
/// A soft limit on the amount of references that may be made to an `Arc`.
4040
///
@@ -503,7 +503,7 @@ impl<T: ?Sized> Arc<T> {
503503

504504
if self.inner().weak.fetch_sub(1, Release) == 1 {
505505
atomic::fence(Acquire);
506-
deallocate(ptr as *mut u8, size_of_val(&*ptr), align_of_val(&*ptr))
506+
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
507507
}
508508
}
509509

@@ -1007,7 +1007,9 @@ impl<T: ?Sized> Drop for Weak<T> {
10071007
// ref, which can only happen after the lock is released.
10081008
if self.inner().weak.fetch_sub(1, Release) == 1 {
10091009
atomic::fence(Acquire);
1010-
unsafe { deallocate(ptr as *mut u8, size_of_val(&*ptr), align_of_val(&*ptr)) }
1010+
unsafe {
1011+
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
1012+
}
10111013
}
10121014
}
10131015
}

0 commit comments

Comments
 (0)