Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazily allocate TypedArena's first chunk #36592

Merged
merged 1 commit into from
Sep 22, 2016
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
93 changes: 55 additions & 38 deletions src/libarena/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,8 @@
//! of individual objects while the arena itself is still alive. The benefit
//! of an arena is very fast allocation; just a pointer bump.
//!
//! This crate has two arenas implemented: `TypedArena`, which is a simpler
//! arena but can only hold objects of a single type, and `Arena`, which is a
//! more complex, slower arena which can hold objects of any type.
//! This crate implements `TypedArena`, a simple arena that can only hold
//! objects of a single type.

#![crate_name = "arena"]
#![unstable(feature = "rustc_private", issue = "27812")]
Expand Down Expand Up @@ -51,16 +50,19 @@ use std::ptr;
use alloc::heap;
use alloc::raw_vec::RawVec;

/// A faster arena that can hold objects of only one type.
/// An arena that can hold objects of only one type.
pub struct TypedArena<T> {
/// The capacity of the first chunk (once it is allocated).
first_chunk_capacity: usize,

/// A pointer to the next object to be allocated.
ptr: Cell<*mut T>,

/// A pointer to the end of the allocated area. When this pointer is
/// reached, a new chunk is allocated.
end: Cell<*mut T>,

/// A vector arena segments.
/// A vector of arena chunks.
chunks: RefCell<Vec<TypedArenaChunk<T>>>,

/// Marker indicating that dropping the arena causes its owned
Expand All @@ -69,7 +71,7 @@ pub struct TypedArena<T> {
}

struct TypedArenaChunk<T> {
/// Pointer to the next arena segment.
/// The raw storage for the arena chunk.
storage: RawVec<T>,
}

Expand Down Expand Up @@ -117,26 +119,26 @@ impl<T> TypedArenaChunk<T> {
const PAGE: usize = 4096;

impl<T> TypedArena<T> {
/// Creates a new `TypedArena` with preallocated space for many objects.
/// Creates a new `TypedArena`.
#[inline]
pub fn new() -> TypedArena<T> {
// Reserve at least one page.
let elem_size = cmp::max(1, mem::size_of::<T>());
TypedArena::with_capacity(PAGE / elem_size)
}

/// Creates a new `TypedArena` with preallocated space for the given number of
/// objects.
/// Creates a new `TypedArena`. Each chunk used within the arena will have
/// space for at least the given number of objects.
#[inline]
pub fn with_capacity(capacity: usize) -> TypedArena<T> {
unsafe {
let chunk = TypedArenaChunk::<T>::new(cmp::max(1, capacity));
TypedArena {
ptr: Cell::new(chunk.start()),
end: Cell::new(chunk.end()),
chunks: RefCell::new(vec![chunk]),
_own: PhantomData,
}
TypedArena {
first_chunk_capacity: cmp::max(1, capacity),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If with_capacity isn't used, I think it'd be worth just not having first_chunk_capacity around at all.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion. I'll file a follow-up PR to remove with_capacity once this one lands.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, this PR would be simpler if it also did that change, I'm saying. I'd r+ it immediately and this PR will have to wait at least half a day more before getting merged, so you have time now.

// We set both `ptr` and `end` to 0 so that the first call to
// alloc() will trigger a grow().
ptr: Cell::new(0 as *mut T),
end: Cell::new(0 as *mut T),
chunks: RefCell::new(vec![]),
_own: PhantomData,
}
}

Expand Down Expand Up @@ -171,29 +173,37 @@ impl<T> TypedArena<T> {
fn grow(&self) {
unsafe {
let mut chunks = self.chunks.borrow_mut();
let prev_capacity = chunks.last().unwrap().storage.cap();
let new_capacity = prev_capacity.checked_mul(2).unwrap();
if chunks.last_mut().unwrap().storage.double_in_place() {
self.end.set(chunks.last().unwrap().end());
let (chunk, new_capacity);
if let Some(last_chunk) = chunks.last_mut() {
if last_chunk.storage.double_in_place() {
self.end.set(last_chunk.end());
return;
} else {
let prev_capacity = last_chunk.storage.cap();
new_capacity = prev_capacity.checked_mul(2).unwrap();
}
} else {
let chunk = TypedArenaChunk::<T>::new(new_capacity);
self.ptr.set(chunk.start());
self.end.set(chunk.end());
chunks.push(chunk);
new_capacity = self.first_chunk_capacity;
}
chunk = TypedArenaChunk::<T>::new(new_capacity);
self.ptr.set(chunk.start());
self.end.set(chunk.end());
chunks.push(chunk);
}
}
/// Clears the arena. Deallocates all but the longest chunk which may be reused.
pub fn clear(&mut self) {
unsafe {
// Clear the last chunk, which is partially filled.
let mut chunks_borrow = self.chunks.borrow_mut();
let last_idx = chunks_borrow.len() - 1;
self.clear_last_chunk(&mut chunks_borrow[last_idx]);
// If `T` is ZST, code below has no effect.
for mut chunk in chunks_borrow.drain(..last_idx) {
let cap = chunk.storage.cap();
chunk.destroy(cap);
if let Some(mut last_chunk) = chunks_borrow.pop() {
self.clear_last_chunk(&mut last_chunk);
// If `T` is ZST, code below has no effect.
for mut chunk in chunks_borrow.drain(..) {
let cap = chunk.storage.cap();
chunk.destroy(cap);
}
chunks_borrow.push(last_chunk);
}
}
}
Expand Down Expand Up @@ -230,13 +240,14 @@ impl<T> Drop for TypedArena<T> {
unsafe {
// Determine how much was filled.
let mut chunks_borrow = self.chunks.borrow_mut();
let mut last_chunk = chunks_borrow.pop().unwrap();
// Drop the contents of the last chunk.
self.clear_last_chunk(&mut last_chunk);
// The last chunk will be dropped. Destroy all other chunks.
for chunk in chunks_borrow.iter_mut() {
let cap = chunk.storage.cap();
chunk.destroy(cap);
if let Some(mut last_chunk) = chunks_borrow.pop() {
// Drop the contents of the last chunk.
self.clear_last_chunk(&mut last_chunk);
// The last chunk will be dropped. Destroy all other chunks.
for chunk in chunks_borrow.iter_mut() {
let cap = chunk.storage.cap();
chunk.destroy(cap);
}
}
// RawVec handles deallocation of `last_chunk` and `self.chunks`.
}
Expand All @@ -260,6 +271,12 @@ mod tests {
z: i32,
}

#[test]
pub fn test_unused() {
let arena: TypedArena<Point> = TypedArena::new();
assert!(arena.chunks.borrow().is_empty());
}

#[test]
fn test_arena_alloc_nested() {
struct Inner {
Expand Down