Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supporting references #64

Open
szunami opened this issue Nov 30, 2020 · 5 comments
Open

Supporting references #64

szunami opened this issue Nov 30, 2020 · 5 comments

Comments

@szunami
Copy link

szunami commented Nov 30, 2020

Hi! I recently picked up cached and it enabled me to very quickly set up some caches for hot functions. I think this project is awesome.

As I began to optimize the rest of my codebase, it quickly became clear that cloning cached values was a bottleneck (mostly just the memory allocation, I was storing Vec<String>). After some thought, I realized that my use case could live off of references to the cached values.

I believed that cached doesn't support this sort of behavior; is it something you would consider adding? I am happy to help with implementation / flesh out the proposal a bit if it seems in scope for this project.

Thanks for building this in the first place!

@jaemk
Copy link
Owner

jaemk commented Dec 7, 2020

This is tricky since the caches generated by the macros will always need to be at the global / static level. I'll need to sit on this for a little bit while I think about it - a possible solution could be to have a version/option of the macro which requires the cached-function to always be passed an explicit cache instance to use, something looking like:

#[cached(use_argument_cache="my_cache")]
fn calculate(my_cache: &mut SizedCache<&str, usize>, input: &str) -> usize {
    input.len()
}
let mut cache = SizedCache::with_size(200);
let res = calculate(&mut cache, "big");

But this feels like something that should be done with just a helper function that copies the macro's inner logic

fn calculate(input: &str) -> usize {
    input.len()
}

let mut cache = SizedCache::with_size(200);
let res = cached::with_cache(&mut cache, calculate, "input_str");
// except I guess it's gotta be a macro to support variadic args
let res = cached::with_cache!(&mut cache, calculate, "input_str");

Each of these would also require relaxing the constraints of the K type to not require Clone

@szunami
Copy link
Author

szunami commented Dec 7, 2020

Gotcha, yeah I figured there were good reasons for the current constraints / approach.

Can you clarify what challenges arise because the cache is static? I don't mean to disagree, I just don't have much experience in working with static variables in rust.

@jaemk
Copy link
Owner

jaemk commented Dec 10, 2020

The macros basically expand to

lazy_static! {
    static CACHE: Arc<Mutex<HashMap<K, V>>> = ...;
}

fn func(input: K) -> V {
    if let Some(v) = CACHE.lock().unwrap().get(&input) {
        return v.clone()
    }
    ... run logic & set value
}

So since the cache will outlive the function inputs and outputs, they need to be things that the cache can "own" and hand out .clone()s of. The Clone requirement is necessary for the cached macros, but it's not strictly necessary for the actual cache stores. Two of the store types do list Clone as a requirement, but I believe we can remove them (partly thanks to #59 ) so you can use the stores directly (without the macros) more freely.

@Pzixel
Copy link

Pzixel commented Jan 14, 2021

You can use Arc<RwLock<HashMap<K, Arc<V>>>> to represent references. It's not ideal but it will work. Also RwLock is probably a better choice than a mutex

@NobodyXu
Copy link

@jaemk Maybe T: ‘static can be used, which requires T to be either owned or contains only reference to’static data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants