-
-
Notifications
You must be signed in to change notification settings - Fork 638
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simplified function memoization #1837
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -6,6 +6,7 @@ | |
package javaslang; | ||
|
||
import java.io.Serializable; | ||
import java.util.Map; | ||
|
||
/** | ||
* This is a general definition of a (checked/unchecked) function of unknown parameters and a return type R. | ||
|
@@ -74,5 +75,17 @@ default boolean isMemoized() { | |
* Zero Abstract Method (ZAM) interface for marking functions as memoized using intersection types. | ||
*/ | ||
interface Memoized { | ||
static <T extends Tuple, R> R of(Map<T, R> cache, T key, Function1<T, R> tupled) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think this is a good solution, we can do that. I currently have not a better solution for the synching problem. It is hard to test. I'm not that familiar with proper cache invalidation (Weak/Soft-Refs) but we need to keep the tuples in the cache as long as the memoized function is referenced. |
||
synchronized (cache) { | ||
if (cache.containsKey(key)) { | ||
return cache.get(key); | ||
} else { | ||
final R value = tupled.apply(key); | ||
cache.put(key, value); | ||
return value; | ||
} | ||
} | ||
} | ||
|
||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
new WeakHashMap<>();
to avoid memory leaks?@danieldietrich, @ruslansennov, @zsolt-donca?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about this scenario?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that's a concern, as weak references can be eliminated at any time. The problem is even more problematic for
FunctionN
whereN ≥ 2
, as the key (tuple) is built internally, immediately discarding any other reference to it.The above problem with the GC prematurely dropping the data could be eliminated by using soft references; for details, see this:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@paplorinc @zsolt-donca I follow @ruslansennov that the function should not compute a value twice (imagine Math::random). Not sure how we can ensure that a memoized function in GC'ed if not referenced any more but one of its cached values is still in use 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think memory out of bounds exception is worse, but I'm ok with both :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will merge this version in. Memory leaks may also appear in other cases, e.g. when capturing the outer object in a lambda. Optimizations need to be done carefully. Actually we have a working version, which is great. Thx!