-
Notifications
You must be signed in to change notification settings - Fork 576
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(cache): add memoize()
and LruCache
#4725
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #4725 +/- ##
==========================================
+ Coverage 92.04% 92.05% +0.01%
==========================================
Files 487 489 +2
Lines 41535 41671 +136
Branches 5365 5393 +28
==========================================
+ Hits 38230 38360 +130
- Misses 3247 3252 +5
- Partials 58 59 +1 ☔ View full report in Codecov by Sentry. |
This sounds fine to me. |
set: (key: K, val: V) => unknown; | ||
delete: (key: K) => unknown; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about aligning these types with Map<K, V>
?
set: (key: K, val: V) => unknown; | |
delete: (key: K) => unknown; | |
set: (key: K, val: V) => this; | |
delete: (key: K) => boolean; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or maybe how about using Map compatible type for cache
option, like Map<unknown, ReturnType<Fn>>
? Then the users can get cache size like memoized.cache.size
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or maybe how about using Map compatible type for
cache
option, likeMap<unknown, ReturnType<Fn>>
? Then the users can get cache size likememoized.cache.size
Map<unknown, ReturnType<Fn>>
is probably too restrictive, as I can't think of any reason to require things like forEach
, clear
, etc. to be implemented given they're not called internally, nor for implementation details like set
returning this
. It'd also place an unnecessary burden on consumers using non-Map-inherited cache implementations to implement new methods if the spec/TS types change.
Good point about size
though, I'll try adding another type param to memoize
to allow for accessing size
and other props/methods where Map-inherited caches are used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair enough. MemoizationCache
type makes sense to me now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Types for memoizedFn.cache
are now correct, but the internal type logic is kinda disgusting, and it's now compulsory to explicitly supply type params for LruCache
, which is less-than-ideal:
const fn = (x: 1) => x;
// good (cache is Map<string, 1>)
memoize(fn);
// good (cache is Map<string, 1>)
memoize(fn, {});
// good (cache is Map<2, 1>)
memoize(fn, { getKey() { return 2 as const } });
// ok-ish but cache is now Map<any, any>
memoize(fn, {
cache: new Map(),
});
// TS error "Type 'LruCache<unknown, unknown>' is not assignable to type 'MemoizationCache<string, 1>'."
memoize(fn, {
cache: new LruCache(99),
});
// ok but more verbose
memoize(fn, {
cache: new LruCache<string, 1>(99),
});
Any ideas on how to fix this without breaking type inference elsewhere would be much appreciated. Fixing it is like whack-a-mole — everything I've tried seems to either result in Key
type becoming string
where it should be something else (i.e. getKey
's return type is something other than string, yet Key
is inferred as string
), types getting widened to any
, unknown
s refusing to narrow to something more specific, LruCache<K, V>
s getting widened to MemoizationCache<K, V>
, etc.
memoize()
and LruCache
Closes #4608
Some questions:
LruCache
? Currently, all three ofget
,set
, andhas
are tracked. In other words, when deleting the "least-recently-used", it means deleting the entry that was least-recently touched by any ofget
,set
, orhas
.memoize(fn)(arg)
to be memoized, with the following gotcha: