Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(cache): add memoize() and LruCache #4725

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

lionel-rowe
Copy link
Contributor

@lionel-rowe lionel-rowe commented May 14, 2024

Closes #4608

Some questions:

  • Which methods should count as accessing/using an entry stored in an LruCache? Currently, all three of get, set, and has are tracked. In other words, when deleting the "least-recently-used", it means deleting the entry that was least-recently touched by any of get, set, or has.
  • Is "on-the-fly" usage even worth supporting? It adds a small amount of complexity to the implementation and allows memoize(fn)(arg) to be memoized, with the following gotcha:
const fn = (n: number) => n + 1

// ✅ reference to function
const fn1 = memoize(fn)
const result1 = fn1(42)

// ✅ function literal
const fn2 = memoize((n: number) => n + 1)
const result2 = fn2(42)

// ✅ on-the-fly with reference to function
const result3 = memoize(fn)(42)

// ⚠️ on-the-fly with function literal - fails to memoize
const result4 = memoize((n: number) => n + 1)(42)

@lionel-rowe lionel-rowe requested a review from kt3k as a code owner May 14, 2024 08:19
Copy link

codecov bot commented May 14, 2024

Codecov Report

Attention: Patch coverage is 95.58824% with 6 lines in your changes are missing coverage. Please review.

Project coverage is 92.05%. Comparing base (860cf0e) to head (83ee466).

Files Patch % Lines
cache/lru_cache.ts 92.10% 3 Missing ⚠️
cache/memoize.ts 96.93% 2 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4725      +/-   ##
==========================================
+ Coverage   92.04%   92.05%   +0.01%     
==========================================
  Files         487      489       +2     
  Lines       41535    41671     +136     
  Branches     5365     5393      +28     
==========================================
+ Hits        38230    38360     +130     
- Misses       3247     3252       +5     
- Partials       58       59       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
@kt3k
Copy link
Member

kt3k commented May 14, 2024

Currently, all three of get, set, and has are tracked.

This sounds fine to me.

cache/deno.json Outdated Show resolved Hide resolved
Comment on lines +6 to +7
set: (key: K, val: V) => unknown;
delete: (key: K) => unknown;
Copy link
Member

@kt3k kt3k May 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about aligning these types with Map<K, V>?

Suggested change
set: (key: K, val: V) => unknown;
delete: (key: K) => unknown;
set: (key: K, val: V) => this;
delete: (key: K) => boolean;

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe how about using Map compatible type for cache option, like Map<unknown, ReturnType<Fn>>? Then the users can get cache size like memoized.cache.size

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe how about using Map compatible type for cache option, like Map<unknown, ReturnType<Fn>>? Then the users can get cache size like memoized.cache.size

Map<unknown, ReturnType<Fn>> is probably too restrictive, as I can't think of any reason to require things like forEach, clear, etc. to be implemented given they're not called internally, nor for implementation details like set returning this. It'd also place an unnecessary burden on consumers using non-Map-inherited cache implementations to implement new methods if the spec/TS types change.

Good point about size though, I'll try adding another type param to memoize to allow for accessing size and other props/methods where Map-inherited caches are used.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough. MemoizationCache type makes sense to me now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Types for memoizedFn.cache are now correct, but the internal type logic is kinda disgusting, and it's now compulsory to explicitly supply type params for LruCache, which is less-than-ideal:

const fn = (x: 1) => x;

// good (cache is Map<string, 1>)
memoize(fn);

// good (cache is Map<string, 1>)
memoize(fn, {});

// good (cache is Map<2, 1>)
memoize(fn, { getKey() { return 2 as const } });

// ok-ish but cache is now Map<any, any>
memoize(fn, {
  cache: new Map(),
});

// TS error "Type 'LruCache<unknown, unknown>' is not assignable to type 'MemoizationCache<string, 1>'."
memoize(fn, {
  cache: new LruCache(99),
});

// ok but more verbose
memoize(fn, {
  cache: new LruCache<string, 1>(99),
});

Any ideas on how to fix this without breaking type inference elsewhere would be much appreciated. Fixing it is like whack-a-mole — everything I've tried seems to either result in Key type becoming string where it should be something else (i.e. getKey's return type is something other than string, yet Key is inferred as string), types getting widened to any, unknowns refusing to narrow to something more specific, LruCache<K, V>s getting widened to MemoizationCache<K, V>, etc.

cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
@iuioiua iuioiua changed the title feat(cache): add memoize and LruCache feat(cache): add memoize() and LruCache May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

suggestion: add cache package
2 participants