Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

replace lru-cache dependency #697

Closed
wants to merge 8 commits into from
Closed

replace lru-cache dependency #697

wants to merge 8 commits into from

Conversation

mbtools
Copy link
Contributor

@mbtools mbtools commented Apr 11, 2024

This change removes the dependency on the lru-cache package. It is replaced by a comparable, but much simpler class in this package. The cache is still set at max 1000 entries and works the same way as before.

Performance looks to be very similar. However, it needs a realistic test case that pushes the limits of the cache. I will leave it to more experts to judge.

References

#447, #448
#695
npm/cli#7350

@mbtools mbtools requested a review from a team as a code owner April 11, 2024 19:38
@wraithgar
Copy link
Member

Normally we try not to ask too much of contributors in the way of commits, and this is one of the reasons why we typically don't accept dependency change PRs. For those we'd prefer if the removal of lru-cache was a discrete deps: remove lru-cache commit so that it shows up in the changelog.

If it's not too much to ask can you make that package.json change a single commit w/ an appropriate commit message? Everything else you can keep in a second commit with a fix: prefix.

@wraithgar wraithgar changed the title chore: replace lru-cache dependency replace lru-cache dependency Apr 11, 2024
@wraithgar
Copy link
Member

Our PR linter will pass if the PR subject or every commit has a valid conventional commit message. I've purposely made the PR subject a non-conventional one to reflect the fact that we intend to rebase-merge this PR with the deps and fix commits.

classes/lrucache.js Outdated Show resolved Hide resolved
classes/lrucache.js Outdated Show resolved Hide resolved
@mbtools mbtools mentioned this pull request Apr 11, 2024
@wraithgar
Copy link
Member

Sorry for the confusion on the commit messages. If this is more than you want to do go ahead and get it all in here however you want and then we can get this PR green and we'll branch off of it and get the commits right.

internal/lrucache.js Outdated Show resolved Hide resolved
@mbtools
Copy link
Contributor Author

mbtools commented Apr 12, 2024

Back to fully working. I will let you create a new PR with two commits as required. Thx!

@wraithgar
Copy link
Member

Thank you! We'll let this sit for a bit for the community to weigh in, and we need some time to properly review it ourselves (we are in the middle of a large refactor in npm itself).

Copy link
Member

@lukekarrys lukekarrys left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm in favor of this PR and converting our lru-cache to an internal dependency.

If we do land this, I think we should remove any functionality that we aren't currently using in semver. Looking at the code that is only cache.set() and cache.get() (and set will call delete()). All my specific comments are about removing things. This is just my opinion and will let others weigh in first. So no need for anyone to make all these changes immediately 😄

internal/lrucache.js Outdated Show resolved Hide resolved
internal/lrucache.js Outdated Show resolved Hide resolved
internal/lrucache.js Outdated Show resolved Hide resolved
internal/lrucache.js Outdated Show resolved Hide resolved
internal/lrucache.js Outdated Show resolved Hide resolved
@wraithgar
Copy link
Member

I'm in favor of the same @lukekarrys. I would like to see us get to a point where we're not testing semver's internal cache directly, but only as a side effect. That is, only testing it exactly how it's used during normal semver operations. Unit testing the module itself isn't that useful, all it does is show we HAVE a working cache, not that it's used by semver properly.

@lukekarrys lukekarrys changed the title replace lru-cache dependency feat: replace lru-cache dependency Apr 15, 2024
@wraithgar wraithgar changed the title feat: replace lru-cache dependency replace lru-cache dependency Apr 15, 2024
@wraithgar
Copy link
Member

wraithgar commented Apr 15, 2024

@lukekarrys the title of this PR is intentionally not conventional so that we remember to rebase it into two commits: one for the deps change and one for the feat

@mbtools
Copy link
Contributor Author

mbtools commented Apr 15, 2024

Sounds good. We can certainly reduce the cache functions to a minimum. I just had them in there to prove to myself that it's working like lru-cache.

If we're convinced the cache is fine, I suggest removing entries(), capacity(), the snapshots, and the max type check. But keep has() and size() so that the basic operations test still completes and we have 100% coverage.

The cache is transparent to the outside of range parse. Showing that semver uses it properly, is / should be visible only using runtime measurements. Not sure how you prefer to implement those.

@wraithgar
Copy link
Member

wraithgar commented Apr 16, 2024

The cache is transparent to the outside of range parse. Showing that semver uses it properly, is / should be visible only using runtime measurements. Not sure how you prefer to implement those.

Here's a proof of concept showing how we could assure that subsequent new Range instances get back the same range set.

const Semver = require('.')
const cached = Symbol('cached')
const r1 = new Semver.Range('1.0.0')
r1.set[0][cached] = true
const r2 = new Semver.Range('1.0.0')
console.log(r1.set[0][cached]) // Will be true
console.log(r2.set[0][cached]) // Will be true, showing it's cached.

We can test this all inside a t.test('cache') inside the range class tests.

@mbtools
Copy link
Contributor Author

mbtools commented Apr 17, 2024

  • Removed unnecessary cache methods & tests
  • Kept basic cache test to ensure 100% coverage
  • Added a test to range class as suggested

@wraithgar
Copy link
Member

Kept basic cache test to ensure 100% coverage

This still runs the risk that we have code in there we never use and don't need. If we aren't using the code from range we don't need to test it.

I think the cache size is the only thing left right? Can we loop through 1001 versions to test that the first one dropped out of the cache? Can the max size be hard set in lru-cache for now? There's no need to pass it in if we never set it to anything else.

@lukekarrys
Copy link
Member

@wraithgar We still use a coverage map in this repo. I'm think that will require testing the cache directly to get full coverage, right?

@wraithgar
Copy link
Member

We can edit that file.

@wraithgar
Copy link
Member

Here's an example of how we did it in npm itself when we wanted to test a component of all the commands in a new test file:

https://github.com/npm/cli/blob/21b823edf7b3095ce1f53da7befda2b41296afe4/test/coverage-map.js

@wraithgar
Copy link
Member

I think that we should fix the tests but also not block this PR on "perfect" code. I'm good w/ this as-is (pending an actual review of the code at this point). We can clean it up on our own time.

@wraithgar
Copy link
Member

We have not forgotten about this PR. The big npm display layer refactor is almost done and then we can come back to things like this PR.

@wraithgar
Copy link
Member

#704

@wraithgar wraithgar closed this May 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants