Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pathUtils | fix memory leak #418

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

Galcarmi
Copy link

@Galcarmi Galcarmi commented Apr 30, 2024

Closes #414

Copy link

linux-foundation-easycla bot commented Apr 30, 2024

CLA Not Signed

@@ -14,6 +14,7 @@
},
"dependencies": {
"graceful-fs": "^4.2.4",
"lru-cache": "^10.2.2",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need this big package here, you can rewrite this package and put it inside project, because we use only the max value

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'll implement it and update

@@ -6,6 +6,7 @@
"use strict";

const path = require("path");
const { LRUCache } = require("lru-cache");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's use your implementation here and remove this package from here

Copy link
Member

@alexander-akait alexander-akait left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also there is simple and fast example:

class LRU {
    constructor(max = 2048) {
        this.max = max;
        this.cache = new Map();
    }

    get(key) {
        let item = this.cache.get(key);
        if (item !== undefined) {
            // refresh key
            this.cache.delete(key);
            this.cache.set(key, item);
        }
        return item;
    }

    set(key, val) {
        // refresh key
        if (this.cache.has(key)) this.cache.delete(key);
        // evict oldest
        else if (this.cache.size === this.max) this.cache.delete(this.first());
        this.cache.set(key, val);
    }

    first() {
        return this.cache.keys().next().value;
    }
}

@alexander-akait
Copy link
Member

@vankop What do you think about it? Yeah, we have a memory leak here, if you run develpment your application very long our cache will be bigger and bigger, so I think using lru cache is good idea, but I don't know what is will be better number for max

@vankop
Copy link
Member

vankop commented May 8, 2024

I personally think we dont need cache here.. maybe at first implementation there was some specific usage for resolver.join, but right now it used in a lot of cases, sometimes even on each plugin call. ( see JoinPlugins ) So probably cache hit is very low. basically happen only on the same resolving, this happens quite rarely on incremental builds

@vankop
Copy link
Member

vankop commented May 8, 2024

on cold builds it does not make sense either since very low cache hit

@Galcarmi
Copy link
Author

Galcarmi commented May 9, 2024

@alexander-akait sorry I'm a bit busy, I'll address it at the weekend,

@vankop so do you recommend to just remove it instead?

@alexander-akait
Copy link
Member

@vankop Can you test it on some repo and check it?

@vankop
Copy link
Member

vankop commented May 29, 2024

we have a metrics repo, we just need to setup cache hits there...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Priority - High
Development

Successfully merging this pull request may close these issues.

pathUtils | joinCache memory leak
4 participants