Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use LRU cache in useragent.lookup so rarely used useragents will not be in the memory forever. #22

Closed
tellnes opened this issue Jan 10, 2013 · 6 comments
Labels

Comments

@tellnes
Copy link

tellnes commented Jan 10, 2013

Great library.

You can easily replace the dictionary hash used by useragent.lookup with eg. lru-cache to resolve this. Great if the options for lru-cache are exposed to configuration in useragent.

@3rd-Eden
Copy link
Owner

Ha,

Same request was made against my hashring lib to 3rd-Eden/node-hashring#8 today. I'll see what i can do..
What would be a reasonable cache limit 10K (as these
Strings can differ a lot)

On Jan 10, 2013, at 2:14 AM, Christian Tellnes notifications@github.com wrote:

Great library.

You can easily replace the dictionary hash used by useragent.lookup with eg. lru-cache to resolve this. Great if the options for lru-cache are exposed to configuration in useragent.


Reply to this email directly or view it on GitHub.

@tellnes
Copy link
Author

tellnes commented Jan 10, 2013

I do not know, you should probably run some tests.
The important thing is that the number is not infinite.

@3rd-Eden
Copy link
Owner

I have been thinking of it, and i think that a LFU might be more suitable for this due the wide arrange of different user agent strings that could invalid the the cache if it's set to low. It comes with a bit high performance penalty then a LRU but it might be worth it.

@hankmander
Copy link

This would be a very good feature. I would use it :)

@3rd-Eden
Copy link
Owner

LRU caching greatly reduces the performance of the lookup

Executed benchmark against node module: "useragent.lookup"
Count (11266), Cycles (3), Elapsed (5.405), Hz (221424.00861640708)

VS with LRU:

Executed benchmark against node module: "useragent.lookup"
Count (1064), Cycles (2), Elapsed (5.421), Hz (16679.68311019655)

Which is quite a big overhead, but it's still faster then a regular parse.

@3rd-Eden
Copy link
Owner

Fixed in 2.0.2, I'm assuming that people would rather see memory leak prevention instead of increased performance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants