How to cache possibly-huge data? (100MBs) #7285
Unanswered
justingrant
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Our React web app is a B2B SaaS tool that analyzes datasets that can sometimes be 100+ megabytes in size. (Our customers are downloading over gigabit internet connections.)
We'd like to cache these datasets, but under no circumstances is it OK to crash the browser tab because the react-query cache is holding onto too much data. Instead, we'd like to evict this data from the cache when our app is under memory pressure. Kinda like what a WeakMap does, so users with gobs of RAM can have a fast experience and users with less RAM at least won't crash their browser tabs.
Is there a way to do this with TanStack query, i.e. to provide a best-effort level of memory caching?
Related question: how does cache persistence relate to RAM caching? Will data be held either in RAM or in the persisted cache, but not both? Or is data held in RAM and persisted? If the latter, when is it kicked out of RAM?
Finally, if you had to cache multi-hundred-megabyte datasets in a browser app, how would you do it? It may be that a RAM cache is simply the wrong tool for this job.
Beta Was this translation helpful? Give feedback.
All reactions