You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, joblib caching allows numpy arrays to be accessed as numpy.memmap objects.
However, these numpy arrays first need to be created. In create to work with larger-than-memory arrays, I tried writing and returning a numpy.memmap object from my function. This worked even after I delete my memmap file. However:
I am not sure if joblib caching loads the entire array into memory anyway.
My data is written once to my memmap file and again to the joblib cache, which reduces performance and wears down storage.
Currently, joblib caching allows numpy arrays to be accessed as numpy.memmap objects.
However, these numpy arrays first need to be created. In create to work with larger-than-memory arrays, I tried writing and returning a numpy.memmap object from my function. This worked even after I delete my memmap file. However:
Is there some clean way to allow joblib to just copy that memmap file to its cache or otherwise efficiently cache this type of function?
The text was updated successfully, but these errors were encountered: