Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent spurious cache clearing when calling a cached func inside a Parallel call #1093

Merged
merged 12 commits into from
Aug 4, 2020

Conversation

pierreglaser
Copy link
Contributor

@pierreglaser pierreglaser commented Jul 29, 2020

Fixes #1035

cc @ogrisel - I need to test a few more cases, but it should be reviewable.

joblib/memory.py Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Aug 2, 2020

Codecov Report

Merging #1093 into master will decrease coverage by 0.61%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1093      +/-   ##
==========================================
- Coverage   94.52%   93.91%   -0.62%     
==========================================
  Files          47       47              
  Lines        6910     6955      +45     
==========================================
  Hits         6532     6532              
- Misses        378      423      +45     
Impacted Files Coverage Δ
joblib/func_inspect.py 91.01% <ø> (-0.60%) ⬇️
joblib/memory.py 95.56% <100.00%> (+0.16%) ⬆️
joblib/test/test_memory.py 98.41% <100.00%> (-0.08%) ⬇️
joblib/backports.py 44.73% <0.00%> (-39.48%) ⬇️
joblib/test/test_store_backends.py 91.42% <0.00%> (-5.72%) ⬇️
joblib/_memmapping_reducer.py 94.33% <0.00%> (-2.27%) ⬇️
joblib/test/test_memmapping.py 97.33% <0.00%> (-1.91%) ⬇️
joblib/pool.py 86.17% <0.00%> (-1.63%) ⬇️
joblib/disk.py 90.47% <0.00%> (-1.59%) ⬇️
joblib/_parallel_backends.py 94.92% <0.00%> (-1.57%) ⬇️
... and 4 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update bc8c8cf...4438013. Read the comment docs.

@pierreglaser
Copy link
Contributor Author

@ogrisel feel free to take a look, this is reviewable :)

Copy link
Contributor

@ogrisel ogrisel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

# an environement where the introspection utilitiees get_func_code
# relies on do not work (typicially, in joblib child processes).
# See #1035 for more info
# TODO (pierreglaser): do the same with get_func_name?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good question. It does not seem to be required to fix #1035 though (I tried manually with the reproducer in an interactive ipython session).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also tried to stop and restart the ipython session and the cache survived.

joblib/memory.py Outdated Show resolved Hide resolved
@ogrisel ogrisel merged commit 43cfb4d into joblib:master Aug 4, 2020
@ogrisel
Copy link
Contributor

ogrisel commented Aug 4, 2020

Thank you very much @pierreglaser . Actually, I forgot to update the changelog prior to merging this. I will do it now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Using Memory and Parallel with cached function defined inside Jupyter notebook results in not using the cache
2 participants