You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Run with size 30, then change to 40, run dvc status, run dvc repro again. It's not running the pipeline, saying this:
Stage 'data_ingestion' is cached - skipping run, checking out outputs
Updating lock file 'dvc.lock'
To track the changes with git, run:
git add dvc.lock
To enable auto staging, run:
dvc config core.autostage true
Use `dvc push` to send your updates to remote storage.
Note that this only happens if the param name is size or nfiles. 😅
skshetry
changed the title
Pipeline is not executed even if parameter dep chaged
Pipeline is not executed for parameter with name size or nfilesFeb 9, 2024
We are recursively excluding nfiles and size before "hashing" for stage cache, which is incorrect. But I have to think it through what impact this can have. Most likely, we'll be able to remove size and nfiles only from outputs that are not parameter dependencies.
Bug Report
Description
See this link https://stackoverflow.com/questions/77962532/dvc-using-cached-run-although-parameter-changed
Reproduce
Use this repo: https://github.com/shcheklein/test-dvc-so-77962532
Run with size 30, then change to 40, run
dvc status
, run dvc repro again. It's not running the pipeline, saying this:File size stays the same.
Logs
Expected
Running the stage.
Environment information
The text was updated successfully, but these errors were encountered: