New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gha Cache not Used With Tagged Releases #433
Comments
@nik-humphries Might be your scope key - name: Build and push
uses: docker/build-push-action@v2
with:
context: .
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:${{ steps.get_version.outputs.VERSION }}, ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:latest
cache-from: type=gha,scope=dev-deploy-acr
cache-to: type=gha,mode=max,scope=dev-deploy-acr Can you also post your Dockerfile? |
It was nice of github to die as soon as your comment came through yesterday! I didn't copy enough of the log, you are correct. This is from building v0.1.8 twice, where the log is from the second time (it reads from cache). Proof of cache logs
I have changed the workflow step to be as described above and am still getting the same issue. I'm tempted to make a public repo with a few parts of the Dockerfile removed and see if the issue persists. Below is the Dockerfile. Dockerfile# Base image https://hub.docker.com/u/rocker/
FROM rocker/shiny:4.0.3
# system libraries of general use
## install debian packages
RUN apt-get update -qq && apt-get -y --no-install-recommends install \
libxml2-dev \
libcairo2-dev \
libsqlite3-dev \
libmariadbd-dev \
libpq-dev \
libssh2-1-dev \
unixodbc-dev \
libcurl4-openssl-dev \
libssl-dev \
libv8-dev \
libudunits2-dev
RUN apt-get -y update && apt-get install -y \
libmysqlclient-dev \
libudunits2-dev libgdal-dev libgeos-dev libproj-dev
## update system libraries
RUN apt-get update && \
apt-get upgrade -y && \
apt-get clean
# install renv & restore packages
COPY /app/packages.R /tmp/
RUN Rscript /tmp/packages.R
## app folder
RUN mkdir /root/Shiny
COPY /app /root/Shiny
# Install gen packages
RUN ["chmod", "o+w", "/usr/local/lib/R/site-library"]
COPY gen-packages gen-packages
RUN Rscript gen-packages/install_all_pkg.R
RUN ["R", "-e", "install.packages('https://***.tar.gz', repos = NULL , type = 'source')"]
#expose port
EXPOSE 3838
#Run the app
CMD ["R", "-e", "options('shiny.port'=3838,shiny.host='0.0.0.0');shiny::runApp('/root/Shiny')"]
|
I've produced a repo demonstrating the behaviour. https://github.com/nik-humphries/buildx-cache-test/actions First run - v0.0.1 builds from scratch |
Sorry to bump this, but do you have any ideas about this @crazy-max ? Is it something that's been tested with tagged releases before or is it something that isn't supported? |
@nik-humphries Will try to repro and keep you in touch. Thanks. |
I am running into the same issue, tagged builds don't cache. Repeated builds on the same tag do cache properly. Can provide a repo with build logs if needed. |
Ok it looks like the token provided when the workflow is triggered has somehow not the same scope depending of the branch/tag and so can't find the relevant blobs because it does not exist for the current "branch" (aka your tag). I think if we don't find the relevant cache for the triggered branch/tag we should check if the parent or default branch cache exists and use it the same way as actions/cache (cc @tonistiigi). |
@crazy-max The token already has multiple scopes with one being writable and others readable. Our behavior should match that so we pull cache from all scopes. Eg. in PRs you also get cache imported from master branch. I haven't double checked but it seems unlikely that same would work with release tags though. What semantics would Github use to understand |
Makes sense, I will compare behaviors of both impl and keep you in touch. |
https://github.com/tonistiigi/go-actions-cache/blob/master/cache.go#L196-L198 can be used to show what scopes gh provides access to. |
Any update on this or a workaround? |
I can confirm caching in tagged releases doesn't work for some reason, however it works with Our workflow for tagged releases: name: Docker
on:
release:
types:
- published
jobs:
php_fpm:
name: PHP-FPM
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2.4.0
- name: Compute API version
id: api-version
run: echo "::set-output name=version::$(echo $GIT_TAG | cut -f1,2 -d'.')"
env:
GIT_TAG: ${{ github.event.release.tag_name }}
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v1.6.0
- name: Login to registry
uses: docker/login-action@v1.12.0
with:
registry: cr.yandex/aaa/api-php-fpm
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@v2.7.0
with:
context: .
file: php-fpm-prod.dockerfile
push: true
tags: |
cr.yandex/aaa/api-php-fpm:latest
cr.yandex/aaa/api-php-fpm:${{ github.event.release.tag_name }}
build-args: |
APP_VERSION=${{ github.event.release.tag_name }}
API_VERSION=${{ steps.api-version.outputs.version }}
cache-from: type=gha,scope=php-fpm
cache-to: type=gha,scope=php-fpm
nginx:
name: NGINX
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2.4.0
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v1.6.0
- name: Login to registry
uses: docker/login-action@v1.12.0
with:
registry: cr.yandex/bbb/api-nginx
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@v2.7.0
with:
context: .
file: nginx/Dockerfile
push: true
tags: |
cr.yandex/bbb/api-nginx:latest
cr.yandex/bbb/api-nginx:${{ github.event.release.tag_name }}
cache-from: type=gha,scope=nginx
cache-to: type=gha,scope=nginx
|
I also encountered this issue, any recommendation for a fix or a workaround? |
same here, cache with tagged release are not working at all. Any update? |
If github doesn't provide correct scopes for these tags then it should be reported to them (same for actions/cache). Scopes can be checked with #433 (comment) or by decoding the token manually. |
We have a deploy flow where anything tagged with I found a workaround. I set up a separate 'build' workflow. This one triggers on every push, or whatever you'd like, just not on tags (you could even use tags-ignore for that), then in the deploy step I use https://github.com/marketplace/actions/wait-on-check to wait on the 'build' workflow. This way we're doing way more building (every commit, instead of every tag), which we kinda don't need, but build times are far lower. small example for when you have a separate workflow called build that follows the suggestions for caching in this repo, that you'd like to wait on from the example below: name: deploy-staging
on:
push:
tags:
- "stag-*"
jobs:
wait-on-build:
name: wait on build
runs-on: ubuntu-latest
steps:
- name: Wait for deploy-staging to succeed
uses: lewagon/wait-on-check-action@v1.1.2
with:
ref: ${{ github.ref }}
check-name: build
repo-token: ${{ secrets.GITHUB_TOKEN }}
wait-interval: 10
|
Original comment (resolved)Does this have anything to do with cache content? I am not using tags at present, but in my testing if I force-push to rewrite history to a PR I'm still seeing the cache load successfully, but I have this output from a
Even though I use the "path" context ( When I do this I notice the cache is no longer respected until I change the cache key with another update and it will start to use the cache again until history changes. I can't think of any other reason that restored cache is being ignored :\ EDIT: I thought my cache was being ignored due to some action or implicit metadata, but it turned out to be an early |
@alextes This workaround doesn't allow you to access the repository checkout from the |
Seeing the same issue when using the bake action as well. Was racking my brain as to what I was doing wrong until i stumbled on this ticket. Our current workflow is set to only build and publish images on tags, so this is currently a big blocker for us. Is there any update on the status of this or other possible workaround? @alextes how does your workaround manage to still hit the cache. Does the tag workflow not use a different context for the cache still which would miss even if it was build on 'main'? |
@rturrado if you mean "what has been built" no, you'd have to upload and download it yourself. In my case we build and push a docker image, later steps just send commands to a kubernetes cluster that downloads the image. If you mean the "fresh" repository checkout, that is pretty trivial, it might happen automatically but I'm confident there are actions for it also. @Nithos I don't quite remember but I think my point was that you do everything you need to do which depends on a cache in the build step, and then in the tag step you only do things that don't require cache. Like in our cache pull and run the built image on a kubernetes cluster. |
@alextes Thanks for the quick answer. I think it addressed my question, but, just to be completely sure, I'll rephrase it. At the moment I have these workflow/job organization:
I was wondering if, by using the workaround you mentioned in this thread, I could simplify job 2 to:
If this last scenario weren't possible, I understand a feasible/quite clean/interesting option would be:
|
#707 should help to figure out scope permissions. |
@alextes Thank you for the clarification. So your workflow is similar to ours, make a Originally I thought about using sha tags so that I can make sure which image is being re-tagged, however that means that there will be a significant amount of images being pushed up to the registry and will need a separate maintenance process to clean things up. Will give your approach a try in the meantime as it is a cleaner solution for the time being. |
I'm having some issues with using the gha cache and don't know if there's something I'm missing. It is apparently exporting the cache.
Export Cache
But then when trying to build again, it doesn't appear to use it. Note that the cache was exported by tag v0.1.7 and the one to use the cache is v0.1.8. They are constructed from the exact same commit.
Doesn't use cache
If I reuse the same tag (e.g. release Tag v0.1.7 and then delete it and re-release Tag v0.1.7) then it grabs from the cache as intended.
Does use the cache
Behaviour
Steps to reproduce this issue
Expected behaviour
Actual behaviour
deploy yml
The text was updated successfully, but these errors were encountered: