Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gha Cache not Used With Tagged Releases #433

Closed
nik-humphries opened this issue Aug 10, 2021 · 23 comments · Fixed by #707
Closed

gha Cache not Used With Tagged Releases #433

nik-humphries opened this issue Aug 10, 2021 · 23 comments · Fixed by #707
Labels
area/cache kind/upstream Changes need to be made on upstream project

Comments

@nik-humphries
Copy link

nik-humphries commented Aug 10, 2021

I'm having some issues with using the gha cache and don't know if there's something I'm missing. It is apparently exporting the cache.

Export Cache
#18 exporting to image
#18 pushing layers 9.3s done
#18 pushing manifest for ***/nr_drainage:v0.1.8@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b
#18 pushing manifest for ***/nr_drainage:v0.1.8@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b 1.8s done
#18 pushing layers 0.6s done
#18 pushing manifest for ***/nr_drainage:latest@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b
#18 pushing manifest for ***/nr_drainage:latest@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b 0.7s done
#18 DONE 56.3s

#22 exporting cache
#22 preparing build cache for export done
#22 writing layer sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd
#22 writing layer sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd 4.6s done
#22 writing layer sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f
#22 writing layer sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 0.2s done
#22 writing layer sha256:192ba9b3221fa4b50acfd5f0d1410a085379b00a5c7c63f1af5c1990897acce4
#22 writing layer sha256:192ba9b3221fa4b50acfd5f0d1410a085379b00a5c7c63f1af5c1990897acce4 2.2s done
#22 writing layer sha256:39e80151150276578c2b94a27bafb5b4b78025702699b428e7b4d14df909393e
#22 writing layer sha256:39e80151150276578c2b94a27bafb5b4b78025702699b428e7b4d14df909393e 0.2s done
#22 writing layer sha256:3a38a5065324eb257788446643418385bc807cd7c4379f6bace227e9745e82d5
#22 writing layer sha256:3a38a5065324eb257788446643418385bc807cd7c4379f6bace227e9745e82d5 0.2s done
#22 writing layer sha256:51b56d12332dedcd8ee37b12bea1f414dada566781a0c1ee6175ec54e5c403d9
#22 writing layer sha256:51b56d12332dedcd8ee37b12bea1f414dada566781a0c1ee6175ec54e5c403d9 2.3s done
#22 writing layer sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783
#22 writing layer sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783 5.2s done
#22 writing layer sha256:668caffbdcc129d34ace9aaa01f52844b50aa81ec38dd28369f0d57b7ae8c0c8
#22 writing layer sha256:668caffbdcc129d34ace9aaa01f52844b50aa81ec38dd28369f0d57b7ae8c0c8 0.2s done
#22 writing layer sha256:7b65d78f479465d24844da2bd0898bddcea6d27d2bd3a6964f88cced87604f84
#22 writing layer sha256:7b65d78f479465d24844da2bd0898bddcea6d27d2bd3a6964f88cced87604f84 0.2s done
#22 writing layer sha256:838b2dcfb9e4aa0a53a8f968f696b8437bc7451b79813a33b101552d8957d588
#22 writing layer sha256:838b2dcfb9e4aa0a53a8f968f696b8437bc7451b79813a33b101552d8957d588 0.2s done
#22 writing layer sha256:99ea233aafd83fc32f3c32cbec66cfd60ed781d5c26fd74c33c4320ea44b5669
#22 writing layer sha256:99ea233aafd83fc32f3c32cbec66cfd60ed781d5c26fd74c33c4320ea44b5669 0.2s done
#22 writing layer sha256:a70d879fa5984474288d52009479054b8bb2993de2a1859f43b5480600cecb24
#22 writing layer sha256:a70d879fa5984474288d52009479054b8bb2993de2a1859f43b5480600cecb24 1.8s done
#22 writing layer sha256:b50df580e5e95d436d9bc707840266404d5a20c079f0873bd76b4cece327cf0d
#22 writing layer sha256:b50df580e5e95d436d9bc707840266404d5a20c079f0873bd76b4cece327cf0d 0.2s done
#22 writing layer sha256:c4394a92d1f8760cf7d17fee0bcee732c94c5b858dd8d19c7ff02beecf3b4e83
#22 writing layer sha256:c4394a92d1f8760cf7d17fee0bcee732c94c5b858dd8d19c7ff02beecf3b4e83 0.2s done
#22 writing layer sha256:d614cfe64e795d7cd4437846ccc4b8e7da7eac49597a10b8b46b5c0ced4b2c19
#22 writing layer sha256:d614cfe64e795d7cd4437846ccc4b8e7da7eac49597a10b8b46b5c0ced4b2c19 2.6s done
#22 writing layer sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c
#22 writing layer sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c 0.2s done
#22 writing layer sha256:dfc8455ab52d21e8800fb4aa291af841849410a129649971bd8296b817fab489
#22 writing layer sha256:dfc8455ab52d21e8800fb4aa291af841849410a129649971bd8296b817fab489 1.7s done
#22 DONE 23.1s

But then when trying to build again, it doesn't appear to use it. Note that the cache was exported by tag v0.1.7 and the one to use the cache is v0.1.8. They are constructed from the exact same commit.

Doesn't use cache
/usr/bin/docker buildx build --tag ***/nr_drainage:v0.1.8 --tag ***/nr_drainage:latest --iidfile /tmp/docker-build-push-ORC2bc/iidfile --cache-from type=gha, mode=max, scope=Dev Deploy to ACR --cache-to type=gha, mode=max, scope=Dev Deploy to ACR --push .
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.27kB done
#1 DONE 0.0s

#2 [internal] load .dockerignore
#2 transferring context: 2B done
#2 DONE 0.0s

#3 [internal] load metadata for docker.io/rocker/shiny:4.0.3
#3 DONE 0.8s

#8 [internal] load build context
#8 DONE 0.0s

#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 resolve docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a done
#17 DONE 0.0s

#4 importing cache manifest from gha:3101350370151987365
#4 DONE 0.2s

#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 ...

#8 [internal] load build context
#8 transferring context: 187.66kB 0.0s done
#8 DONE 0.0s

#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 0B / 187B 0.2s
#17 sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd 6.29MB / 214.86MB 0.2s
#17 sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c 0B / 21.29kB 0.2s
#17 sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783 0B / 287.72MB 0.2s
#17 sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 187B / 187B 0.3s done

If I reuse the same tag (e.g. release Tag v0.1.7 and then delete it and re-release Tag v0.1.7) then it grabs from the cache as intended.

Does use the cache
/usr/bin/docker buildx build --tag ***/nr_drainage:v0.1.8 --tag ***/nr_drainage:latest --iidfile /tmp/docker-build-push-mwHmsK/iidfile --cache-from type=gha, mode=max, scope=Dev Deploy to ACR --cache-to type=gha, mode=max, scope=Dev Deploy to ACR --push .
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.27kB done
#1 DONE 0.0s

#2 [internal] load .dockerignore
#2 transferring context: 2B done
#2 DONE 0.0s

#3 [internal] load metadata for docker.io/rocker/shiny:4.0.3
#3 DONE 0.7s

#8 [internal] load build context
#8 DONE 0.0s

#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 resolve docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a done
#17 DONE 0.0s

#4 importing cache manifest from gha:6286612600847900197
#4 DONE 0.3s

#8 [internal] load build context
#8 transferring context: 187.65kB 0.0s done
#8 DONE 0.0s

Behaviour

Steps to reproduce this issue

  1. Deploy using a tag trigger
  2. Deploy using a new tag

Expected behaviour

Second deploy should use cache from previous one

Actual behaviour

It doesn't use the cache

deploy yml
# Dev deployment (all tags get pushed)

name: Dev Deploy to ACR



# Controls when the workflow will run
on:
  push:
    tags:
    # Limits to all versions! Imagine that.
      - v*.*.*
      - v*.*.*-*

  # Allows you to run this workflow manually from the Actions tab
  workflow_dispatch:

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
  # This workflow contains a single job called "build"
  build:
    # The type of runner that the job will run on
    runs-on: ubuntu-20.04
    # Specify environment (dev/test/etc)
    environment: dev
    env:
      docker_repo_name: nr_drainage

    # Steps represent a sequence of tasks that will be executed as part of the job
    steps:
    - name: Get the version
      id: get_version
      run: echo ::set-output name=VERSION::$(echo $GITHUB_REF | cut -d / -f 3)
      
      # Get release version
    - name: Checkout triggered release
      uses: actions/checkout@v2
      with:
        ref: '${{ github.ref }}'
        
      # Put all packages here. Could even make your own github actions to handle this.
    - name: Checkout and tarball up gen azurestorefuns
      uses: actions/checkout@v2
      with:
        repository: '***'
        ssh-key: '${{ secrets.AZURESTOREFUNS_READ_KEY }}'
        path: 'tmppkg'
        ref: 'v0.1.6-1'
    - run: mkdir -p gen-packages
    - run: tar -czvf gen-packages/azurestorefuns.tar.gz ./tmppkg
    - run: rm -rf ./tmppkg
    # End packages
    
    # Connect into ACR
    - name: Connect to ACR
      uses: azure/docker-login@v1
      with:
        login-server: ${{ secrets.REGISTRY_LOGIN_SERVER }}
        username: ${{ secrets.REGISTRY_USERNAME }}
        password: ${{ secrets.REGISTRY_PASSWORD }}

    # This is the a separate action that sets up buildx runner
    - name: Set up Docker Buildx
      uses: docker/setup-buildx-action@v1
      with:
        version: v0.6.1
        
    - name: Build and push
      uses: docker/build-push-action@v2
      with:
        context: .
        builder: ${{ steps.buildx.outputs.name }}
        push: true
        tags: ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:${{ steps.get_version.outputs.VERSION }}, ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:latest
        cache-from: type=gha, mode=max, scope=${{ github.workflow }}
        cache-to: type=gha, mode=max, scope=${{ github.workflow }}
@crazy-max
Copy link
Member

crazy-max commented Aug 10, 2021

@nik-humphries Might be your scope key ${{ github.workflow }} which contains spaces (Dev Deploy to ACR). Also mode=max only works with cache-to. And I don't see any cached step in your Does use the cache logs.

    - name: Build and push
      uses: docker/build-push-action@v2
      with:
        context: .
        builder: ${{ steps.buildx.outputs.name }}
        push: true
        tags: ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:${{ steps.get_version.outputs.VERSION }}, ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:latest
        cache-from: type=gha,scope=dev-deploy-acr
        cache-to: type=gha,mode=max,scope=dev-deploy-acr

Can you also post your Dockerfile?

@nik-humphries
Copy link
Author

It was nice of github to die as soon as your comment came through yesterday!

I didn't copy enough of the log, you are correct. This is from building v0.1.8 twice, where the log is from the second time (it reads from cache).

Proof of cache logs
/usr/bin/docker buildx build --tag ***/nr_drainage:v0.1.8 --tag ***/nr_drainage:latest --iidfile /tmp/docker-build-push-mwHmsK/iidfile --cache-from type=gha, mode=max, scope=Dev Deploy to ACR --cache-to type=gha, mode=max, scope=Dev Deploy to ACR --push .
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.27kB done
#1 DONE 0.0s

#2 [internal] load .dockerignore
#2 transferring context: 2B done
#2 DONE 0.0s

#3 [internal] load metadata for docker.io/rocker/shiny:4.0.3
#3 DONE 0.7s

#8 [internal] load build context
#8 DONE 0.0s

#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 resolve docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a done
#17 DONE 0.0s

#4 importing cache manifest from gha:6286612600847900197
#4 DONE 0.3s

#8 [internal] load build context
#8 transferring context: 187.65kB 0.0s done
#8 DONE 0.0s

#11 [ 7/12] RUN mkdir /root/Shiny
#11 CACHED

#6 [ 3/12] RUN apt-get -y update && apt-get install -y      libmysqlclient-dev     libudunits2-dev libgdal-dev libgeos-dev libproj-dev
#6 CACHED

#7 [ 4/12] RUN apt-get update &&     apt-get upgrade -y &&     apt-get clean
#7 CACHED

I have changed the workflow step to be as described above and am still getting the same issue. I'm tempted to make a public repo with a few parts of the Dockerfile removed and see if the issue persists.

Below is the Dockerfile.

Dockerfile
# Base image https://hub.docker.com/u/rocker/
FROM rocker/shiny:4.0.3

# system libraries of general use
## install debian packages
RUN apt-get update -qq && apt-get -y --no-install-recommends install \
    libxml2-dev \
    libcairo2-dev \
    libsqlite3-dev \
    libmariadbd-dev \
    libpq-dev \
    libssh2-1-dev \
    unixodbc-dev \
    libcurl4-openssl-dev \
    libssl-dev \
    libv8-dev \
    libudunits2-dev

RUN apt-get -y update && apt-get install -y  \
    libmysqlclient-dev \
    libudunits2-dev libgdal-dev libgeos-dev libproj-dev

## update system libraries
RUN apt-get update && \
    apt-get upgrade -y && \
    apt-get clean


# install renv & restore packages
COPY /app/packages.R /tmp/
RUN Rscript /tmp/packages.R

## app folder
RUN mkdir /root/Shiny
COPY /app /root/Shiny


# Install gen packages
RUN ["chmod", "o+w", "/usr/local/lib/R/site-library"]
COPY gen-packages gen-packages
RUN Rscript gen-packages/install_all_pkg.R

RUN ["R", "-e", "install.packages('https://***.tar.gz', repos = NULL , type = 'source')"]


#expose port 
EXPOSE 3838

#Run the app
CMD ["R", "-e", "options('shiny.port'=3838,shiny.host='0.0.0.0');shiny::runApp('/root/Shiny')"]

@nik-humphries
Copy link
Author

I've produced a repo demonstrating the behaviour.

https://github.com/nik-humphries/buildx-cache-test/actions

First run - v0.0.1 builds from scratch
Second run - v0.1.2 builds from scratch, but shouldn't
Third run - v0.1.2 builds from cache

@nik-humphries
Copy link
Author

Sorry to bump this, but do you have any ideas about this @crazy-max ? Is it something that's been tested with tagged releases before or is it something that isn't supported?

@crazy-max
Copy link
Member

@nik-humphries Will try to repro and keep you in touch. Thanks.

@jisensee
Copy link

jisensee commented Sep 7, 2021

I am running into the same issue, tagged builds don't cache. Repeated builds on the same tag do cache properly. Can provide a repo with build logs if needed.

@crazy-max
Copy link
Member

Ok it looks like the token provided when the workflow is triggered has somehow not the same scope depending of the branch/tag and so can't find the relevant blobs because it does not exist for the current "branch" (aka your tag). I think if we don't find the relevant cache for the triggered branch/tag we should check if the parent or default branch cache exists and use it the same way as actions/cache (cc @tonistiigi).

@crazy-max crazy-max added the kind/upstream Changes need to be made on upstream project label Oct 5, 2021
@tonistiigi
Copy link
Member

@crazy-max The token already has multiple scopes with one being writable and others readable. Our behavior should match that so we pull cache from all scopes. Eg. in PRs you also get cache imported from master branch. I haven't double checked but it seems unlikely that same would work with release tags though. What semantics would Github use to understand v0.1.7 is a parent of v0.1.8 (it can't look at the commits)? If the scope is not there from Github then we can't access any other scope because we don't have token for it. It should be the same behavior as actions/cache.

@crazy-max
Copy link
Member

Makes sense, I will compare behaviors of both impl and keep you in touch.

@tonistiigi
Copy link
Member

https://github.com/tonistiigi/go-actions-cache/blob/master/cache.go#L196-L198 can be used to show what scopes gh provides access to.

@Ashniu123
Copy link

Any update on this or a workaround?

@flaksp
Copy link

flaksp commented Dec 21, 2021

I can confirm caching in tagged releases doesn't work for some reason, however it works with on: push workflows.

Our workflow for tagged releases:

name: Docker

on:
  release:
    types:
      - published

jobs:
  php_fpm:
    name: PHP-FPM
    runs-on: ubuntu-latest

    steps:
      - name: Checkout repository
        uses: actions/checkout@v2.4.0

      - name: Compute API version
        id: api-version
        run: echo "::set-output name=version::$(echo $GIT_TAG | cut -f1,2 -d'.')"
        env:
          GIT_TAG: ${{ github.event.release.tag_name }}

      - name: Setup Docker Buildx
        uses: docker/setup-buildx-action@v1.6.0

      - name: Login to registry
        uses: docker/login-action@v1.12.0
        with:
          registry: cr.yandex/aaa/api-php-fpm
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_PASSWORD }}

      - name: Build and push
        uses: docker/build-push-action@v2.7.0
        with:
          context: .
          file: php-fpm-prod.dockerfile
          push: true
          tags: |
            cr.yandex/aaa/api-php-fpm:latest
            cr.yandex/aaa/api-php-fpm:${{ github.event.release.tag_name }}
          build-args: |
            APP_VERSION=${{ github.event.release.tag_name }}
            API_VERSION=${{ steps.api-version.outputs.version }}
          cache-from: type=gha,scope=php-fpm
          cache-to: type=gha,scope=php-fpm

  nginx:
    name: NGINX
    runs-on: ubuntu-latest

    steps:
      - name: Checkout repository
        uses: actions/checkout@v2.4.0

      - name: Setup Docker Buildx
        uses: docker/setup-buildx-action@v1.6.0

      - name: Login to registry
        uses: docker/login-action@v1.12.0
        with:
          registry: cr.yandex/bbb/api-nginx
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_PASSWORD }}

      - name: Build and push
        uses: docker/build-push-action@v2.7.0
        with:
          context: .
          file: nginx/Dockerfile
          push: true
          tags: |
            cr.yandex/bbb/api-nginx:latest
            cr.yandex/bbb/api-nginx:${{ github.event.release.tag_name }}
          cache-from: type=gha,scope=nginx
          cache-to: type=gha,scope=nginx

@avibash
Copy link

avibash commented Jan 12, 2022

I also encountered this issue, any recommendation for a fix or a workaround?

@antonioparraga
Copy link

same here, cache with tagged release are not working at all. Any update?

@tonistiigi
Copy link
Member

If github doesn't provide correct scopes for these tags then it should be reported to them (same for actions/cache). Scopes can be checked with #433 (comment) or by decoding the token manually.

@alextes
Copy link
Contributor

alextes commented Jul 7, 2022

We have a deploy flow where anything tagged with stag- or prod- is deployed to those respective environments. That means that every time we need to build an image to deploy, there is a tag, and thus no cache 😔 .

I found a workaround. I set up a separate 'build' workflow. This one triggers on every push, or whatever you'd like, just not on tags (you could even use tags-ignore for that), then in the deploy step I use https://github.com/marketplace/actions/wait-on-check to wait on the 'build' workflow. This way we're doing way more building (every commit, instead of every tag), which we kinda don't need, but build times are far lower.

small example for when you have a separate workflow called build that follows the suggestions for caching in this repo, that you'd like to wait on from the example below:

name: deploy-staging

on:
  push:
    tags:
      - "stag-*"

jobs:
  wait-on-build:
    name: wait on build
    runs-on: ubuntu-latest
    steps:
      - name: Wait for deploy-staging to succeed
        uses: lewagon/wait-on-check-action@v1.1.2
        with:
          ref: ${{ github.ref }}
          check-name: build
          repo-token: ${{ secrets.GITHUB_TOKEN }}
          wait-interval: 10

@polarathene
Copy link

polarathene commented Aug 22, 2022

Original comment (resolved)

Does this have anything to do with cache content? I am not using tags at present, but in my testing if I force-push to rewrite history to a PR I'm still seeing the cache load successfully, but build-push-action is ignoring it.

I have this output from a du -bch /path/to/cache/* command for the directory the cache is restored to:

329M	/tmp/.buildx-cache/blobs/sha256
329M	/tmp/.buildx-cache/blobs
246	/tmp/.buildx-cache/index.json
4.0K	/tmp/.buildx-cache/ingest

Even though I use the "path" context (context: .) instead of git context, I'm getting the impression some metadata is associated to the earlier run or commit history? I am only modifying my workflow on the main branch, and rebasing the PR that triggers it onto that update.

When I do this I notice the cache is no longer respected until I change the cache key with another update and it will start to use the cache again until history changes.

I can't think of any other reason that restored cache is being ignored :\

EDIT: I thought my cache was being ignored due to some action or implicit metadata, but it turned out to be an early ARG directive that invalidated cache when it's value changed (even if not explicitly used). Dockerfile reference docs explain that it will implicitly be used by RUN invalidating that layer.

@rturrado
Copy link

rturrado commented Oct 17, 2022

@alextes This workaround doesn't allow you to access the repository checkout from the build workflow from the deploy-staging workflow, does it?

@Nithos
Copy link

Nithos commented Oct 18, 2022

Seeing the same issue when using the bake action as well. Was racking my brain as to what I was doing wrong until i stumbled on this ticket. Our current workflow is set to only build and publish images on tags, so this is currently a big blocker for us. Is there any update on the status of this or other possible workaround?

@alextes how does your workaround manage to still hit the cache. Does the tag workflow not use a different context for the cache still which would miss even if it was build on 'main'?

@alextes
Copy link
Contributor

alextes commented Oct 19, 2022

@rturrado if you mean "what has been built" no, you'd have to upload and download it yourself. In my case we build and push a docker image, later steps just send commands to a kubernetes cluster that downloads the image. If you mean the "fresh" repository checkout, that is pretty trivial, it might happen automatically but I'm confident there are actions for it also.

@Nithos I don't quite remember but I think my point was that you do everything you need to do which depends on a cache in the build step, and then in the tag step you only do things that don't require cache. Like in our cache pull and run the built image on a kubernetes cluster.

@rturrado
Copy link

@alextes Thanks for the quick answer. I think it addressed my question, but, just to be completely sure, I'll rephrase it.

At the moment I have these workflow/job organization:

  • User:
    • Git pushes on branch B.
    • Git pushes tag B-vx.y.z.
  • GitHub Actions workflow A / job 1:
    • Triggered by a push on branch B.
    • Check out MyRepository.
    • Builds it.
    • Run some tests.
  • GitHub Actions workflow B / job 2:
    • Triggered by a pushed tag 'B-v**'.
    • Check out MyRepository.
    • Builds it.
    • Builds a docker and publishes it (the Dockerfile copies some build output binaries from the GitHub repository to the docker).

I was wondering if, by using the workaround you mentioned in this thread, I could simplify job 2 to:

  • GitHub Actions workflow B / job 2':
    • Triggered by a pushed tag 'B-v**'.
    • Waits for workflow A / job 1 to finish.
    • Builds a docker and publishes it (the Dockerfile has access to the GitHub repository checked out by job 1).

If this last scenario weren't possible, I understand a feasible/quite clean/interesting option would be:

  • The job 1 uploads an artifact with the build ouput binaries needed by job 2.
  • The job 2 waits for job 1 to finish and downloads the artifact.
  • The Dockerfile copies the artifact to the docker.

@crazy-max
Copy link
Member

#707 should help to figure out scope permissions.

@Nithos
Copy link

Nithos commented Oct 20, 2022

@alextes Thank you for the clarification. So your workflow is similar to ours, make a build step that builds the image and pushes as latest (where the cache works). Then have a follow up workflow that triggers on tag to pull and re-tag with the version number. The piece I missed originally is that you added the wait-on to remedy to possibility of both workflows being triggered at the same time, and the re-tag pulling an older version of the image and re-tagging before the current build run finishes.

Originally I thought about using sha tags so that I can make sure which image is being re-tagged, however that means that there will be a significant amount of images being pushed up to the registry and will need a separate maintenance process to clean things up.

Will give your approach a try in the meantime as it is a cleaner solution for the time being.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/cache kind/upstream Changes need to be made on upstream project
Projects
None yet
Development

Successfully merging a pull request may close this issue.