New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regression using ubuntu linux/amd64 host with linux/386 container #7695
Comments
Hey @molinav. Thank you for reporting. We will investigate it. |
Hey @molinav. We updated some underlying infrastructure that may relate to this issue. Could you try running your workflow again? |
Hi @vpolikarpov-akvelon. Unfortunately the problem is still triggered (Runner Image Provisioner is now 2.0.226.1), see below:
|
I also tested on my personal computer (Windows 10 Pro x64, WSL with Debian 11) to ensure that the On Windows with Docker Desktop + Linux containers:
On WSL with Docker CLI:
|
To keep this alive, I have been trying to run the same workflows when I saw that new runner images were available, and the exact same problem persists in all of them (last runner version was |
Hey, @molinav. I have carefully investigated the information you provided once more. I noticed that there were only three successful builds on May 4 and May 5. Two weeks later, on May 18, the docker image In any case, the runner pulls the image using a plain Regarding your local PC, the reason you can pull the image without explicitly specifying the platform may be due to the Docker version. The Docker version on GitHub-hosted runners is currently Considering all this information, I don't believe it is related to the runner image update. If there is something I overlooked, please let us know in the comment. |
Thanks for the feedback, @vpolikarpov-akvelon! The Docker image update on May 18 should be related to a rebuild of the same Dockerfile with the latest Python versions built from source (very likely the Python patch versions were different for the still-supported Python versions). As you indicated, the plain
so the
so the container initialisation is aborted because no amd64 images are found on I hope that I could clarify a bit better the behaviour that I was seeing at the beginning of May with respect to the behaviour that I see since June. Could it be that the Docker version has changed, and the latest Docker version in the runner images has a different behaviour on what to do in this multi-platform cases? My tests on Windows and WSL2 were done with Docker Desktop, which is providing Docker 24.0.2 at the moment. It seems that it is possible to bypass the default Docker platform used when pulling through the
but I could not figure out how to do this (if it is even possible), because my exported environment variables are only being set after the creation of the job container. |
Well, I tried to revert moby-engine upgrade that took place here on VM created from runner image and it helped indeed. Looks like until version I didn't find any related changes in moby-engine changelog, but I think it may be caused by the update of dependent package We can't pin version of moby-engine package, therefore the only way for you to restore functionality you have lost is to request it in runner repo. I think this feature may be re-requested taking into account new information and recent spec updates. As a workaround I can suggest using images hashes, like this: build-geos:
strategy:
matrix:
image:
- "pylegacy/x64-python:3.6-debian-4@sha256:41f8377e5294575bae233cc2370c6d4168c4fa0b24e83af4245b64b4940d572d"
- "pylegacy/x86-python:3.6-debian-4@sha256:91bc1c1b2e60948144cc32d5009827f2bf5331c51d43ff4c4ebfe43b0b9e7843" It's quite dumb, I know, but I don't see any other options for now. |
Thanks for your detailed analysis, @vpolikarpov-akvelon. I am currently inspecting other sources of issues, since my naive rebuild of the Docker images (which you pointed yesterday) could also have caused some impact which I was not being aware of. It seems that since I could find similar issues and pull requests from last weeks (docker/buildx#1533, open-policy-agent/opa#6052, freelawproject/courtlistener#2830 (comment)). I am currently rebuilding my Docker images with the |
@vpolikarpov-akvelon I think I can confirm the source of the issue and it is not related to any
With the old Docker versions, amd64 hosts can run i386 images if:
The old BuildKit generates single-platform images with The current workaround that solves my problem is to force I rebuilt my Docker images with the Thanks for your effort and time, @vpolikarpov-akvelon! |
@molinav, thank you for the solution and detailed explanation. As problem seems to be resolved now, I'm closing this thread. Feel free to reach out again if you have other problems or questions. |
Following advice here: actions/runner-images#7695 (comment)
Description
I observed that one of the project workflows I maintain is not able anymore to build 32-bit packages on 64-bit GNU/Linux hosts and the only thing that has changed is the GitHub runner image version:
ubuntu-latest
was able to use a container based on a pulled image with onlylinux/386
as available arch, and this was allowing to have a 32-bit isolated environment to build 32-build libraries on a 64-bit host. So alinux/amd64
was able to handle the use oflinux/386
when the Docker registry was not offering alinux/amd64
image:linux/amd64
host now only allows to pulllinux/amd64
images, so when a user now provides a registry with onlylinux/386
images, the runner will complain because it cannot find anylinux/amd64
image to pull (and it will not try to pull thelinux/386
image):Passing the
--platform
option together with the container setup is not an option, because this option and its argument are not passed to thedocker pull
call during the container preparation and an issue pointing to this problem was closed long ago (actions/runner#648).Platforms affected
Runner images affected
Image version and build link
Before (working, 20230426.1): https://github.com/matplotlib/basemap/actions/runs/4884953600/jobs/8718596379
Now (failing, 20230517.1): https://github.com/matplotlib/basemap/actions/runs/5218138554/jobs/9418704735
Is it regression?
Yes, because with runner image version 20230426.1 it was working.
Expected behavior
The
ubuntu-latest
64-bit runners should be able to runlinux/386
containers as before.Actual behavior
The
ubuntu-latest
64-bit runners are failing because they do not identifylinux/386
as a valid architecture.Repro steps
The workflow below reproduces the bug:
https://github.com/matplotlib/basemap/blob/v1.3.7/.github/workflows/basemap-for-manylinux.yml
In particular, the following job is enough, it does not even start because the container cannot be created:
https://github.com/matplotlib/basemap/blob/v1.3.7/.github/workflows/basemap-for-manylinux.yml#LL78-L125
The text was updated successfully, but these errors were encountered: