Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build arguments not replaced #735

Open
nicolabovolato opened this issue Mar 22, 2024 · 9 comments
Open

Build arguments not replaced #735

nicolabovolato opened this issue Mar 22, 2024 · 9 comments
Labels
triage Investigation required

Comments

@nicolabovolato
Copy link

Expected Behaviour
Having a Dockerfile containing build arguments, I expected them to be replaced when using withBuildArgs() on GenericContainerBuilder

E.g. dockerfile for nodejs monorepos.

const container = await GenericContainer.fromDockerfile()
  .withBuildArgs({
    WORKSPACE_NAME: ...,
    WORKSPACE_PATH: ...,
  })
  .build();
FROM node:20-alpine AS base
ENV PNPM_HOME=/root/.local/share/pnpm
ENV PATH=$PATH:$PNPM_HOME
RUN npm i -g pnpm@8.x.x
RUN apk add --no-cache libc6-compat
RUN pnpm i -g turbo

WORKDIR /usr/app
ARG WORKSPACE_NAME
ARG WORKSPACE_PATH

FROM base as pruned
COPY . .
RUN turbo prune --scope $WORKSPACE_NAME --docker

FROM base as builder
COPY --from=pruned /usr/app/out/json ./
COPY --from=pruned /usr/app/out/pnpm-lock.yaml ./
RUN pnpm install --frozen-lockfile
COPY turbo.json turbo.json
COPY --from=pruned /usr/app/out/full .
RUN pnpm run build --filter=$WORKSPACE_NAME

FROM base as installer-prod
COPY --from=builder /usr/app ./
RUN pnpm install --frozen-lockfile --prod --filter=$WORKSPACE_NAME

FROM base as prod
COPY --from=installer-prod /usr/app ./
WORKDIR /app/$WORKSPACE_PATH

ENV NODE_ENV "production"
CMD ["pnpm", "start"]

Actual Behaviour
Looking at the source code, build arguments are only getting replaced if present in the FROM clause.
Also that regex doesn't look right.

const buildArgRegex = /\${([^{]+)}/g;
export async function getDockerfileImages(dockerfile: string, buildArgs: BuildArgs): Promise<ImageName[]> {
try {
return (await parseImages(dockerfile))
.map((line) => line.replace(buildArgRegex, (_, arg) => buildArgs[arg] ?? ""))
.map((line) => ImageName.fromString(line.trim()));
} catch (err) {
log.error(`Failed to read Dockerfile "${dockerfile}": ${err}`);
throw err;
}
}
async function parseImages(dockerfile: string): Promise<string[]> {
return Array.from(
(await fs.readFile(dockerfile, "utf8"))
.split(/\r?\n/)
.filter((line) => line.toUpperCase().startsWith("FROM"))
.map((line) => line.split(" ").filter(isNotEmptyString)[1])
.reduce((prev, next) => prev.add(next), new Set<string>())
.values()
);
}

Steps to Reproduce
Try to programmatically build an image with a build arg placed everywhere but in the FROM clause.

Environment Information

  • Operating System: macOS Sonoma
  • Docker Version: v25.0.3
  • Node version: v18.16.0
  • Testcontainers version: v10.7.2
@cristianrgreco
Copy link
Collaborator

Hi @nicolabovolato,

Looking at the source code, build arguments are only getting replaced if present in the FROM clause.
Also that regex doesn't look right.

The code link you shared is unrelated. That code parses a list of Docker image names which are later used for authentication. We need the build arguments there just in case the image name is a build arg.

The build arguments relevant for you are sent directly to the Docker daemon here:

await client.image.build(this.context, {
t: imageName.string,
dockerfile: this.dockerfileName,
buildargs: this.buildArgs,
pull: this.pullPolicy ? "true" : undefined,
nocache: !this.cache,
registryconfig: registryConfig,
labels,
target: this.target,
});

Could you please share what exactly is going wrong with your implementation, perhaps share the logs as well?

@cristianrgreco cristianrgreco added the triage Investigation required label Mar 22, 2024
@nicolabovolato
Copy link
Author

My bad @cristianrgreco, i should have looked a bit further into that function.

Anyway, I'm running containerized benchmarks inside a monorepo.
Dockerfile is the same as above and source code is an iteration around this logic:

const workspaceName = `bench-${group}-${pkg}`;
const workspacePath = path.relative(
  rootDir,
  path.join(__dirname, group, pkg)
);

logger.info(
  `Building dockerfile with workspace ${workspaceName} and path ${workspacePath}...`
);
const container = await GenericContainer.fromDockerfile(
  rootDir,
  dockerfilePath
)
  .withBuildArgs({
    WORKSPACE_NAME: workspaceName,
    WORKSPACE_PATH: workspacePath,
  })
  .build(workspaceName)
  .withResourcesQuota({
    cpu: containerCpu,
    memory: containerMemory,
  });

const startedContainer = await container.start();

logger.info("Running bench for package " + pkg);
const results = await benchmark(group, startedContainer);

await startedContainer.stop();

Logs are as follows:

{"level":30,"time":1711102181390,"pid":76924,"hostname":"Mac-di-Nicola-Bovolato","msg":"Running bench for group http"}
  testcontainers [DEBUG] Testing container runtime strategy "TestcontainersHostStrategy"... +0ms
  testcontainers [DEBUG] Testing container runtime strategy "ConfigurationStrategy"... +0ms
  testcontainers [DEBUG] Testing container runtime strategy "UnixSocketStrategy"... +0ms
{"level":30,"time":1711102181391,"pid":76924,"hostname":"Mac-di-Nicola-Bovolato","msg":"Building dockerfile with workspace bench-http-fastify and path packages/benchmarks/http/fastify..."}
  testcontainers [DEBUG] Container runtime strategy "UnixSocketStrategy" works +663ms
  testcontainers [DEBUG] Acquiring lock file "/var/folders/h2/71hmhq8n5p59f79w4k8_ksk80000gp/T/testcontainers-node.lock"... +4ms
  testcontainers [DEBUG] Acquired lock file "/var/folders/h2/71hmhq8n5p59f79w4k8_ksk80000gp/T/testcontainers-node.lock" +3ms
  testcontainers [DEBUG] Listing containers... +0ms
  testcontainers [DEBUG] Listed containers +3ms
  testcontainers [DEBUG] Creating new Reaper for session "9b34cb0b281d" with socket path "/var/run/docker.sock"... +0ms
  testcontainers [DEBUG] Checking if image exists "testcontainers/ryuk:0.5.1"... +1ms
  testcontainers [DEBUG] Checked if image exists "testcontainers/ryuk:0.5.1" +3s
  testcontainers [DEBUG] Image "testcontainers/ryuk:0.5.1" already exists +0ms
  testcontainers [DEBUG] Creating container for image "testcontainers/ryuk:0.5.1"... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Created container for image "testcontainers/ryuk:0.5.1" +68ms
  testcontainers [INFO] [9f55b6bb9277] Starting container for image "testcontainers/ryuk:0.5.1"... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Starting container... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Started container +152ms
  testcontainers [INFO] [9f55b6bb9277] Started container for image "testcontainers/ryuk:0.5.1" +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Inspecting container... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Inspected container +3ms
  testcontainers [DEBUG] [9f55b6bb9277] Fetching container logs... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Demuxing stream... +2ms
  testcontainers [DEBUG] [9f55b6bb9277] Demuxed stream +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Fetched container logs +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Waiting for container to be ready... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Waiting for log message "/.+ Started!/"... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Fetching container logs... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Demuxing stream... +1ms
  testcontainers [DEBUG] [9f55b6bb9277] Demuxed stream +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Fetched container logs +0ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 Pinging Docker... +0ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 Docker daemon is available! +1ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 Starting on port 8080... +0ms
  testcontainers [DEBUG] [9f55b6bb9277] Log wait strategy complete +11ms
  testcontainers [INFO] [9f55b6bb9277] Container is ready +1ms
  testcontainers [DEBUG] [9f55b6bb9277] Connecting to Reaper (attempt 1) on "localhost:55000"... +0ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 Started! +4ms
  testcontainers [DEBUG] [9f55b6bb9277] Connected to Reaper +22ms
  testcontainers [DEBUG] Releasing lock file "/var/folders/h2/71hmhq8n5p59f79w4k8_ksk80000gp/T/testcontainers-node.lock"... +0ms
  testcontainers [DEBUG] Released lock file "/var/folders/h2/71hmhq8n5p59f79w4k8_ksk80000gp/T/testcontainers-node.lock" +1ms
  testcontainers [DEBUG] Executing Docker credential provider "docker-credential-desktop" +1ms
  testcontainers [DEBUG] Executing Docker credential provider "docker-credential-desktop" +1ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 New client connected: 192.168.65.1:56213 +25ms
  testcontainers:containers [9f55b6bb9277] 2024/03/22 10:09:45 Adding {"label":{"org.testcontainers.session-id=9b34cb0b281d":true}} +0ms
  testcontainers [DEBUG] No credential found for registry "https://index.docker.io/v1/" +59ms
  testcontainers [DEBUG] No registry auth locator found for registry "https://index.docker.io/v1/" +0ms
  testcontainers [DEBUG] No credential found for registry "https://index.docker.io/v1/" +2ms
  testcontainers [DEBUG] No registry auth locator found for registry "https://index.docker.io/v1/" +1ms
  testcontainers [INFO] Building Dockerfile "/Users/nicola/personal-projects/be-framework/packages/benchmarks/Dockerfile" as image "bench-http-fastify:latest"... +0ms
  testcontainers [DEBUG] Building image "bench-http-fastify:latest" with context "/Users/nicola/personal-projects/be-framework"... +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 1/31 : FROM node:20-alpine AS base"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"status":"Pulling from library/node","id":"20-alpine"} +1s
  testcontainers:build [bench-http-fastify:latest] {"status":"Digest: sha256:bf77dc26e48ea95fca9d1aceb5acfa69d2e546b765ec2abfb502975f1a2d4def"} +9ms
  testcontainers:build [bench-http-fastify:latest] {"status":"Status: Image is up to date for node:20-alpine"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e e0a5cd56bd9c\n"} +3ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 2/31 : ENV PNPM_HOME=/root/.local/share/pnpm"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 0ddbde311661\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 3/31 : ENV PATH=$PATH:$PNPM_HOME"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +2ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 46a872561aab\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 4/31 : RUN npm i -g pnpm@8.x.x"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 2f5731bd9a7c\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 5/31 : RUN apk add --no-cache libc6-compat"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 86a0f2e52204\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 6/31 : RUN pnpm i -g turbo"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e ab7238a029bf\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 7/31 : WORKDIR /usr/app"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e ab17aac04be7\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 8/31 : ARG WORKSPACE_NAME"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +4ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 5055c62eb735\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 9/31 : ARG WORKSPACE_PATH"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Using cache\n"} +1ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e c85c7acf2ee0\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"aux":{"ID":"sha256:c85c7acf2ee0f50afa5e858cb36b5f23983658a34294462bc9391ebb8d43d2bb"}} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 10/31 : FROM base as pruned"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e c85c7acf2ee0\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 11/31 : COPY . ."} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e 6f1a8316f184\n"} +31s
  testcontainers:build [bench-http-fastify:latest] {"stream":"Step 12/31 : RUN turbo prune --scope ${WORKSPACE_NAME} --docker"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\n"} +0ms
  testcontainers:build [bench-http-fastify:latest] {"stream":" ---\u003e Running in 7d232810005f\n"} +59ms
  testcontainers:build [bench-http-fastify:latest] {"stream":"\u001b[91m ERROR  a value is required for '--scope \u003cSCOPE\u003e' but none was supplied\n\nFor more information, try '--help'.\n\n\u001b[0m"} +283ms
  testcontainers:build [bench-http-fastify:latest] {"errorDetail":{"code":1,"message":"The command '/bin/sh -c turbo prune --scope ${WORKSPACE_NAME} --docker' returned a non-zero code: 1"},"error":"The command '/bin/sh -c turbo prune --scope ${WORKSPACE_NAME} --docker' returned a non-zero code: 1"} +600ms
  testcontainers [DEBUG] Built image "bench-http-fastify:latest" with context "/Users/nicola/personal-projects/be-framework" +51s
  testcontainers [DEBUG] Checking if image exists "bench-http-fastify:latest"... +1ms
  testcontainers [DEBUG] Checked if image exists "bench-http-fastify:latest" +5ms
/Users/nicola/personal-projects/be-framework/node_modules/.pnpm/testcontainers@10.7.2/node_modules/testcontainers/build/generic-container/generic-container-builder.js:68
            throw new Error("Failed to build image");

@cristianrgreco
Copy link
Collaborator

@nicolabovolato I don't necessarily see an issue with the build arguments here. What I do see is this error:

testcontainers:build [bench-http-fastify:latest] {"stream":"\u001b[91m ERROR a value is required for '--scope \u003cSCOPE\u003e' but none was supplied\n\nFor more information, try '--help'.\n\n\u001b[0m"} +283ms

Address this issue first and then we'll see if there's an issue with the build arguments.

@nicolabovolato
Copy link
Author

nicolabovolato commented Mar 22, 2024

@cristianrgreco building directly with the docker command works.

Screenshot 2024-03-22 at 11 29 21

You can see from the logs above that the same build arguments are passed to the testcontainer.

@nicolabovolato
Copy link
Author

Simpler reproduction

https://stackblitz.com/edit/stackblitz-starters-ab6hdh?file=index.ts

Looks like args in multi stage builds are not propagated correctly, perhaps this is something unrelated to testcontainers.

@silh
Copy link
Contributor

silh commented Mar 30, 2024

Hey @nicolabovolato, docker documentation states that build args last until the end of the stage where they were declared - see here https://docs.docker.com/reference/dockerfile/#scope. You declare WORKSPACE_NAME in the first stage and use it in later stages, which should not be supported (according to the documentation), so I am not sue why it works with docker cli client.

@silh
Copy link
Contributor

silh commented Mar 30, 2024

Looks like global build args are a docker build kit feature - https://medium.com/@sujaypillai/globally-scoped-platform-args-in-docker-buildkit-787b9010bbb8, so it's related to #571

@nicolabovolato
Copy link
Author

Hi @silh, thanks for the info. I always assumed this behavior was supported by docker.

We should close this in favor of #571

@cristianrgreco
Copy link
Collaborator

Thanks for looking into this @silh, indeed lack of build kit support seems to be the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
triage Investigation required
Projects
None yet
Development

No branches or pull requests

3 participants