Skip to content
This repository has been archived by the owner on Jan 8, 2024. It is now read-only.

Templates' UID Directive Ignored When Deploying Waypoint Application to Nomad #4731

Open
zboralski opened this issue May 18, 2023 · 3 comments

Comments

@zboralski
Copy link

Issue Description:
I have encountered an issue where the UID (user ID) directive specified within the templates of a Nomad job is ignored when deploying the same job through Waypoint to Nomad. Surprisingly, deploying the job directly to Nomad without using Waypoint results in the UID directive being applied correctly.

Details:

I have a Nomad job with template blocks that include the uid directive, intended to set the UID of specific files generated within the container.

When deploying the job directly to Nomad using the nomad run command or through the Nomad UI, the UID directive functions as expected. The file ownership and permissions are set correctly within the container.

However, when I deploy the same job through Waypoint using the waypoint deploy command to Nomad, the UID directive appears to be ignored. The file ownership and permissions are not set according to the specified UID, resulting in unexpected behavior.

I have verified that the UID directive is correctly included in the Waypoint deployment configuration.

I have also attempted disabling the entrypoint in the Waypoint configuration, but it did not resolve the issue.

Expected Behavior:
When deploying a Nomad job through Waypoint, the UID directive specified within the template blocks should be respected, and the file ownership and permissions should be set accordingly.

Actual Behavior:
The UID directive specified within the templates of a Nomad job is ignored when deploying the job through Waypoint. The file ownership and permissions are not set as expected, resulting in unexpected behavior.

Additional Information:

Everything (nomad, waypoint, etc) was up to date as of today.

I have invested several hours investigating and troubleshooting this issue, but I have been unable to find a resolution.

I have confirmed that the Nomad version used by Waypoint is the same as the one used for direct deployments, ruling out any version compatibility issues.

The issue occurs consistently with different Nomad jobs and templates.
This issue significantly impacts my workflow as I rely on Waypoint for managing deployments. I would greatly appreciate any assistance or insights into resolving this problem.

@zboralski zboralski added the new label May 18, 2023
@evanphx evanphx removed the new label May 24, 2023
@evanphx
Copy link
Contributor

evanphx commented May 24, 2023

Hi @zboralski! Are you using the nomad jobspec plugin then? Could you provide the waypoint.hcl and nomad jobspec that you're using?

@zboralski
Copy link
Author

waypoint.hcl

project = "gubernator"

app "gubernator" {
  build {
    use "docker" {
      buildkit           = true
      platform           = "linux/amd64"
      disable_entrypoint = true
    }

    registry {
      use "docker" {
        image = "xxx/gubernator"
        tag   = "latest"
        local = false
      }
    }
  }

  deploy {
    use "nomad-jobspec" {
      jobspec = templatefile("${path.app}/gubernator.nomad.hcl", {
        hostname   = var.hostname
        datacenter = var.datacenter
      })
    }
  }

  release {}

  url {
    auto_hostname = false
  }
}

variable "hostname" {
  type    = string
  default = "localhost"
}

variable "datacenter" {
  type    = string
  default = "dc1"
}

gubernator.nomad.hcl

job "gubernator" {
  datacenters = ["xxx"]
  group "gubernator" {
    network {
      port "http" {
        host_network = "tailscale"
        static       = 9080
        to           = 9080
      }

      port "grpc" {
        host_network = "tailscale"
        static       = 9081
        to           = 9081
      }
    }

    service {
      provider = "nomad"
      name     = "gubernator"
      port     = "grpc"

      check {
        type     = "http"
        path     = "/v1/HealthCheck"
        port     = "http"
        interval = "10s"
        timeout  = "1m"
      }
    }

    vault {
      policies = ["nomad-gubernator"]
    }

    task "gubernator" {
      driver = "docker"
      config {
        image = "belua/gubernator:latest"

        ports = ["grpc", "http"]
        force_pull = true
      }

      env {

        GUBER_ADVERTISE_ADDRESS = "${NOMAD_HOST_ADDR_grpc}"
        GUBER_GRPC_ADDRESS = "0.0.0.0:${NOMAD_HOST_PORT_grpc}"
        GUBER_HTTP_ADDRESS = "0.0.0.0:${NOMAD_HOST_PORT_http}"
        // GUBER_TLS_CA = "${NOMAD_ALLOC_DIR}/ca.pem"
        // GUBER_TLS_CERT = "${NOMAD_SECRETS_DIR}/server.pem"
        // GUBER_TLS_CLIENT_AUTH_SERVER_NAME = "gubernator"
        // GUBER_TLS_KEY = "${NOMAD_SECRETS_DIR}/server-key.pem"
      }

//       template {
//         data        = <<EOH
// {{ with secret "pki_int/issue/gubernator" "common_name=gubernator" "format=pem" }}{{ .Data.certificate }}{{ end }}
//         EOH
//         destination = "${NOMAD_SECRETS_DIR}/server.pem"
//         change_mode = "noop"
//         uid         = "1000"
//       }

//       template {
//         data        = <<EOH
// {{ with secret "pki_int/issue/gubernator" "common_name=gubernator" "format=pem" }}{{ .Data.private_key }}{{ end }}
//         EOH
//         destination = "${NOMAD_SECRETS_DIR}/server-key.pem"
//         change_mode = "restart"
//         perms       = "0400"
//         uid         = "1000"
//       }

//       template {
//         data        = <<EOH
// {{ with secret "pki_int/issue/gubernator" "common_name=gubernator" "format=pem" }}{{ .Data.issuing_ca }}{{ end }}
//         EOH
//         destination = "${NOMAD_ALLOC_DIR}/ca.pem"
//         change_mode = "noop"
//         uid         = "1000"
//       }

      template {
        data        = <<EOH
REDIS_HOST="{{ env "NOMAD_IP_grpc" }}:6379"
{{ with secret "keydb/creds/gubernator" }}REDIS_USERNAME={{ .Data.username }}
REDIS_PASSWORD={{ .Data.password }}{{ end }}
EOH
        destination = "${NOMAD_SECRETS_DIR}/gubernator.env"
        change_mode = "restart"
        env         = true
        perms       = "0400"
        uid         = "1000"
      }
    }
  }
}%       

@zboralski
Copy link
Author

If I build the image and deploy directly with nomad... the exact same nomad job will set the uid correctly.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants