Hi all, happy Thursday! Looking for some help with...
# prefect-community
a
Hi all, happy Thursday! Looking for some help with KubernetesEnvironment _`job_spec_file`._ It doesn’t seem to include my environment variables. Had a look at the source code and docs and it definitely seems it should. Would someone mind looking at my scripts and letting me know what I’m doing wrong? Could be the weird way i’m doing the builds (multi flow storage). Code in the thread
👀 1
Copy code
import uuid
from os import environ, path

import docker
from prefect.environments import KubernetesJobEnvironment
from prefect.environments.storage import Docker

from customer_balances import flow as customer_balances
from deleted_customers import flow as deleted_customers

FLOWS = [customer_balances, deleted_customers]

this_dir = path.dirname(path.realpath(__file__))
job_spec_file_path = path.join(this_dir, "job_spec.yaml")
registry_url = "<http://gcr.io/xxx/xxx|gcr.io/xxx/xxx>"
image_tag = uuid.uuid4().hex
tls_config = None
base_url = None

# Special Docker-in-Docker configuration for CircleCI
if environ.get("CI"):
    print("Running on CI")
    tls_config = docker.tls.TLSConfig(
        client_cert=(
            path.join(environ.get("DOCKER_CERT_PATH", ""), "cert.pem"),
            path.join(environ.get("DOCKER_CERT_PATH", ""), "key.pem"),
        ),
        verify=False,
    )
    base_url = environ.get("DOCKER_HOST")


# Configure the storage object
storage = Docker(
    image_name="xxx",
    registry_url=registry_url,
    image_tag=image_tag,
    base_url=base_url,  # required for CircleCI
    tls_config=tls_config,  # required for CircleCI
    files={
        path.join(this_dir, "customer_balances.py"): "/files/customer_balances.py",
        path.join(this_dir, "deleted_customers.py"): "/files/deleted_customers.py",
        path.join(this_dir, "job_spec.yaml"): "/files/job_spec.yaml",
    },
    env_vars={"PYTHONPATH": "${PYTHONPATH}:/files"},
    python_dependencies=[
        "pandas",
        "prefect[google,kubernetes,postgres]",
        "requests",
        "synapsepy",
        "sqlalchemy",
    ],
)

# Add the flows to the Docker strorage
for flow in FLOWS:
    storage.add_flow(flow)

# Build the Docker image
storage_ref = storage.build()

# Assign the flow storage to Docker and register
for flow in FLOWS:
    flow.environment = KubernetesJobEnvironment(job_spec_file=job_spec_file_path)
    flow.storage = storage_ref
    flow.register(project_name="prefect-test-1", build=False)
d
Hey @Adam, everything looks correct to me
This is all running without errors?
a
Unfortunately not @Dylan. It seems the Jobs are created without any modifications. They should have the environment variables from the job_spec.yaml
Job spec looks like this:
Copy code
apiVersion: batch/v1
kind: Job
metadata:
  name: prefect-job
  labels:
    identifier: ""
spec:
  template:
    metadata:
      labels:
        identifier: ""
    spec:
      restartPolicy: Never
      containers:
        - name: flow-container
          image: ""
          command: []
          args: []
          env:
            - name: MY_ENV
              value: foo
            - name: POSTGRES_HOST
              valueFrom:
                configMapKeyRef:
                  name: globals
                  key: POSTGRES_HOST
            - name: POSTGRES_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: postgres
                  key: password
            - name: POSTGRES_USER
              valueFrom:
                secretKeyRef:
                  name: postgres
                  key: username
            - name: SYNAPSE_CLIENT_ID
              valueFrom:
                secretKeyRef:
                  name: synapse
                  key: client-id
            - name: SYNAPSE_CLIENT_SECRET
              valueFrom:
                secretKeyRef:
                  name: synapse
                  key: client-secret
          resources:
            limits:
              cpu: "2"
              memory: 4G
            requests:
              cpu: "0.5"
              memory: 1G
d
@Marvin open “Kubernetes Job Spec Not Respected”
d
Hi @Adam, this looks like a bug to me. I’ve opened the above issue to track. Thanks for letting us know!
a
Thanks @Dylan, would love to try get this resolved soon as it prevents us running any jobs in production 😞