• h

    Hawkar Mahmod

    2 years ago
    Hey folks. I've got a Prefect flow set up. For execution I am using the Fargate Agent, itself hosted on an ECS cluster. In the flow I've configured the Docker Storage to use a custom ECR repository to host the image for the flow and I then register the flow in the same module (.py file). I am however ensure how to coordinate the deployment of the agent and new flows. How are people registering flows in production and deploying their agent? Is it all part of one build process, or are they separate, and if so how do you manage it, what tools are you using for instance? I am considering using AWS CodeBuild since we already use it on other projects?
    h
    j
    +3
    6 replies
    Copy to Clipboard
  • l

    Lukas N.

    2 years ago
    Hi everyone, I'm running Prefect in Kubernetes and I'm trying to make it talk to an existing vault. We use the Kubernetes agent to deploy jobs, by default the job doesn't talk to the vault. I would need to add some annotations to the job to make it work
    <http://vault.security.banzaicloud.io/vault-addr|vault.security.banzaicloud.io/vault-addr>:
    <http://vault.security.banzaicloud.io/vault-role|vault.security.banzaicloud.io/vault-role>:
    Looking at the code of the agent the
    job_spec
    is hardwired and I cannot modify it. I've also checked the
    KubernetesJobEnvironment
    which seems like a way to go for a custom
    job_spec.yaml
    file. But in this case, the environment values specified in the prefect Kubernetes agent
    prefect agent start kubernetes --env NAME=value
    don't get passed to the custom job. They only get passed to the first Job that creates the custom job. Is there another way to have both custom annotations on Jobs and environment values passed from prefect kubernetes agent?
    l
    j
    2 replies
    Copy to Clipboard
  • Adam

    Adam

    2 years ago
    Hello friends, trust you’re all having a lovely day! I’m having some issues with building my docker image due to pickling and I’m hoping someone can lend a hand. Error in the thread
    Adam
    j
    20 replies
    Copy to Clipboard
  • Adam

    Adam

    2 years ago
    Hi all, does anyone have any examples of patterns for bulk loading data into Postgres from Prefect? Currently I’m mapping over my dataset and passing each individual record into the
    PostgresExecute
    task which runs an insert statement. In addition to the thousands of INSERTs, the task also creates and terminates a postgres connection each time which i’d prefer to avoid. Thoughts? My current implementation in the thread
    Adam
    Dylan
    12 replies
    Copy to Clipboard
  • r

    Riley Hun

    2 years ago
    Hi everyone, Was wondering what would be the best way to submit a spark job to a DataProc cluster through Prefect? The way I'm doing it right now (even though I haven't yet executed it) is as follows: I have these items baked into the dockerfile for the dask workers: • gcloud command tools • spark-snowflake credentials json file • A copy of the main spark job python file Is this the correct approach?
    r
    1 replies
    Copy to Clipboard
  • s

    Slackbot

    2 years ago
    This message was deleted.
    s
    Chris White
    2 replies
    Copy to Clipboard
  • Adam

    Adam

    2 years ago
    Hopefully my last question for the day, I’m having some trouble building the docker image for my flow when one of my tasks uses calls another function (non-task) in my file. It fails the health check because of
    cloudpickle_deserialization_check
    If I stop calling that function it works again. Am I missing something? Can I not call other methods from within a task?
    Adam
    j
    7 replies
    Copy to Clipboard
  • g

    George Coyne

    2 years ago
    A number of flows in prefect cloud are stuck in submitted and zombie killer is not putting them to bed, any suggestions on getting them processed?
    g
    nicholas
    +2
    45 replies
    Copy to Clipboard
  • g

    George Coyne

    2 years ago
    Logs are clean, nothing untoward
  • Amit

    Amit

    2 years ago
    Hi Team, I was looking at slack integration for notifications. The docs: https://docs.prefect.io/core/advanced_tutorials/slack-notifications.html#using-your-url-to-get-notifications says to save the slack url in the
    ~/.prefect/config.toml
    file, is there any other way like environment variable or something?
    Amit
    l
    5 replies
    Copy to Clipboard