• Jessica Smith

    Jessica Smith

    4 months ago
    Can flow schedules have names? I have one flow that has 4 schedules (two different clocks and two sets of parameters) and there doesn't seem to be a way to differentiate them.
    Jessica Smith
    1 replies
    Copy to Clipboard
  • Cole Murray

    Cole Murray

    4 months ago
    Hi All, Running into an issue with a task exiting due to a bad exit code and stays stuck in running (using Orion).
    21:24:17.733 | ERROR   | prefect.flow_runner.subprocess - Subprocess for flow run '3919d4eb-17ed-4847-bca9-577d8d31d702' exited with bad code: -6
    I see this emitted in the agents log, but the task stays stuck in running. Thoughts?
    Cole Murray
    2 replies
    Copy to Clipboard
  • Danilo Drobac

    Danilo Drobac

    4 months ago
    Hi all Coming from an Airflow background and trying to make sense of how a full deployment looks for Prefect. In Airflow we have VM's that run the different services webserver and scheduler etc... In Prefect (2.0 I'm looking at), I have a Prefect Cloud account that runs the UI (the orion server), if I'm looking for deployment, do I need a VM that connects to the this via service-account (or API key?) and creates the work queues + agents? Additionally, on top of that, what is the recommended CI/CD development cycle for when somebody creates new flows + pushes them to a repo?
    Danilo Drobac
    Florian Guily
    +1
    4 replies
    Copy to Clipboard
  • Jonathan Mathews

    Jonathan Mathews

    4 months ago
    Hi all! Sometimes I’m a bit confused how to set task args when using built-in dbt tasks (such as dbt), Is this the correct syntax for setting a tag on the task? (reason I ask is i can’t see it in prefect cloud when I click into the task):
    dbt_run = dbt(
            command=dbt_run_command,
            task_args={"name": "dbt run", "tags": ["dbt-limit-1"]},
            upstream_tasks=[dbt_deps],
            dbt_kwargs=snowflake_credentials
        )
    Jonathan Mathews
    Anna Geller
    5 replies
    Copy to Clipboard
  • a

    Alexis He

    4 months ago
    Hello, I am evaluating Prefect Orion for the purpose of running workflows of containers in a Kubernetes cluster; in particular, I intend to mount a volume for the containers to access (think k8s persistent volume). What are my options? • As far as I have seen, KubernetesFlowRunner does not allow me that level of customisation. • I think Prefect 1.0 allows this use case Thanks!
    a
    Anna Geller
    3 replies
    Copy to Clipboard
  • Felix Horvat

    Felix Horvat

    4 months ago
    i want to use PostgresExecute - how do i pass the password to this task? it only seems to be possible through the run method?
    Felix Horvat
    Anna Geller
    +1
    9 replies
    Copy to Clipboard
  • o

    Oliver Mannion

    4 months ago
    Hiya when using Prefect Cloud we've experienced more than once now a
    No heartbeat detected
    reported by the Zombie Killer, but we can't see the task state set to failed. Has anyone else experienced this?
    o
    Anna Geller
    12 replies
    Copy to Clipboard
  • k

    Kevin

    4 months ago
    is there a way to troubleshoot a flow that is stuck in 'pending' and won't run?
    k
    1 replies
    Copy to Clipboard
  • a

    Andres

    4 months ago
    HI Guys, I need help with a
    dbtShellTask
    . The issue I have is that I have a handler which sends notifications with details on the failing flows but when dbt task fails with return code 1, all i get is
    'Command failed with exit code 1'
    is there a way to capture in a variable the logs of a failed shell task to then process them and display them using the handler?
    a
    alex
    +1
    4 replies
    Copy to Clipboard
  • s

    Sumant Agnihotri

    4 months ago
    Hi all, Im learning the basics of Prefect and have a quick question. When I write this and register my flow to the cloud:
    with Flow("flow-a") as flow_a:
            a()
            b()
        flow = Flow("flow-a-b", tasks=[a, b])
        flow.register(project_name="tester")
    Is the code written in function
    a
    and
    b
    saved on the cloud, or does the cloud refers to my system every time it needs to run the flow on an agent? If it does refer to my system, will it make a difference if the agent is running on a different system?
    s
    Kevin Kho
    2 replies
    Copy to Clipboard