• Saurabh Indoria

    Saurabh Indoria

    7 months ago
    Hi all, In our prefect deployment, we see task logs like.. "Task finished with final status Pending". Does Lazarus process handle this? I have seen this stuck for hours.
    Saurabh Indoria
    Kevin Kho
    21 replies
    Copy to Clipboard
  • Sergi de Pablos

    Sergi de Pablos

    7 months ago
    Hi everyone. I have a process that submits multiple jobs to an external service, and I have to wait for all of them to be finished before proceeding. I've found how to wait in a task for some external process https://github.com/PrefectHQ/prefect/discussions/3944 to be finished but no idea how to wait for multiple ones to be finished. Thanks
    Sergi de Pablos
    Kevin Kho
    4 replies
    Copy to Clipboard
  • m

    massumo

    7 months ago
    Hello everybody, I want to this https://github.com/PrefectHQ/prefect/issues/5297. But i didn't see any docs about it . Can i do that? I want to create dynamic workflow with input.
    m
    1 replies
    Copy to Clipboard
  • a

    Andreas

    7 months ago
    Hi everyone! I've setup a prefect server on a Google compute instance, but I still have problems grasping some concepts: I have the server and local agent running on the GCE instance and the UI is working. I can register and run a flow that resides on the GCE machine without errors. Now I want to setup GitLab as code repository. My initial intuition was: I can probably setup a connection to my Gitlab repo in a config file and then all flows will be synced from there. When I commit something, it will be updated in prefect as well. This is not the case as it seems. I'm trying to wrap my head around how to use the Storage options, but I really don't understand it. On my server, do I have to create python files for each of my flows that reside in Gitlab? What sense would the repository make then? Thanks for your help! I'm really lost here...
    a
    Kevin Kho
    5 replies
    Copy to Clipboard
  • s

    shijas km

    7 months ago
    Hi I am facing an issue I have created an extraction flow and run it locally , its working fine, this program contains one task and inside this task its calling another function from different python module for ex import abc @task def fun() abc.fun1() assume like above , its working in local but when i register the flow in cloud its failed saying abc not found how can we resolve this should I write all the logic inside task function itself ??
    s
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Suresh R

    Suresh R

    7 months ago
    Hi! In Prefect cloud flow name should be unique across the team or projects?
    Suresh R
    Anna Geller
    3 replies
    Copy to Clipboard
  • e

    Eli Treuherz

    7 months ago
    I’ve got a task mapping set up, but the child tasks sometimes have to retry until the external data they read is ready. When one of them enters the Retrying state, it seems to hold up the entire flow. The parent doesn’t produce any more tasks until the retrying one is complete. Is there way around this behaviour? Ideally I’d like the parent to spawn all its tasks right away, and for Prefect to identify which ones can be run, rather than one retry blocking everyone else
    e
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Kevin Mullins

    Kevin Mullins

    7 months ago
    Can tasks accept a Callable argument from another task? For example, I have an upstream task that needs to make a decision on whether to use
    function a
    in a downstream task or
    function b
    in a down stream task (essentially deciding on a strategy function or factory function). Could the down-stream task properly get what function to execute from the other? Hopefully this makes sense.
    Kevin Mullins
    Anna Geller
    +1
    9 replies
    Copy to Clipboard
  • Matthew Webster

    Matthew Webster

    7 months ago
    Hi, I’m looking for some guidance on how to use the same flows across multiple projects. For context: we are doing ETL for customers and some flows are identical. We’re trying ECS Agent/Prefect Cloud with a custom Docker image and S3 storage. I am currently registering the flow with the CLI for multiple customers and passing in customer-specific information as Parameters when scheduling the run. This currently breaks down in a few places: 1. Having some problem with Storage and unpicking flows. Is there project-specific info stored there? So when a flow is registered/picked for one project it can’t be unpickled by another? 2. Using secrets. We have some customer-specific API Keys that need to be stored as Secrets and these appear to only be set globally in Prefect Cloud (or AWS SSM). There are no per-project secrets that I can pass in. One option I can think of would be to make a service that takes a Parameter and returns a secret that then gets passed to a task but maybe there’s a better way? I’m still pretty new to Prefect but couldn’t find answers to these. Hoping for some help from the community!
    Matthew Webster
    Kevin Kho
    21 replies
    Copy to Clipboard