• e

    Enda Peng

    8 months ago
    Is the feature that loading api_key from
    ~/.prefect/config.toml
    deprecated? I tried with
    prefect agent local start -t <my-token>
    which succeeds, however, after I save it under config.toml and restart without
    -t
    , it complains about missing API key, this is my config.toml file
    [xxx]# cat ~/.prefect/config.toml 
    [cloud]
    api_key = "*******"
    e
    Kevin Kho
    +2
    20 replies
    Copy to Clipboard
  • Anh Nguyen

    Anh Nguyen

    8 months ago
    i want to active flow A , but flow B are in process and has same function as flow A too. So I want to complete flow A first then flow B will be available automatically.
    Anh Nguyen
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Sean Leakehe

    Sean Leakehe

    8 months ago
    Can someone help me figure out what causes these
    UnixHTTPConnectionPool
    errors in Prefect Cloud? These happen seemingly at random. I have this particular flow set to retry 3 times and it I got this on all 3 retries.
    Sean Leakehe
    Kevin Kho
    17 replies
    Copy to Clipboard
  • chelseatroy

    chelseatroy

    8 months ago
    Hi folks! We want to run an incremental flow in Prefect. I.e., we move some data from Snowflake to a AWS SageMaker feature store. We have this part working. However, what we want to do is run the flow on a regular schedule, and each time, only move the rows in Snowflake that are not already moved to AWS from the previous run. Is there a good way to schedule incremental runs like this in Prefect itself? If someone knows of an example to point us to of how this is done, we'd be grateful 🙂
    chelseatroy
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Christoph Deil

    Christoph Deil

    8 months ago
    Are flow objects mostly similar or fundamentally different in Prefect Core vs Prefect Orion? I’d be specifically interested in why in Prefect Core you decided to make Flow a context manager and not a decorator like it is now in Prefect Orion. Was that a more or less arbitrary choice, or is there a deeper reason why in the old API a decorator wasn’t possible or desirable? There’s the statement at https://www.prefect.io/blog/announcing-prefect-orion/ that now flows aren’t DAGs any more. But I couldn’t find an explanation what they are now. I presume the prefect.flow decorator still does some analysis and generates a DAG-like data structure that represents task dependencies? Is this change in what a flow object is related to the change context manager -> decorator? https://docs.prefect.io/core/concepts/flows.html https://orion-docs.prefect.io/api-ref/prefect/flows/
    Christoph Deil
    Jeremiah
    +1
    5 replies
    Copy to Clipboard
  • Tom Shaffner

    Tom Shaffner

    8 months ago
    I've got a case where I start a local flow with num_workers in a LocalDaskExecutor set to 6, but the task seems to start many more tasks than that. For example, that flow is currently showing 15 tasks running simultaneously. Is there something else needed to limit the number of simultaneous tasks? It overloads the machine as is.
    Tom Shaffner
    Kevin Kho
    32 replies
    Copy to Clipboard
  • a

    An Hoang

    8 months ago
    hi there,
    task_run_name
    when templated with input names can be very helpful in debugging mapped tasks. Are they only available on Prefect backend (Cloud and Server) and not Core? If I'm doing local debugging with
    flow_result = flow.run()
    , let's say I have a flow with
    task_a
    mapped 1000 times and
    task_a[25]
    failed but the other indices succeed. What's the quickest way to find out which input caused it to fail? I don't think I can access the result of
    flow_result.result[task_a].result[25]
    a
    Kevin Kho
    2 replies
    Copy to Clipboard
  • Fina Silva-Santisteban

    Fina Silva-Santisteban

    8 months ago
    Hi everyone, I’m looking for suggestions for how to use prefect together with a Spark cluster running on aws. Is there an equivalent to Airflow’s SparkSubmitOperator ? I’ve only found a DataBricksTask in the docs, but I’m not interested in getting a Databricks subscription at this point.
    Fina Silva-Santisteban
    Anna Geller
    +1
    7 replies
    Copy to Clipboard
  • Tom Klein

    Tom Klein

    8 months ago
    what's the proper way to make a Kubernetes agent (and a Kubernetes Run) use an S3 storage for the flows? we tried the k8s
    service account
    but it seems insufficient and we noticed some warning in the docs about that but couldn't really decipher what it would mean for us since we don't use these methods to define permissions:
    Tom Klein
    Anna Geller
    20 replies
    Copy to Clipboard
  • j

    John Muehlhausen

    8 months ago
    I'm on "account settings" -> "account" and I don't see a way to see my past payment activity. Where do I see a recap of each credit card charge?
    j
    Anna Geller
    3 replies
    Copy to Clipboard