• Varun Joshi

    Varun Joshi

    1 year ago
    I'm not able to see the logs for my flows which I used to be able to previous. Yes, my log_stdout is set to True in task settings. Any reason why this is happening?
    Varun Joshi
    Kevin Kho
    +1
    18 replies
    Copy to Clipboard
  • Joseph Ellis

    Joseph Ellis

    1 year ago
    Has anyone got any recommendations on how best to deploy the Prefect agent in AWS? We’re leveraging the ECS Agent to execute flow runs within our ECS cluster (great, that all works fine). But, we’re contemplating whether the agent service itself is best run on EC2 vs. a long-running container in Fargate? Any help appreciated 🙂.
    Joseph Ellis
    Kevin Kho
    +1
    7 replies
    Copy to Clipboard
  • Brent Bateman

    Brent Bateman

    1 year ago
    Howdy, does Prefect have any recommended SIs familiar with how to implement Prefect with Snowflake (dbt a plus)?
    Brent Bateman
    Kevin Kho
    4 replies
    Copy to Clipboard
  • b

    Berty

    1 year ago
    👋 I need a little help with the cloud UI
    b
    2 replies
    Copy to Clipboard
  • dh

    dh

    1 year ago
    Howdy, when an Agent triggers a run of a registered flow, is there some standard way to ask Agent to pass some arguments into the flow? e.g. Agent would do:
    registerd_flow.run(**runtime_args_passed_by_agent)
    Or is this disallowed by design? (registered flow shall be self-sufficient for reproducibility) Context: we have some flow that depends on some value that changes quite often (e.g. dependency package version number). We don’t want to register a new flow each time we update the version number; rather have one flow and have the agent run the flow expecting the package version information will be provided to it at run time. We thought about using env var, but not sure if it’s the best way…
    dh
    Kevin Kho
    12 replies
    Copy to Clipboard
  • Rob Fowler

    Rob Fowler

    1 year ago
    damn, seems Dask thread executors are broken again, back to processes and everything is good
    Rob Fowler
    tash lai
    5 replies
    Copy to Clipboard
  • b

    Brian Keating

    1 year ago
    I've written a
    ResourceManager
    to create and terminate EC2 instances. I test it out with a bare bones workflow:
    @task
    def do_something_on_instance(instance_id):
        prefect.context.get('logger').info(f'Do something on instance {instance_id}')
    
    with Flow('hello-ec2') as flow:
        with EC2Instance('t2.micro') as instance_id:
            do_something_on_instance(instance_id)  # instance_id is a string
    This works correctly when using github storage, but when I switch to S3, the flow fails with
    TypeError: cannot pickle 'SSLContext' object
    . Anyone know what's going on here? Note that the value returned by
    EC2Instance.setup
    is a
    str
    .
    b
    Kevin Kho
    8 replies
    Copy to Clipboard
  • Joe McDonald

    Joe McDonald

    1 year ago
    Anyone had this show up when using the ECS agent? We are running an ECS cluster with three agents and prefect server in a separate ecs cluster all running 3 instances of all services. It seems in the deprecated fargate agent you could have it create new family for each run, but looks like ECS agent doesn’t do that, it reuses the same family for task definition and increments version so when running a lot of flows at the same time we run into this.
    An error occurred (ClientException) when calling the RegisterTaskDefinition operation: Too many concurrent attempts to create a new revision of the specified family.
    Joe McDonald
    2 replies
    Copy to Clipboard
  • x

    xyzz

    1 year ago
    I'm a bit irritated about the information on https://docs.prefect.io/orchestration/faq/dataflow.html#gotchas-and-caveats , which claims to be an exhaustive list of data that might end up in the prefect database. It doesn't mention anything about the flow metadata like names of flows and tasks and their configuration like run_config, schedules or storage. So I assume this list isn't actually exhaustive?
    x
    emre
    3 replies
    Copy to Clipboard
  • Noah Holm

    Noah Holm

    1 year ago
    I’m using S3 storage for flows running with an ECS Agent. By default the flow’s tasks uses S3Result as outlined here. Is there any way to disable the task results for individual or all tasks in the flow while keeping the S3Storage on the flow?
    Noah Holm
    Amanda Wee
    +1
    10 replies
    Copy to Clipboard