• n

    Nikola Lusic

    1 year ago
    Does the
    ECSRun
    configuration combined with
    LocalDaskExecutor(scheduler="processes", num_workers=4)
    support parallel execution of mapped tasks? Currently I'm unable to get the ECS task to spawn any additional processes - all are run in sequence (first image). When running the same flow on the local Prefect environment, the tasks are all done in parallel (second image). If I use
    LocalDaskExecutor(scheduler="threads", num_workers=4)
    , the flow tasks are executed in parallel, but threaded flow only covers part of our use cases.
    n
    ciaran
    +1
    20 replies
    Copy to Clipboard
  • h

    Howard Cornwell

    1 year ago
    Hey, I’m receiving this error when running a docker flow on a k8s cluster. The flow runs fine locally, but fails every time on the cluster:
    Failed to load and execute Flow's environment: TypeError("default() got an unexpected keyword argument 'default_scopes'")
    I tried out some already-deployed flows, they run fine. But if I re-deploy them they start raising the same error. Running
    0.13.19
    in the container and on the server. Any advice would be great!
    h
    1 replies
    Copy to Clipboard
  • Zach Schumacher

    Zach Schumacher

    1 year ago
    Hey, I’m on a paid standard account and have been running into the below semi-frequently. Note this is from a task run from last week.
    Zach Schumacher
    1 replies
    Copy to Clipboard
  • Carlos Gutierrez

    Carlos Gutierrez

    1 year ago
    Hi all 🙂 I have a question regarding the usage of
    flow.serialized_hash()
    for flow change detection in automated flow register processes. I found out that whenever a flow is registered with a particular task, lets say
    task_A
    , and then I update the values of the parameters passed to the task (imagine for instance
    task_A(var='typo_string') --> task_A(var='correct_string')
    ), the
    serialized_hash()
    will remain invariant and thus the flow will not work according to the last changes because it will not bump a new version to the server. Would like to know if there is a better way to do this or I might be using the wrong approach
    Carlos Gutierrez
    Kevin Kho
    +1
    10 replies
    Copy to Clipboard
  • Raúl Mansilla

    Raúl Mansilla

    1 year ago
    Hello friends! I´m trying to run a flow into an ECS cluster…the thing is that I see the task, pending and running but the flow never gets done as one error raise:
    Failed to load and execute Flow's environment: ModuleNotFoundError("No module named 'ecs_test'")
    Raúl Mansilla
    Kevin Kho
    36 replies
    Copy to Clipboard
  • Shea O'Rourke

    Shea O'Rourke

    1 year ago
    Hey all, I work at a Boston startup that utilizes Prefect very heavily and have a question about Prefect's API rate limiting. In the past, we used to have no control on the amount of concurrent calls we would make to Prefect. We then started to get rate limited by Prefect as our requests would come in large bursts that may have been overloading Prefect. We then started to only allow one concurrent Prefect call at a time to reduce possible lost of requests as a result of rate limiting. This then proved too small of an amount and we've boosted it up to five concurrent calls. We are currently having trouble with this being too small of an amount and are looking to further boost the number. Our question is whether there are specific limits we should be aware of and work around to prevent the API rate limiting that has affected us poorly in the past.
    Shea O'Rourke
    nicholas
    3 replies
    Copy to Clipboard
  • a

    Alex Furrier

    1 year ago
    I'm trying to understand how to create a Flow of Flows (dependent flows) from flows I've been running locally. Currently I can run the flows in succession using a Makefile where each file invoked by python contains a Flow and a flow.run() wrapped with CLI param inputs:
    test-flow-of-flows:
    	@echo 'Running flow A'
    	@python flows/flow_a.py \
    		param1=foo \
    	    param2=bar 
    	@python flows/flow_b.py \
    		param1=foo \
    	    param2=bar 		
    	@python flows/flow_c.py \
    		param1=baz \
    	    param2=bar
    I would like to combine those into a single Flow run with shared parameters passed to the flow runs. That seems to be what's described in this documentation. That mentions registering flows using the orchestration API to specific projects. So far I've been running flows without that. Is there any way to create a flow of flows with local importing of flows or do they have to be registered with the orchestration API to do so?
    a
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Nathan Atkins

    Nathan Atkins

    1 year ago
    Catch Up CronClock: I wanted to use the CronClock to run a flow at the same time every day. I also want it to catch up if the start date and after date are before today. start_date=5/25/2021, after_date=5/30/2021 and today=6/2/2021. I would get a schedule to run on 5/31, 6/1 and 6/2. These would all be processed ASAP and then the flow would wait until 6/3 to run again. I’m having 2 problems with this.1. I don’t see how to pass
    after
    into the schedule as part of
    flow.run()
    . I have hacked
    flow._run()
    to support this. 2. Something past my Python knowledge is causing
    CronClock.events()
    to do something weird when the yield returns. This causes the execution to drop directly out of the while loop and exit the method. Each new call to
    schedule.next()
    winds up creating a new
    croniter
    and running the same event start_date again. If I build the clock directly and call it’s next on the iterable returned
    events()
    it works as I would expect.
    Nathan Atkins
    Chris White
    6 replies
    Copy to Clipboard
  • b

    Ben Collier

    1 year ago
    Hi there Prefect people! Just a quick question. We have a our agent running up behind a firewall, and have added the firewall’s root cert to the keychain at /etc/ssl/certs/ca-certificates.crt However, we’re still getting [SSL: CERTIFICATE_VERIFY_FAILED] when trying to authenticate with Prefect Cloud. I’ve set the env var SSL_CERT_FILE to this location, in case Python’s picking that up. Would you confirm where you’re picking up root certs from?
  • b

    Ben Collier

    1 year ago
    QUICK UPDATE: Realised that
    requests
    in python uses its own bundle and set
    REQUESTS_CA_BUNDLE
    . All now good.