• m

    Matt Harvey

    3 years ago
    Good morning! Apologies if this is answered in the docs, I can't seem to find it. Prefect always seems to run in UTC, and so the
    CronClock
    needs to be set in UTC. Not a big deal, but is it possible to override the timezone? 🕐
    m
    Chris White
    +1
    5 replies
    Copy to Clipboard
  • s

    Stewart Webb

    3 years ago
    Hi All, I'm trying to figure out what the best-practice would be within Prefect if within a task I need to send a job to a 3rd party appliance which cannot scale, so I need to not send too many concurrent jobs to it. All I can find in relation to this is the queued state I could put a given Task in. I assume then I'd be looking at building my own resource queuing system (using redis-or-the-like) to queue these jobs..
    s
    Chris White
    3 replies
    Copy to Clipboard
  • a

    Aakarsh Nadella

    3 years ago
    I am trying to run prefect flow on Dask cluster using kubernetes and AWS. I was able to import "prefect" and also print its version. But when I execute the flow, it says
    ModuleNotFoundError: No module named 'prefect'
    . I have installed all the dependencies that need to be installed by Dask-worker and Dask-jupyter using yaml file.
    a
    Chris White
    +1
    19 replies
    Copy to Clipboard
  • a

    Argemiro Neto

    3 years ago
    Hello, everyone! I'm new to Prefect and Python as well. Correct me if I'm wrong: tasks inside a flow will run in sequence. If a task fails the downstream ones will not be executed. On my tests, the execution flow is not deterministic. In the code below I expect the following output:
    ==== START ====
    === starting the flow
    == generate called
    = status: [1, 2, 3]
    == task runner called with 1
    == task runner called with 2
    == task runner called with 3
    ==== END ====
    Code:
    @task
    def generate_task_list() -> list:
        print("== generate called")
        return [1, 2, 3]
    
    
    @task(max_retries=3, retry_delay=timedelta(seconds=0))
    def task_runner(x: int) -> None:
        print('== task runner called with {}'.format(x))
    
    
    @task
    def print_status(text: any) -> None:
        print('= status: {}'.format(text))
    
    
    with Flow("random-mapping") as f:
        print_status('=== starting the flow')
        values = generate_task_list()
        print_status(values)
        final = task_runner.map(x=values)
    
    print('==== START ====')
    f.run()
    print('==== END ====')
    By running this multiple times I get different results but never what I expected. What am I missing? Thank you!
    a
    Chris White
    +1
    9 replies
    Copy to Clipboard
  • b

    Brett Naul

    3 years ago
    is there a simple way to use a secret in a custom task? so far I’ve been using a
    RemoteEnvironment
    and my existing cluster already loads
    GOOGLE_APPLICATION_CREDENTIALS
    from a k8s secret; for a
    DaskKubernetesEnvironment
    I guess I need to pull it from a prefect
    Secret
    instead. do I just need to add
    os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = Secret(...).get()
    everywhere or is there an easier way?
    b
    j
    +1
    14 replies
    Copy to Clipboard
  • a

    Argemiro Neto

    3 years ago
    Is there possible to run
    tasks
    inside
    tasks
    and
    flows
    inside
    tasks
    ? I was able to call a
    flow
    inside a task but got a
    ValueError: Could not infer an active Flow context
    error every time I tried to send parameters to the
    flows
    from the
    tasks
    . Got the same error when calling a
    task
    within a
    task
    .
    a
    Chris White
    7 replies
    Copy to Clipboard
  • Alex Lopes

    Alex Lopes

    3 years ago
    Hi guys, good morning. 🙂 I'm starting to use prefect, and have a quick newbie question: What kind of scenario would I use task class instead of the task decorator? Just trying to understand it better.
    Alex Lopes
    Zachary Hughes
    +2
    6 replies
    Copy to Clipboard
  • j

    Jerry Thomas

    3 years ago
    I would like to try running my prefect flow on a dask-kubernetes cluster. I saw that it is possible but I am not able to grasp how. Can you share an example of running a prefect flow on a dask-kubernetes cluster? Should I create a custom worker (pod/docker) for the cluster with the appropriate libraries? How does my prefect flow end up executing on the dask-worker if the worker pod/docker does not contain my code?
    j
    Zachary Hughes
    +1
    4 replies
    Copy to Clipboard
  • c

    Christopher Stokes

    3 years ago
    Hello folks. I feel like I'm missing something pretty basic. I have the following code that runs
    CreateRidAlert
    task twice and throws warnings:
    create_alert = CreateAlert()
    create_rid_alert = CreateRidAlert()
    
    with Flow('Alert Flow') as flow:
        alert_json = Parameter(name="alert_json", required=True)
        alert = create_alert(alert_json)
        rid_alert = create_rid_alert(alert)
        ifelse(has_rid(alert), rid_alert, None)
        rid = extract_rid(rid_alert)
    I'd like to remove the
    rid_alert = create_rid_alert(alert)
    line but then I don't know how to wire up the
    ifelse
    line with a result to pass to
    extract_rid
    for data flow. This may not be a clear question.
    c
    Chris White
    +1
    10 replies
    Copy to Clipboard
  • m

    Markus Binsteiner

    3 years ago
    Quick question about scheduling: is there a way to gracefully cancel a running (infinite) flow that uses a schedule? Also, I need to manage several of them in my application, so I plan to write some sort of management class for runs. Except, maybe something like this already exists, does anybody know?
    m
    Jeremiah
    3 replies
    Copy to Clipboard