• r

    Raymond Yu

    6 months ago
    Heya Prefectionists, I’ve built something of a dumb s3 sensor, but it appears to get stuck whenever it attempts to retry. Any suggestions or ideas as to why it could hang on retry? I’ll include a snippet of the code in the comments, all the upstream tasks have result types in PrefectResult().
    r
    Kevin Kho
    +1
    8 replies
    Copy to Clipboard
  • Ayah Safeen

    Ayah Safeen

    6 months ago
    Hi all, I'm using prefect core, and when I
    run prefect server start
    I get the following error
    Pulling postgres ... error
    Pulling hasura   ... error
    Pulling graphql  ... error
    Pulling apollo   ... error
    Pulling towel    ... error
    Pulling ui       ...
    Ayah Safeen
    Anna Geller
    2 replies
    Copy to Clipboard
  • richard-social 812

    richard-social 812

    6 months ago
    Hello, Is this the correct way to ensure a task only runs once a week, in a flow scheduled to run daily?
    @task(result=LocalResult(dir=f'{os.getcwd()}/storage/pre_delinquency_models',
                             serializer=H2OModelSerializer()),
          checkpoint=True, target='{date:%Y}-Week {date:%U}', log_stdout=True)
    def train_new_model(data: pd.DataFrame):
    I have it running on prefect cloud with a run_config that looks like below:
    run_config = LocalRun(env={'PREFECT__FLOWS__CHECKPOINTING':os.environ.get('PREFECT__FLOWS__CHECKPOINTING', 'true')})
    However, looking at the results folder I see that the task result file is created a new with each daily run . What am I missing?
    richard-social 812
    emre
    +1
    4 replies
    Copy to Clipboard
  • Dekel R

    Dekel R

    6 months ago
    Hey, I have a flow running using Vertex Run - this is my run config -
    flow.run_config = VertexRun(scheduling={'timeout': '3600s'},
                                machine_type='n2-highcpu-80', labels=["ml"],
                                service_account=PREFECT_SERVICRE_ACCOUNT)
    It works without the scheduling parameter - I added it and used this documentation -https://docs.prefect.io/orchestration/flow_config/run_configs.html#vertexrun Now when registering to Prefect cloud and running I get this error -
    Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Duration got str.
    Am I missing something? Thanks
    Dekel R
    Anna Geller
    3 replies
    Copy to Clipboard
  • Emma Rizzi

    Emma Rizzi

    6 months ago
    Hello! I have a flow that uses DateTimeParameters, I would like to schedule it with one date being default to the day of the schedule, or first day of the month for example, is there a way to do it through schedules or should I handle this in my code ?
    Emma Rizzi
    Stéphan Taljaard
    +1
    3 replies
    Copy to Clipboard
  • Patrick.H

    Patrick.H

    6 months ago
    hello
  • h

    Hugo Polloli

    6 months ago
    Hi, I have chained tasks that are simply database operations, they don't return anything but they have to be run in a specific order. To this day I've simply used a boolean, I
    return True
    and pass it, unused, to the next task. Is that ok or did I miss a way to do that more "beautifully" in the docs ? (I don't find it particularly bad, + with the fact that I suffix it with "_done" I think it's explicit, but was still wondering)
    t1_done = task1()
    t2_done = task2(t1_done)
    ....
    h
    Sylvain Hazard
    +1
    3 replies
    Copy to Clipboard
  • Chris Reuter

    Chris Reuter

    6 months ago
    Hi all! It's our first ever launch week and to celebrate, in case you don't following the #announcements channel....we're giving away free pizza starting at 12p Eastern! https://prefect-community.slack.com/archives/CKNSX5WG3/p1647264267554719
  • Nico Neumann

    Nico Neumann

    6 months ago
    Hi, Is it possible to have a callback function which is called every time a flow changes its state which runs on the system where I kick off the flow rather than on the system actually running the flow? I tried
    Flow(..., state_handlers=[…])
    but the callback is called on the system where the flow is executed. My idea is to use
    StartFlowRun
    e.g. on my local computer which starts the flow in AWS cloud. And every time the state changes (
    submitted
    ,
    running
    ,
    canceled
    , etc.) a local callback function is called where I see the
    state
    and
    flow_run_id
    .
    Nico Neumann
    Kevin Kho
    +1
    3 replies
    Copy to Clipboard
  • Tyndyll

    Tyndyll

    6 months ago
    Hi, I've what is probably a fairly basic question, but I can't find the right sequence of words to get an answer in the docs. Say I have a flow that takes 23 hours to run, and I kick it off at Midnight each night. Is it possible to block today's run if yesterdays is still running?
    Tyndyll
    emre
    +2
    5 replies
    Copy to Clipboard