• Braun Reyes

    Braun Reyes

    2 years ago
    Hey there....wondering about 'requires result handlers for all flows with keyed edges that are registered with Prefect Cloud' in master branch. I rebased our fork and now some flows are failing to register. Is there a good way to understand this change? Is this so that flows runs can pick up from a last know successful task?
    Braun Reyes
    Jeremiah
    +1
    3 replies
    Copy to Clipboard
  • Braun Reyes

    Braun Reyes

    2 years ago
    notice that docker storage did not get the sensible defaults
  • Braun Reyes

    Braun Reyes

    2 years ago
    which make sense since the underlying storage is ephmeral
  • Braun Reyes

    Braun Reyes

    2 years ago
    I think we will just force S3 storage on our flows until we get S3 storage working with Fargate agent
  • Braun Reyes

    Braun Reyes

    2 years ago
    though would be interesting to understand the reasoning behind forcing persistent storage between tasks. I am all for having one proper way to do things...just would be good to understand.
  • Braun Reyes

    Braun Reyes

    2 years ago
    all example for migration would be nice as I feel this will break a decent amount of flows once it comes out
  • Braun Reyes

    Braun Reyes

    2 years ago
    final note....I do understand master is 'not prod ready' and we are good with adjusting on our end.
  • a

    alexandre kempf

    2 years ago
    Hello guys ! I have a weird bug when I'm using prefect. I reduce the example to the minimal I could find. This works:
    from prefect import task, Parameter, Flow
    
    @task
    def load_data(c, b):
        return c
    
    @task
    def bugtask(c):
        return c
    
    with Flow("training") as flowModel:
        init = Parameter("c")
        data = load_data(init, b= {"f": 4})
        # data = load_data(init, b= {"f": bugtask})
    
    state_model = flowModel.run(c=5)
    Now if you just use the comment instead of
    load_data
    (basically, if you have a task in your arguments, even nested in other structured and not executed, there is an error. It this expected ? I have the feeling that it tries to run all the tasks, even if they are not called ! @josh @Dylan, This is a problem for subflows since I must give the configuration of the subflow as an argument of my run_flow task 😒
    a
    Jeremiah
    19 replies
    Copy to Clipboard
  • k

    Kamil Okáč

    2 years ago
    I have a trouble working with tasks declared in external files (using dask executor). Is this how it's supposed to work? main.py:
    from prefect import Flow
    from prefect.engine.executors.dask import DaskExecutor
    import mytask
    
    mt = mytask.MyTask()
    with Flow("Flow") as flow:
        t1 = mt(1)
    
    executor = DaskExecutor(address='tcp://....:8786')
    flow.run(executor=executor)
    mytask,py:
    from prefect import Task
    
    class MyTask(Task):
        def run(self, x):
            return x
    This leads to error on worker: "ModuleNotFoundError: No module named 'mytask'" If I use @task decorator instead of subclassing, there's no problem.
    k
    emre
    +1
    5 replies
    Copy to Clipboard
  • j

    John Ramirez

    2 years ago
    When you using
    task.copy()
    the upstream dependencies are removed but are task results copied as well?
    j
    1 replies
    Copy to Clipboard