• i

    Italo Barros

    1 year ago
    Hi everyone, sometimes I receive the following errors from the CloudTaskRunner (I'm running an "always-on" Local Agent on Windows Server 2019):
    ConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host
    
    urllib3.exceptions.ProtocolError: ("Connection broken: ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None)", ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None))
    If the Task or Flow tries to retry the execution the error pops again and only goes away if the Task/Flow is Cancelled and executed again. There's any way to bypass that?
    i
    1 replies
    Copy to Clipboard
  • Wilson Bilkovich

    Wilson Bilkovich

    1 year ago
    I don’t have any errors in my
    apollo
    pod logs, but I see this in the Agent logs:
    urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='prefect-server-initial-apollo.prefect', port=4200): Max retries exceeded with url: /graphql (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))
    (prefect is my namespace, prefect-server-initial-apollo is the name of the service)
    Wilson Bilkovich
    nicholas
    15 replies
    Copy to Clipboard
  • Konstantin

    Konstantin

    1 year ago
    how to understand what he is missing
    Konstantin
    nicholas
    +1
    18 replies
    Copy to Clipboard
  • Konstantin

    Konstantin

    1 year ago
  • b

    Bastian Röhrig

    1 year ago
    Hey everyone, is there a way to use a FilterTask and rely on the output of two different tasks in the filter_func? I build a workaround where I combine the data I want to filter with the additional information, that feels really clunky however. I added some pseudo code to ilustrate my workaround
    with Flow("filtering") as flow:
    	a_s = get_a_s() # ["a1", "a2", "a3"]
    	b = get_b() # "b2"
    	a_s_and_b = combine_a_s_and_b.map(a_s = a_s, b = unmapped(b)) # [("a1", "b2"), ("a2", "b2"), ("a3", "b2")]
    	FilterTask(filter_func=a_and_b_end_on_the_same_character)(a_s_and_b) # [("a2", "b2")]
    b
    1 replies
    Copy to Clipboard
  • Evan Curtin

    Evan Curtin

    1 year ago
    Hey folks, just trying out prefect for the first time after using some other workflow systems before, loving the UX! Quick question, I’m using the
    ExecuteNotebook
    task and it’s spilling the cell content to stdout during normal execution, is there an option to supress? I’m using a target file to cache the output and that’s working fine
    Evan Curtin
    Kevin Kho
    18 replies
    Copy to Clipboard
  • Jean Da Rolt

    Jean Da Rolt

    1 year ago
    Folks, is it possible to specify the number of threads for a specific task when running LocalDaskExecutor?
    Jean Da Rolt
    Kevin Kho
    +1
    7 replies
    Copy to Clipboard
  • Danny Vilela

    Danny Vilela

    1 year ago
    Hi all! Is there a documented reason as to why checkpointing is disabled by default on local Prefect flows? That is, why is the behavior opt-in when other frameworks typically have it as a requirement (or opt-out)? https://docs.prefect.io/core/concepts/results.html#pipeline-persisted-results
    Danny Vilela
    Chris White
    3 replies
    Copy to Clipboard
  • b

    Blake List

    1 year ago
    Hi there, I have a flow of flows (a, b, and parent flow p). Flows a and b both query different databases, build a dataframe, do some processing, and then store them elsewhere. I want to make a third flow, c, that merges these dataframes. Is it possible to input the dataframes into flow c without having to load them from the new db or read them from a csv? In other words, can I pass them into flow c, e.g. using the parent flow?
    b
    Kevin Kho
    2 replies
    Copy to Clipboard