Tim Helfensdörfer08/08/2022, 11:16 AM
it runs slower by a factor of 2-10x. This is our test setup:
We can't share any code from inside
def run_flow(): calculate_something() @flow( name=FLOW_NAME, task_runner=get_default_task_runner(), version=get_file_hash(__file__), timeout_seconds=get_default_timeout(), ) def run_prefect_flow(): global USE_PREFECT_LOGGER USE_PREFECT_LOGGER = True run_flow() if __name__ == "__main__": if len(sys.argv) > 1 and sys.argv == "--no-prefect": # Normal performance run_flow() else: # Bad performance run_prefect_flow()
- are there any circumstances that you know of where this might happen? What overhead brings `@flow`into play? Does it analyze http or DB requests for debugging purposes which might explain the performance degradation? What I can offer is a pstats profile/graph in a dm because it also may contain sensitive data. *currently = as long as we can remember using prefect 2.