Morning,
I'm trying to register flow from another python (like it's described in
https://github.com/PrefectHQ/prefect/discussions/4042)
My code which does registration:
import os.path as p
from <http://flows.xxx|flows.xxx> import testflow
from prefect.storage import S3
flows = [testflow]
root = p.dirname(p.realpath(__file__))
storage = S3(stored_as_script=True, key='testflow.py', bucket='test')
if __name__ == '__main__':
for flow_file in flows:
flow = flow_file.flow
print(f"Registering flow {flow.name} from {flow_file}")
storage.add_flow(flow)
flow.register(
project_name='test',
idempotency_key=flow.serialized_hash()
)
Flow itself:
name = "testflow"
executor = LocalDaskExecutor()
storage = S3(stored_as_script=True, key='testflow.py', bucket='test')
run_config = KubernetesRun(
job_template_path='<https://XXX/job_template/k8s_job_template.yaml>')
with Flow(name=name, executor=executor, storage=storage, run_config=run_config) as flow:
....
Error I've got is the same when you're not adding storage to DataFlow:
Failed to load and execute flow run: ValueError('Flow is not contained in this Storage')
What am I missing?