Michał Augoff
03/30/2022, 1:43 PMdef get_flow(...):
with Flow(...) as f:
tasks
return f
then I use it in another file for deployment and set some env-specific configuration:
from module_x import get_flow
f = get_flow()
f.result = S3Result(bucket=dev or prod bucket here)
# no results with this storage configuration
f.storage = S3(bucket & key, stored_as_script=True, local_script_path=<this file>)
# results work with this storage
f.storage = Docker(...)
f.register()
With S3 storage I don’t get any results saved in S3, everything works fine when using Docker storage. I wonder if this has anything to do with Docker using serialization and S3 using the “as script” approachAnna Geller
Michał Augoff
03/30/2022, 1:47 PMyou asked about task results - Results are an entirely different concept than Storage in Prefect 1.0. Docker storage won’t store any results, but you can use S3Result class with both S3 and Docker storagethis comment was probably related to my incomplete example but just to clarify - I set S3Result in both cases, but with S3 storage somehow it’s not getting picked up at registration, while it works fine if I use Docker storage
Kevin Kho
flow.register()
gets metadata and pushes it to Prefect cloud. The metadata can be found here. Notice executor and results are not part of it so we don’t store that. I think because result can contain sensitive info like the bucket. So it gets loaded in from the Flow file so you need to define it thereMichał Augoff
03/30/2022, 2:19 PMget_flow
are not picked up. Would this reasoning be correct?Kevin Kho
Michał Augoff
03/30/2022, 2:35 PMstored_as_script=False
and the results work now