Claire Herdeman
10/14/2021, 5:32 PMClaire Herdeman
10/14/2021, 5:35 PMflow_<agency_name>.py
file with the main flow, I run and register the flow from this file. Additionally, I have a tasks/<agency_name>.py
where I have written the tasks for this flow, these are imported in the flow file. My question is, when I register the flow are the tasks imported into the flow file included in the object stored in s3?Claire Herdeman
10/14/2021, 5:35 PMAnna Geller
aws
subpackage:
pip3 install "prefect[aws]"
Then you have several options to configure S3 storage.
1. You can either choose the default pickle-based storage - this will store your flow in serialized form
2. ...or you can choose script-based storage - this will assume your flow is stored on S3 as a normal .py
file → you need to pass stored_as_script=True
to configure that.
3. If you went for script-based storage, you can additionally let Prefect upload your local flow file to S3 → just pass local_script_path
as argument. Otherwise, you can let your flows be pushed to S3 as part of a CI/CD pipeline.
Here is an example with a local agent:
from prefect.storage import S3
from prefect.run_configs import LocalRun
from prefect import task, Flow
FLOW_NAME = "s3_storage_demo"
STORAGE = S3(
bucket="prefect-datasets",
key=f"flows/{FLOW_NAME}.py",
stored_as_script=True,
# if you add local_script_path, Prefect will upload the local Flow script to S3 during registration
local_script_path=f"{FLOW_NAME}.py",
)
@task(log_stdout=True)
def hello_world():
print("hello world")
with Flow(
FLOW_NAME, storage=STORAGE, run_config=LocalRun(labels=["s3"])
) as flow:
hello_world()
When you start the agent, make sure that your AWS CLI is configured with permissions to use S3. Then you can start the agent using:
prefect agent local start --label s3
Let me know if you have any issues with it.Claire Herdeman
10/14/2021, 6:19 PMClaire Herdeman
10/14/2021, 6:20 PMAnna Geller
Claire Herdeman
10/14/2021, 6:24 PM