10/12/2020, 9:26 AM
Hey all, I have a question regarding S3 storage. I would like to store my flow (as python script) as S3 object, such that it could be easily retrieved and run from another agent (e.g. colleagues computer, cloud etc.). I noticed, that the flow script is not stored in s3, when I set
stored_as_script = True
which I tracked down to the implemenation of the S3 build() function that returns the storage object before uploading it to s3, when this variable is set to True.
def build(self) -> "Storage":
        Build the S3 storage object by uploading Flows to an S3 bucket. This will upload
        all of the flows found in `storage.flows`. If there is an issue uploading to the
        S3 bucket an error will be logged.

            - Storage: an S3 object that contains information about how and where
                each flow is stored

            - botocore.ClientError: if there is an issue uploading a Flow to S3

        if self.stored_as_script:
            if not self.key:
                raise ValueError(
                    "A `key` must be provided to show where flow `.py` file is stored in S3."
            return self
Also, I can register the flow with s3 storage and
stored_as_script = False
, but even though it appears in UI and has a corresponding s3 object, flow_runs are not executed.
Nice, I managed to run the flow on a different agent, by providing a label in the flow.register() call 🙂
:upvote: 2


10/12/2020, 6:54 PM
Hey @Julian I just submitted a PR that will actually upload the script just as you intended on your initial attempt!


10/12/2020, 8:25 PM
Nice, thank you 😁