Julian10/12/2020, 9:26 AM
which I tracked down to the implemenation of the S3 build() function that returns the storage object before uploading it to s3, when this variable is set to True.
stored_as_script = True
Also, I can register the flow with s3 storage and
def build(self) -> "Storage": """ Build the S3 storage object by uploading Flows to an S3 bucket. This will upload all of the flows found in `storage.flows`. If there is an issue uploading to the S3 bucket an error will be logged. Returns: - Storage: an S3 object that contains information about how and where each flow is stored Raises: - botocore.ClientError: if there is an issue uploading a Flow to S3 """ self.run_basic_healthchecks() if self.stored_as_script: if not self.key: raise ValueError( "A `key` must be provided to show where flow `.py` file is stored in S3." ) return self ..
, but even though it appears in UI and has a corresponding s3 object, flow_runs are not executed.
stored_as_script = False
josh10/12/2020, 6:54 PM
Julian10/12/2020, 8:25 PM