When registering a flow and it being stored in an ...
# ask-community
j
When registering a flow and it being stored in an S3 bucket, is it possible to use a different AWS_PROFILE for deploying, to the the one it uses when executed by Cloud? My machine uses a "production" profile to access the aws account, but inside AWS the profile is "default". Any ideas? Thanks.
k
Hi @James Gibbard, sorry I’m a bit confused here. What is the difference between deploying and executed by Cloud?
j
Hi @Kevin Kho. Sorry for the confusion. When running
prefect register
I'd like to use a aws profile called "production", and when the Prefect Cloud runs the flow, it uses an aws profile to download the flow from S3 using the "default" aws profile.
Currently I'm able to set the aws profile used by adding it to the Flow parameters.
Copy code
...
with Flow(
    "flow1",
    executor=DaskExecutor(),
    storage=S3(
        bucket="<bucket>",
        stored_as_script=True,
        client_options={"region_name": "eu-west-1", "profile_name": "production"},
    ),
...
I'd like to be able to use "production" when registering the flow and "default" when the flow gets executed. Is that possible?
k
For this specific case I don’t think it can be done because storage is evaluated during build time while any Parameters are inserted during run time. I think you would need to set an environment variable where your code is running
AWS_PROFILE
for example and then load it in for the profile name.
👍 1
This one can’t be done with
Parameters
j
Thanks @Kevin Kho - I'll give that a go 🙂
k
It still might not work because the name might be evaluated (set to “production”) when you register so it won’t change the value when it’s already registered.
Or are you looking for something like this?
Copy code
with Flow as flow:
    task()
flow.storage = S3("prod")

if __name__ == "__main__":
    flow.storage = S3("default")
j
The EnvVar does seem to work. I was able to register and upload using the S3 storage. But that code example would be nice also. I have another issue now. As I'm running the
prefecthq/prefect:latest-python3.8
container in AWS ECS Fargate and have provided the ESC Task the permissions to read the S3 bucket that the flow script is in. I was hoping that the perfect boto3 authentication would use the auto-discovered method of auth, assume the task role and access the flow script. But I' getting the following error:
Error downloading Flow from S3: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied
The AWS docs say that: "If your container instance is using at least version 1.11.0 of the container agent and a supported version of the AWS CLI or SDKs, then the SDK client will see that the 
AWS_CONTAINER_CREDENTIALS_RELATIVE_URI
 variable is available, and it will use the provided credentials to make calls to the AWS APIs." Is the prefect container and boto3 auth able to make use of this method? Thanks
k
I don’t think so. You’ll need to pass in the Task Role ARN when you start your agent.