https://prefect.io logo
Title
l

Luke Orland

11/23/2020, 9:40 PM
If I have an
AWS_CREDENTIALS
secret in Prefect Cloud, should S3Result and S3Download use those credentials automatically, unless I specify something different for the
boto_kwargs
argument?
I think I'm seeing different behavior from legacy 0.9.x and new 0.13.x prefect versions
k

Kyle Moon-Wright

11/23/2020, 9:47 PM
Hey @Luke Orland, Just to clarify since we are in #prefect-server, the credentials you store as Secrets in Cloud should populate that variable at runtime if you have them as set to PrefectSecret tasks, but they won’t carry over to your Server instance and populate there (unless they are additionally stored in your execution environment).
l

Luke Orland

11/23/2020, 9:48 PM
that will get the values from Prefect Cloud?
(I don't roll my own Prefect Server)
(is #prefect-ui the new #prefect-cloud?)
k

Kyle Moon-Wright

11/23/2020, 10:03 PM
Haha, I think that one is for UI specific things but Cloud stuff is just #prefect-community now. Also, I’d definitely recommend updating from 0.9.x if you are able. That line grabs the value from context which populates with things like environment variables. To capitalize on secrets you’ve stored in Cloud, I think you’ll want to use a SecretTask, otherwise the variable will be populated by whatever is in your local/execution context (usually env vars) which is probably not defined if you have them as Cloud Secrets.
Though with older versions, there very well could be something else afoot.
l

Luke Orland

11/23/2020, 10:13 PM
ok, I think my issue is that I don't see a way of specifying credentials for the S3Result, since it's defined outside the context of a flow.
k

Kyle Moon-Wright

11/23/2020, 10:32 PM
Hmm, the
S3Result
is a task that will populate it’s values at runtime -so you could use the
boto3_kwargs
on the S3Result which should grab those secret values from Cloud like this:
my_access_key = PrefectSecret("MY_CLOUD_SECRET_1")
my_secret_key = PrefectSecret("MY_CLOUD_SECRET_2")
my_boto3_creds = dict(
    ACCESS_KEY=my_access_key,
    SECRET_ACCESS_KEY=my_secret_key
)
my_result = S3Result(
    #this kwarg only takes a dictionary
    boto3_kwargs=my_boto3_creds
)
It has to be a dict, so you could also do:
export PREFECT__CONTEXT__SECRETS__AWS_CREDENTIALS=${AWS_CREDENTIALS}
and also pass to
AWS_CREDENTIALS
to the kwarg to grab from local context
l

Luke Orland

11/24/2020, 3:14 AM
Adding the secret to the Storage was the ticket, thanks!
storage = Docker(
    ...,
    secrets=["AWS_CREDENTIALS"],
)
Then the S3Result and S3Download objects authenticate using that secret without having to set a value for their
boto3_kwargs
arguments.
s

Saulius Beinorius

11/24/2020, 8:07 AM
Personally, my favorite way is to use the default boto3 credential storage mechanisms, by adding files in
~/.aws
locally and using EC2 / ECS roles after deployment, that way you never have to specify the credentials in code at all 🙂
👍 1