How to authenticate for S3Result() if my AWS_CREDE...
# ask-community
p
How to authenticate for S3Result() if my AWS_CREDENTIALS are stored in prefect cloud secrets?
k
Hi @Prashob Nair, You can add it to the Storage of the Flow using
Storage(secrets=["AWS_CREDENTIALS"])
. You can find more in the docstring here
p
I'm getting
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
. Seems like it can't find "AWS_CREDENTIALS"
k
Do you register with S3 Storage?
I think this error means the credentials are pulled but the role is not there? Having no credentials would lead to
Unable to locate credentials
Do you have other credentials on the machine under
~./aws
that may be getting pulled?
p
No, I don't have any credentials stored locally
Actually, I wanted to use S3Result .. which works if I pass access_key & secret_access_key to boto3_kwargs as dict..I want to understand how to make S3Result read from default AWS_CREDENTIALS as mentioned in documentation
k
I went through the code and it does here . Did you add the secret to the Storage?
That will get added to the context and then this would pull it automatically
p
Yes, I tried
STORAGE = S3(bucket="bucket-name", secrets=["AWS_CREDENTIALS"])
gives the error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
k
Will give it a shot one sec
So if I have no credentials on my machine, I got
botocore.exceptions.NoCredentialsError: Unable to locate credentials
so your Flow really is pulling something. Is this running on ECS?
p
agent runs locally on my mac.. orchestration by prefect cloud
Sorry, there was a .aws/ folder in root and after removing that now it gives same
botocore.exceptions.NoCredentialsError: Unable to locate credentials
k
Ok that sounds better. I am less confused. Are you doing
flow.run()
or a registered run? Could you show me the syntax of your secret?
p
flow.register(project_name="name") { "ACCESS_KEY": "key", "SECRET_ACCESS_KEY": "secret" }
k
ok trying this
👍 1
Ok so I was successful. To make sure we fetch the secret from Cloud, do
export PREFECT___CLOUD____USE__LOCAL_SECRETS=False
.
Then to make sure the Secrete is formatted correctly, run something like this:
Copy code
from prefect import Flow, task
from prefect.tasks.secrets import PrefectSecret
import prefect

@task
def my_task(credentials):
    <http://prefect.context.logger.info|prefect.context.logger.info>(credentials)

with Flow("example") as flow:
    my_secret = PrefectSecret("AWS_CREDENTIALS")
    res = my_task(credentials=my_secret)

flow.run()
This was the output for my secret:
Copy code
[2021-11-09 16:06:54-0500] INFO - prefect.my_task | {'ACCESS_KEY': 'AKI......IL4U', 'SECRET_ACCESS_KEY': 'Sz5..........r08'}
It comes out as a dict
And this is my working flow (it was needed to attach the secret to the Storage which I think you have
Copy code
from prefect import Flow, task
from prefect.engine.results import S3Result
from prefect.storage import Local

@task(result=S3Result(bucket="coiled-prefect"), checkpoint=True)
def abc():
    return 1

with Flow("s3_test") as flow:
    abc()

flow.storage = Local(secrets=["AWS_CREDENTIALS"])
flow.register("bristech")
p
even after running the export command i got
ValueError: Local Secret "AWS_CREDENTIALS" was not found.
do i check in config.toml ?
k
Did you restart your agent maybe after setting the env variable? You can do it in the
config.toml
also.
Copy code
[cloud]
use_local_secrets = false
p
restarting the agent and export command didn't seem to work..but manually setting variable in config.toml worked
k
Uhh that’s weird. Were you able to upload the result now?
p
yes, it worked.Thanks a lot! If I change the storage to S3 from Local in example above, it again gives me
botocore.exceptions.NoCredentialsError: Unable to locate credentials
any idea why that's happening?
Copy code
from prefect import Flow, task
from prefect.engine.results import S3Result
from prefect.storage import S3

@task(result=S3Result(bucket="coiled-prefect"), checkpoint=True)
def abc():
    return 1

with Flow("s3_test") as flow:
    abc()

flow.storage = S3(bucket="coiled-prefect", secrets=["AWS_CREDENTIALS"])
flow.register("bristech")
k
Is that still the same machine executing it? I can try using S3 storage one sec
p
yes
k
Ah ok I understand. That’s because the secret is not pulled to register so your local env needs to be configured to use boto3