Jacob Bedard
05/25/2023, 9:45 PMSerina
05/25/2023, 11:30 PMJacob Bedard
05/25/2023, 11:32 PMbotocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
Serina
05/25/2023, 11:44 PMJacob Bedard
05/25/2023, 11:46 PMSerina
05/26/2023, 12:08 AMJacob Bedard
05/26/2023, 12:10 AMraise ProfileNotFound(profile=profile_name)
botocore.exceptions.ProfileNotFound: The config profile (data-eng-bot) could not be found
Christopher Boyd
05/26/2023, 12:14 AMJacob Bedard
05/26/2023, 12:15 AMChristopher Boyd
05/26/2023, 12:15 AMJacob Bedard
05/26/2023, 12:16 AMChristopher Boyd
05/26/2023, 12:16 AMJacob Bedard
05/26/2023, 12:20 AMaws_credentials_block = AwsCredentials.load("<my stored aws bot creds>")
s3_bucket = S3Bucket(
bucket_name="<top-level bucket>",
aws_credentials=aws_credentials_block
)
test_test = s3_bucket.read_path(path="<a folder one level below my default bucket >/<subfolder>/test_json_file.json")
s3_bucket_block.list_objects()
s3_bucket.download_folder_to_path('email_scan_bot', 'attachments')
with open("test_response.json", "wb") as f:
s3_bucket.download_object_to_file_object("<a folder one level below my default bucket >/<subfolder>/test_json_file.json", f)
with open("test_response.json", "wb") as f:
s3_bucket.download_object_to_file_object("<subfolder>/test_json_file.json", f)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
with open("test_response.json", "wb") as f:
s3_bucket.download_object_to_file_object("<subfolder>/test_json_file.json", f)
Christopher Boyd
05/26/2023, 12:22 AMc3cb88d5-f75a-40ed-b01c-… │ S3 │ tests3 │ s3/tests3
Jacob Bedard
05/26/2023, 12:30 AMChristopher Boyd
05/26/2023, 12:37 AMfrom prefect import task, flow
from prefect import get_run_logger
from prefect_aws import AwsCredentials
from prefect_aws.s3 import S3Bucket
def this_is_not_a_task(logger):
<http://logger.info|logger.info>("I am not a task context")
@task
def log_platform_info():
logger = get_run_logger()
<http://logger.info|logger.info>("hello world")
this_is_not_a_task(logger)
@flow(log_prints=True)
def hello_world():
logger = get_run_logger()
#log_platform_info()
aws_creds = AwsCredentials(
aws_access_key_id="<removed>",
aws_secret_access_key="removed"
)
s3_bucket_block = S3Bucket(
bucket_name="<bucket name> ",
aws_credentials=aws_creds,
basepath=""
)
s3_bucket_block.list_objects("storage")
s3_bucket_block.upload_from_path("deployment.py", "storage/deployment.py")
if __name__ == "__main__":
hello_world()
20:36:49.060 | INFO | Flow run 'magnificent-chimpanzee' - Listing objects in bucket storage.
20:36:49.292 | WARNING | prefect._internal.concurrency.timeouts - Overriding existing alarm handler <function _alarm_based_timeout.<locals>.sigalarm_to_error at 0x10b804af0>
20:36:49.624 | INFO | Flow run 'magnificent-chimpanzee' - Uploaded from '/Users/christopherboyd/all_the_things/s3_community/deployment.py' to the bucket '<bucket name>' path 'storage/deployment.py'.
20:36:49.750 | INFO | Flow run 'magnificent-chimpanzee' - Finished in state Completed()
20:36:49.932 | INFO | prefect._internal.concurrency.threads - Exiting worker thread 'APILogWorkerThread'
(prefect2) (base) christopherboyd@Christophers-MacBook-Pro ~/all_the_things/s3_issue_community
$ aws sts get-caller-identity
An error occurred (InvalidClientTokenId) when calling the GetCallerIdentity operation: The security token included in the request is invalid.
(prefect2) (base) ✘ christopherboyd@Christophers-MacBook-Pro ~/all_the_things/s3_issue_community
$
/.aws/
folder to rule out any possible conflicts.
I used the exact code you saw there in a fresh venv - I used my actual access key + secret (just redacted for sharing)
For the bucket, I have the name of the bucket and a single folder at the root named “storage” (so /storage/deployment.py
is where this file went)Jacob Bedard
05/26/2023, 12:41 AMChristopher Boyd
05/26/2023, 12:48 AMJacob Bedard
05/26/2023, 12:51 AMChristopher Boyd
05/26/2023, 12:52 AMJacob Bedard
05/26/2023, 12:55 AMChristopher Boyd
05/26/2023, 12:57 AM$ aws s3 ls <bucket> --profile <the one for my credentials>
PRE storage/
2023-04-12 08:40:42 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvM2E1NjJkOTAtZDkyZi0xMWVkLTg2MjEtZjkyYmY3YzMyMjVj
2022-11-07 14:54:41 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvMDIyYTZhZTAtNWVkNi0xMWVkLWI0ZTAtYTU5MWY4YjhmZTY4
2023-04-10 13:35:56 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvMjM1NDMxMTAtZDdjNi0xMWVkLWFlMmMtMTFiNTI1NWMxZjJj
2022-11-18 12:22:04 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvODI0ZGFkODAtNjc2NS0xMWVkLWJiMTktMWRiNjE1ODBmMDU2
2022-11-18 12:22:05 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvODM0YzU4ZDAtNjc2NS0xMWVkLWE3MWMtMjkyYzQzYjZhNzdi
2022-11-14 09:50:28 223 Y2lzY29zcGFyazovL3VzL01FU1NBR0UvYWI0NjVkMDAtNjQyYi0xMWVkLTk0YmQtNDM2ZTdiYzNiYjZl
Jacob Bedard
05/26/2023, 1:06 AMChristopher Boyd
05/26/2023, 1:07 AMJacob Bedard
05/26/2023, 1:07 AMChristopher Boyd
05/26/2023, 1:08 AMfrom prefect import flow
from prefect_aws import AwsCredentials
from prefect_aws.s3 import S3Bucket
@flow(log_prints=True)
def check_s3():
logger = get_run_logger()
aws_creds = AwsCredentials(
aws_access_key_id="",
aws_secret_access_key=""
)
s3_bucket_block = S3Bucket(
bucket_name="my_bucket",
aws_credentials=aws_creds,
basepath=""
)
s3_bucket_block.list_objects("storage")
s3_bucket_block.upload_from_path("deployment.py", "storage/deployment.py")
print(s3_bucket_block.list_objects("storage"))
if __name__ == "__main__":
check_s3()
aws configure
to set up a profile needing you key id, access key, region and format outputaws s3 ls <your bucket>
--profile <the profile name you just created>Jacob Bedard
05/26/2023, 1:26 AMChristopher Boyd
05/26/2023, 1:27 AMJacob Bedard
05/26/2023, 1:28 AM