Tibs
11/28/2022, 4:19 PMRyan Peden
11/28/2022, 4:55 PMs3fs
depends on aiobotocore
, which pins itself to a very specific version of botocore
. They do this so they can validate and ensure compatibility with whatever version of botocore
they support, but the downside is that aiobotocore
doesn't always officially support the latest botocore
.
You mentioned prefect_aws
, and that might be a good solution here. It contains an S3Bucket
class that accesses S3 without using s3fs
. You'd need to run the following commands to make the block and its credentials block show up in your Prefect UI:
prefect block register -m prefect_aws.credentials
prefect block register -m prefect_aws.s3
Then you would use an S3Bucket
block in your deployments instead of the S3
block you are using now.
Alternatively, you could try spinning up a Dask cluster with worker nodes that have a newer version of botocore
installed and then use a DaskTaskRunner
from the prefect-dask
collection to connect to the cluster and run your tasks. This should work as long as your tasks don't use your S3
block.
Of the two options, I recommend the prefect_aws
approach.Tibs
11/29/2022, 6:42 AM