Hello, has anybody been able to make .deploy() fun...
# ask-community
j
Hello, has anybody been able to make .deploy() functions work properly with S3Bucket blocks?
Copy code
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/prefect/deployments/steps/core.py", line 154, in run_steps
    step_output = await run_step(step, upstream_outputs)
  File "/usr/local/lib/python3.9/site-packages/prefect/deployments/steps/core.py", line 125, in run_step
    result = await from_async.call_soon_in_new_thread(
  File "/usr/local/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 326, in aresult
    return await asyncio.wrap_future(self.future)
  File "/usr/local/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 388, in _run_async
    result = await coro
  File "/usr/local/lib/python3.9/site-packages/prefect/deployments/steps/pull.py", line 195, in pull_with_block
    await storage.pull_code()
  File "/usr/local/lib/python3.9/site-packages/prefect/runner/storage.py", line 542, in pull_code
    await self._block.get_directory(local_path=str(self.destination))
  File "/usr/local/lib/python3.9/site-packages/prefect_aws/s3.py", line 508, in get_directory
All my tentatives following the documentation leads to that.
k
could you share both the stack trace and some of your code?
j
The stack trace is already there
Here's my code
Copy code
s3_bucket_block = S3Bucket(
        bucket_name=f"{deployment_variables.storage_bucket_name}",
        bucket_folder="prefect2"
    )
    s3_bucket_block.save(deployment_variables.default_storage_block_name, overwrite=True)

    flow.from_source(source=S3Bucket.load(deployment_variables.default_storage_block_name),
                     entrypoint="./5041a44f9487/prefect2/flows/dataplatformfoundation/ratlab_flow/ratlab_flow.py:ratlab_flow").deploy(
        name=f"testjmprovencher-{deployment_variables.get_deployment_name(label=None)}",
        work_pool_name="kubernetes-dev-us-east-1",
        build=False,
        push=False,
        schedules=[get_schedule_every_x_minutes_with_fixed_anchor(frequency_in_minutes=60)],  # 1 hour
        image="<http://1234.dkr.ecr.us-east-1.amazonaws.com/prefect2-base:5041a44f9487|1234.dkr.ecr.us-east-1.amazonaws.com/prefect2-base:5041a44f9487>")
k
ah I think the end of the stack trace is cut off
j
Copy code
Traceback (most recent call last):
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/prefect2/flows/dataplatformfoundation/ratlab_flow/deployment.py", line 58, in <module>
    deploy2()
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/prefect2/flows/dataplatformfoundation/ratlab_flow/deployment.py", line 47, in deploy2
    flow.from_source(source=S3Bucket.load(deployment_variables.default_storage_block_name),
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/utilities/asyncutils.py", line 259, in coroutine_wrapper
    return call()
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 431, in __call__
    return self.result()
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 317, in result
    return self.future.result(timeout=timeout)
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 178, in result
    return self.__get_result()
  File "/Users/jmprovencher1/.pyenv/versions/3.9.11/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result
    raise self._exception
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/_internal/concurrency/calls.py", line 388, in _run_async
    result = await coro
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/flows.py", line 937, in from_source
    await storage.pull_code()
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect/runner/storage.py", line 542, in pull_code
    await self._block.get_directory(local_path=str(self.destination))
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect_aws/s3.py", line 508, in get_directory
    bucket = self._get_bucket_resource()
  File "/Users/jmprovencher1/PycharmProjects/data-platform-workflows/venv/lib/python3.9/site-packages/prefect_aws/s3.py", line 475, in _get_bucket_resource
    params_override = self.credentials.aws_client_parameters.get_params_override()
AttributeError: 'dict' object has no attribute 'get_params_override'
🙌 1
Should be better now
k
if you load and print your block, what does it look like? be sure to remove any sensitive data
I saw you replied to the other thread I was in about this, and it sounds like the shape of the block is wrong/outdated in some way
j
Copy code
S3Bucket(bucket_name='xxxxx-xxxx-prefect-storage', credentials=AwsCredentials(region_name=None, profile_name=None, aws_access_key_id=None, aws_session_token=None, aws_client_parameters={'config': None, 'verify': True, 'use_ssl': True, 'api_version': None, 'endpoint_url': None, 'verify_cert_path': None}, aws_secret_access_key=None), bucket_folder='prefect2')
Yes, I don't understand why @Kevin Grismore 😕
Copy code
I'm running these versions of prefect and prefect-aws
k
it is strange. even an empty credentials field on an S3Bucket block should construct an instance of AwsCredentials that uses the AwsClientParameters constructor as the default value for
aws_client_parameters
if you made an S3 bucket block from the UI, then loaded and inspected it in python, same issue?
j
for me creating it from the UI is not really an option in my development workflow
but it seems to do the same thing
From the UI it works, so it must be a bug with the S3Bucket resource right ?
calling .save() from the python sdk or from the UI does not actually behaves the same