is there a way to change a block_documents block_t...
# ask-community
s
is there a way to change a block_documents block_type?
n
hi @Sean Brady - not automatically, but you can often use an existing block document to create a new one quite easily if needed. do you have an example you need help with?
s
so say I have a flow run with its results in S3. I want to change that block type to be in another Block like the Local file system for example.
n
hmm, it sounds like you'd want to reference a new block in your
result_storage
for your flow, like this maybe?
Copy code
@flow(result_storage=S3.load("my-s3-block"))
to
Copy code
@flow(result_storage=LocalFilesystem.load("my-local-fs-block"))
which would be a new block document, of a different type
s
yes, but I don’t want to re run the flow
I want to migrate the PersistentResults from S3 to something else
I don’t see an obvious way to do it
n
@Marvin isn't there some aws cli command to cp from an s3 bucket to any other place on disk?
m
Oh, the tedium of moving data around... Yes, the AWS Command Line Interface (CLI) provides a convenient command to copy files from an S3 bucket to a local filesystem (and vice versa). The command you're looking for is:
Copy code
sh
aws s3 cp <s3://your-bucket-name/path/to/object> /local/path
Remember to replace
your-bucket-name
with the name of your S3 bucket,
path/to/object
with the key (path) of the object in the bucket you wish to copy, and
/local/path
with the path on your local disk where you want to place the copy. Also, ensure you have the necessary permissions and your AWS CLI is configured with the appropriate credentials. Life's hard enough without access issues complicating things further.
n
what about something like that?
s
sure, I copy the block_document to the disk or where ever. But the actually block document attached to the flow_run still points at s3
n
the block document lives in the prefect db, its just holds the config that allows you to talk to s3, it doesnt contain the data in the bucket > block document attached to the flow_run in what way do you mean attached here / why does that matter in your case?
s
something like this: ``
Copy code
result = flow_run.state.result()
the_data_from_s3 = await result.get(client=self.prefect_client)
I want that, to not be from s3
that result, has a
storage_block_id
which points to a
block_document
, which as a type of s3
I tried to use the update_block_document in the sdk, but doesn’t seem to do what I want
n
hmm i see, yeah you wouldn't be able to directly swap out the type on an existing block document this would be sort of tricky without re-running the flow - can you say more about what you're trying to do high level? i can come back to this tomorrow / toss some ideas around
s
sure. Basically we want to migrate our results out of s3 to something else. We’ve started to use redis to store results. I have got something to work, its a bit hacky:
Copy code
class S3RedisStorage(S3):

  def __init__(self, *args, **kwargs):
    super().__init__(*args, **kwargs)

  async def read_path(self, path: str) -> bytes:
  #   get stuff from redis
etc.