Hi everyone. I need to copy a set of large files from s3 to AZ storage (blob).
It looks like both the s3 and the Azure Blob tasks in the library read the data in memory.
I tried rewriting them to stream the data instead. I got it to work but the machine gets unresponsive when transferring large files. This is running in a container on an AWS linux instance (DockerRun).
Any suggestions on the best way to stream a file this way without reading it into memory? Thanks!
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.