with query results; 3. GZIP the
and upload it to S3; 4. Dump the uploaded file content into a Redshift table (using
with the correct settings) So I have a few questions: • A noticed that there is a 😛refect: S3 task for uploading, but I guess I can't pass gzipped content to it, right? So I built a custom Task that handles the gzipping + uploading for me. Am I using the S3Upload Task wrong or for my purposes this is what I should be doing? • Is there a way to generate dynamic tasks/flows? Example: with Airflow I made both a DAG and a Task factory using a
file as reference. My tasks are pretty repetitive and I only need to change some fields (source table name, destination schema/table, schedule time and incremental key). How could I achieve this with Prefect? Could you point me some directions or a documentation link? Thank you very much and congratulations for the awesome work.
parameter to the S3 tasks would be an excellent opportunity to get your hands dirty! In the meantime, that’s exactly what you should be doing (you found a task to use as a template but tweaked it to suit your needs 💯 )
on the tasks. Sorry for the blurring, I'm just testing stuff and did not use Secrets ):