Gabriel Montañola
09/03/2020, 7:08 PM.csv
with query results;
3. GZIP the .csv
and upload it to S3;
4. Dump the uploaded file content into a Redshift table (using COPY
with the correct settings)
So I have a few questions:
• A noticed that there is a P S3 task for uploading, but I guess I can't pass gzipped content to it, right? So I built a custom Task that handles the gzipping + uploading for me. Am I using the S3Upload Task wrong or for my purposes this is what I should be doing?
• Is there a way to generate dynamic tasks/flows? Example: with Airflow I made both a DAG and a Task factory using a .yml
file as reference. My tasks are pretty repetitive and I only need to change some fields (source table name, destination schema/table, schedule time and incremental key). How could I achieve this with Prefect? Could you point me some directions or a documentation link?
Thank you very much and congratulations for the awesome work.Dylan
gzip
or compress
parameter to the S3 tasks would be an excellent opportunity to get your hands dirty!
In the meantime, that’s exactly what you should be doing (you found a task to use as a template but tweaked it to suit your needs 💯 )Dylan
Dylan
Dylan
Gabriel Montañola
09/03/2020, 8:39 PMDylan
Dylan
Dylan
Gabriel Montañola
09/03/2020, 8:53 PMDylan
sales at prefect dot io
is the best email for demos, or jump into intercom on https://www.prefect.io whenever you’re readyGabriel Montañola
09/03/2020, 10:04 PM.map
on the tasks. Sorry for the blurring, I'm just testing stuff and did not use Secrets ):Gabriel Montañola
09/03/2020, 10:05 PMDylan