I think a have a challenge. I made a significant change to an existing pipeline on the 18th which was last Friday, on this day after the change when I looked at my prefect cloud dashboard I observed it now has a new version. Fast forward to today, according to my schedule the flow ran today but the csv file imprint Friday’s date I.e
<tel:18-03-2022|18-03-2022>.csv
instead of
<tel:21-03-2022|21-03-2022>.csv
like I have said before my code is on ec2 instance that through a lambda start and stops at certain time of each day. What could be amiss?
k
Kevin Kho
03/21/2022, 7:28 PM
Couple of things.
What storage are you using? What the flow updated in storage?
When you have a Flow with a Schedule that is on, the next 10 runs are scheduled. Could it be you have other runs scheduled already that are still running?
h
Hedgar
03/21/2022, 7:41 PM
I didn't indicate storage and like I said the csv file was supposed to take on the date of the system today
<tel:21-03-2022|21-03-2022>.csv
but took on
<tel:18-03-2022|18-03-2022>.csv
thereby overriding Friday data
Hedgar
03/21/2022, 7:42 PM
Can this happen?
k
Kevin Kho
03/21/2022, 7:43 PM
Is that filename made with a Parameter? If so, can you show me how the parameter was defined?
h
Hedgar
03/21/2022, 7:47 PM
No! Just simple datetime format
%d%m%Y
k
Kevin Kho
03/21/2022, 7:51 PM
Can I see how you defined that in the Flow?
Kevin Kho
03/21/2022, 7:54 PM
Did you do something like:
Copy code
with Flow(...) as flow:
save_to_file(f"{datetime.datetime.today()}")
?
h
Hedgar
03/22/2022, 8:34 PM
I think my stupid error of defining the variable
output = f “{today}.csv”
outside the task could be responsible for the error So when I ran it for the first time it ran properly with the proper date
<tel:18-03-2022|18-03-2022>
unfortunately this is the date subsequent runs takes
k
Kevin Kho
03/22/2022, 8:36 PM
Ah yeah you want the f-string to be a task output
h
Hedgar
03/23/2022, 6:27 AM
Correct, though I have move the variable into the task hoping it reflects current date going forward… fingers crossed 🤞
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.