Rainer Schülke
08/18/2022, 9:40 AM@task(log_stdout=True)
def etl(
connector,
exec_file,
environment: Optional[str] = None
):
master_data = MasterDataSet(connector, exec_file)
master_data.get_dataset(environment)
master_data.load_dataset()
My flow looks like this:
.... as flow:
source = SOME_SOURCE
destination = SOME_DESTINATION
file_list = ['SOME/PATH/TO/FILE', 'ANOTHER/PATH/TO/FILE']
master_dataset = etl.map(
connector=unmapped(source),
exec_file=file_list
)
The cloud logs look like this:
Task 'etl_master_data': Starting task run...
Task 'etl_master_data': Finished task run for task with final state: 'Mapped'
Flow run FAILED: some reference tasks failed.
There is no failed reference task.
Does anybody has any idea why this is not working on the cloud but locally? It is configured like all the other flows, even the data handling by class is already implemented for other flows. I really have absolutely no clue why 😄return master_data.load_dataset()
Bianca Hoch
08/18/2022, 6:55 PMRainer Schülke
08/19/2022, 7:24 AMBianca Hoch
08/19/2022, 1:49 PM