Nic
12/22/2022, 10:26 AMNic
12/22/2022, 10:38 AMMohit Singhal
12/22/2022, 12:46 PMMohit Singhal
12/22/2022, 12:48 PMJustin Trautmann
12/22/2022, 2:05 PMDaniel Komisar
12/22/2022, 2:39 PMapply_map
and what I'd really like to do is have each of those be the body of a loop. It looks like you can only loop one single task though. Am I missing something? Thanks!Aram Karapetyan
12/22/2022, 3:11 PMPREFECT_DEBUG_MODE
PREFECT_LOGGING_LEVEL
PREFECT_LOGGING_SERVER_LEVEL
.
Non helpsDeepanshu Aggarwal
12/22/2022, 4:01 PMKristian Andersen Hole
12/22/2022, 5:19 PMLaraib Siddiqui
12/22/2022, 5:45 PMfrom prefect.tasks.shell import ShellTask
Laraib Siddiqui
12/22/2022, 5:47 PMfrom prefect.tasks.shell import ShellTask
it throws out and error
ModuleNotFoundError: No module named 'prefect.tasks.shell'; 'prefect.tasks' is not a package
I know i can use prefect-shell
to overcome this and run python scripts but i couldn't find any resolution on the above errorUday
12/22/2022, 7:25 PMKristian Andersen Hole
12/22/2022, 8:03 PMJavier Ochoa
12/22/2022, 8:22 PMwith Flow(
FLOW_NAME,
run_config=UniversalRun(labels=LABELS),
terminal_state_handler=workflow_terminal_state_handler,
) as flow:
file_list = list_unprocessed_files(var)
dataframes = get_dataframes.map(file_list)
dataframes = filter_dataframe.map(
dataframes, resource_type_name=unmapped(var)
)
# this is the final task
print_process_summary_log()
Billy McMonagle
12/23/2022, 3:40 AMAmruth VVKP
12/23/2022, 12:52 PMChris Gunderson
12/23/2022, 3:29 PMRobert Esteves
12/23/2022, 6:55 PMPekka
12/24/2022, 2:06 PMKyle McEntush
12/24/2022, 7:19 PM@task
def f(val):
return val * 2, val * 4
then the call in my flow looks like
a, b = f.map([1, 2, 3])
but I get ValueError: not enough values to unpack (expected 2, got 1)
I have confirmed that the code works as expected when not decorated with mapping over @task
(that is, two inputs are returned)Kyle McEntush
12/24/2022, 7:20 PMnout
as a task arg, but I can't find the equivalent in the docs...Shubham Wani
12/24/2022, 7:39 PMcursor.executemany(query, values),
I run the
connection.commit()
to to transfer all data from buffer to the table in dataset.
Error faced in the following command:
connection.commit()
Error:
00:12:33.556 | ERROR | Task run 'load_records-41eb504c-0' - Encountered exception during execution:
Traceback (most recent call last):
File "/home/shubhamwani/nice/lib/python3.8/site-packages/prefect/engine.py", line 1346, in orchestrate_task_run
result = await run_sync(task.fn, *args, **kwargs)
File "/home/shubhamwani/nice/lib/python3.8/site-packages/prefect/utilities/asyncutils.py", line 69, in run_sync_in_worker_thread
return await anyio.to_thread.run_sync(call, cancellable=True)
File "/home/shubhamwani/nice/lib/python3.8/site-packages/anyio/to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/home/shubhamwani/nice/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "/home/shubhamwani/nice/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 867, in run
result = context.run(func, *args)
File "/home/shubhamwani/nice/prefect_flows/load/prefect2_tasks.py", line 166, in load_records
(
TypeError: cannot unpack non-iterable NoneType object
I've tried using different sql connectors(mysqlclient, mysql-connector, pymysql), but the error persists.
is this somehow related to prefect?
anyone else facing this issue?
Any help is welcomed, Thanks.Shruti Hande
12/26/2022, 4:40 AMprefect.exceptions.PrefectHTTPStatusError: Server error '500 Internal Server Error' for url '<https://api.prefect.cloud/api/accounts/2f48cb6f->
0049-4569-879d-7123d9113e31/workspaces/9da885d9-9d9f-4ee9-9b42-84bfb4a3d9aa/work_queues/name/queue_5'
Response: {'exception_message': 'Internal Server Error'}
Getting this error in prefect cloud , code got stuck with this issue.
#prefect-community #prefect-cloudAditya Sharma
12/26/2022, 8:03 AMAditya Sharma
12/26/2022, 8:03 AMAditya Sharma
12/26/2022, 8:04 AMJelle Vegter
12/26/2022, 1:32 PMJelle Vegter
12/26/2022, 1:25 PMOllie Sellers
12/26/2022, 9:33 PMRikimaru Yamaguchi
12/27/2022, 12:58 AM