Dylan
05/09/2022, 9:32 PMMatt O'Brien
05/09/2022, 10:27 PMMichelle Brochmann
05/09/2022, 10:42 PM.fn
returns a coroutine?
I tried this:
from prefect_aws.s3 import s3_upload
...
with prefect_test_harness():
my_upload = s3_upload.fn(bucket=S3_BUCKET_NAME, key='B5_key', data=b'55555', aws_credentials = AwsCredentials())
asyncio.run(my_upload)
But it’s not working with this runtime error:
E RuntimeError: There is no active flow or task run context.
../valo-prefect-poc/.venv/lib/python3.7/site-packages/prefect/logging/loggers.py:91: RuntimeError
kushagra kumar
05/10/2022, 7:14 AMpip install -U "prefect==2.0b1"
and I am encountering below error my machine Ubuntu 20.04.4 LTS
.kushagra kumar
05/10/2022, 7:21 AMPraveen Chaudhary
05/10/2022, 7:39 AMkushagra kumar
05/10/2022, 8:01 AMprefect 2.0
. Facing below error:
Traceback (most recent call last):
File "/home/kku/work/prefect_poc/env/lib/python3.8/site-packages/prefect/engine.py", line 467, in orchestrate_flow_run
result = await run_sync_in_worker_thread(flow_call)
File "/home/kku/work/prefect_poc/env/lib/python3.8/site-packages/prefect/utilities/asyncio.py", line 52, in run_sync_in_worker_thread
return await anyio.to_thread.run_sync(context.run, call, cancellable=True)
File "/home/kku/work/prefect_poc/env/lib/python3.8/site-packages/anyio/to_thread.py", line 28, in run_sync
return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable,
File "/home/kku/work/prefect_poc/env/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 818, in run_sync_in_worker_thread
return await future
File "/home/kku/work/prefect_poc/env/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 754, in run
result = context.run(func, *args)
File "car_linearregression.py", line 104, in do_regression
X,y = get_feat_and_target(df_car,target)
TypeError: cannot unpack non-iterable PrefectFuture object
It's a simple serial execution where a flow
function calls different Task
functions serially. very similar to the below tutorial on the official website.
import requests
from prefect import flow, task
@task
def call_api(url):
response = requests.get(url)
print(response.status_code)
return response.json()
@task
def parse_fact(response):
print(response["fact"])
return
@flow
def api_flow(url):
fact_json = call_api(url)
parse_fact(fact_json)
return
So far I have tried creating a new virtual env and install minimal packages required to run the ML model but had no luck. Could you please help me with this.Jan Domanski
05/10/2022, 8:25 AMFile "/venv/lib/python3.8/site-packages/prefect/orion/orchestration/rules.py", line 534, in __aexit__
await self.after_transition(*exit_context)
File "/venv/lib/python3.8/site-packages/prefect/orion/database/dependencies.py", line 112, in async_wrapper
return await fn(*args, **kwargs)
File "/venv/lib/python3.8/site-packages/prefect/orion/orchestration/core_policy.py", line 190, in after_transition
cache_key = validated_state.state_details.cache_key
AttributeError: 'NoneType' object has no attribute 'state_details'
Any idea how to debug/interpret this?Ben Muller
05/10/2022, 9:36 AMNacho Rodriguez
05/10/2022, 9:55 AMElio
05/10/2022, 10:06 AMBhupesh Kemar Singh
05/10/2022, 10:35 AMDanilo Drobac
05/10/2022, 10:39 AMModuleNotFoundError: No module named 'markupsafe'
Arthur Jacquemart
05/10/2022, 11:17 AMTony
05/10/2022, 1:36 PMflow.storage
and flow.run_config
) and registration flows for my enterprise. Recently we wanted to duplicate all Prefect Cloud UI logging to Cloudwatch.
Inside an individual flow I can add this code to get the logs there, but I was wondering if there was a way I could do this through a central utility?
with Flow("My First Flow") as flow:
logger = context.get("logger")
logger.addHandler(
watchtower.CloudWatchLogHandler(
log_group_name="prefect-logs",
)
)
Aka, would something like this work?
from prefect.utilities.storage import extract_flow_from_file
flow = extract_flow_from_file("path")
flow.logger.addHandler()?
. . .
flow.register()
Jan Domanski
05/10/2022, 2:36 PMJason
05/10/2022, 2:37 PMBob Colner
05/10/2022, 2:39 PMprefect-gcp
authentication issue/question. I’m trying to follow the example docs, but getting an error Importing GcpCredentials
: NameError: name 'SecretManagerServiceClient' is not defined
. FIY the prefect1.0 GCP/bigquery tasks are working fine in my environment. Any advise?Benny Warlick
05/10/2022, 3:59 PMprefect cloud login --key <MY_KEY> -w <MY_WORKSPACE>
rand="GCS_"$(cat /dev/urandom | tr -cd 'a-f0-9' | head -c 16)
output=$(printf '%s\n' 2 <MY_BUCKET> <MY_PROJECT> $rand | prefect storage create)
storage_id=$(echo $output | grep -oP "(?<=identifier \').+?(?=\')")
prefect storage set-default $storage_id
prefect deployment create my_flow.py
output=$(prefect work-queue create my_queue | grep -oP "(?<=UUID\(\').+?(?=\'\))")
prefect agent start $output
Bob Colner
05/10/2022, 4:14 PMprefect-gcp
issue using the bigquery_insert_stream
task. I’m not able to pass Timestamp
data-types -getting: TypeError: Object of type Timestamp is not JSON serializable
Josephine Douglas
05/10/2022, 5:42 PMcreate_flow_run
and wait_for_flow_run
(see previous thread). The child flow takes a few hours to run, and in the meantime, the parent flow decides that it must have failed and reports that the whole parent flow failed. Is there a way to extend the timeout period for wait_for_flow_run
?Billy McMonagle
05/10/2022, 5:44 PMJake
05/10/2022, 6:05 PMRaviraja Ganta
05/10/2022, 6:08 PMBob Colner
05/10/2022, 6:47 PMalex
05/10/2022, 7:00 PMKathryn Klarich
05/10/2022, 7:03 PMDylan
05/10/2022, 7:21 PMJason
05/10/2022, 7:33 PMname
of a SnowflakeQuery
task similar to @task to give it a human meaning name? I didn't see it in https://docs.prefect.io/api/latest/tasks/snowflake.htmlSander
05/10/2022, 7:53 PM