eddy davies
12/14/2022, 5:44 PMpyproject.toml
for and installed through poetry into the other projects I want to use them in. It all seems to be working except I get a warning:
UserWarning: Block document has schema checksum sha256:3867078ba2512b836bbfbc1ee1831b44fe3e1dc4ea93da04e96ed0be9e1ef1e5 which does not match the schema checksum for class 'GCPServiceAccount'. This indicates the schema has changed and this block may not load.
return cls._from_block_document(block_document)
Should I worry about this?Maikel Penz
12/15/2022, 2:25 AMKubernetesJob
block and apply it to my deployment through Prefect's python API.
When creating the block through the Prefect Cloud UI I can get it to work fine (pull image from ECR and run it on a EKS pod). However, when creating through the Python API my flow fails to start, without any logging. Any ideas?
This is what I am doing
project = "prefect20"
k8s_job_block=KubernetesJob(
image="<AWS-ACCOUNT>.<http://dkr.ecr.eu-west-1.amazonaws.com/kubernetes-job-block:latest|dkr.ecr.eu-west-1.amazonaws.com/kubernetes-job-block:latest>"
)
k8s_job_block.save(project, overwrite=True)
# Create Flow deployment
s3_block_loaded = S3.load(project)
k8s_job_block_loaded = KubernetesJob.load(project)
deployment = Deployment.build_from_flow(
flow=log_flow,
name=project,
storage=s3_block_loaded,
infrastructure=k8s_job_block_loaded,
work_queue_name="dev_queue"
)
deployment.apply()
In my mind it must be credential related but unsure how/where to set them up. If I go to the UI and edit the block (any setting), my flow works.
Something must be done under the hood in the UI.Maikel Penz
12/15/2022, 3:17 AMDocker(
registry_url=registry_url,
image_name=image_name,
image_tag=image_tag,
files=workflow_files,
env_vars=env_vars,
extra_dockerfile_commands=pip_commands
).build(push=True)
I understand the new concept on 2.0 where we reference a block, which references an image. But by any chance are there utilities to help building and pushing workflow base images?Tim-Oliver
12/15/2022, 7:39 AMhttpx.WriteError
. After running for about 1h and completing 371 tasks the flow entered a crashed state with the following trace:Arnoldas Bankauskas
12/15/2022, 8:04 AMCan't connect to Orion API at <http://172.18.0.3:4200/api>. Check that it's accessible from your machine.
how to check the IP address what I need to use for Orion ? I was thinking that that more related with docker not with Prefect but maybe some one will help me with this :)Vio
12/15/2022, 8:13 AMAndy Yeung
12/15/2022, 8:40 AMDeepanshu Aggarwal
12/15/2022, 8:50 AMAleksandr Liadov
12/15/2022, 10:16 AMgraph = {
"feature_1" : [],
"feature_2" : [],
"feature_3" : ["feature_1", "feature_2"],
"feature_4" : ["feature_3"],
"feature_5" : ["feature_4"],
}
So how can I “explain” that in the first time we can run in parallel feature_1
and feature_2
, but not the others one.
We can run feature_3
, only when feature_1
and feature_2
was successfully executed, and feature_3 take like input parameters the results of feature_1
and feature_2
.
P.S. In production it should be run with DaskTaskRunner
. Could be there any difficulty?Mike O'Connor
12/15/2022, 11:56 AM11:32:09.162 | INFO | prefect.infrastructure.kubernetes-job - Job 'prefect-standard-kubernetes-jobdmmq2': Pod has status 'Pending'.
11:32:11.003 | INFO | prefect.infrastructure.kubernetes-job - Job 'prefect-standard-kubernetes-jobdmmq2': Pod has status 'Running'.
11:32:21.064 | ERROR | prefect.infrastructure.kubernetes-job - Job 'prefect-standard-kubernetes-jobdmmq2': Job did not complete.
pod snippet:
prefect-standard-kubernetes-jobdmmq2-cpfz4 0/1 OOMKilled 0 23m
Is there some configuration I need to do to here, or is this unsupported, or a bug?Vishy ganesh
12/15/2022, 1:08 PMJoël Luijmes
12/15/2022, 2:52 PMjack
12/15/2022, 4:14 PMMichael Cody
12/15/2022, 4:39 PMAshley Felber
12/15/2022, 11:00 PMArnoldas Bankauskas
12/15/2022, 11:06 PMRuntimeError: Cannot create flow run. Failed to reach API at <http://127.0.0.1:4200/api/>.
Jarvis Stubblefield
12/15/2022, 11:42 PMAaron Goebel
12/15/2022, 11:52 PMprefect.exceptions.MissingFlowError: Flow 'tool' not found in script 'flows/tool.py'. Found the following flows: 'monolithic-tool-flow'. Check to make sure that your flow function is decorated with `@flow`.
I have the flow defined as
flows/tool.py
-------
@flow(name="monolithic-tool-flow")
def process_tool()...
and deploy like
prefect deployment build flows/tool.py:process_tool -n process_tool -q test -ib docker-container/local --override image=sha256:4e3346168cs3a3c6c0fb256d6e705c6629a41f0f6f23405c33a0bbc3deb964690 --apply
Muhammad Qasim Sheikh
12/16/2022, 8:36 AMDeepanshu Aggarwal
12/16/2022, 8:37 AM/usr/local/lib/python3.9/runpy.py:127: RuntimeWarning: 'prefect.engine' found in sys.modules after import of package 'prefect', but prior to execution of 'prefect.engine'; this may result in unpredictable behaviour
warn(RuntimeWarning(msg))
Simon Macklin
12/16/2022, 10:30 AMSimon Macklin
12/16/2022, 10:30 AMJarvis Stubblefield
12/16/2022, 3:59 PMMySQLdb.OperationalError: (1205, 'Lock wait timeout exceeded; try restarting transaction')
… any thoughts or ideas on what might be causing additional delay?Nils
12/16/2022, 4:18 PMpersist_result
to False since the object it return is not compatible with the Pickle serializer. However, I'm receiving the following error. I'm running on 2.7.2.
Flow could not be retrieved from deployment.
Traceback (most recent call last):
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/opt/prefect/main.py", line 6, in <module>
from steps.parse.parser import parse
File "/opt/prefect/steps/parse/parser.py", line 9, in <module>
from .html_parser import process_html
File "/opt/prefect/steps/parse/html_parser.py", line 22, in <module>
@task(persist_result=False)
TypeError: task() got an unexpected keyword argument 'persist_result'
Devansh Doshi
12/16/2022, 6:00 PMZachary Lee
12/16/2022, 6:07 PMDeepanshu Aggarwal
12/16/2022, 6:21 PMAndrew
12/16/2022, 6:43 PM.git
to default .prefectignoreSean Davis
12/16/2022, 7:32 PMPatrick Tan
12/16/2022, 10:04 PM