https://prefect.io
Join Slack
<@ULVA73B9P> is it possible to set nodeSelector and tolerants on a work pool so that all flows descr...
k

Kiley Roberson

about 2 years ago
@Marvin is it possible to set nodeSelector and tolerants on a work pool so that all flows described in my prefect.yaml that are associated with that work pool will be scheduled on a specified node pool?
k
m
  • 2
  • 3
  • 152
Hello, consider I have a prefect server on EC2 (AWS), can I deploy a flow in a sagemaker processing ...
k

Kevin Takano

over 2 years ago
Hello, consider I have a prefect server on EC2 (AWS), can I deploy a flow in a sagemaker processing job and see the flow in the EC2 server?
✅ 1
k
r
  • 2
  • 5
  • 152
For some reason, the artifacts create_markdown is rendering the markdown, but it is ignoring the ima...
t

Tomás Emilio Silva Ebensperger

about 3 years ago
For some reason, the artifacts create_markdown is rendering the markdown, but it is ignoring the images. Has anyone had this issue?
1️⃣ 2
✅ 1
t
b
  • 2
  • 9
  • 152
Hi, a newbie question… If the trigger for all tasks in a flow is “all successful”, and one of the up...
j

Jasono

almost 5 years ago
Hi, a newbie question… If the trigger for all tasks in a flow is “all successful”, and one of the upstream task fails, all of the downstream tasks are “trigger failed” (even if the upstream failed task is restarted and succeeds). This forces me to restart the trigger-failed tasks one by one. Is this the intended behavior? or am I missing something? Should I change the trigger to “all finished” in order to avoid this manual restarting of the downstream tasks when one of the upstream tasks fails at least once?
j
c
  • 2
  • 38
  • 152
We’re having trouble using Blocks with SecretStr on the latest prefect (2.14.5) with pydantic 2 (2.5...
l

Lee Mendelowitz

almost 2 years ago
We’re having trouble using Blocks with SecretStr on the latest prefect (2.14.5) with pydantic 2 (2.5.1). When I paste the example from the prefect docs into a ipython session, I get an error:
from typing import Optional

from prefect.blocks.core import Block
from pydantic import SecretStr

class AWSCredentials(Block):
    aws_access_key_id: Optional[str] = None
    aws_secret_access_key: Optional[SecretStr] = None
    aws_session_token: Optional[str] = None
    profile_name: Optional[str] = None
    region_name: Optional[str] = None
RuntimeError: no validator found for <class 'pydantic.types.SecretStr'>, see `arbitrary_types_allowed` in Config
Anyone have any ideas? I know pinning
pydantic<2
will make the error go away, but since Prefect supports pydantic 2 I’m trying to understand if this is a Prefect bug or if we need to do something differently with our blocks on pydantic 2.
✅ 1
l
a
  • 2
  • 8
  • 151
Has anybody used Prefect to write data into Salesforce ?
a

Amey Desai

almost 3 years ago
Has anybody used Prefect to write data into Salesforce ?
✅ 1
a
b
k
  • 3
  • 11
  • 151
<@ULVA73B9P> how to clean up docker containers after a flow run if using the docker worker
j

juana

about 1 year ago
@Marvin how to clean up docker containers after a flow run if using the docker worker
j
m
  • 2
  • 5
  • 150
Hi,all After upgrading of Prefect 2 to 3, "run-graph" always say "This flow run did not generate any...
s

syakesaba

about 1 year ago
Hi,all After upgrading of Prefect 2 to 3, "run-graph" always say "This flow run did not generate any task or subflow runs" at a "Runs" pane with Deployment on Kubernetes Worker. Somehow but when I execute same flow at other workload, it works. why? woker "run-graph" show: Local Execution. does not show: Kubernetes Worker
s
c
  • 2
  • 5
  • 150
<@ULVA73B9P> is it possible to upload a file as the input to a flow?
a

Arthur

over 1 year ago
@Marvin is it possible to upload a file as the input to a flow?
a
m
  • 2
  • 3
  • 150
<@ULVA73B9P> how to get the return value of the flow using flow id?
j

Joish

over 1 year ago
@Marvin how to get the return value of the flow using flow id?
j
m
+2
  • 4
  • 20
  • 150
Previous394041Next

Prefect Community

Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.

Powered by