https://prefect.io
Join Slack
Is there any way to limit concurrency when mapping over a task? I’m wanting to send a bunch of API r...
a

Alex Cano

almost 6 years ago
Is there any way to limit concurrency when mapping over a task? I’m wanting to send a bunch of API requests, but the endpoint I’m hitting is pretty fragile. I’m wanting the exact map functionality (feed in a list, and have each execute), but limit the execution rate. Either sequentially or limiting to N concurrent requests would be great. I’ve achieved the same functionality by arbitrarily breaking up the number of requests I’d need to send into chunks, but then those either succeed or fail as a group, which I’m hoping to avoid.
a
c
m
  • 3
  • 9
  • 188
I'm nearly there on getting my first successful flow run with an ECS work pool, but I keep getting a...
n

Nash Taylor

10 months ago
I'm nearly there on getting my first successful flow run with an ECS work pool, but I keep getting a Crashed run with this error:
Flow run infrastructure exited with non-zero status code:
 Exited with non 0 code. (Error Code: 1)
This may be caused by attempting to run an image with a misspecified platform or architecture.
I've been following this guide (https://docs.prefect.io/3.0/deploy/infrastructure-examples/serverless) and the
prefect work-pool create
step went fine. What am I missing here?
n
b
  • 2
  • 3
  • 187
Please help.
d

Denis

11 months ago
Please help.
d
n
  • 2
  • 19
  • 187
Hi! I want to debug a flow-of-flows on my local machine, but the flow runs created using create_flow...
j

Jonathan Mathews

about 3 years ago
Hi! I want to debug a flow-of-flows on my local machine, but the flow runs created using create_flow_run seem to be executing on my production Prefect Cloud environment. How do I ensure that these flow runs are all executed on my local machine?
j
k
c
  • 3
  • 6
  • 185
<@ULVA73B9P>, show me please example of base job template for Docker work pool type, where custom en...
m

Michal

about 1 year ago
@Marvin, show me please example of base job template for Docker work pool type, where custom envvars, image and volumes are customized
m
m
  • 2
  • 12
  • 184
Hi , is there a way to deploy multiple data pipelines(one for daily and other pipeline for weekly et...
r

Revanth

over 2 years ago
Hi , is there a way to deploy multiple data pipelines(one for daily and other pipeline for weekly etc) , would appreciate the help , thanks in advance
✅ 1
r
p
  • 2
  • 14
  • 184
Hi, when I run Prefect server with my custom docker image, I get the following error ```Failed to lo...
m

Mia

over 3 years ago
Hi, when I run Prefect server with my custom docker image, I get the following error
Failed to load and execute flow run: ModuleNotFoundError("No module named 'pandas'")
I don’t think it’s coming from missing pandas module. When I log into my docker image, I have no problem importing pandas. I’ve installed all the required packages in the base environment. I’ve also tried loading all the packages in the conda environment and that the conda environment is loaded upon docker image is loaded. Is this some python path issue? What I don’t understand is that i’ve done this several times before and i didn’t have this issue until now
m
a
k
  • 3
  • 27
  • 184
<@ULVA73B9P> I have a long running task that loops through a long list. Occasionally this task fails...
k

KG

over 1 year ago
@Marvin I have a long running task that loops through a long list. Occasionally this task fails during the loop. How would you recommend I save the last entry processed so that on a retry, the task does not iterate over the records already processed in the previous task run
k
m
  • 2
  • 5
  • 182
Hi all, I’m having trouble diagnosing a GitHub storage problem. I’ve created a trivial testing flow...
m

Mars

about 3 years ago
Hi all, I’m having trouble diagnosing a GitHub storage problem. I’ve created a trivial testing flow similar to the example script-based workflow for GitHub. I’ve deployed a k8s agent using
prefect k8s agent install
. I’ve uploaded my flow to a private GitHub repo and registered it with Prefect. And I’ve added a Cloud Secret called
GITHUB_ACCESS_TOKEN
that holds a valid GitHub personal access token. When I run my flow the agent’s GitHub storage gives me an
UnknownObjectException(404, 'Not Found')
error. If I change the flow to use a different Cloud Secret key for the PAT, such as
access_token_secret='MYKEY'
, then the agent tells me
ValueError('Local Secret "MYKEY" was not found.')
. How can I introspect the kubernetes agent to verify that the GitHub PAT secret is being loaded from Prefect Cloud correctly?
m
k
  • 2
  • 62
  • 182
Hello everybody! I've been playing around with the idea of self hosting Prefect server on a VM inst...
a

Aaron Gonzalez

about 2 years ago
Hello everybody! I've been playing around with the idea of self hosting Prefect server on a VM instance using docker compose. The following posts have gotten me pretty much all the way to having a working self-host option: • https://dev.to/calvinqc/the-easiest-docker-docker-compose-setup-on-compute-engine-1op1 • https://medium.com/the-prefect-blog/how-to-self-host-prefect-orion-with-postgres-using-docker-compose-631c41ab8a9f I am able to just set my local
PREFECT_API_URL
env var to the external IP address of my vm instance (not the nicest solution but I can clean this up later) and then I can immediately push my blocks and deployments to the server! 👍 The problem comes from when I lauch a run of one of my test deployments that use a Cloud Run Job. I never get out of the "Pending" status in the Prefect UI. Pretty sure it is something with my networking and probably not an actual issue with Prefect, but I can't seem to make any progress. I've tried setting up Serverless VPC Connections to allow the Cloud Run Job to be able to connect to the vm that has the server running, but maybe I've set it up incorrectly 🤷. Anyone else ever managed to get a cloud run job deployment to work with a vm instance hosted prefect server?
a
n
r
  • 3
  • 4
  • 181
Previous212223Next

Prefect Community

Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.

Powered by