still a rule for when you pass parameters to a Deployment?
m
Marvin
04/26/2024, 10:20 PM
The current Prefect documentation does not explicitly state the 512kb limit for flow run parameters when passing them to a Deployment. However, there is a mention of a JSON representation size limit for specific parameter fields related to Databricks job tasks, such as
jar_params
and
spark_submit_params
, which cannot exceed 10,000 bytes, but this is specific to Databricks and not a general Prefect deployment limitation.
For the 512kb limit, it's advisable to check the latest documentation or reach out to Prefect maintainers for the most up-to-date information, as my current knowledge based on the searches does not specify such a limit.
For specific details on Databricks flows, you can refer to the Databricks Prefect documentation here: Prefect Databricks Integration.
For general concerns about rate limits and other limitations in Prefect Cloud, this might be useful: Prefect Cloud Rate Limits.
n
Nate
04/26/2024, 10:22 PM
is this
Flow run parameters cannot exceed 512kb in size
still a rule for when you pass parameters to a Deployment?
short answer, yes!
Nate
04/26/2024, 10:22 PM
why do you ask?
j
Jack P
04/26/2024, 10:23 PM
I just wanted to see if there was a few extra 0s at the end of 512 for Deployments, since I was trying to pass a giant list of IDs to fetch, but I think this indicates I need to fetch those IDs within deployment.
Thank you for confirming 🙏 Lemme know if you can sneak some 0s into the 512 😉
n
Nate
04/26/2024, 10:24 PM
fetch those IDs within deployment
yes! our rec is to pass a reference and load inside when possible
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.