<@ULVA73B9P> is there a limit to how much data I c...
# marvin-in-the-wild
o
@Marvin is there a limit to how much data I can pass into the "parameters" property for a deployment?
m
The documentation for Prefect doesn't explicitly mention a size limit for the "parameters" property for a deployment in the main deployment documentation I found here. However, there is reference related to deploying specifically with Databricks, where it states the JSON representation of the
spark_submit_params
field cannot exceed 10,000 bytes here. If you're working with a specific integration like Databricks, it's worth checking the documentation for that integration to see if there are any unique limitations. Otherwise, the general documents suggest that
parameters
can simply be a
dict
with no mentioned size constraints. It's always recommended practice, however, to ensure the amount of data is reasonable to avoid potential performance issues. If you're experiencing specific issues or errors with large parameters, you could search for relevant information in the Prefect GitHub issues or consider optimizing the size of the data you are passing.
n
yes
well for cloud, yes
o
Is the limit for cloud 10k bytes?
n
no its higher - what are you passing in though?
often passing tons of data in parameters is not the best approach - we typically recommend passing a reference (e.g. path to blob storage etc) so you can load your data inside your flow
o
I might just do that then. I have to compute these values and wanted to do them before the deployment. I guess I'll generate a JSON or something and include that as part of the image.