The documentation for Prefect doesn't explicitly mention a size limit for the "parameters" property for a deployment in the main deployment documentation I found
here. However, there is reference related to deploying specifically with Databricks, where it states the JSON representation of the
spark_submit_params
field cannot exceed 10,000 bytes
here.
If you're working with a specific integration like Databricks, it's worth checking the documentation for that integration to see if there are any unique limitations. Otherwise, the general documents suggest that
parameters
can simply be a
dict
with no mentioned size constraints. It's always recommended practice, however, to ensure the amount of data is reasonable to avoid potential performance issues.
If you're experiencing specific issues or errors with large parameters, you could search for relevant information in the Prefect GitHub issues or consider optimizing the size of the data you are passing.