Omar Khudeira
05/02/2024, 5:02 PMMarvin
05/02/2024, 5:02 PMspark_submit_params
field cannot exceed 10,000 bytes here.
If you're working with a specific integration like Databricks, it's worth checking the documentation for that integration to see if there are any unique limitations. Otherwise, the general documents suggest that parameters
can simply be a dict
with no mentioned size constraints. It's always recommended practice, however, to ensure the amount of data is reasonable to avoid potential performance issues.
If you're experiencing specific issues or errors with large parameters, you could search for relevant information in the Prefect GitHub issues or consider optimizing the size of the data you are passing.Nate
05/02/2024, 5:30 PMNate
05/02/2024, 5:30 PMOmar Khudeira
05/02/2024, 5:41 PMNate
05/02/2024, 6:01 PMNate
05/02/2024, 6:02 PMOmar Khudeira
05/02/2024, 7:15 PM