Hi, I have some flows that deal with geospatial data, and accept as input an area of interest expressed as GeoJSON, a fully-compatible JSON language to express geo-referenced shapes. Those shapes can be very complex, therefore making for very very verbose JSON parameters passed when running deployments. I'm working on a Prefect Server installed on-premise.
This should not be an issue in principle, except that there are various places in the UI where multiple flow runs are retrieved through the API, along with their parameters. When this happens, the server starts being very very slow, which affects both dashboard users and flow runs execution through workers.
Now I see 2 possible explainations:
1. the Prefect Server and UI are not optimized to deal with very big JSON parameters, so best practice is to save that GeoJSON to a file, pass the file path as parameter and let the flow parse it
2. it is just a matter of resources and your server should be more equipped - i.e. if I worked on Prefect Cloud everything would probably be quite smooth
I believe there must be similar situations in other domains as well, when there are big objects to be passed to flows or subflows. Clearly the solution is somewhere in between those two bullet points, just curious about how others deal with such cases and deriving best practices :)