Ah, ok, I think this just became clear to me, but will rephrase just to make sure I am understanding:
• I can call flows from my python code all I want without creating Deployments. In this case it would also be my API code that is responsible for invoking these. It sounds like if we wanted to keep this appropriately decoupled from our webapp backend, we'd write a think wrapper API that kicks off the flows in this manner.
• If I want to use the Orion API to trigger the flows, then they need to be turned into Deployments, but there is a mechanism to do this with an API. (Though I imagine doing this for adhoc queries could dirty up the system over time, so maybe doesn't make the most sense.).
I saw some good blog posts on Prefect 1.0 code organization recommendations, but I think that might be a valuable thing to do eventually for Orion as well. Getting it up and running locally was super easy and love how magically things I'm running in console are showing up in the server. It became less clear to me how I'd manage large codebases of tasks and flows and what the entrypoints would look like for this / how these get packed into Docker containers. Anyway, I think this just comes down to trying to prototype out one of our current DAGs in Airflow.
Not intending to derail this thread, though. I really appreciate the help!