Hi,
Maybe someone can help on how to do best options in such case
1) We have a linux docker image as Prefect Agent with mounted S3 storage acting as primary storage for Flows, configs and some resources used for tasks.
2) Each Flow is executed in separate container using DockerRun(image=prefecthq/prefect:0.15.13-python3.8, host_config = host_config). So we have isolated Flow Containers inside Agent Container :). host_config is used to mount Agents S3 into Flow runtime, so that Flow, when running can pick up configs and resources mounted to Agent.
Tricky thing is how to pass connection strings inside Flow Container?
1) I thought that we can pass them as arguments of DockerRun(.... environment = environment) according to (
https://docker-py.readthedocs.io/en/stable/api.html#module-docker.api.container), but we need to prepare the list and it means that it will be "plain text". Not good.
2) Another option is to define some dictionary in the Flow and replace variables in CI/CD tool during deployment, but in this case it will be stored as plain text on storage inside Flow. This might be an option in case if serialized Prefect flow can be encrypted.
3) ??? maybe some other ideas?