https://prefect.io logo
Title
j

Jacob Blanco

04/19/2022, 4:23 AM
Trying to ship logs from our containerized flows executed on EC2 to DataDog, I’m trying to figure out what is the best approach. As far as I can tell we can: a) Use the DataDog agent already on the EC2 instance to collect logs from the containers b) Add the DataDog agent to the flow images and write a custom logger to ship the logs to statsd c) Something else I haven’t thought about I think a. has the advantage that we don’t need to do any custom coding on each flow and we get all the logs for all flow runs for free. Anyone have experience with this approach? It seems all we need according to this: https://docs.datadoghq.com/agent/docker/log/ is to enable the Docker logs collections and maybe configure the logs in the Dockerfile.
a

Anna Geller

04/19/2022, 9:31 AM
This GitHub discussion may be helpful
from the architecture perspective, you're right that a) seems the easiest and most straightforward but I'm not sure whether it will collect all Prefect logs rather than just system metrics. b) is the most realistic option in the sense that you would need to configure an extra logger the docker option could work if you are using a Docker agent - you could add those environment variables to the DockerRun run config and test it out
j

Jacob Blanco

04/21/2022, 12:00 AM
Thanks for all the info. I guess we should just bite the bullet now and introduce the new logger. Better do it earlier rather than later to avoid having to make changes in 100s of flows.