So, I have prefect server running on some dev inst...
# prefect-server
So, I have prefect server running on some dev instances, and a uat/qa instance, and then I have production using prefect cloud. I want ALL logging to go to a graylog server via graypy. I've found that when I run the flow via the command line in the docker container, it sends all the logging to my graylog server. however, when I run the flow from the UI ( server or cloud ), it doesn't set my handler. Where's the appropriate place to put this:
Copy code
logger = prefect.context.get('logger')
    handler = graypy.GELFTLSHandler('', 12345, fqdn=False, localname=os.environ.get('PREFECT_ENV'))
Hi @Corris Randall, I think this sounds like the handler is being added in the Flow block. The content of the Flow block is not deferred, it is executed eagerly and builds the DAG. Adding the handler here only attaches it during registration time but not during the Flow run time. You can try using
in the Flow storage that the Flow is not serialized and this will be executed during runtime. You can also put this inside a task (but you might need to add it in every task if you do it that way). This will work because task execution is deferred so attaching the handler will happen during the flow run.
Ah, ok, I'll give that a shot. putting it in the task doesn't really work because that just gets the logs from that task, I'd like the whole execution logs to be sent to graylog.
that worked. the agent is actually running in the docker container, it seems most people are doing it the other way around. anyways, here's my local flow storage for dev/qa:
Copy code = Local(
   stored_as_script=True,path="/flows/"+os.path.basename( inspect.stack()[1][1] ),
argh, slack formatting is wack. anyways, the stack inspection is because I'm using a setup function in a custom module to define consistent flow parameters/setup. Thanks again.
Ah ok. Sounds good!