Hey team - I am running prefect jobs within a dock...
# prefect-cloud
m
Hey team - I am running prefect jobs within a docker container, and I still cannot seem the get the logging to the API to turn off when I am just running locally. Is there a better way to do this? Nothing seems to prevent it from showing up in the cloud and triggering our alarms for failed runs?
Copy code
if __name__ == '__main__':
    import os
    os.environ["PREFECT_LOGGING_ORION_ENABLED"] = "False"
    # with temporary_settings(set_defaults={PREFECT_LOGGING_TO_API_ENABLED : False}):
    with use_profile('prod'):
        run_my_flow(x)
Oh and we are using
2.10.20
c
Hey Mitch - to clarify, are you trying to only disable logging or are you trying to avoid sending all run state updates to the Cloud API?
m
trying to disable sending run states to the API just for local testing. I guess it would be fine if it logged to API, but when I am testing new applications, we get a bunch of notifications due to the runs failing. In prefect one all of the runs locally didn't log into the cloud API. We have an Enterprise subscription for prefect
c
great, that makes sense - the most robust way to do this is through profiles. For example I have a profile that configures my API URL to be a locally running instance of
prefect server start
(so it's not talking to Cloud at all), so I'll do:
Copy code
prefect profile use local
## python run-my-flow.py or whatever
if you are heavily relying on cloud features within the flow itself, this is where a separate workspace comes in handy (i.e. a "Sandbox" workspace where anything goes), and then
prefect profile use sandbox
to switch APIs
m
Yeah that makes sense - we rely on workspace specific blocks such as secrets in many of the runs however. Hmm
c
gotcha, yea that makes sense; we probably could expose a flag for disabling all create/update activity and instead make it a "read-only" run that never gets an ID. I'm not immediately sure if there are edge cases or "gotchas" in that approach, but I'd be game to look into it.