Hello! I am dealing with a logger problem, while I...
# prefect-integrations
k
Hello! I am dealing with a logger problem, while I am using pyspark-kafka module with custom StreamingQueryListener I can't see any outputs/logs in FlowRun area for my run. Do you have any ideas how to add it? Thanks!
n
hi @Kacper K - have you tried setting
PREFECT_LOGGING_EXTRA_LOGGERS=pyspark-kafka
? that will work if
pyspark-kafka
defines a custom python logger and thats what you want to capture
k
Can I set PREFECT_LOGGING_EXTRA_LOGGERS to 2 values?
Then I would add both custom logger and pyspark-kafka
n
yes! it can be a comma delimited string
Copy code
» PREFECT_LOGGING_EXTRA_LOGGERS=foo,bar ipython

In [1]: from prefect.settings import PREFECT_LOGGING_EXTRA_LOGGERS

In [2]: PREFECT_LOGGING_EXTRA_LOGGERS.value()
Out[2]: ['bar', 'foo']
k
Should I restart worker to see results?
n
if its a process worker then yes, otherwise it shouldnt matter
you'll want to set this where your flow code is running, which is not the actual worker process (unless its a process worker)
k
it is quite werid I can see logs in worker screen, but not i UI, should I enable some function?
This is a warning:
logger.py:53: UserWarning: Logger 'ds-data-logger' attempted to send logs to the API without a flow run id. The API log handler can only send logs within flow run contexts unless the flow run id is manually provided.