Question for logs : I'm running a job in a contain...
# prefect-community
Question for logs : I'm running a job in a container ( I'm using this example : ) : I remove last part and add flow.register() - In UI, in log part, I can logs
May 10th 2020 at 4:48:48pm | prefect.CloudTaskRunner
Task 'GetContainerLogs': finished task run for task with final state: 'Success'
which is cool 🙂 but I don't have any print done by my job itself - now if I check log file (in folder .prefect/results) in console I can see all prints done by my job - Do you know how I can have it in UI ?
Hi @Adrien Boutreau - if you want to log stdout such as print(), you can set "log_stdout" to true. For example:
Copy code
from prefect import task, Flow

def my_task():
    print("This will be logged!")

flow = Flow("log-stdout", tasks=[my_task])
You can see more about logging in the docs here:
Hope that helps!
Copy code
container = CreateContainer(
start = StartContainer()
logs = GetContainerLogs(trigger=always_run)
status_code = WaitOnContainer()

schedule = IntervalSchedule(interval=timedelta(hours=1))
#schedule = Schedule(clocks=[CronClock("0 7 * * *")])

with Flow("flow", schedule) as flow:
    start_container = start(container_id=container)
    code = status_code(container_id=container, upstream_tasks=[start_container])
    collect_logs = logs(container_id=container, upstream_tasks=[code])

## run flow and print logs
so my task is inside a container - how can I do that ?
@Adrien Boutreau The
task in the task library returns a result that looks like this:
Copy code
api_result = client.logs(container=container_id).decode()
Since the logs are retrieved you should add a downstream task that does something with the result of that task where you can filter, save, output, etc.