Mohamed Zaki
08/26/2025, 7:00 AMautoupdate-worker:
image: prefect-workpool
depends_on:
server:
condition: service_healthy
services:
condition: service_started
create-deployment:
condition: service_completed_successfully
environment:
PREFECT_API_URL: <http://server:4200/api>
DOCKER_HOST: <tcp://host.docker.internal:2375>
PREFECT_LOCAL_STORAGE_PATH: /logs/storage
And here's a task that i stored its result:
@task(name="persist_result", persist_result=True, result_serializer="json")
async def persist_result_task(result: dict):
"""Persist the result of the flow.
"""
logger = get_run_logger()
<http://logger.info|logger.info>(f"Storing result: {result}")
return result
The output is gibberish filename and gibberish contetnMarvin
08/26/2025, 7:00 AMMarvin
08/26/2025, 7:06 AMfrom prefect import task, get_run_logger
from prefect.serializers import JSONSerializer
@task(
name="persist_result",
persist_result=True,
result_serializer=JSONSerializer(),
result_storage_key="persist_result.json", # readable base name; suffix added to avoid collisions
)
async def persist_result_task(result: dict):
logger = get_run_logger()
<http://logger.info|logger.info>(f"Storing result: {result}")
return result
Ensure the path is actually on disk where you expect
- You set PREFECT_LOCAL_STORAGE_PATH=/logs/storage
in the worker container. Make sure that:
- The directory exists and is writable inside the container
- You mount it to your host if you want to inspect files from the host:
- In docker-compose for your worker:
volumes:
- ./logs/storage:/logs/storage
- The flows/tasks are actually running in that same container. If they’re running in separate containers (e.g., via Docker infrastructure), those containers also need the same env var and volume mount.
If you need to read a compressed result you already wrote
- If the file is compressed JSON, it will look like binary. You can read it like this:
import lzma, json
with open("/logs/storage/path/to/file", "rb") as f:
data = json.loads(lzma.decompress(f.read()).decode())
print(data)
Programmatic inspection during a run
- Prefect will rehydrate results for you when you call .result()
on a task future, so you usually don’t need to open the files manually.
- If you want to confirm what serializer/key were used, you can inspect the task’s state after it completes (happy to provide an exact snippet if you share your Prefect version).
A couple of clarifying questions to make sure we nail this:
1) Which Prefect version are you on (2.x or 3.x)? I’ll assume 3.x unless you say otherwise.
2) Are the tasks running in the same “autoupdate-worker” container, or in separate containers?
3) Can you share a sample filename and the first few bytes of the file? If it starts with FD 37 7A 58 5A 00
(b'\xfd7zXZ\x00'), it’s LZMA-compressed content.
4) After switching to JSONSerializer()
and setting result_storage_key="persist_result.json"
, does the file content become plain JSON?Mohamed Zaki
08/26/2025, 7:31 AMMarvin
08/26/2025, 7:31 AMMarvin
08/26/2025, 7:32 AMimport json
from prefect import flow, task
from prefect.artifacts import create_markdown_artifact, create_table_artifact
@task(persist_result=True, result_serializer="json")
def persist_result_task(result: dict):
return result
@flow
def demo():
data = persist_result_task({"foo": "bar", "n": 42})
# Markdown artifact (pretty-printed JSON)
create_markdown_artifact(
key="persist_result_preview",
markdown=f"## Task result\n```json\n{json.dumps(data, indent=2)}\n```",
description="Human-readable preview of the task result"
)
# Table artifact (if you want a quick table)
create_table_artifact(
key="persist_result_table",
table=[{"key": k, "value": v} for k, v in data.items()],
description="Result as a table"
)
if __name__ == "__main__":
demo()
Where to look in the UI
- Open the flow run → Artifacts tab for those items.
- Global Artifacts page aggregates artifacts across runs.
Docs
- Artifacts concept: Artifacts (concepts)
- How-to with examples: Artifacts (how-to)
- Results (persistence/serialization): Results (advanced)
Extra tips
- If you want a clickable link to the persisted file, use a Link artifact that points at a URL reachable by your browser (e.g., presigned S3 URL). Local container paths will not be accessible from the UI.
- For small values you can also log a summary with `get_run_logger()`; logs are visible on the run page.
If you share where your results are stored (local path vs. cloud storage), I can suggest a simple Link artifact pattern for your setup.