<@ULVA73B9P> How can I get the serialized (string)...
# ask-marvin
s
@Marvin How can I get the serialized (string) output starting using the REST API endpoint
/api/flow_run_states/{id}
where I already have the flow run state id.
m
thought for 352.4 seconds
Short answer: call GET on that endpoint and read the State’s
data
field. If
data
is embedded JSON, that’s your serialized output. If
data
is a “persisted result” reference, you’ll need to fetch the payload from the configured result storage using the
storage_key
(and
storage_block_id
) shown in
data
. The Prefect REST API does not dereference and return persisted result bytes for you. Links: - Read flow run state (REST) - Results and persistence in Prefect 3 How to do it 1) Fetch the flow run state by id - Prefect Server:
Copy code
curl -s -H "Authorization: Bearer $PREFECT_API_KEY" \
  "$PREFECT_API_URL/api/flow_run_states/$STATE_ID"
- Prefect Cloud (note account/workspace scoping):
Copy code
curl -s -H "Authorization: Bearer $PREFECT_API_KEY" \
  "<https://api.prefect.cloud/api/accounts/$ACCOUNT_ID/workspaces/$WORKSPACE_ID/flow_run_states/$STATE_ID>"
You’ll get a JSON like:
Copy code
{
  "id": "f2f3…",
  "type": "COMPLETED",
  "name": "Completed",
  "timestamp": "2025-09-17T12:34:56.789Z",
  "message": null,
  "state_details": { ... },
  "data": <either inline JSON OR a persisted-result metadata object>
}
2) Case A: Inline/embedded result If
data
is a plain JSON value/object (string, number, array, object), that is the serialized result already. For example:
Copy code
{ ..., "data": "hello world" }
You can print as a string with jq:
Copy code
curl -s ... | jq -r '.data'
3) Case B: Persisted result (stored externally) If
data
is an object with fields like
storage_key
,
storage_block_id
,
expiration
, and
serializer
, Prefect persisted the result to external storage. Example shape:
Copy code
"data": {
  "storage_key": "results/2d5e.../return.json",
  "storage_block_id": "b1a2c3d4-...-....",
  "serializer": { "type": "json", ... },
  "expiration": "2025-09-18T12:34:56.789Z",
  "prefect_version": "3.x.y"
}
What to do next: - Identify the storage from
storage_block_id
(e.g., Local filesystem, S3, GCS, Azure). Fetch the block document via the Blocks API (Cloud/Server path depends on your deployment) to see its configuration (e.g., basepath, bucket/container, credentials). - Download the object at
storage_key
using that storage’s client/CLI. - Local filesystem block: the payload will be at
<basepath>/<storage_key>
- S3 block:
aws s3 cp s3://<bucket>/<storage_key> -
- GCS block:
gsutil cat gs://<bucket>/<storage_key>
- Azure block:
az storage blob download --container-name <container> --name <storage_key> --file -
- Decode according to `serializer.type`: - If
json
(the default for JSON-serializable values), the file content is already a JSON string/bytes; print it directly. - If
pickle/cloudpickle
, the payload is a Python pickle; there isn’t a REST-only way to turn that into a human-readable string—you’d need Python to unpickle it. Notes and gotchas - If
data
is null, there is no stored result for that state. This can happen if results weren’t persisted. See Results and persistence for enabling persistence or configuring result storage. - Make sure you’re querying the terminal “Completed” state (or the state that actually carried the result) for the flow run. Non-terminal states won’t have a final result. - There is no dedicated REST endpoint that returns the persisted result bytes; dereferencing must be done via the storage indicated in
data
. If you can share whether you’re on Prefect Cloud or Server and which result storage block you use (e.g., local, S3, GCS, Azure), I can give exact curl/CLI commands to fetch and print the serialized output.
s
@Marvin Sure. This the REST API response:
Copy code
"data": {
    "storage_key": "<my_local_path>/.prefect/storage/<key>",
    "expiration": null,
    "serializer": {
      "type": "pickle",
      "picklelib": "cloudpickle",
      "picklelib_version": null
    },
    "prefect_version": "3.4.1",
    "storage_block_id": null
  }
m
thought for 344.3 seconds
Thanks—that tells us the result was persisted locally using a pickle/cloudpickle serializer. In this case: - The REST API won’t return the actual result payload; it only returns the metadata you pasted. - The file lives on the machine that executed the flow at
<my_local_path>/.prefect/storage/<key>
. - Because it’s pickled, you’ll need Python in that environment to load and print it; there’s no REST-only way to turn pickle bytes into a string. One-off: print the result as a string on the machine that has the file
Copy code
python - <<'PY'
import sys, cloudpickle
p = sys.argv[1]
with open(p, "rb") as f:
    obj = cloudpickle.load(f)
# If it’s a string, this will print it; otherwise you'll see its repr
print(obj)
PY "<my_local_path>/.prefect/storage/<key>"
If you just need the raw serialized bytes as a string (base64), you can do:
Copy code
python - <<'PY'
import sys, base64
p = sys.argv[1]
print(base64.b64encode(open(p, "rb").read()).decode())
PY "<my_local_path>/.prefect/storage/<key>"
Prefer JSON for future runs (so you can read directly via REST or from storage) - Use a JSON serializer and enable persistence at the flow (or task) level:
Copy code
from prefect import flow, task
from prefect.serializers import JSONSerializer

@flow(result_serializer=JSONSerializer(), persist_result=True)
def my_flow():
    return "hello world"
- Or set settings so all results use JSON and persist by default: -
PREFECT_RESULTS_DEFAULT_SERIALIZER=json
-
PREFECT_RESULTS_PERSIST_BY_DEFAULT=true
With JSON, the persisted file is plain JSON and the REST
data
may be embedded JSON or point to a JSON file you can read with `curl`/`jq`. If you need to fetch results from another machine, consider using a remote result storage block (S3/GCS/Azure) instead of local storage so you can `curl`/CLI the object directly. Docs: - Results and persistence - Read flow run state (REST) If you’d like, tell me whether the executing environment is accessible to you and I can give exact commands to print or copy that file; or if you want to switch to S3/GCS/Azure, I can share the minimal config changes.
s
@Marvin Sorry, but I couldn't unpickled the file on
<my_local_path>/.prefect/storage/<key>
. It seems the contents are just several encrypted lines of text. . . so cloudpickle library complains upon encountering the first line in the file:
Copy code
ValueError: invalid literal for int() with base 10: 'XXXXXXXX'
m
thought for 111.1 seconds
Good clue — that error usually means you’re trying to unpickle a file that isn’t a raw pickle stream. Prefect doesn’t store “just the pickle”; it stores a ResultRecord that wraps your value plus metadata. So the on-disk file at
<my_local_path>/.prefect/storage/<key>
isn’t a plain pickle file and cloudpickle.load() will fail. Use Prefect’s ResultStore to read and deserialize it for you: Option 1: Read with your current Prefect profile (same machine/env that wrote the file)
Copy code
from prefect.results import get_result_store

store = get_result_store()
record = store.read("<my_local_path>/.prefect/storage/<key>")  # use the exact storage_key you got from the API
print(record.result)  # this is your original Python value
Option 2: If running from a different base path, point ResultStore at local storage explicitly - If your storage_key is relative to a basepath, do this:
Copy code
from prefect.filesystems import LocalFileSystem
from prefect.results import ResultStore

store = ResultStore(result_storage=LocalFileSystem(basepath="<my_local_path>"))
record = store.read(".prefect/storage/<key>")  # note: key relative to basepath
print(record.result)
- If your storage_key is absolute (as shown in the API output), you can still pass it as-is to
store.read()
as in Option 1. Tips - You can confirm what you’re reading by inspecting
record.metadata.serializer
and
record.metadata.storage_key
. - If you get a not-found error, verify the path exists and that you’re running with the same user/PREFECT_HOME that created it. - If the value is a custom class, ensure compatible versions of Python and the class definitions are importable when unpickling. For future runs: switch to JSON so the payload is plain text and easier to read via REST or disk
Copy code
from prefect import flow
from prefect.serializers import JSONSerializer

@flow(result_serializer=JSONSerializer(), persist_result=True)
def my_flow():
    return "hello world"
Or set: -
PREFECT_RESULTS_DEFAULT_SERIALIZER=json
-
PREFECT_RESULTS_PERSIST_BY_DEFAULT=true
Docs: - Results and persistence: https://docs-3.prefect.io/v3/advanced/results - Read flow run state (REST): https://docs-3.prefect.io/v3/api-ref/rest-api/server/flow-run-states/read-flow-run-state If you paste the exact
storage_key
shape (absolute vs relative) and confirm whether you’re on the same machine/profile, I can tailor the exact
ResultStore
call you need.