Hey there! I'm currently exploring Prefect as a ba...
# ask-community
k
Hey there! I'm currently exploring Prefect as a batch inference solution. I've set up a flow and deployed it. Now, I'm eager to fetch the inference results using a REST API call. I've managed to run the deployment successfully using
/deployments/{id}/create_flow_run
, and I can track the run status using
/flow_runs/{id}
. However, I'm struggling to find a good method to retrieve the inference json results from the run. Any insights or suggestions on how I can accomplish this?
n
hi @KyuWoo Choi ! when you say the inference json result from the run, do you mean a python object that you have as a
return
value of a task?
k
oh yea. My initial approach is that persist python dict object flow result with json serializer. So far what I got is disk stored json file and file path from status api.
I'd really appreciate being able to receive JSON results directly through a REST API in any way possible.
I opted for a different approach by utilizing artifacts. Unable to obtain the result directly through REST API, I now store it in an artifact as a table and retrieve it using REST API. Welcome suggestions for the better solution.
n
sorry I didn't get back to you on this! glad you found artifacts, if you want to be able to fetch the JSON result directly from the API, then artifacts sound like the way to go. generally the API only knows about the location of
Results
, not the actual values (which are serialized / stored somewhere on your disk)
k
Thanks for the reply @Nate. Now I know the using artifact isn't wrong after all.
b
@Nate Is it possible for a flow to store a JSON as a result in an S3, and then retrieve it through the prefect API? Seems like a basic functionality, which I'm having trouble finding documentation for.