Follow-up question. Let's say every time my flow runs, I want to grab a set of 10K records from salesforce. Generally, this requires several api calls and managing pagination to grab multiple sub-batches of records. Is there a recommended way to store large datasets like these so I only have to grab this data once and use it throughout multiple executions