API to Bulk Export data in batches of 25,000 or 50,000 or 1,00,000

Hi everyone,

As API requests that returns multiple items, such as listing one of the resources, will be paginated subjected to default page size of 25 items per page limited to maximum 100 items per page.

Is there any option/endpoint that can return data in bulk for such tables?

For example, If I have millions of records for invoices, is there any API that can export the data in bulk maybe in batches of 1 million, or 50,000 or 25,000.

Thanks
Swatantra

Hi Swatantra,

That’s a great question. Handling large datasets can certainly be a challenge, especially when working with APIs that have pagination limits.

Regarding your specific use case with invoices, many APIs do provide an option to export data in bulk, typically referred to as a “bulk export” or “data dump” endpoint. This allows you to retrieve larger chunks of data at once, often in a format like CSV or JSON, rather than being limited to the standard pagination limits.

The exact implementation and availability of such an endpoint can vary across different APIs and platforms. Some common approaches include:

  • Dedicated Bulk Export Endpoint: The API may provide a specific endpoint (e.g., /invoices/export) that allows you to download the entire dataset (or a significant portion of it) in a single request.
  • Streaming/Chunked Responses: The API may support streaming responses or chunked data transfers, where you can page through the data in larger batches (e.g., 50,000 or 100,000 records at a time) without being limited to the standard pagination limits.
  • Background Jobs/Asynchronous Exports: Some APIs may offer the ability to request a bulk export as a background job, where the data is prepared and made available for you to download later, rather than waiting for the entire export to complete in a single request.

I’d recommend checking the API documentation or reaching out to the API provider to see if they offer any bulk export or large dataset handling capabilities that could suit your needs. Many modern APIs are designed with these types of use cases in mind.

If the API you’re using doesn’t provide a KrogerFeedback dedicated bulk export option, you may need to implement some client-side pagination logic to retrieve the data in manageable chunks and stitch it together programmatically. This can be a bit more involved, but is certainly doable with the right approach.

Let me know if you have any other questions! I’m happy to provide further guidance or suggestions on handling large datasets with APIs.