Power Automate: How to Get more than 5000 items from SharePoint

Here’s a SharePoint list I’ll use to demonstrate the solution. It contains more than 7000 records, making it the right scenario for addressing the 5000-item limit in Power Automate.




As shown above, this dataset is well beyond the threshold, which makes it ideal for walking through how to retrieve all items successfully.

In this blog, you’ll see 3 different methods to retrieve more than 5000 items from a SharePoint list using Power Automate.

Method 1: Enable Pagination

  • Create a new flow → In Power Automate, start with an empty flow and choose Manually trigger a flow.
  • Add the “Get items” action → Select the SharePoint connector and use the Get items action, which is generally used to fetch data from a SharePoint list.
  • Choose the site and list → From the dropdowns, select the appropriate Site Address and List Name where your data is stored.
  • Set Top Count → In the action, set the Top Count value to 5000.

    • Enable Pagination → Open the 'Get Items' action’s Settings, turn on Pagination, and set the Threshold value to 100000. Click Done to save.
    • After the “Get items” step, insert a Compose action. Paste the following expression in the Compose action to get the count of items retrieved:
    length(outputs('Get_items')?['body/value'])
    • Save and run the flow → Once all the steps are configured, save the flow and trigger it manually.
    • Check the run history → Open the flow’s run history. In the output of the Compose action, you’ll see the total count of items retrieved from the SharePoint list.

    ✅ When to use

    • List contains up to 100,000 items
    • You need a quick and simple way to fetch all records
    • Suitable for small to medium datasets

      ⚠️ Drawbacks

      • Can slow down or time out with large datasets
      • Lacks fine-grained performance control

      Method 2: Do Until Loop with Get Items

      • Create a new flow → In Power Automate, start with an empty flow and choose Manually trigger a flow .
      • Initialize variables → Add four Initialize variable actions. Set them up as per the details provided in the table below.

      Variable Name Type Default Value Comments
      varListItems Array None Stores retrieved list items
      varLastRecordID Integer 0 Keeps track of the last processed ID
      varBatchSize Integer 5000 Defines fetch size per loop
      varRecordCount Integer 0 Counts total records retrieved


      • Add a “Do Until” loop → Insert a Do Until control after the variables are initialized.
      • Set the condition → Configure the loop condition: varRecordCount is less than varBatchSize.


      • Add the “Get items” action → Select the SharePoint connector and use the Get items action, which is generally used to fetch data from a SharePoint list.
      • Choose the site and list → From the dropdowns, select the appropriate Site Address and List Name where your data is stored.
      • Set Top Count → In the action, set the Top Count value to 5000. In the Filter Query field, add the following condition to ensure you only fetch records greater than the last retrieved ID:

      ID gt @{variables('varLastRecordID')}


      • Insert a Set variable action and update the variable varRecordCount with the following expression:
      length(outputs('Get_items')?['body/value'])
      • Next, add a Compose action and enter this expression to merge the retrieved items with the array variable:
      union(variables('varListItems'), body('Get_items')?['value'])


      • Add a Set variable action to update varListItems with the Outputs of the Compose action from the previous step.
      • Finally, add another Set variable action to update varLastRecordID with this expression:
      last(body('Get_items')?['value'])?['ID']


      • After the Do Until loop, add a Compose action and use this expression to get the total count of items retrieved:
      length(variables('varListItems'))
      • At this point, the varListItems variable holds the complete dataset fetched from SharePoint, combining all the batches retrieved in the loop.
      Here’s a snapshot of the complete flow for your reference.


      Snip of the Do Until loop:

      • Save and run the flow → Once all the steps are configured, save the flow and trigger it manually.
      • Check the run history → Open the flow’s run history. In the output of the Compose action outside the loop, you’ll see the total count of items retrieved from the SharePoint list.

      Logic:

      • Do Until condition → The loop keeps running until the number of items fetched in the latest batch is less than the defined batch size (meaning no more items left).
      • Fetch a batch of records → The Get Items action retrieves the next set of items after the last record ID.
      • Count items in this batch → The flow calculates how many records were retrieved in the current run.
      • Merge with collected data → These new items are appended to the master variable holding all records.
      • Update last record ID → The ID of the last item in the batch is stored, so the next loop continues from there.

      ✅ When to use

      • List contains more than 100,000 items
      • You want better control over batch retrieval

      ⚠️ Drawbacks

      • Flow becomes a bit complex with extra variables and logic
      • Slower execution compared to pagination for smaller datasets

      Method 3: Do Until Loop with SharePoint HTTP Request

      • Create a new flow → In Power Automate, start with an empty flow and choose Manually trigger a flow.
      • Initialize variables → Add three Initialize variable actions. Set them up as per the details provided in the table below.


      Variable Name Type Default Value
      varListItems Array None
      varLastRecordID Integer 0
      varHasMoreItems Boolean None


      • Add a “Do Until” loop → Insert a Do Until control after the variables are initialized.
      • Set the condition → Configure the loop condition: varHasMoreItems is equal to false.


      Inside the Do Until loop, add a Send an HTTP request to SharePoint action.

      • Select your Site Address.
      • Choose GET as the method.
      • In the URI field, paste the following expression:

      
      concat( '_api/web/lists/GetByTitle(', '''7k records''', ')/items?$select=Id,Title,field_0,field_2,field_3&$top=5000&$filter=Id gt ', variables('varLastRecordID') )
      
      

      Here’s what the URI does:

      • _api/web/lists/GetByTitle('7k records')/items → points to the list named 7k records.
      • $select=Id,Title,field_0,field_2,field_3 → specifies which columns to retrieve (you’ll need to use internal names of the fields in your list).
      • $top=5000 → fetches up to 5000 items in one go.
      • $filter=Id gt varLastRecordID → ensures only items with an ID greater than the last retrieved record are fetched, enabling paging through the dataset.
      • Don’t forget to add a header in the request with KeyAccept and  Valueapplication/json;odata=nometadata — This makes the response cleaner and easier to handle in Power Automate.


      After the HTTP action, add a Parse JSON action.

      • For Content, use the body output of the HTTP request.
      • To generate the schema, the easiest way is to run the HTTP action once (with $top=2 for quick testing), copy the sample response, and then use the “Generate from sample” option in Parse JSON.


      Next, add an Append to array variable action.

      • Select varListItems as the target variable.
      • For the Value, pass the body output from the Parse JSON action.

      👉 This step keeps adding each fetched batch of records into varListItems, so by the end of the loop you’ll have the complete dataset in one place.


      Next, update two variables inside the loop:

      • Set variable  varHasMoreItems  → Use this expression:
        if(less(length(body('Parse_JSON')?['value']), 5000), false, true)
        
      • Set variable  varLastRecordID  → Use this expression:
        last(body('Parse_JSON')?['value'])?['Id']
        

      • At this point, the varListItems variable holds the complete dataset fetched from SharePoint, combining all the batches retrieved in the loop.
      Here’s a snapshot of the complete flow for your reference.

      Snip of the Do Until loop:

      • Save and run the flow → Once all the steps are configured, save the flow and trigger it manually.
      • Check the run history → Open the flow’s run history. In the last iteration of the Do Until loop, the variable varLastRecordID will hold the ID of the last item in the SharePoint list. This confirms that all items have been successfully retrieved.

      Logic:

      • Do Until condition → The loop continues running as long as varHasMoreItems is true, meaning there are still items left to fetch.
      • Request items in batches → Each iteration uses the HTTP request to pull up to 5000 items from SharePoint, starting after the last retrieved ID.
      • Parse the response → The raw JSON response is parsed so individual records can be accessed and used in later steps.
      • Append to master list → The parsed items are appended into the varListItems array variable, gradually building the complete dataset.
      • Update flags for next run → varLastRecordID is updated with the most recent item’s ID, and varHasMoreItems is set based on whether the batch contained fewer than 5000 items.

      ✅ When to use

      • List contains very large datasets (hundreds of thousands of items or more)
      • You want maximum control over data retrieval and filtering
      • Ideal when you need to work with specific columns via internal names for efficiency

      ⚠️ Drawbacks

      • Setup is bit complex compared to the other methods

      Post a Comment

      0 Comments