Showing results for 
Search instead for 
Did you mean: 
Resident Rockstar
Resident Rockstar

Another method for making a collection from Sharepoint and SQL larger than 2,000 items. Bye-bye delegation!

Maybe I'm slow and just now finding this out, but I thought I would share it with you.  I have been working with some data sets in SQL and lists in Sharepoint that exceed 2,000 items.  There is one method I summarized a while back that provided a method to get up to 4,000 items.  Via flow, I have found a way to collect many more items.  I have sucessfully retrieved a SQL table of over 20,000 rows with this method.  This also works with sharepoint, but it is much slower and has some other hangups.  I'll try to outline them here.

The basic method is to create a JSON array from the data, and the pass that back to Powerapps via a Respond Action.  This is how it works:

Sample flow, "GetSQLData"

1. The entire flow is only three steps:


2. You go into the "Settings"  in the Get Rows action... and turn on Pagination, and set a max number in the Limit field.


3. The trick is the "JSON Schema".  You have to run the flow with steps one and two only first.  Limit the output to a single row (Set "Top Count" in advanced options to 1.  No quotes).  Copy (highlight, then Cntrl-V) the output, but ONLY what is in square brackets, including the brackets.


4. Open the Response action.  Select "Use sample payload to generate schema", and paste what you copied in 3.  Pick done, and the system will generate the schema.


5.  In powerapps, use your command in OnStart, or OnVisible:


6. After this, you should have all of the table data available in your collection.  No delegation issues and very fast response.  (Dont forget to remove the limit in "Top Count" if you did that!)


Sharepoint modification:

Sharepoint is just a little different animal...  You have to remap the tables, since there is something strange in the JSON output of the sharepoint that powerapps just does not like.  So, add the Select action.  You will have to map each column.  KEEP IN MIND, the pitfall note below.  If you change anything in the select columns, you will have to make a new scheme.  Your load file will come from the Select output when you do a test run, instead of the sharepoint output.  You will also have to remove the data source, and re-add the flow in powerapps to reset it.




Accellerating retrieval of thousands of items!

You can increase the speed of retrieval by a factor of how many "lanes" you create in parallel.  You basically divide up the indexes into as many ranges as you create "lanes".  In this example, I split the index into four ranges, and put them in parallel actions.  Yes, this nearly cuts retrieval time to 1/4 the speed of a single range!




Update data after initial load:


You may wish to update your data instead of loading the entire dataset again.  This method can take your existing dataset, look for updates from your sources, then update the dataset in your current collections.  While the initial load of my data can be nearly 60 seconds, a quick update check and update takes only 6-8 seconds.  This allows for your data to not go stale.  You can also use timers to refresh data at regular intervals.  


// Data Refresh Routine v1.0.0 3/26/2019


   UpdateContext({TimerReset:false, Circle1_Color:Green})

   Refresh('Engineering Change Notice'), Refresh('Engineering Change Request'), Refresh(Drawing_Vault),

   If(CountRows(ClearCollect(UpdatedAcct,Filter(SortByColumns('[dbo].[tbINVTransactionDetail]',"PostedTimestamp",Descending),DateTimeValue(Text(PostedTimestamp))>DateAdd(LastUpdate,-DateDiff(Now(), DateAdd(Now(),TimeZoneOffset(),Minutes),Hours),Hours))))>0,UpdateContext({No_Acct_Updates:false}),UpdateContext({No_Acct_Updates:true})),

   If(CountRows(ClearCollect(UpdatedAcctProd,Filter(SortByColumns('[dbo].[VIEW_POWERAPPS_BASIC_ACCTIVATE_DATA]',"UpdatedDate",Descending),DateTimeValue(Text(UpdatedDate))>DateAdd(LastUpdate,-DateDiff(Now(), DateAdd(Now(),TimeZoneOffset(),Minutes),Hours),Hours))))>0,UpdateContext({No_AcctProd_Updates:false}),UpdateContext({No_AcctProd_Updates:true}))








          MD_Project: MD_Project, 
          MD_LatestRevNumber: MD_LatestRevNumber, 
          MD_EngrgRelDate: MD_EngrgRelDate, 
          Title: Title, 
          MD_PDM_Link: MD_PDM_Link, 
          MD_LinkToDrawing: MD_LinkToDrawing, 
          MD_IsActive: MD_IsActive, 
          MD_Data_Flagged: MD_Data_Flagged,  
          MD_Latest_SW_DocumentID: MD_Latest_SW_DocumentID, 
          OnHand: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,OnHand), 
          Location_Acct: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Location), 
          Discontinued_Acct: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Discontinued), 
          Available: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Available)






**PITFALLS!!** (PLEASE read to reduce frustrations...)

  1. Changing anything with the SQL table, or the flow REQUIRES you to disconnect the flow from powerapps (View->DataSources->select flow from list), and reconnect it (Action->Flows->select flow from list).  Sometimes, even when you do that its not enough!  I had more than one occasion I had to do a "Save As" on the flow to a new name, and add it back fresh into powerapps.  If all you get is a "Value" data vale in your collection, then something has gone awry.  I spent hours trying to find what the problem was with this.
  2. The JSON Schema... the "Use Sample payload to generate schema" is NOT perfect!!!  If you get a red "Register error" when you try to add your flow to powerapps, check your schema.  There can be two problems that I have found.
    1. The "type" for a column is empty...  If you see "{}" after your column name, this is a problem.  You need to add it in manually.  See two examples...
      Copy what is between the curly brackets from another column definition, and paste it in.  Be sure the type is correct.  Allowable types are "integer", "string", "object","boolean".  Date columns are converted to a string.  So you have to use DateTimeValue() to convert the text to an object if you need it in that format.
    2. Sometimes, the schema generator interprets the type wrong.  Again, if you get the "Register Error" when you add it, then its likely that a "integer" was interpreted as a "string".  Again, you will need to update the schema manually if needed.


Flow Execution time for 22,000 rows was a total of 29 seconds.  Ten columns of data.  I did a test run with a basic table object, and a counter.  Powerapps shows the table in just about that amount of time.  Not bad if you are in a situation where you can pre-load all of your data.  It eliminates delegation issues.


Resident Rockstar
Resident Rockstar

@jcutrin ,

Where are you getting this error?  Are you pushing a flow command from PowerApps?  I believe your Flow has 2-3 minutes to run if you are using a Respond action.  Let me know how you are doing this.

Helpful resources

New Badges

New Solution Badges!

Check out our new profile badges recognizing authored solutions!

New Power Super Users


We are excited to announce the Power Apps Super Users!

Power Apps Community Call

Power Apps Community Call: February

Did you miss the call? Check out the Power Apps Community Call here.

Microsoft Ignite

Microsoft Ignite

Join digitally, March 2–4, 2021 to explore new tech that's ready to implement. Experience the keynote in mixed reality through AltspaceVR!

Top Solution Authors
Top Kudoed Authors
Users online (35,723)