Maybe I'm slow and just now finding this out, but I thought I would share it with you. I have been working with some data sets in SQL and lists in Sharepoint that exceed 2,000 items. There is one method I summarized a while back that provided a method to get up to 4,000 items. Via flow, I have found a way to collect many more items. I have sucessfully retrieved a SQL table of over 20,000 rows with this method. This also works with sharepoint, but it is much slower and has some other hangups. I'll try to outline them here.
The basic method is to create a JSON array from the data, and the pass that back to Powerapps via a Respond Action. This is how it works:
Sample flow, "GetSQLData"
1. The entire flow is only three steps:
2. You go into the "Settings" in the Get Rows action... and turn on Pagination, and set a max number in the Limit field.
3. The trick is the "JSON Schema". You have to run the flow with steps one and two only first. Limit the output to a single row (Set "Top Count" in advanced options to 1. No quotes). Copy (highlight, then Cntrl-V) the output, but ONLY what is in square brackets, including the brackets.
4. Open the Response action. Select "Use sample payload to generate schema", and paste what you copied in 3. Pick done, and the system will generate the schema.
5. In powerapps, use your command in OnStart, or OnVisible:
ClearCollect(TableData,GetSQLData.Run())
6. After this, you should have all of the table data available in your collection. No delegation issues and very fast response. (Dont forget to remove the limit in "Top Count" if you did that!)
Sharepoint modification:
Sharepoint is just a little different animal... You have to remap the tables, since there is something strange in the JSON output of the sharepoint that powerapps just does not like. So, add the Select action. You will have to map each column. KEEP IN MIND, the pitfall note below. If you change anything in the select columns, you will have to make a new scheme. Your load file will come from the Select output when you do a test run, instead of the sharepoint output. You will also have to remove the data source, and re-add the flow in powerapps to reset it.
Accellerating retrieval of thousands of items!
You can increase the speed of retrieval by a factor of how many "lanes" you create in parallel. You basically divide up the indexes into as many ranges as you create "lanes". In this example, I split the index into four ranges, and put them in parallel actions. Yes, this nearly cuts retrieval time to 1/4 the speed of a single range!
Update data after initial load:
You may wish to update your data instead of loading the entire dataset again. This method can take your existing dataset, look for updates from your sources, then update the dataset in your current collections. While the initial load of my data can be nearly 60 seconds, a quick update check and update takes only 6-8 seconds. This allows for your data to not go stale. You can also use timers to refresh data at regular intervals.
// Data Refresh Routine v1.0.0 3/26/2019 UpdateContext({LoadDataSpinner:true,Circle1_Color:Red,Circle2_Color:Red,Circle3_Color:Red}); Concurrent( Clear(UpdatedFromMDL),Clear(UpdatedAcct),Clear(UpdatedAcctProd),Clear(UpdatedAcctData),Clear(UpdatedMDL), UpdateContext({TimerReset:false, Circle1_Color:Green}) ); Concurrent( Refresh('Engineering Change Notice'), Refresh('Engineering Change Request'), Refresh(Drawing_Vault), If(CountRows(ClearCollect(UpdatedMDL,Filter(SortByColumns(Master_Drawing_List,"Modified",Descending),Modified>LastUpdate)))>0,UpdateContext({No_MDL_Updates:false}),UpdateContext({No_MDL_Updates:true})), If(CountRows(ClearCollect(UpdatedAcct,Filter(SortByColumns('[dbo].[tbINVTransactionDetail]',"PostedTimestamp",Descending),DateTimeValue(Text(PostedTimestamp))>DateAdd(LastUpdate,-DateDiff(Now(), DateAdd(Now(),TimeZoneOffset(),Minutes),Hours),Hours))))>0,UpdateContext({No_Acct_Updates:false}),UpdateContext({No_Acct_Updates:true})), If(CountRows(ClearCollect(UpdatedAcctProd,Filter(SortByColumns('[dbo].[VIEW_POWERAPPS_BASIC_ACCTIVATE_DATA]',"UpdatedDate",Descending),DateTimeValue(Text(UpdatedDate))>DateAdd(LastUpdate,-DateDiff(Now(), DateAdd(Now(),TimeZoneOffset(),Minutes),Hours),Hours))))>0,UpdateContext({No_AcctProd_Updates:false}),UpdateContext({No_AcctProd_Updates:true})) ); Collect(UpdatedAcct,UpdatedAcctProd); UpdateContext({Circle2_Color:Green}); Concurrent( ClearCollect(UpdatePartNumbers,UpdatedMDL.MD_PartNumber,RenameColumns(UpdatedAcct,"ProductID","MD_PartNumber").MD_PartNumber,RenameColumns(UpdatedAcctProd,"ProductID","MD_PartNumber").MD_PartNumber), ForAll(UpdatedAcct.ProductID,Collect(UpdatedAcctData,LookUp(RenameColumns('[dbo].[VIEW_POWERAPPS_BASIC_ACCTIVATE_DATA]',"ProductID","Acct_PartNumber"),Acct_PartNumber=ProductID))) ); UpdateContext({Circle3_Color:Green}); ForAll(Distinct(UpdatePartNumbers,MD_PartNumber).Result,Collect(UpdatedFromMDL,LookUp(Master_Drawing_List,MD_PartNumber=Result))); ForAll(RenameColumns(UpdatedFromMDL,"MD_PartNumber","MD_PartNumber2"), Patch(MDL_All,LookUp(MDL_All,MD_PartNumber=MD_PartNumber2), { MD_Project: MD_Project, MD_LatestRevNumber: MD_LatestRevNumber, MD_EngrgRelDate: MD_EngrgRelDate, Title: Title, MD_PDM_Link: MD_PDM_Link, MD_LinkToDrawing: MD_LinkToDrawing, MD_IsActive: MD_IsActive, MD_Data_Flagged: MD_Data_Flagged, MD_Latest_SW_DocumentID: MD_Latest_SW_DocumentID, LastCost:LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,LastCost), OnHand: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,OnHand), Location_Acct: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Location), Discontinued_Acct: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Discontinued), Available: LookUp(UpdatedAcctData,Acct_PartNumber=MD_PartNumber2,Available) } ) ); Set(LastUpdate,Now()-.002); UpdateContext({TotalDwgCount:CountRows(MDL_All),LoadDataSpinner:false}); ClearCollect(StockQuote,GetKIQStockQuote.Run()); UpdateContext({LastStockUpdate:Now()})
**PITFALLS!!** (PLEASE read to reduce frustrations...)
Flow Execution time for 22,000 rows was a total of 29 seconds. Ten columns of data. I did a test run with a basic table object, and a counter. Powerapps shows the table in just about that amount of time. Not bad if you are in a situation where you can pre-load all of your data. It eliminates delegation issues.
Solved! Go to Solution.
@jcutrin ,
Where are you getting this error? Are you pushing a flow command from PowerApps? I believe your Flow has 2-3 minutes to run if you are using a Respond action. Let me know how you are doing this.
User | Count |
---|---|
183 | |
123 | |
88 | |
45 | |
42 |
User | Count |
---|---|
251 | |
160 | |
126 | |
78 | |
73 |