cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
Super User
Super User

Re: ClearCollect for more than 2000 records

 

 

smiles.gif

 

Highlighted
New Member

Re: ClearCollect for more than 2000 records

Hi @dinusc ,

 

I'd like to use Option 2 you mention but have no idea how to implement it. I've used the Collect(dsfinal,ds1,ds2) as shown below by Drrickryp, however my collections only retrieve the first 2000 rows for each collection ... my data is over 6000 rows.

 

How would you collect 1-500; 501-1000; etc. , what would the formula look like, as I'm not sure how that would work and cannot find any examples.

 

I've also tried to Collect(dsfinal(If(CustomersList.Custgrp = GalleryAgencies.Selected.Custgrp)2000) but without success. I was hoping the Agency would be the factor in collecting <2000 items at a time, but it's just not working.

 

Thanks in advance.

Highlighted
Resident Rockstar
Resident Rockstar

Re: ClearCollect for more than 2000 records

@Bren0074  I built a solution for this last week because I was having the exact same issues. A warning though, it's not elegant - but it does work 🙂

 

The solution is to use SharePoint ID numbers.

1. First, sort by ID Descending, collect that into collection1 (col1)

2. Then, sort by ID Ascending, collect that into collection2 (col2)

3. Build an array of 'Missing ID numbers' as a collection - call it colTarget. Calculated by (MinID col1) - (MaxID col2)

4. Use ForAll() with Collect() to get the missing ID rows one by one into collection3 (col3)

5. [optional] Collect col1, col2 & col3 into finalCol

 

Step 4 can take awhile if 'the gap' between Min/Max is large so just be mindful of that and maybe think about if you need all the data in your App? I put a label in my Dev App and set it to CountRows(col3) and you can see it tallying up as data gets collected - gives you an idea of data collection speed.

 

You could always build a smaller array (Step 3) of missing IDs that gets tacked onto col1 if you don't need all of the data Eg. (MinID col1) - 500 instead of (MinID col1) - (MaxID col2)?

 

Apologies for all the blah above, here's my code solution:

Note: my Step 3 is adjusted from @Prem_ddsl's cool code here Dynamic Loops

 

Step 1:

ClearCollect(col1,SortByColumns(yourSPList,"ID",Descending))

 

Step 2:

ClearCollect(col2,SortByColumns(yourSPList,"ID",Ascending))

 

Step 3:

//specify target rowcount

Set(vMaxRows,Min(col1,ID)-Max(col2,ID))

 

//define a base collection from which you will generate your target collection. A colBase of 60 will be able to collect up to 60x60 = 3600 items. Set colBase's size to meet your needs.
ClearCollect(colBase,[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60]);


//setup collections to be used as inner and outer counters
ClearCollect(colOuter,colBase);ClearCollect(colInner,colBase);Set(vOuterMax,CountRows(colOuter));


//generate array of missing ID numbers 
ForAll(colOuter,ForAll(colInner,Collect(colTarget,{RowId:(Max(col2,ID)+colInner[@Value])+(vOuterMax*(colOuter[@Value]-1))})));

 

RemoveIf(colTarget,RowId>=Min(col1,ID));  // remove unwanted ID numbers

 

Step 4:

Clear(col3));

ForAll(colTarget,Collect(col3,Filter(yourSPList,ID=RowId)));

 

Step 5:

ClearCollect(finalCol, col1, col2, col3)

 

As  I say, not elegant but effective. Hope this helps 🙂

 

Cheers

 

Highlighted
Resident Rockstar
Resident Rockstar

Re: ClearCollect for more than 2000 records

Ooops, the 'sad face' emoji should be:

: ( 

 

ie 

ForAll(colOuter,ForAll(colInner,Collect(colTarget,{RowId : ( Max(colSiteInduct2,ID)+colInner[@Value])+(vOuterMax*(colOuter[@Value]-1))})))

Highlighted
Frequent Visitor

Re: ClearCollect for more than 2000 records

@Eelman   Thanks for this - really excellent work. One thing I wondered about is whether it's possible to modify this to include a filter on the data source to remove unecessary items? e.g. I have a list with 40,000 items, but I only need to pull in ~7000 items, based on a status text field. Therefore I'd like to filter out the 33,000 items which are in a "Completed" status. Is this achievable using this same method?

Highlighted
Helper I
Helper I

Re: ClearCollect for more than 2000 records

Hi there,

 

i've a data source of about 16000 rows, so i've tried to workaround the 2000 limit, by implement your code, @dinusc : 

 

ClearCollect(DummyCollection;Filter(DataDedo;"BatchID1" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID2" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID3" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID4" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID5" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID6" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID7" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID8" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID9" in BatchID))

but i'm stuck, 'cause my DummyCollection only displays 2000 rows...

have i misunderstood your proposal ?

Thanks

 

Highlighted
Microsoft
Microsoft

Re: ClearCollect for more than 2000 records

@GeekAlf_Pro, I suspect your multiple "Collect" formulas do not pull any data (since they're not delegable) and this is the reason you can't collect them. Here is how non-delegable functions work:

1. You define your search criteria which uses non-delegable functions.

2. The function pulls the unfiltered batch of records to the client side (up to 2K).

3. The function applies the filtering on the client side (since the function cannot be delegated to the back end).

In case when none of the records from the retrieved batch correspond to the search criteria then the search result is empty. 

My suggestion was to use sub-collections with delegable functions (something like ID > 5000 && ID < 10000) to pull all the records to the client and then apply the desired filtering. Please note that this will cause your client app to use a significant amount of memory (which is one of the main reasons for the record limitation in the first place) and you should not do this if your application will be used on older mobile devices with limited memory and/or low speed network connections.

Highlighted
Helper I
Helper I

Re: ClearCollect for more than 2000 records

Hi @dinusc,

 

thank you very much for your quick response.

 

My comments below

 

 


@dinusc wrote:

@GeekAlf_Pro, I suspect your multiple "Collect" formulas do not pull any data (since they're not delegable) and this is the reason you can't collect them. Here is how non-delegable functions work: [GeekAlf_Pro] Right ! it's an OneDrive Excel Sheet

1. You define your search criteria which uses non-delegable functions.

2. The function pulls the unfiltered batch of records to the client side (up to 2K). [GeekAlf_Pro] That's where I don't understand, I thought that filtering with the BatchID would allow me to enrich my local collection, since I filter to be below the threshold.
If I understand you correctly, you're explaining to me that filtering is useless, right?

3. The function applies the filtering on the client side (since the function cannot be delegated to the back end).

In case when none of the records from the retrieved batch correspond to the search criteria then the search result is empty. 

My suggestion was to use sub-collections with delegable functions (something like ID > 5000 && ID < 10000) to pull all the records to the client and then apply the desired filtering. Please note that this will cause your client app to use a significant amount of memory (which is one of the main reasons for the record limitation in the first place) and you should not do this if your application will be used on older mobile devices with limited memory and/or low speed network connections.[GeekAlf_Pro] i try and make a comeback. thanks again.




Highlighted
Microsoft
Microsoft

Re: ClearCollect for more than 2000 records

To provide more information about #2 below, filtering with batchId is no different that any other filtering. "Sectioning" data into batches idea is about retrieving smaller batches using simple and delegable functions. In your example, you cannot apply non-delegable filter to 16K records. So the idea is to pull all 16K records to the client using simpler and delegable functions and then apply the desired filter locally (on the client). This is not always straightforward to implement because, as mentioned in the original post, you may need to re-design your backend storage (for example by adding an unique numeric index column - batch ID). 

Helpful resources

Announcements
secondImage

Demo-Extravaganza 2020

Check out these cool Power Apps & vote on your favorite!

secondImage

Robotic Process Automation

Let's talk about the solution provided by Microsoft for Robotic Process Automation (RPA)

secondImage

Community Highlights

Check out whats happening in Power Apps

secondImage

Community User Group Member Badges

FIll out a quick form to claim your community user group member badge today!

Top Solution Authors
Top Kudoed Authors
Users online (9,082)