Showing results for 
Search instead for 
Did you mean: 
Kudo Kingpin
Kudo Kingpin

500 item limit in CDM entity search filter(need to switch to app if this exists for long)

500 item limit in CDM entity search filter, this makes it very dofficult to use for any business scenario(export, data analysis)

because I have 50k records and search filter may return sometimes 5k or 20k and I need to analyze this data(so export)

Currently its only exporting 500 first items which does not meet any business criteria(imagine you are doing google search and it returns only 3 items), sadly if this is permanant issue like sp list 5k limit I will have to inform this to our sponsors of the project and most likely as it does not meet business need to filter and export we will have to do app which we did not want to do.

I will atleast need some good workaround. One thing I observed is there is export data link in CDM screen(can you give me some workaround based on that?)

100 REPLIES 100

@mr-dang, @Meneghino, @v-yamao-msft, hpkeong

One thing I noticed is "IN" statement works great when you have data already loaded in collection!

I have data loaded in CachedInstruments4 and I am using checkbox check, uncheck to filter records based on countries selected.



Collect(CachedInstruments4,Filter(CachedInstruments3,Country in ThisItem.Country))


On uncheck

Remove(CachedInstruments4,Filter(CachedInstruments4,Country = ThisItem.Country))

Hi @AmitLoh-Powerap

Yes, indeed that is the case for the in operator as a test of membership.

We really need to get this operator properly delegated when used a test of membership.

I have written this post, but have had no luck in getting clarification yet:

Community Champion
Community Champion

Below are some metrics with my new method. The internet connection used is not as quick as my school internet which my users use.


5 entities = 10,803 records total = 34 iterations of up to 500 records = about 22 real groups of 500


Elapsed time trials: 44s, 39s, 40s, 44s, 42s


  • Average elapsed time: 41.8s
  • Average number of records loaded per second: 258 records
  • Average elapsed time per loop (iteration): 1.22s/loop
  • Average elapsed time to cache 500 records: 1.9s/cache of 500

These tests were run on the same network as a previous test I ran. Below are some comparisons:

  • There were 65% more records this time.
  • There were 24 loops last time. The new test sample has 42% more iterations.
  • The average time elaspsed this time is about the same as the fastest time last time (41s).

Therefore, the changes to PowerApps version 2.0.590 and this method have up to a combined 42-65% improvement in performance. Since I no longer need to write a column identifying which block of 500 a record belongs to, there are also improvements in writing. That represents about 50% less writing time since I previously wrote new records twice: once for the record with the main information, and second for calculating n as RoundUp(PrimaryId/500,0).


Although there is a performance improvement, I would still judge 40+ seconds of caching a long time for end-users even if it's a one time block at the start. If it is 40s right now, then with more usage of the app, the time will gradually increase to unsustainable levels. If Sum() or CountRows() could operate on any one of the entities in its entirety or if there were a one-click Cache() function, I could easily cut out at least 10s or more. 

Microsoft Employee
Community Champion
Community Champion

hi @mr-dang

thanks for the update, very useful.

yesterday I added this idea, please upvote:


not sure it will help, but at least they cannot say we did not ask




I am using an excel file as a database in powerapps and i am trying your logic but it is not working, powerapps doesnt read the content where recordid is more than 500. It is only showing me 500 records only. Please help!!! Thanks.

Excel is not a delegable data source (yet) so that you will only obtain the first 500 records.

See here for some ideas:


Hi, Thanks for the information.


So i have to use Azure Sql DB as a database for better result?



You would get much better results

Continued Contributor
Continued Contributor

Hoping for some help here.  I have a SP list with just over 1000 items.  OnStart, I would like to filter to only pull in items if the Date field is this year or last year.  Is this possible?

The best way to have your cake and eat it (i.e. filter and have the filter operation delegated) would be to create a calculated column in SharePoint as Year([Date]), then use this column (Let's call it DateYear) to filter.  So you can have this expression in the OnStart:

ClearCollect(MyFilteredItems, Filter(MySpList, DateYear=Year(Today()) || DateYear=Year(Today())-1))

If this is not delegated because of the indeterminacy of Today(), then try this in the OnStart:

Set(CurrentYear, Year(Today()));
ClearCollect(MyFilteredItems, Filter(MySpList, DateYear=CurrentYear || DateYear=CurrentYear-1))

Please let me know how you get on

Helpful resources

PA User Group

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group

MBAS Attendee Badge

Claim Your Badge & Digital Swag!

Check out how to claim yours today!


Are Your Ready?

Test your skills now with the Cloud Skill Challenge.

Top Solution Authors
Top Kudoed Authors
Users online (88,591)