Showing results for 
Search instead for 
Did you mean: 

Fix 500 Record Limitation


  • At most 500 records can be returned at a time, whether filtered or not.
  • OneDrive, Dropbox, and Google Sheets can only read from the first 500 records.
  • CDS can read from all records, but can only return 500 records on a Filter.
  • It is not possible to Collect all records of an unfiltered datasource.


Suggestion 1:

Could you create a function to only pre-load and reload entity data to a temporary collection? It would do the same as this, but it could pull in everything without the 500 record limit:


It would only import all records or none; so if you wanted to filter, you would have to do it after the temporary collection has been created. Furthermore, to reduce drag on the network, you could restrict this specific function to queue up or a quota like 100 runs per second.



Rationale: If we can collect the entire contents of a datasource to a temporary collection, we can operate on the temporary collection instead. This would allow for all aggregate functions and other non-delegable functions to work on the temporary collection. Then from time to time, we can reload the datasource to have the most updated data in the temporary collection.



Suggestion 2:

Could you create a function that would pull in 500 records at a time until all are pulled in? It would work like my solution, but it would be automated with fewer lines of code.





I am suggesting these functions as temporary solutions to not having all functions delegable.


Please consider this idea.

Status: Completed
Power Apps

yeah, I'm sorry.  This is my first assignment on the team -- to fix this.   Let me ask you some questions about what you would consider acceptable solution.  Some background.  The 500 record limit is actually a cache limit.  And, we'll soon be allowing users to globally set this for the application up to a limit of 2000 records (it will be in the experimental section.) The issue is that when we allow it get much higher than that, everything slows down.  How many records are in the Excel source you're dealing with?  Would that work?  Alternatively, what about the idea of caching the entire Excel table temporarily in a data source like the Common data model.  In that kind of scenario, we would read the entire table from the Excel sheet in question when the app loads then you can party on since CDS supports delegation.   Thoughts?

Power Apps

Was responding the last question (which mentioned me) and just saw the suggestion about the temporary cache -- which is the same as my question back to you guys.  But, I have another clarifying question.  When you pull data in from Excel is it only for read purposes?  Or are you expecting to write back?  Single user, multi-user?

Community Champion



The 500 record limit is actually a cache limit. And, we'll soon be allowing users to globally set this for the application up to a limit of 2000 records (it will be in the experimental section.)


Are the settings in the experimental section scoped to the individual app or does it need to be changed for all apps?


The issue is that when we allow it get much higher than that, everything slows down.

Can you clarify what exactly slows down? Is it client-side or also server-side? How bad is it?



How many records are in the Excel source you're dealing with? Would that work? Alternatively, what about the idea of caching the entire Excel table temporarily in a data source like the Common data model.

I use a variety of solutions depending on need.


For small tables of <20 records used for style or navigation and are not expected to change, I use Table() or I put it in as static data. This can inflate the msapp file size--I'm not sure what effect this has on loading the app itself.


I cache an entity that stores information about users, known quantity <500 records, minor details can be Patched back using ForAll(Patch()). The data is recached afterwards, but changes aren't expected in high volumes or at all so this is not a burden.




When you pull data in from Excel is it only for read purposes? Or are you expecting to write back? Single user, multi-user?


The 500 limit mostly affects datasources of mine that have thousands of records and require aggregate functions and all CRUD operations. These are the kinds of datasources that receive records written regularly by multiple users.


If I run a Sum over a Filter, PowerApps thinks there's only the first 500 records. I cache the data and invoke a Refresh and cache new data in the OnVisible property of a screen.


But the Sum becomes outdated when other users write data and it remains outdated until new data is re-cached. Multiple users are not reading the same instance of the data. Every create/update/delete operation is written to the original datasource, but its formula also needs to be duplicated to be rewritten to the cached collection. PowerApps becomes a high-code reality in a low-code environment.


In one app I decided to write the calculated Sum to the datasource since the information was readily available as cached data anyway. It is fast to recall a single record detailing the Sum so it works as a solution, but it quickly becomes inaccurate when other users write to the datasource. So I wrote a function that would cross-check and rewrite the Sum in the datasource in each user's instance if they perform actions on the datasource.


These extra functions to double-check sums and re-cache data impacts the user experience. There is more waiting involved when navigating to screens with re-caching formulas and when writing data with cross-checking.


Can you share how bad things will slow down with the projected changes in place?

Power Apps

1. It will be per app. 

2. server side.  Evidently they previously (a long while back) allowed you to bring large numbers of records down and it made the app unusable.  I don't know how bad it was.  So, they cut the number down to 500.  Bottom line, PowerApps itself is not a true data store.  We can cache some records locally but we're not optimized for that.  That's what we use CDS and other stores for.  

3. and 4.  A file data source does not provide the same mechanics as a true multi-user database.  For the narrow scenario where a only few people use the app and you you need to read in (say) 50K records and you rarely update or a scenario where you never update from PowerApps - and you're just using the data as reference data and the data is refreshed externally on a daily basis, then a caching scheme where we bring the entire 50K records into a store like CDS would work.  These are both valid (the second one especially) scenarios.  But your app would need to be responsible for updating your (say) CDS cache on a frequent enough basis so that things are not out of date.   By the way, there are "in Technical Preview features" you could use to do this today if you have the right licenses (you need the data integrator that lights up with a Dynamics 365 licence.)


I want to be very careful here because we can't make Excel a mult-user database by ourselves.  It's true that the online version of Excel supports multi-user editing but that is different than the fuller set of servicesoffered by a database (including native query support, server-side query/delegation, performance balancing, etc.)  Excel has inched closer to the multi-user database story but there are still differences.


Given that, I want to be clear on whether or not your scenario fits in the one of the two narrow scenarios I described.

Community Champion

The ability to cache 2000 records in one operation (as opposed to 500 currently) would solve 90% of my problems and those of the users I regularly help.  The ability to cache 5000 would resolve 98% of problems.

Caching and delegation are two separate, although related, issues.

Community Champion

PS @LanceDelano, when you say 'soon' in regards to the ability to cache 2000 records, when is that planned/expected?

Power Apps

re: "soon".  Based on the current train deployment, I expect it within two to three weeks.  Please note this is an estimate and things can happen with deployments.  But, in that timeframe.  


re:  "5000 record caching".  As I mentioned in an earlier post, allowing large numbers brought the system to a very slow experience.  I don't think we're going to go above 2000 any time soon -- and if we do it will likely be via a scheme where we "cache" in something like CDS instead of the way we do it now.  


re: caching v. delegation.  Yes, they are different but highly intertwined in our case when the number of records is large.  We can't do efficent caching of very large sets of records - particularly in the case where we bring in a flat file which can easily have hundreds of thousands of records.  We don't do well above 2000 records.  So, in these kinds of cases I think we need a server side story and thus delegation comes in to play. 





Community Champion

Thanks @LanceDelano, raising the limit to 2,000 records will be a game changer for me. Look forward to it.

Community Champion

2000 will make a big difference. Thank you @LanceDelano


I am confused between data stored in Collection and caching records. are they not the same, basically data stored in temp memory for retrieval? Reading the thread, it sounds like there is no 500 record limit if the data is cached/collected in memory. I received the below confirmation about not being able to use non-delegable commands on collection beyond 500 records.


Can someone please help me understand this little bit better?