cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
mr-dang
Level 10

Overcoming the 500 record limit locally

UPDATE 2/19/17: I had written a new solution for this method here. It works more efficiently for CDS.

____________________

 

A few days ago I shared the solution I have been using to pull in all records of an Entity into a temporary collection to overcome the 500 record limitation. You can read about it here.

 

Now that I've had some time to play with ForAll, I have a more elegant solution for pulling in 500 records at a time for reading. I still do not have a good solution for writing though. For this to work, you will still need a column in your Entity which describes which set of 500 it belongs to.

 

Big idea: 

  1. Find out what the maximum n value is that describes how many sets of 500 you have. The formula I included also repairs the datasource if the last entry did not correctly have an n recorded.
  2. Create a dummy collection that includes whole numbers that are less than or equal to the n value you found in step 1.
  3. Use the dummy collection in step 2 as an argument in ForAll--"For each n in the dummy table, collect the 500 records from the datasource which are equal to that n."

 

UpdateIf(datasource,IsBlank(n),
	{n: RoundDown(Value(PrimaryId)/500,0)+1
	}
);

UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))});

ClearCollect(iter,
	Distinct(Filter(HundredChart,Num<=maxn.n),Num)
);

Clear(datasource_temp);

ForAll(iter,
	Collect(datasource_temp,
		Filter(datasource,n=Result)
	)
)

This can be done in a Button, Toggle, Timer, or whatever you want to trigger.

 

The only requirement is that you create a Table of whole numbers in a column [1,2,3,4,5, etc.] from which to pull your dummy collection. You can connect it to PowerApps as static data. I just used an existing "Hundred Chart" from a datasource I already connected. I do not know another way of making a collection with such whole number sets.

 

EDIT: Unfortunately, you will need to create an n value in your entity. I tried the following formula to try working around writng a column for n, but it has service limitations:

 

ForAll(iter,
	Collect(datasource_temp,
		Filter(datasource,(RoundDown(Value(PrimaryId)/500,0)+1)=Result)
	)
)

 

EDIT2:

Since UpdateIf does not delegate, you will not be able to fix all n that are blank. Instead, when writing a record, write n as 0 instead of blank so you can fix it. The change below can only fix the last record and may miss any others that do not have an n.

UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))});

If(IsBlank(maxn.n) || maxn.n=0,
Patch(datasource,First(Filter(PrimaryId=maxn.PrimaryId)), {n: RoundDown(Value(maxn.PrimaryId)/500,0)+1 } )
UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))})
); ClearCollect(iter, Distinct(Filter(HundredChart,Num<=maxn.n),Num) ); Clear(datasource_temp); ForAll(iter, Collect(datasource_temp, Filter(datasource,n=Result) ) )

 

 

@Meneghino@hpkeong@AmitLoh-Powerap: I think you would be interested in this.

Microsoft Employee
@8bitclassroom
17 REPLIES 17
Super User
Super User

Re: Overcoming the 500 record limit locally

Hi Brian @mr-dang

 

Thanks for much for the great effort in sharing delegation which is so crucial to most of us dealing with bulk data, esp. using those non-delegated formula.

 

I shall also check on the actual performance based on your sharing.

 

Once again, my appreciation.

 

Have a nice day, my friend.

hpkeong

Re: Overcoming the 500 record limit locally

Thank you for your help!

mr-dang
Level 10

Re: Overcoming the 500 record limit locally

The savings on loading time are worth while:

 

Entity size: 4 entities with a total of 24 blocks of 500

 

Timer method: 1:03 - 1:10

ForAll method: 0:41 - 0:50

 

Tests were conducted on a stronger laptop, so YMMV. I expect Atom devices and Chromebooks should be much slower.

 

This represents a 42% reduction in loading time at best and 21% at worst. This is because ForAll can run concurrently. 

 

2017-01-16.png

 

 

 

 

Microsoft Employee
@8bitclassroom
mr-dang
Level 10

Re: Overcoming the 500 record limit locally

I ran some new tests on a faster network:

 

Writing 24 sets of 500 records to temporary collections

 

Surface device (Chrome browser)

Timer method: 0:28 - 0:29

ForAll method: 0:17 - 0:18

 

Surface device (Edge browser)

Timer method: 0:30 - 0:37

ForAll method: 0:17 - 0:19 (Also 1:51 - 1:54)

 

Chromebook

Timer method: 0:31 - 0:36 

ForAll method: 0:16 - 0:17 (Also 1:31 - 1:39)

 

It put my Surface to shame when the Chromebook kept up to speed using the ForAll method. However, my tests were done in a mostly blank app. The Chromebook will definitely choke when Galleries and other calculations are based on your temporary collection that you are making. That's where the computer will need CPU and RAM to keep up. To avoid that, I put a condition on the Items property of Galleries so that when I am loading in data, the gallery does not need to recalculate its items for each 500 that is pulled in. It does need to recalculate at the end, but that's fine. I confirmed that I could still achieve around 0:32 - 0:36 on the Chromebook with this work around, but was unnecessarily minutes longer without the condition on Items properties (I quit).

 

In my testing there was a time when everything dramatically slowed on both devices. I listed the times in parentheses as well. You could tell that things were going to lag when the first n of 500 records went nowhere--normally I see it pull in within 3s. (this was around 3:30PM PST if that means anything). At first, I thought it was a Microsoft Edge issue, but retesting showed consistent results with Chrome on Windows. The Chromebook also returned to about 0:17 again. So the inconsistency in writing to CDS is either a problem with my network or on the backend of CDS.

 

Caveat: this testing was done under very ideal conditions. My students were gone for the day so I was likely the only one on the network. I also only tested one device at a time. In real scenarios, the load is unbearably slow for my students' Chromebooks--I do not get the 0:17. When all students are on, loading in the entities has gone as long as 3-5 minutes using the Timer method. It's hard to justify using the app with that amount of delay (students work on something else while it is loading). That's why I'm running these tests and trying to find faster methods.

 

Last comment: I made a critical error in my formulas. It turns out that UpdateIf cannot fix all n that are blank. In PowerApps, UpdateIf cannot update records outside of the first 500 if the argument is "IsBlank(n)"--it doesn't delegate. A blue error did not pop up, which caused me to believe that it did work. The obvious fix is to go into Excel to recalculate the n column in one shot. However, making a formula solve this problem is a better idea. I have a way of fixing the n for the last record, but not everything in between:

UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))});

Patch(datasource,First(Filter(datasource,PrimaryId=maxn.PrimaryId)),
{n: RoundDown(Value(maxn.PrimaryId)/500,0)+1
}
)

Unfortunately, ForAll cannot delegate to fix all blank values of n either.

Microsoft Employee
@8bitclassroom

Re: Overcoming the 500 record limit locally

Hi Dang,

 

On button click how do you bring next 500, On timer I understand duration based it will do next. On button click how you bring set of 500(24 times)

mr-dang
Level 10

Re: Overcoming the 500 record limit locally

 

One Button press is able to pull in everything because you identified the total number of sets of 500:

UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))});

If(IsBlank(maxn.n) || maxn.n=0,
Patch(datasource,First(Filter(PrimaryId=maxn.PrimaryId)), {n: RoundDown(Value(maxn.PrimaryId)/500,0)+1 } )
UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending))})
);

 

The "iter" collection below identifies how many times you will pull in 500 by creating an ordered set of whole numbers [1,2,3,4,5,... maxn.n]:

 

ClearCollect(iter,
	Distinct(Filter(HundredChart,Num<=maxn.n),Num)
);

 

ForAll is able to repeat collecting 500 records for each record you have in the "iter" collection above. It means, "For All the whole numbers you collected in iter, collect the set of 500 records which have n equal to that whole number." So collect all 500 records with n=1, then all records with n=2... finally all records with n=maxn.n.

 

Since iterations are built into the function itself, you only need to click the button once.

Clear(datasource_temp);

ForAll(iter,
	Collect(datasource_temp,
		Filter(datasource,n=Result)
	)
)

 

Microsoft Employee
@8bitclassroom
Super User
Super User

Re: Overcoming the 500 record limit locally

Hi Dang:

 

I will explore this modified function after CNY.

You mentioned Two Buttons, which include:

- UpdateContext(...); If..(Patch)..., then

- ClearCollect....

- ForAll

...Are they different action or?

 

The frist argumenet UpdateContext({maxn:  ...}), and then If(IsBlank(maxn.n ||...

Are there referring to the same maxn or maxn.n?

 

Anyway, maybe by looking at it I can;t really figure out how but I will try then.

 

Glad to have new solutions exactly on the first day of Lunar New Year.

Thanks.

hpkeong
mr-dang
Level 10

Re: Overcoming the 500 record limit locally

@hpkeong wrote:

You mentioned Two Buttons, which include:

- UpdateContext(...); If..(Patch)..., then

- ClearCollect....

- ForAll

...Are they different action or?

The three actions are all part of one Button.OnSelect. I just split them up in my post to explain each part of it.

@hpkeong wrote:

 

The frist argumenet UpdateContext({maxn:  ...}), and then If(IsBlank(maxn.n ||...

Are there referring to the same maxn or maxn.n?

 



 

The first UpdateContext({maxn: ...}) is to see what the n of the last record in the datasource is. Sometimes it can be blank or 0 if PowerApps closed on a user who was in the middle of Patching. So I fix the record with Patch, then find out what maxn is again. Ideally I could just UpdateIf on the datasource and fix all n, but it cannot delegate.

Microsoft Employee
@8bitclassroom
Super User
Super User

Re: Overcoming the 500 record limit locally

Thanks Brian

 

I will seriously look into this and see if there is any walkaround on this though rarely deal with big amount of data.

 

Have a nice day.

hpkeong

Helpful resources

Announcements
firstImage

Microsoft Business Applications Virtual Launch Event

Join us for an in-depth look at the new innovations across Dynamics 365 and the Microsoft Power Platform.

firstImage

Watch Sessions On Demand!

Continue your learning in our online communities.

Power Platform 2019 release wave 2 plan

Power Platform 2019 release wave 2 plan

Features releasing from October 2019 through March 2020

FirstImage

Power Platform World Tour

Coming to a city near you

thirdimage

PowerApps Community User Group Member Badge

Fill out a quick form to claim your user group badge now!

FourthImage

Join PowerApps User Group!!

Connect, share, and learn with your peers year-round

Top Kudoed Authors
Users Online
Currently online: 347 members 5,160 guests
Please welcome our newest community members: