cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

I was able to figure out multi select without using IN filter but unselect does not work. I will share this solution over the weekend, close to going live.

Meneghino
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

OK, please mention me in your post so I can have a look.

Thanks.

mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

The latest version of PowerApps can handle more conditions and it can use <= and >= more easily in Filter.

 

This makes it possible to pull in records without needing an n column. I have confirmed with CountRows() and my old method that each iteration does add up. I will post shortly. This is big news!

Microsoft Employee
@8bitclassroom
hpkeong
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

 hi

 

excellent. hope excel is coming.

hpkeong
mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Ok, this is the best solution for pulling in all records now. It is only possible if you have the latest version of PowerApps 2.0.590 since Filter seems to handle more conditions.

 

This solution makes it possible for you to avoid making a column that calculates which block of 500 a record belongs to (I previously used a column called n). I made my formulas based on a CDS entity and performed calculations on the default RecordId field which is a Big Integer. I previous was using PrimaryId, since it was small numbers starting at 1, but the problem is that it is handled as text. RecordId is a value at least.

 

 

UpdateContext({firstrecord: First(datasource)});
UpdateContext({lastrecord: First(Sort(datasource,RecordId,Descending))});
UpdateContext({maxiter: RoundUp((lastrecord.RecordId-firstrecord.RecordId)/500,0)});

ClearCollect(iter,
	AddColumns(AddColumns(Filter(HundredChart,Number<=maxiter),"min",(Number-1)*500),"max",Number*500)
);

Clear(datasource_temp);
ForAll(iter,
	Collect(datasource_temp,
		Filter(datasource,RecordId>=firstrecord.RecordId+min && RecordId<firstrecord.RecordId+max)
	)
)

 

There are three parts:

 

1. Determine the first record (firstrecord), the last record (lastrecord). Subtracting their RecordId value and dividing it by 500 determines how many times you would need to perform iterations (maxiter). The firstrecord's RecordId will be used as a reference for pulling in records later.

 

UpdateContext({firstrecord: First(datasource)});
UpdateContext({lastrecord: First(Sort(datasource,RecordId,Descending))});
UpdateContext({maxiter: RoundUp((lastrecord.RecordId-firstrecord.RecordId)/500,0)});

 

2. Make a static table of whole numbers [1, 2, 3, ... 100 or whatever you want]. Filter it to use as the argument in ForAll later. It will give instructions to ForAll on how many times to "loop." So if in step 1, you determined that your number of iterations (maxiter) was 3, then the formula would Filter all whole numbers less than and equal to 3.

 

Note: my formula below is messy. I am using an existing table I have that only has a column with whole numbers. I had to add columns for the minimum 500 and upper 500 using AddColumns(). You may opt to make those columns in your table so it does not need to be calculated every time. I figure it's a small calculation so it's not a big deal.

 

ClearCollect(iter,
	AddColumns(AddColumns(Filter(HundredChart,Number<=maxiter),"min",(Number-1)*500),"max",Number*500)
);

 

3. The last part is where the formula pulls in records. First it clears the temporary collection (datasource_temp) that is used for holding the records. ForAll will pull in 500 records at a time for each whole number you Filtered in step 2. So if you have 3 whole numbers in the iter Collection (maxiter=3), then ForAll will pull in:

  • all records with RecordId>=firstrecord.RecordId+0 and RecordId<firstrecord.RecordId+500
  • all records with RecordId>=firstrecord.RecordId+500 and RecordId<firstrecord.RecordId+1000
  • all records with RecordId>=firstrecord.RecordId+1000 and RecordId<firstrecord.RecordId+1500
  • then it will stop because there are no other whole numbers in the "iter" Collection.

 

Clear(datasource_temp);
ForAll(iter,
	Collect(datasource_temp,
		Filter(datasource,RecordId>=firstrecord.RecordId+min && RecordId<firstrecord.RecordId+max)
	)
)

 

 

 

Microsoft Employee
@8bitclassroom
hpkeong
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Hi

 

Thanks for sharing and I will test it my side and hopefully I will get as explained.

Always wonder, what is Number < = EndPocket?

 

TQ

hpkeong
mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

@hpkeong,

EndPocket is a typo--that's the name of my variable. I forgot to rename it to a general name for typing up my post, but I fixed it now. It should be maxiter there.

Microsoft Employee
@8bitclassroom

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Thanks mr-Dang!

For multiselect I used gallery technique with checkbox. I will try your new suggestion.

gallery.png

Meneghino
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Hi @mr-dang

Good solution, I will try it.

Do you know if RecordId is always necessarily sequential in the absence of record deletions? (i.e. if records start being numbered with a certain number and then increase by one)

I imagine that if a record is deleted then the Id will disappear.  If a source has many deletions, then with each Collect you could be pulling in significantly less than 500 records, and this may affect performance; but I agree that it is much simpler than the nested variant I proposed earlier.

In any case, I think we are providing work-arounds for something that should be resolved in PowerApps itself: the ability to cache large data sources with one simple command.

@Anonymous proposed this idea for caching, I supported it.

 

PS Does anyone know why the CDS RecordId field works the way it does?  What determines the starting number?

mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

@Meneghino, you are correct--RecordId and PrimaryId both count on even from deleted records. Also when you are pulling in a range of 500, you may not be pulling in 500 if you've removed records in between. In one of my entities, I had removed the first 3500 sequential records, so my old formula for caching the data had to skip 7 loops; otherwise, PowerApps would have to spend time accessing the entity and pull in 0 records which is not pleasing to the end-user. 

 

I am not sure how PA determines at what number RecordId begins. It's a huge number--I'm not sure what a "Big Integer" entails about its properties either.

 

I agree that a single command would be the best option for those who need to pull in everything at once. I'm surprised there are not more Kudos on the idea since there's a new thread about 500 record limitation everyday 🙂

Microsoft Employee
@8bitclassroom

Helpful resources

Announcements
thirdimage

Power Automate Community User Group Member Badge

Fill out a quick form to claim your user group badge now!

sixthImage

Power Platform World Tour

Find out where you can attend!

Power Platform 2019 release wave 2 plan

Power Platform 2019 release wave 2 plan

Features releasing from October 2019 through March 2020

fifthimage

Microsoft Learn

Learn how to build the business apps that you need.

Top Kudoed Authors
Users Online
Currently online: 125 members 5,184 guests
Please welcome our newest community members: