cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long)

500 item limit in CDM entity search filter, this makes it very dofficult to use for any business scenario(export, data analysis)

because I have 50k records and search filter may return sometimes 5k or 20k and I need to analyze this data(so export)

Currently its only exporting 500 first items which does not meet any business criteria(imagine you are doing google search and it returns only 3 items), sadly if this is permanant issue like sp list 5k limit I will have to inform this to our sponsors of the project and most likely as it does not meet business need to filter and export we will have to do asp.net app which we did not want to do.

I will atleast need some good workaround. One thing I observed is there is export data link in CDM screen(can you give me some workaround based on that?)

74 REPLIES 74
Highlighted
mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

I have a temporary solution until delegation works. It is impractical and can introduce flaws in writing data. It is also inefficient in some parts, yet has benefits in others. But if you need those records, this works.

 

Big idea:

  • Import the entire Entity to a temporary collection.
  • Read from the collection instead of the original Entity.
  • If you need to write any data back to the datasource, write it to the temporary collection too.
  • If an entry already exists, look up the original record to modify.

I use @hpkeong's model of a timer. The timer imports 500 records at a time (the limit in PowerApps). To achieve this, every entity must have a field identifying "which 500" each record belongs to (e.g. 766 belongs to the second block of 500 records). I just call my field n" and set it to "number."

 

Import

 

Create a button that activates a timer which will import the next 500 records each time it ends/repeats.

 

 

Button1.OnSelect:
UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending)).n});
Clear(tempdata); UpdateContext({import: true, iter: 0})

 

Set the Timer1.OnTimerEnd to:

 

 

If(import,
	If(iter<maxn,
		UpdateContext({iter: iter+1});
		
		Collect(tempdata,
			Filter(datasource,n=iter)
		),
		
		UpdateContext({import: false})
	)
)

 

Then set the Timer's properties:

 

Timer1.Duration: 1
Timer1.Start: import Timer1.Repeat: import Timer1.AutoStart: import

 

How does this work?

The variable "maxn" is determined ahead of time--it looks for the a value that describes how many sets of 500 you have. So if you have 10,000 records, and you correctly programmed the way your n field is written, then you would expect maxn to be 20. Unfortunately you can't simply use Max(datasource,n) since delegation is not supported.

 

Click Button1 to begin importing. The "import" variable will activate the timer properties. As long as import is true, it will check for the number of iterations of importing by n that you have done so far. Since it starts at 0, 0 is less than maxn, the expected number of groups of 500 that you want to import. So the timer will import the nth group of 500. This repeats until iter equals maxn, then it deactivates the timer.

 

Flaw 1: if the last record in the datasource is blank for n, then nothing will load. This can easily happen if you are in the middle of writing and the app times out your session.

 

I previously had the Timer compare to see if the next n existed, but that was very slow--foolproof but slow:

 

If(import,
	If(!IsEmpty(Filter(datasource,n=iter+1)),
		UpdateContext({iter: iter+1});
		
		Collect(tempdata,
			Filter(datasource,n=iter)
		),
		
		UpdateContext({import: false})
	)
)

 

Flaw 2: The problem with the method above is that there was a version of PowerApps in which Filtering the datasource by "n=iter+1" had service limitations. iter+1 was too complicated for the formula, so I had to map it to Timer1.Text:

 

Timer1.Text: iter

Timer1.OnTimerEnd:
If(import, If(!IsEmpty(Filter(datasource,n=Timer1.Text+1)), UpdateContext({iter: iter+1}); Collect(tempdata, Filter(datasource,n=Timer1.Text) ), UpdateContext({import: false}) )
)

 

Writing

You need to do the writing in 3 steps: 

  1. Write a new record normally, but copy the new record to a variable.
  2. Take the new record and calculate a correct value for which set of 500 it belongs to. 
  3. Write the same record to the temporary collection that you are using instead of the original datasource.

 

UpdateContext({temprecord:
	Patch(datasource, Defaults(datasource),
		{field1: data,
			field2: data,
		}
	)
});

UpdateContext({temprecord:
	Patch(datasource, First(Filter(datasource,PrimaryId=temprecord.PrimaryId)),
		{n: RoundDown(Value(temprecord.PrimaryId)/500,0)+1
		}
	)
});

Collect(tempdata, temprecord)

 

This gets more complicated if you want to update an existing record--or to check that one exists. Either way, the n value of which set of 500 it belongs to can only be calculated once the column of unique values has been figured out, since it is based on that value. This explains why you need to Patch twice. 

 

I originally collected the temprecord as it was written, but I came across some writing errors. I've listed it below for reference, but YMMV:

 

UpdateContext({temprecord:
	Patch(datasource, Defaults(datasource),
		{field1: data,
			field2: data,
		}
	)
});

Collect(tempdata, Patch(datasource, First(Filter(datasource,PrimaryId=temprecord.PrimaryId)), {n: RoundDown(Value(temprecord.PrimaryId)/500,0)+1 } )
)

 

Flaw 3: Writing needs to access the datasource twice. This could be instant sometimes, but it can also be very slow. 

 

Flaw 4: as mentioned before, if the app closes in the middle of writing, then the n value might not be written. This could break your importing unless you use the slower importing method (see Flaw 1 and 2 of Importing)

 

You could arguably calculate all the n later using UpdateIf or ForAll, but then you would need to figure out a new way of importing when n has not yet been calculated.

 

Final Thoughts

This method is only as good as you can keep data current. If you have multiple users accessing the same data, or even just multiple instances in the webplayer, I do not yet have a solution on syncing data in a reasonable way. In my heaviest entity, reloading about 20 iterations of 500 could take 2 minutes.

 

The ability to open multiple instances is a blessing. I would not trade it for anything. However, it does open possibility for data inaccuracy if you use this method. Other users might make changes to the original datasource, yet you would never know it.

 

The ForAll function came out recently, yet I have not had a good amount of time to play with it yet. I imagine that you could work out a way to import or write multiple with it.

Microsoft Employee
@8bitclassroom
Community Support Team
Community Support Team

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Hi AmitLoh-Powerap,

 

When I create an app based on an custom entity which has more than 10,000 items, but there are only 500 items shown on the app, though with no Filter/Search function, only 500 items are shown on the app.

 

There is an article about “Import or export data from the Common Data Service”, does this "export" equal to what you said “export data link in CDM screen”?
https://powerapps.microsoft.com/en-us/tutorials/data-platform-export-data/


Best regards,
Mabel Mao

Community Support Team _ Mabel Mao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

thanks all!

 

I just need export for now. I will try timer example but looks lilttle complicated.
https://powerapps.microsoft.com/en-us/tutorials/data-platform-export-data/
This export data link in CDM screen only works when you have less data. If I have 100k records in entity it does not export anything. There is a bug in this export.

 

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Hi !
I was able to overcome 500 item limit without using N column and without modifying any existing schema!!
Thanks mr-dang and hpkeong for your suggestions. I used timer to do this with forall without adding any additional N column.

mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

@AmitLoh-Powerap, can you share your timer solution with a timer and ForAll?

Microsoft Employee
@8bitclassroom

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

I will update over the weekend, its kind of slow but works. It can become fast but I was not able to update forall(forall does not allow to update a varibale)

mr-dang
Level 10

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

I have a work around for ForAll not being able to update a variable, but it doesn't always make sense since ForAll does not necessarily go in the order you may want.

 

If you want a Count of things that are finished by ForAll, then you can use:

 

ForAll(datasource,
	[your other actions go here];	
	Patch(universalvariables,First(universalvariables),
		{var: First(universalvariables).var+1
		}
	)
)

 

Then you can reference the variable as First(universalvariables).var.

 

Or you can use Collect and rely on CountRows instead. This requires you to clear the Count earlier:

Clear(countcompleted);

ForAll(datasource,
	[your other actions go here];	
	Collect(countcompleted,
		{Value: 1
		}
	)
)

In this case, you would reference this count as CountRows(countcompleted). ClearCollect does not work.

This is only useful for getting a count of items done by ForAll. I do not yet have a way to update a "variable" for Text or Boolean yet.

 

Microsoft Employee
@8bitclassroom

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

Hi Dang,

 

Please see my completely different solution!! This works but I need your help!

please see the flow.

The flow has below steps-

For now its based on recurrance for testing-

Recurance

Get number of items for which you want to iterate-I created count cdm table with 20(500*20 so you get 10 k items fast!)

Apply for each

now in this get your actual entity which you want to export(this will run 20 times)--imp add filter with RecordID ge lastcountcdmtable

In same loop use '@last(@{outputs('Compose')}) and get last record thats 500th now I am trying "@parameters('recordid')" Returns 500/or last id as a string to get recordid

Once you get last recordid store it in lastcount cdm table

 

This gives you 20 loops of 500 records each but now I need help in compose(previouscompuse+compose) 20 times and then convert it to csv. I noticed this process is way faster than timer based loop.

500.png

 

for getting last record refer this thread-

https://powerusers.microsoft.com/t5/Flow-Forum/how-to-get-last-row-from-odata-filter-result/m-p/2107...

 

 

Re: 500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long

My other solution based on timer in powerapps works but its super slow-2 minutes to export 5k records but this one is around 30 seconds to get 10k! I need final 2 things compose all 20 500 sets into single entity and convert to csv either using json or custom api azure function and we are done!!

Helpful resources

Announcements
firstImage

Watch Sessions On Demand!

Continue your learning in our online communities.

Power Platform 2019 release wave 2 plan

Power Platform 2019 release wave 2 plan

Features releasing from October 2019 through March 2020

FirstImage

Power Platform World Tour

Coming to a city near you

thirdimage

PowerApps Community User Group Member Badge

Fill out a quick form to claim your user group badge now!

FourthImage

Join PowerApps User Group!!

Connect, share, and learn with your peers year-round

SecondImage

Power Platform Summit North America

Register by September 5 to save $200

Top Kudoed Authors
Users Online
Currently online: 196 members 5,209 guests
Recent signins:
Please welcome our newest community members: