cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
JonPowerApps
Frequent Visitor

Application user account entitlement limits

Hi -

 

We have a logic app pulling data nightly and pushing about 100K records into Dataverse (currently with interactive user accounts). 

 

We are hitting limits. If we switch to application accounts, what are the Dataverse limits? The service protection limits make sense but we are unclear if there are entitlement/request limits applied to the application accounts.

 

If there are limits to the application accounts, is there any way to increase them through licensing?

 

Maybe dataflow might help us overcome these limits too?

 

Best,

Jon

1 ACCEPTED SOLUTION

Accepted Solutions

I also wouldn't recommend dataflows for large volume of data (100K+) and complex data models. Although Microsoft is constantly making performance improvements they are quite slow...

Instead of going the custom route with Functions / ADF / SSIS, why not purchase more "Power Apps and Power Automate capacity add-ons" if performance is not a big deal? Even though more licensing cost, it will probably cost less if you consider build time (and maintenance) effort.

Cheers!

 

View solution in original post

9 REPLIES 9
dpoggemann
Super User
Super User

Hi @JonPowerApps,

 

What limit are you actually hitting?  This article reviews the limits for Logic Apps with throughput at this bookmark (https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-limits-and-config#throughput-limits

 

If you put in "High Throughput Mode" you should be able to increase from the 100K limit (default) to the 300K limit.  The article discusses how to accomplish this, and it is a "preview" feature from what I understand.

 

Hope this helps.  Please mark accepted if answers your question or Like if provided some assistance.


Thanks,


Drew

Hope this helps. Please accept if answers your question or Like if helps in any way.
ChrisPiasecki
Super User
Super User

Hi @JonPowerApps,

 

The request limits for application users are documented here. It's a base amount at the tenant level + more based on how many licenses you have (similar to how storage capacity is determined). This will be higher than the API limits for regular users, which are limited based on the type of license they have. 

 

For very large volumes of data ingestion, Logic Apps and Power Automate are not the best choices as they don't batch the requests. You would either need to manually construct the HTTP requests to send to the Dataverse web api in batches, or put this logic into an Azure function and call the function in your flow.

 

Instead you may want to look at Azure Data Factory or SQL Server Integration Services, or using a third party ETL tool that supports out of the box batching for operations, such as KingswaySoft.

 

More information on API protection limits:

https://docs.microsoft.com/en-us/powerapps/developer/data-platform/api-limits

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

 

Thank you, Drew. 

 

We are not hitting limits on Logic Apps. We are concerned with limits on CDS.

dpoggemann
Super User
Super User

Hi @JonPowerApps,

 

I don't know of any limit you would be hitting within Dataverse except for maybe storage.  There is the ability to purchase more storage space in Dataverse if that is what you are looking for.   

 

Thanks,


Drew

Hope this helps. Please accept if answers your question or Like if helps in any way.

Hi Chris,

 

Great info. A few points I'm still trying to figure out.

Batch vs single requests - the docs suggest single requests but will that kill our usage?

"You would either need to manually construct the HTTP requests to send to the Dataverse web api in batches, or put this logic into an Azure function and call the function in your flow." - Would these batches overcome API limits?

 

I had seen a "batch" operation in the power automate roadmap at one point. However, when looking at the Microsoft data integration I found this blurb that says "avoid batching":

 

https://docs.microsoft.com/en-us/powerapps/developer/data-platform/api-limits#avoid-batching

 

Licensing - can we pay for more?

Also, it is unclear from this link if we can get more than the 100K requests. The article does mention "After base request capacity is exhausted, customers can increase this capacity by purchasing a Power Apps and Power Automate capacity add-on." However, I have tried to use that license in the past and although you can purchase it, I do not see any place we can apply this license once purchased.

 

There is also an open issue on GitHub to clarify the entitlement limits limits: https://github.com/MicrosoftDocs/powerapps-docs/issues/1930

 

Any thoughts?

Jon

 

 

 

 

Hi @JonPowerApps,

 

In general, you should avoid batching in your apps, as it may lead to poor user experience for users. Data integration / ETL is a different story and batching is recommended there to optimize throughput and ensure you stay within limits. Batching will overcome request limits, as instead of for example 1,000 single HTTP requests to create 1,000 records, you could send 10 HTTP requests of 100 record creations in the batch. That will alleviate request limits, but of course there is still CPU cycles and other processing time where limits could come into play, depending how intensive the operations that you are batching are. Usually simple create actions aren't too bad depending on how many columns and number of lookups etc.

 

If you purchase add-on capacity, I don't think you would assign it to a resource, as the 100k requests is at the TENANT level, so you have to take care to not hog all the resources for your entire organization..

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up. 

Good info, Chris. We are doing ETL. Pulling thousands of records from remote APIs and dumping them into CDS.

 

I'm ok stitching in an Azure function to do the batch update as the last step. However, do you know if data flows would also be a good choice? We can dump a batch of insert-only records ready to do into an Azure Blob. If we had a dataflow pick those files up from Azure blob storage, would data flows bulk insert into CDS?

 

Best,

Jon

 

Hi @JonPowerApps,

 

I'm not overly experienced with Dataflows but from what I have seen from others it seems to fall over at larger data volumes, so I'd say for 100k+ records, might not be the best. That said, its always worth investigating for yourself as I know the platform is always evolving and improvements are made often, so it may be better now.

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

I also wouldn't recommend dataflows for large volume of data (100K+) and complex data models. Although Microsoft is constantly making performance improvements they are quite slow...

Instead of going the custom route with Functions / ADF / SSIS, why not purchase more "Power Apps and Power Automate capacity add-ons" if performance is not a big deal? Even though more licensing cost, it will probably cost less if you consider build time (and maintenance) effort.

Cheers!

 

View solution in original post

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!

M365 768x460.jpg

Microsoft 365 Collaboration Conference | December 7–9, 2021

Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.

Users online (2,319)