Showing results for 
Search instead for 
Did you mean: 
New Member

Charge SQL data into CDS using Data Gateway - API limits problem

Hello Powers!!

Need some help.


I have been looking at some posts here, none with a concrete solution.


Here my problem:


I installed OnPremise Data Gateway to search an SQL database. Within PowerApps I am able to connect to the database, I am able to read this data, but I am not able to load this data into a CDS entity. The table has 105k rows. 


I have hit my face on the wall in the error: "code":"0x80072322","message":"Number of requests exceeded the limit of 6000 over time window of 300 seconds."


However the PowerApps documentation says that to perform this operation they accept a limit of 500k lines. Training videos, documentation and etc show people carrying a huge amount of data and I can't get over 12k. 
( and




I have a Microsoft PowerApps Plan 2 license.

Community Support
Community Support

Hi @Gestis_Marcelo ,

We limit the number of API requests made by each user, per organization instance, within a five minute sliding window. Additionally, we limit the number of concurrent requests that may come in at one time. When one of these limits is exceeded, an exception will be thrown by the platform.

So I think the problem that you met is because of this limit.

I think maybe you need to read this doc about api limit in PowerApps:

I'd recommend spitting your imports into smaller batches.

There are also some tips about how to deal with the problem that your application exceeds the limit:




Best regards,

Community Support Team _ Phoebe Liu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

This is a problem I've come to recently while investigating using CDS for our company as we look to update the Power Platform and Dynamics.


I just find it shocking that Microsoft's own dataflow tool, taking data from it's owns data gateway (and it's own SQl server) is just crashing out when hitting these limits.

The right solution would be to slow it down. In many cases user will be looking to do over-night updates/refreshes of data. Just flaking out the entire process because it's hitting it too fast, without giving the user the option to "reduce the speed" or pay for higher flow rates is simply a painful experience to work with.


Setting up dataflows is such a long winded task as it is, then to get these problems just makes the entire platform something I'm beginning to dislike. Not to mention this annoying bug in dataflows about date columns randomly being treated as 01-01-0001 values for no valid reasons, even when the dataset has no such date in it! 

Continued Contributor
Continued Contributor

Helpful resources

User Group Leader Meeting January 768x460.png

Calling all User Group Leaders!

Don't miss the User Group Leader meetings on January, 24th & 25th, 2022.

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!

Users online (1,853)