Showing results for 
Search instead for 
Did you mean: 
Level: Power Up

Charge SQL data into CDS using Data Gateway - API limits problem

Hello Powers!!

Need some help.


I have been looking at some posts here, none with a concrete solution.


Here my problem:


I installed OnPremise Data Gateway to search an SQL database. Within PowerApps I am able to connect to the database, I am able to read this data, but I am not able to load this data into a CDS entity. The table has 105k rows. 


I have hit my face on the wall in the error: "code":"0x80072322","message":"Number of requests exceeded the limit of 6000 over time window of 300 seconds."


However the PowerApps documentation says that to perform this operation they accept a limit of 500k lines. Training videos, documentation and etc show people carrying a huge amount of data and I can't get over 12k. 
( and




I have a Microsoft PowerApps Plan 2 license.

Community Support Team
Community Support Team

Re: Charge SQL data into CDS using Data Gateway - API limits problem

Hi @Gestis_Marcelo ,

We limit the number of API requests made by each user, per organization instance, within a five minute sliding window. Additionally, we limit the number of concurrent requests that may come in at one time. When one of these limits is exceeded, an exception will be thrown by the platform.

So I think the problem that you met is because of this limit.

I think maybe you need to read this doc about api limit in PowerApps:

I'd recommend spitting your imports into smaller batches.

There are also some tips about how to deal with the problem that your application exceeds the limit:




Best regards,

Community Support Team _ Phoebe Liu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Level: Powered On

Re: Charge SQL data into CDS using Data Gateway - API limits problem

This is a problem I've come to recently while investigating using CDS for our company as we look to update the Power Platform and Dynamics.


I just find it shocking that Microsoft's own dataflow tool, taking data from it's owns data gateway (and it's own SQl server) is just crashing out when hitting these limits.

The right solution would be to slow it down. In many cases user will be looking to do over-night updates/refreshes of data. Just flaking out the entire process because it's hitting it too fast, without giving the user the option to "reduce the speed" or pay for higher flow rates is simply a painful experience to work with.


Setting up dataflows is such a long winded task as it is, then to get these problems just makes the entire platform something I'm beginning to dislike. Not to mention this annoying bug in dataflows about date columns randomly being treated as 01-01-0001 values for no valid reasons, even when the dataset has no such date in it! 

Helpful resources


Power Apps Super User Class of 2020

Check it out!


Power Apps Community User Group Member Badge

Fill out a quick form to claim your user group badge now!


Power Platform World Tour

Find out where you can attend!

Power Platform 2019 release wave 2 plan

Power Platform 2019 release wave 2 plan

Features releasing from October 2019 through March 2020


Difinity Conference

The largest Power BI, Power Platform, and Data conference in New Zealand

Top Kudoed Authors (Last 30 Days)
Users online (4,527)