Need some help.
I have been looking at some posts here, none with a concrete solution.
Here my problem:
I installed OnPremise Data Gateway to search an SQL database. Within PowerApps I am able to connect to the database, I am able to read this data, but I am not able to load this data into a CDS entity. The table has 105k rows.
I have hit my face on the wall in the error: "code":"0x80072322","message":"Number of requests exceeded the limit of 6000 over time window of 300 seconds."
However the PowerApps documentation says that to perform this operation they accept a limit of 500k lines. Training videos, documentation and etc show people carrying a huge amount of data and I can't get over 12k.
(https://docs.microsoft.com/en-us/power-platform/admin/data-integrator and https://docs.microsoft.com/fr-fr/powerapps/maker/common-data-service/data-platform-cds-newentity-pq)
I have a Microsoft PowerApps Plan 2 license.
Hi @Gestis_Marcelo ,
We limit the number of API requests made by each user, per organization instance, within a five minute sliding window. Additionally, we limit the number of concurrent requests that may come in at one time. When one of these limits is exceeded, an exception will be thrown by the platform.
So I think the problem that you met is because of this limit.
I think maybe you need to read this doc about api limit in PowerApps:
I'd recommend spitting your imports into smaller batches.
There are also some tips about how to deal with the problem that your application exceeds the limit:
This is a problem I've come to recently while investigating using CDS for our company as we look to update the Power Platform and Dynamics.
I just find it shocking that Microsoft's own dataflow tool, taking data from it's owns data gateway (and it's own SQl server) is just crashing out when hitting these limits.
The right solution would be to slow it down. In many cases user will be looking to do over-night updates/refreshes of data. Just flaking out the entire process because it's hitting it too fast, without giving the user the option to "reduce the speed" or pay for higher flow rates is simply a painful experience to work with.
Setting up dataflows is such a long winded task as it is, then to get these problems just makes the entire platform something I'm beginning to dislike. Not to mention this annoying bug in dataflows about date columns randomly being treated as 01-01-0001 values for no valid reasons, even when the dataset has no such date in it!
Please upvote my idea of removing throttling limits in the Dataflows