Dataflows is such a nice addition to loading data into CDS and Azure data lake. However, there is a limitation of importing 6000 records per 300 seconds. Users get below error when exceeding the limit.
Number of requests exceeded the limit of 6000, measured over time window of 300 seconds.
The above limit may sound reasonable for third party API integrations but not suitable for dataflows which deal with importing large chunks of data. Since dataflow is an inherent feature of the product, there should not be any throttling limits in the first place. As there is no source code available to tweak, we neither control exceptions nor delay the execution as per the API limits.
Please upvote and let's see what Microsoft has to say about it.