cancel
Showing results for 
Search instead for 
Did you mean: 

Remove Service Protection API throttling limits for Dataflows

Dataflows is such a nice addition to loading data into CDS and Azure data lake. However, there is a limitation of importing 6000 records per 300 seconds. Users get below error when exceeding the limit.

 

Number of requests exceeded the limit of 6000, measured over time window of 300 seconds.

 

The above limit may sound reasonable for third party API integrations but not suitable for dataflows which deal with importing large chunks of data. Since dataflow is an inherent feature of the product, there should not be any throttling limits in the first place. As there is no source code available to tweak, we neither control exceptions nor delay the execution as per the API limits.

 

Please upvote and let's see what Microsoft has to say about it.

Status: New
Comments
New Member

I completely agree. I just ran into the same API limitations with dataflow. I recommended my customer to use this for importing a large amount of data from Excel (+1.000.000 records). We can live with the limitation that we have to devide the import into several files since we can only import 500.000 records per import. It's a silly limitation, but OK we can live with it. What we didn't know was that these API limitations actually prevent us from using the tool, since we have no control over de number of API call per 5 minutes. I just tried importing a file containing 100.000 accounts. I failed after an hour with this error message. So now I don't trust dataflow to be used for importing data at all. 😞

 

I sooo tired of constantly running into these kind of problems with half baked solutions from MS. Things seem perfect at the surface but when you start using the technology in real customer scenarios it really isn't usable at all. 

 

Responsive Resident

@thomasrath  Thanks for upvoting it. I can feel what you are going through. : ) I believe Microsoft would provide some solution to the issue. 

 

New Member

@Satish-Reddy I'm really frustrated. I just tried importing 300.000 records to account entity in my production environment. The documentation gave me hope that it might succeed due to more ressources in production instanses. It's a very simple import with no lookup fields included. It also failed due to API limitations. It managed to import aproxx 298.000 records. Data flows are basically useless if they don't fix this.

 

When you say that you believe that MS would provide some solution is that because you are in dialog with them?

I'm considering creating a service request. A basic design flaw in the tool that renders it useless is in my view a bug.

Responsive Resident

@ThomasRath_DK I am not in touch with Microsoft. There is a chance that Microsoft could do something about it as it sounds legitimate request. I advise you to raise a service ticket. 

 

I hope you are aware of the classic data import, you can try that for the time being. 

New Member

I'm aware of the classic import tool. I't just not a very good tool for importing millions of records. You have to split the import files into many files and the error handling is just not very good. Also you cannot update records. It's an almost 10 year old tool that basically is outdated. That's why I turned to Data Flows which gave promise of a much more modern tool. It suspect that this is also the direction MS want us to go so they at some point can kill the Classic import tool.

 

I have created a service request with MS. Let's see if I can get them to fix it. It's a great tool so it's a pity that something like this ruins it. 

Advocate I

having the same issue. i would be OK if my dataflow just slowed down to copensate. but finnishing a dataflow after an hour of importing just to get an error about API throtteling is really concerning. And its next to impossible to figure out what rows didt get importet.

New Member

A quick update on this issue. I created a Service Request and the product team has just told me that they are working on a solution to the API limitations. They also fixed another bug I reported that the status of the refresh was not shown properly. The fixes should be available within the next couple of days.

Responsive Resident

@ThomasRath_DK  That's great news. Thanks for the update. Interested to know if our feedback has any impact on their decision or is it planned a long time before by the product team. Anyway, we are all happy to know that the issue is being resolved. 

Microsoft

@Satish-ReddyIdeas forum is actively monitored by product teams, this forum and product team specifically by me 🙂 

 

Indeed a change for dataflow refresh to react to throttling request by CDS is on its way.

 

In general, Ideas with higher votes will get most attention so thank you for creating this one!

Ben