cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
bondy_07
Frequent Visitor

Apply for each is very slow

Hello,

 

I am using Apply for Each to loop through the records and some conditions in LOOP. My observation is that the Flow gets stuck up in Apply for Each and takes 30 mins for looping through 150 records.

Any help is appreciated for improving the performance of the Flow

21 REPLIES 21

Hi @bondy_07 the amount of time the looping takes will depend on the actions which are contained within the loop. When I'm working on my Flows, I always try to limit the number of steps that are in each loop.

 

Another consideration is concurrency. By default the loop will process sequentially, whereas, assuming you're using an apply to each loop, I can change the concurrency to allow side by side iterations of the loop. To do this

  1. click on the ellipsis (...) in the top right corner of your loop
  2. Select settings
  3. Switch the concurrency control on (toggle switch)
  4. Either leave the degree of parallelism as it is (default 20) or up it as you see fit.

Give that a go and see if that brings the run time down.





Did my reply help? Please give it a thumbs up.
Did I answer your question? Please mark my post as a solution!
If you liked my response, please consider giving it a thumbs up. THANKS!


Proud to be a Flownaut!





Community Leader: Black Country PowerApps & Flow User Group

Hi, @MattWeston365  this didn`t help.....

Hi @bondy_07 can you please post a screenshot of your Flow actions? If you can post one screenshot which just outlines the overall Flow, and one which shows the execution times for each of the steps please?





Did my reply help? Please give it a thumbs up.
Did I answer your question? Please mark my post as a solution!
If you liked my response, please consider giving it a thumbs up. THANKS!


Proud to be a Flownaut!





Community Leader: Black Country PowerApps & Flow User Group

Anonymous
Not applicable

I agree with this. I am insert 13,000 records into a sharepoint list. Entire payload converted into JSON is 10 MB.  With parallelism set to the maximum of 50 it takes 30 minutes to complete.  That is not an acceptable amount of time. A couple of minutes, maybe.

DenisMolodtsov
Kudo Kingpin
Kudo Kingpin

I have the same issue. I'm testing flows, while developing them. I need to iterate through a small array for 20 items that was already retreived. It takes more than 10 minutes for it to go through. I never saw it to be that slow before.

 
Alan_Sanchez
Advocate I
Advocate I

I'm experiencing a similar issue in speed. I have a loop with 8 actions contained within that is meant to loop through several hundred CSV values and store in an array on it's way to JSON. It takes ~30 minutes sequentially and upwards of 1.25 hours at full concurrency. I understand the reasoning to keep the loop as simple as possible, but what stumps me is that concurrency is slower than sequential. That doesn't make much sense to me...

 

2020-03-20 19_38_54-Edit your flow _ Power Automate.png2020-03-20 19_39_42-Run History _ Power Automate.png

bouillons
Helper IV
Helper IV

Can anybody explain what is the root cause for these delays ? Seconds, I can understand, minutes is just baffling.

Stephane

AIUYM19
Helper V
Helper V

Yes, the unexplainable slowness of Power Automate flows is something that's rankled me, as well.

While it's not a solution, per se, one way to mitigate the time it takes to complete a "apply to each" loop is to limit the number of iterations it needs to complete. I do this by always using a "filter" action immediately before the loop, and then feeding the loop with the result of that filter. You will often need to get creative with your filter conditions, but it can significantly reduce the amount of time it takes for your flow to complete.

Another trick I've found is to break up the loop into multiple loops (again, using the filter action to break up the data). If, for example, you use a condition inside your loop to do one thing to some data and another thing to the rest of the data, perform this data separation outside of the loop using a pair of filter actions, and then run two loops in parallel, each taking the result of one of the filter actions. This avoids the use of a conditional inside the loop, which can halve the time to complete.

Something like this...

Screenshot 2021-04-30 151519.png

 

bouillons
Helper IV
Helper IV

Whenever something is slow, it makes me wonder if between certain steps the workflow engine is doing transactional writing of data to a persistent storage, in case it needs to restart due to some failure. At least, that's what I learned from my Biztalk years. There should be a way to tell the engine not to bother with that and just restart from the saved state before the apply to each.

Or it could be Microsoft being...well...Microsoft, and not bothering to implement anything correctly. >_>

Seriously, I've seen better code come out of an eight year-old. Microsoft's dev teams ought to be ashamed of themselves.

vargasjo
Frequent Visitor

today im getting random slows on appy to each, on a flow that was always working ok, im guessing right now its something microsoft related, to accelerate de process im cancelling manually the stucked flow and resubmitting.

carudev
Helper II
Helper II

Hi guys! 

 

I have a flow that makes a REST request to update items in a Sharepoint list. I have about 5k items on the list and my flow is taking more than a day to execute: one day and 18 hours on average. Is there any way to decrease this time?

 

PS: I already put the flow parallelism at maximum. 😕

Thank you all! 

Hi, @carudev!

When you say you've "put the flow parallelism at maximum," do you mean you've set the concurrence of an apply to each block to 50, or that you've created hundreds (whatever the maximum is) of parallel branches within the flow?

Since a REST call (I assume you're using the send HTTP request action here) can be used for only one item at-a-time, there's no way to speed up this action itself. However, you could try running more than one of them simultaneously. In your flow, use filters to break up your array of items into multiple sets—say, one set could have the first 100 items, then the next set could have the next 100, and so on. Then, in parallel branches, send each set through an apply to each action with the REST call inside. While the concurrence limit of each apply to each action will still keep things slow, you'll be running more than one of them at any time, which could speed things up. Remember to set the concurrence of the apply to each blocks to 50, and it's fine to have all your branches be identical and performing the same task. What we're trying to accomplish is tricking Power Automate into thinking it's doing different things with each branch.

For a diagram of how this might work, see my screenshot above from 30 April 2021.

Another issue might be how quickly SharePoint is responding, especially if you're sending it 5k+ REST reuests.

jhfin
New Member

Hi @carudev ,


I have also struggled with slow operations against Sharepoint. Currently I believe batching might be the solution.

 

I have tested this with Graph API using Powershell. I have also written a blog post about it: https://jb1c.blog/2021/05/18/using-graphapi-for-exporting-importing-editing-gigantic-sharepoint-list...

 

Microsoft has also a nice tutorial on doing something similar with the flow: Create a Microsoft Graph JSON Batch Custom Connector for Power Automate - Microsoft Graph | Microsof...

 

That requires using registered AzureAD application but fortunately it is now possible to have one with a reasonable permissions: https://developer.microsoft.com/en-us/graph/blogs/controlling-app-access-on-specific-sharepoint-site...

 

I'm currently planning to try this out with the flow, but building batches from json needs some more designing. Anyways my experience with the Powershell is promising!

 

Hope this helps!

Paulie78
Super User
Super User

Hi @carudev 

 

I have just done a blog post and video which covers using Power Automate delete items from a SharePoint list using the Batch API. It is much faster than the standard method.

 

I am also going to do a separate video on creating and updating items, but the concept will be similar for all cases. Might be a good place for you to start.

 

Blog: tachytelic.net

YouTube: https://www.youtube.com/c/PaulieM/videos

If I answered your question, please accept it as a solution 😘

Robhcc
Advocate II
Advocate II

Hi,

 

I have found that using Variables inside apply to each functions slow it down considerably. This is because the Flow has to "Lock" the variable while it is in use and this adds time to the run. Concurrency will make the situation worse in these cases as each concurrent action is raised as a separate "mini flow" when being run and each "mini flow" has to wait for the variable to become unlocked and then has to lock it for its run, do its actions and then unlock it again.

 

I have found that minimising the use of variables can significantly reduce run times. I have found that using Compose actions are far quicker, although you may need to evaluate this depending on your needs. 

 

I have also found that table actions tend to be mush faster as well. One example was I needed to update records in sharepoint in bulk. I used a select command to build a JSON input with graph bulk commands and then ran a single bulk graph command to do my inputs (up to 20 a time), instead of using an apply to each, this reduced my import times from approx 4 hours per 1000 to about 10 minutes.

 

Have a look at the blog here, there is some really good info about Flow performance improvements:
https://sharepains.com/2018/10/15/microsoft-flow-improve-your-flows-performance-in-a-few-easy-steps/

 


@Robhcc wrote:

...each concurrent action is raised as a separate "mini flow" when being run and each "mini flow" has to wait for the variable to become unlocked and then has to lock it for its run, do its actions and then unlock it again.

While this is technically true, it's a very niche case. The variable in-question is "locked" only while a CRUD operation is being actively applied to it, not for the entire run of that particular instance. That takes almost no time at all, and running concurrent "mini flows" will use the same variable space simultaneously while running. That's why a tooltip appears when you add a "Set variable" control within an "Apply to each" container, telling you to set the concurrency to 1. Multiple concurrent runs of that container—depending on what happens within the container—will use and/or overwrite the variable while another instance of that container is still using it. So, the time impact of accessing variables within an "Apply to each" container is negligible. The real time hit comes from more compute-heavy operations like file I/O, complex calculations, or branched and nested conditional statements (conditionals are an interesting case in themselves, since every branch needs to be computed whether or not it's actually executed).

Hi @AIUYM19 That's not my experience, almost every time I have tried to use a variable in an apply to each, my flow grinds to a halt and adds huge amounts of time.

In the example given above by Alan Sanchez, every action in the loop is actioning a variable, so whilst it seems negligible on a single action, when you have a loop with 4 variable actions and then a loop inside that with 3 variable actions, all of a sudden that negligible delay becomes huge delays. Even 1 extra second per loop can add hours to the run if you have thousands of iterations. For example, 3600 records with 1 extra second will add an hour to the run time.

 

All I am saying is try to reduce the amount of variable actions, try using a compose action if you don't need to make amendments and also try using array actions instead of loops as they can do thousands of edits in a second. I am not saying that variables are bad or useless, just expect your flow to take longer to run. And yes, there are also 10 other things that can slow it down, but again in the example above, they are not using conditions, file IO, complex calcs (Its all string amendments) or heavy computational stuff, they are just using variables to amend JSON strings. My answer was specific to the example given and my response was not based on the technicality, but my experience with using variables in flow.

 

@Alan_Sanchez you have have better luck using this guide to convert your csv into json, it uses a select / join action instead of variables and a loop, I have it in use in one of my flows and it regularly does 10,000+ records (csv to JSON) in a matter of 20-30 seconds. You may need to modify the replace commands to meet your needs, if you need help, let me know.
https://sharepains.com/2020/03/09/read-csv-files-from-sharepoint/

 

Thanks for the tip @Robhcc. I'll try this on my next flow where this is needed and share my feedback.

Helpful resources

Announcements
MPA_User Group Leader_768x460.jpg

Manage your user group events

Check out the News & Announcements to learn more.

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!

Welcome Super Users.jpg

Super User Season 2

Congratulations, the new Super User Season 2 for 2021 has started!

Carousel 2021 Release Wave 2 Plan 768x460.jpg

2021 Release Wave 2 Plan

Power Platform release plan for the 2021 release wave 2 describes all new features releasing from October 2021 through March 2022.

Users online (1,001)