Hello,
I've had a look around and not found an answer to this one yet;
I've created a flow that copies all the files from a Box folder to a Dropbox folder every x hours. The flow works fine, however the Apply to Each action seems to have a 100 iteration limit - I only see 100 files in the Dropbox folder, and 100 actions on review.
The issue is that we have 150+ (and potentially growing) files in Box. Is there a way to increase this limit or a workaround that will copy all the files?
Thanks.
Solved! Go to Solution.
I beleive this was fixed in a recent update
Advanced options under 'Get Items' has a maximum get count which you can manually set to whatever you want
Hi @Anonymous,
Please check the article below for the limits under Microsoft Flow:
Limits and configuration in Microsoft Flow
So I don't think the issue here is caused by the Apply to Each limitation.
Would you please share some details about the flow configuration, are you using the Box->List files and Folders in a folder action?
In addition, please verify the Body output of the Box->List files and folders in a folder action, confirm that it has all the files loaded.
Regards,
Michael
Hi @v-micsh-msft,
Thanks for looking at this. Yes, I'm using the List files and folders action. I just checked this and it looks like this might be the culptit as I can only see 100 items in the List action results.
Here's a copy of the flow as it is now. Are you aware why the List action might be only pulling 100 items?
Answer to this would be great as I am looking at a smilar issue.
I'm currently having this issue with a SP list - apply to each on every item in the list. It does not iterate beyond 100
I beleive this was fixed in a recent update
Advanced options under 'Get Items' has a maximum get count which you can manually set to whatever you want
I see that now, appears to have fixed it. Thank you!
My existing flows have defaulted to 100 items instead of 512 FYI
I just checked, and this has resolved for my flow too. I didn't have to change anything it just updated.
Thanks All.
I have the same issue today. I had 157 items in the sharepoint list, used an ODATA filter string to get the relevant 119 records.
yet only 100 records are read by "GET ITEM"
Start of Flow
Part 2 of flow
expanding on the condition where the apply to each is programmed.
see also
https://powerusers.microsoft.com/t5/Building-Flows/get-items-only-returns-100-records/td-p/136883
I do not see an advance setting selection...is there another workaround? Thanks so much for your help!
Never mind! I answered my own question. The pagination is on the get items, not at each... Thanks!
I am having this issue as well... I do not see an advanced setting for adding more iterations. I have a folder that receives 300+ files a day and a flow that moves the files out of the received folder into a processing folder but it is only moving 100 files. This is creating a major bottle neck on file processing.
I am facing an issue where Get Items consume 3ms whereas Apply to Each takes 5 days
I have been iterating on a large list having 15k+ items which show the threshold already as it crossed 5k limit
As a workaround, I tried breaking the item limit < 2500 per each based on ID to run the parallel power automate instance (all instance are separate on their own)
Any idea how can we tune this - "Apply To Each loop"
@prashanthspark following this to know any suggestions/solution as well.
I found that if iterating through more than 1000 records will cause the flow to run disproportionately slower (e.g. 500 records take 1.5 hours, but 1500 records take 24+ hours). And when I break them up manually into separate instances, it runs much faster. Would love to know how I can optimize this.
Or does anyone know how to build a flow which can automatically break the 'apply to each' loops to 500 records each (e.g. if 2100 records, break it into 5 loops: 500 + 500 + 500 + 500 + 100)
Can you use apply all to update all the rows in a excel table? Essentially this would reset the table to have all internal columns updated. I have tried update a row but the flow tends to get very long one for each person.
I am having the exact same problem. When I filter the results to 500 records or so, the flow runs in a few minutes. But I have to run the flow to 8,000 records. So it runs for days.
Did you ever find a solution to split the apply to each in 500 records batch?
User | Count |
---|---|
88 | |
37 | |
26 | |
13 | |
13 |
User | Count |
---|---|
127 | |
54 | |
38 | |
24 | |
21 |