11-27-2021 16:55 PM - last edited 05-24-2023 15:45 PM
Batch update with external data template flow.
Explanation Video: https://youtu.be/l0NuYtXdcrQ
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel has some set values in it and matches it to the SharePoint record with those set values through the SharePoint ID. Then it uses the batch update method to update those SharePoint records.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Version 1.5
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
***LISTS WITH UNDERSCORES _ IN THE TITLE MAY CAUSE ERRORS WITH EARLIER VERSIONS OF BATCH ACTIONS.***
If you have a list with underscores in the name, then in TachyTelic.Net's batch create you will want to go to the __metadata line on the GenerateSPData select action and substitute the expression
for
json(concat('{"type":"SP.Data.', replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), 'ListItem"}'))
TachyTelic.Net Blog & Videos
Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
See @DamoBird365 go through a join / merge example similar to the Reformat JSON pieces of this template here: https://youtu.be/R_bD8nAoJ8Y
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If the batch call succeeds, but individual rows fail to load to SharePoint, then the flow will not throw an error. I personally change out the Results Compose actions with Append to variable actions and use an expression that returns the results text if it identifies an error. Then at the end of the flow I have a condition checking the length of that variable. If it has some text/length to it, then something failed and it sends me an email with a link to the flow run.
Expression to conditionally return results on failure:
if(
or(
or(
or(
or(
or(
or(
or(
or(
contains(base64ToString(body('sendBatch')['$content']), '400 Bad Request')),
contains(base64ToString(body('sendBatch')['$content']), '401 Unauthorized')),
contains(base64ToString(body('sendBatch')['$content']), '403 Forbidden')),
contains(base64ToString(body('sendBatch')['$content']), '404 Not Found')),
contains(base64ToString(body('sendBatch')['$content']), '408 Request Timeout')),
contains(base64ToString(body('sendBatch')['$content']), '409 Conflict')),
contains(base64ToString(body('sendBatch')['$content']), '500 Internal Server Error')),
contains(base64ToString(body('sendBatch')['$content']), '502 Bad Gateway')),
base64ToString(body('sendBatch')['$content']),
'')
watch?v=l0NuYtXdcrQ
Hi @takolota
I tried fetching user ID via email and used it to update the column. So far it's functioning as expected. Thank you for the suggestion.
Hi.
I am trying to evolve on this flow and basically keep our Sharepoint Lists updated 100% to the data in Dataverse, this includes deleting, creating and updating any changes.
However, I am having difficulties wrapping my head around getting it to remove any items from Sharepoint that is no longer in Dataverse.
Can you point me in the right direction on how to achieve this?
I’m guessing you already know of @Paulie78’s SharePoint batch delete resource.
https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/?amp#h-flow-detail
And if the dataset was small enough to just delete each SharePoint record each time a Data-verse record was deleted, then you probably would have used that method.
So I’m taking this as you need a recurring batch deletion synchronization.
For that you can probably take a few things from my Excel Batch Delete V2B file from this page: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Delete/td-p/1634375
It uses several actions with Union( ) & Intersect( ) expressions to get to only the records existing in one dataset that are not in another dataset.
Unless @Paulie78 knows some simpler method.
This is an awesome post. Got it working using an on-prem SQL instead of Excel file and works as expected. However found some limitations or issues that had to workaround:
1. Not sure how you are able to read 100,000 records at a time . I get the threshold error. "The attempted operation is prohibited because it exceeds the list view threshold." My list is under 15K so I set the Top Count to 5000 , Pagination to 5000 , modify the "do until update" and added extra threads. How did you do get items of 100K at once ?
2. I had to lower the batch size under 1000 because I hit the another limitation : "The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1063857 bytes has been read from the stream however a maximum of 1048576 bytes is allowed". I was updating several more columns than you example.
Cheers
1. I think it’s a licensing thing. I’m usually working with an E3, E5, or premium per user license.
2. If I had to guess you either have a large number of columns / fields in each row, you have some multiline text columns with a lot of characters in them, or some mix of those. Yes, reducing the batch size should help in that case.
Good job taking the initiative & figuring out whatever you could on your own.
First, thank you for sharing. I used your flow and it works as intended.
Would you respond to the following issue as well?
I'm extracting a table from SQL where the ID column is listed as a string instead of an integer (example: d3f9dff6-7625-4f23-b1bc-3a9652a5f866)
The flow throws an error when i use this column for an ID as the HTTP request URL line is not formatted correctly.
Error:
thank you for the quick response.
- i tried the string() method and it did not work. the URI was still the same without the apostrophe.
- i changed the 'batchTemplate' code to add apostrophes around the |ID| and it did create the following
PATCH https://<YOURWEBSITE >.sharepoint.com/sites/zTest_for_support/_api/web/lists/getByTitle('2testdatabase')/items('5D8FC382-4177-4571-9DB4-000FE75BC572') HTTP/1.1
however, now i get the following error:
"value": "Input string was not in a correct format."
- I also tried to use base64() for the dynamic ID code listed under 'Select Keys from Get Item 1' step.
Any thoughts? I am under the impression that for non-integer ID columns I just need to add the apostrophe around the string.
Hi @novice1508,
If that's coming from SharePoint, then ID's are integers, not GUID's. GUID's are more for database environments like Dataverse.
If ever you try to fetch a specific record from your SharePoint table that holds this GUID, then you should use filter predicates instead. For example:
https://<YOURWEBSITE >.sharepoint.com/sites/zTest_for_support/_api/web/lists/getByTitle('2testdatabase')/items?$filter=myGUIDcol eq '5D8FC382-4177-4571-9DB4-000FE75BC572'
My 2 cents
hi @Fred_S,
thank you for the response.
I tried your suggestion which sounds great. I notice new errors because it creates an array in the URL section.
I tried the following:
- /items?$filter=
- /items?$top=1&$filter=
- /items?$select=<column_name>&$top=1
again thank you for the suggestion.
I Finally realized how to work the original flow:
in the GenerateSPData step I just needed to choose the ID number from the destination database list
I originally confused this filed with the SharePointKeyColumnName
thank you both for responding, and thank you for sharing a fantastic flow.