Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2500 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 25 actions or only 2.5% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for creates & updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
Video: https://youtu.be/BP9Kp0Ltj7U
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates & creates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Be aware that some characters in column names, like \ / - _ . : ; ( ) & $ may cause errors when processing the data. Also backslashes \ in the data, which are usually used to escape characters in strings, may cause errors when processing the JSON.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
You can download the Version 5 of this template attached to this post, copy the Office Script codes into an online Excel instance, & try it out for yourself.
-Open an online Excel workbook, go the the automate tab, select New Script, then copy & paste the Office Script code into the code editor. Do this for both the Batch Update and the Batch Create script code. You may want to name them BatchUpdateV6 & BatchCreateV5 appropriately.
-Once you get the template flow into your environment, follow the notes in the flow to change the settings to your datasources, data, & office scripts.
If you need just a batch update, then you can remove the batch create scope.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining batch create scope below it. Then any update data sent to the 1st Select GenerateUpdateData action will just be created, it won't check for rows to update.
(ExcelBatchUpsertV5 is the core piece, ExcelBatchUpsertV5b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
Anyone facing issues with the standard zip file import package method can check this post for an alternative method of importing the flow: Re: Excel Batch Create, Update, and Upsert - Page 33 - Power Platform Community (microsoft.com)
Or this one: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Thanks for any feedback, & please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86)
watch?v=HiEU34Ix5gA
I have over 99 columns, Where can i expand the limit over 50?
@rsalam
What version are you using? The newest versions should not have any column limits. If there is still a Select action with a line for each potential column, then that is an earlier version.
Thanks,
I just wanted to ask before i start mapping out. I am using the newest version. I will map it out tonight. I am getting the data from Power Bi and exporting it to excel template however i have six different sheets that i need to update. What is the best way to update all the sheets in the same flow?
@rsalam
You may want to use parallel branches for that. But you will probably want to put the two scopes into another larger scope & copy that across branches so you don't have to re-write connections between actions each time you create & copy a new branch.
Thanks for your help. I really appreciate it. I try putting the two scope into another scope. But looks like for some of the action, i have to rewrite the connections.
Looks like the minus does throws an error. I have replaced the select arrays with yours and it works perfectly fine.
Thanks for this amazing tool. Unfortunately I face an error which I don't understand. My source file is 1,500 rows long and my dest file is 23,000 rows long, both have 35 columns.
When running the flow I have this error:
It successfully ran on smaller files of same type....
I tried to decrease BatchSize to 100 with no success
I haven't dealt with that error before. HTTP documentation: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/502
But if it was working with smaller table-sizes, then started to error on larger tables, try switching the ForceMode1Processing to Yes
Thanks for the answer !
I just tried with ForceMode1Processing to Yes, but ran into another error message: