Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2500 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 25 actions or only 2.5% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for creates & updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
Video: https://youtu.be/BP9Kp0Ltj7U
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates & creates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
You can download the Version 5 of this template attached to this post, copy the Office Script codes into an online Excel instance, & try it out for yourself.
-Open an online Excel workbook, go the the automate tab, select New Script, then copy & paste the Office Script code into the code editor. Do this for both the Batch Update and the Batch Create script code. You may want to name them BatchUpdateV6 & BatchCreateV5 appropriately.
-Once you get the template flow into your environment, follow the notes in the flow to change the settings to your datasources, data, & office scripts.
If you need just a batch update, then you can remove the batch create scope.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining batch create scope below it. Then any update data sent to the 1st Select GenerateUpdateData action will just be created, it won't check for rows to update.
(ExcelBatchUpsertV5 is the core piece, ExcelBatchUpsertV5b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
Anyone facing issues with the standard zip file import package method can check this post for an alternative method of importing the flow: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Thanks for any feedback!
watch?v=HiEU34Ix5gA
Maybe this picture clarifies things?
This is what both the "base" file and the destination file look like.
The base file has 91 columns, but I only need 45 of them in my destination file.
Currently I have a table of 5569 rows, but only 5512 of them are filled in (apart from the first "ID" column, which is just a number (=A9+1, =A10+1, etc.)
The goal is that the destination file gets updated 1/day and after that the file should be sent as an e-mail attachment.
The main file is changed daily, nobody uses the destination file, it only acts as a "template".
If you need any more information please let me know.
Unless it’s messed up by the double row header, I don’t see much to go on with that picture.
If your data is sensitive, can you come up with a fake example line?
Hi,
I get this error for a dataset with over 5k rows. I have the flow working for others and it works fine. Most of the other one has less than 2k row.
Rest of the error is just the data from the filter array. Let me know if the below pic helps. Thanks for your help!!!
Alright, please share the filter array output or copy & paste it into something like this JSON validator so we can try to find where the JSON formatting is off
Hi @takolota
Thank you for sharing the batch create/update flow.
I believe I am having the same issue another user is having with line 8 of the script.
This is the error I am getting in PA:
"We were unable to run the script. Please try again. Runtime error: Line 16: Cannot read property 'getRange' of undefined clientRequestId: "
And when I try to run the script in excel, I get:
"Line 8: Workbook getTable: The argument is invalid or missing or has an incorrect format."
I didn't make any modifications to your script, I just created sample source and destination tables (table1 in both files) with 4 columns and a <10 rows to test.
Any guidance you can provide would be greatly appreciated. Thanks.
Is the script referencing your Excel files correctly?
Are the names of your Excel tables correct in the script action, including case sensitivity?
The only part that would fail at line 8 is the reference with the Table name input in the action.
Line 16 references the Primary Key column name you enter in the action.
Thanks for the quick reply.
So I didn't modify the text in the script to look for the specific table name.
I thought the code "workbook.getTable(TableName)" was reading the specific table name from the file.
Do I need to replace all of the "TableName" references in the script to my specific table name ("Table1") in my scenario?
Thanks