06-13-2022 09:31 AM - last edited 11-18-2023 10:59 AM
Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2500 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 25 actions or only 2.5% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for creates & updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates & creates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Be aware that some characters in column names, like \ / - _ . : ; ( ) & $ may cause errors when processing the data. Also backslashes \ in the data, which are usually used to escape characters in strings, may cause errors when processing the JSON.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
You can download the Version 5 of this template attached to this post, copy the Office Script codes into an online Excel instance, & try it out for yourself.
-Open an online Excel workbook, go the the automate tab, select New Script, then copy & paste the Office Script code into the code editor. Do this for both the Batch Update and the Batch Create script code. You may want to name them BatchUpdateV6 & BatchCreateV5 appropriately.
-Once you get the template flow into your environment, follow the notes in the flow to change the settings to your datasources, data, & office scripts.
If you need just a batch update, then you can remove the batch create scope.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining batch create scope below it. Then any update data sent to the 1st Select GenerateUpdateData action will just be created, it won't check for rows to update.
(ExcelBatchUpsertV5 is the core piece, ExcelBatchUpsertV5b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
Anyone facing issues with the standard zip file import package method can check this post for an alternative method of importing the flow: Re: Excel Batch Create, Update, and Upsert - Page 33 - Power Platform Community (microsoft.com)
Thanks for any feedback, & please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86)
I'm able to update on long alpha-numeric keys...
Are you finding each of those SKUs in the "result" output which is an array of all the primary key values not found in the column you designated as the primary key in the Run script action?
Are any of your Available Qty values blank or null?
These are the script code pieces that...
Create a flattened array from the column identified as the primary key in the destination table.
Then searches in that array for each primary key value from the PrimaryKey column of the data passed to the script.
If it finds it, then it updates that row in a copy of the table in the JS script memory. Otherwise, it adds the primary key value to the "result" array.
After it goes through every row of the data passed to the script, then later in the code it pastes the copy of the in-memory table with the edits to the real table in the file.
You can try changing the ForceMode1Processing input in the Power Automate Run script inputs to Yes and see if that gets any better results as it uses some different code to post each row of updates to the real destination table one by one in a different shorter piece of code.
If that gets the same errors you have mentioned, then I suspect it's not a problem with the script.
@takolota Its really bizarre, can't make sense of why its happening. I did try setting Force Mode1 Processing to Yes and seems to have fixed the issue. I get the full load the first time and my flow completes in under 30 seconds on 800 rows
As of now I have tested my full use case and am happy with result
Can you please dial in on the Batch B version configuration, specifically the batchSize Counts and the UpdatedData values as well. I am so close to getting this working, no errors, but it is not batching through my file correctly either so something is incorrect. thanks! Jess
It looks like the Run Script action is prompting you with the separate column/line inputs by default. You need to select the "Switch to input entire array" button so the input changes to an array input where you can input the ComposeDestinationJSON dynamic content...
I got that, it is cycling through my file, however not the entire entity. It seems to not be picking up the SKUs that begin with ALPHA values, last time I had to split the data up to process it all. I assume the loop is not picking up, I have only made changes to the required sections, and then attempted to increase the batchSize variable still not cycling through them all. Do I need to make an adjustment to get the loop going? The alpha rows are not listed in the body/results either.
Are the rows not getting updates in the outputs of your ComposeDestinationJSON before the Run script Update Excel rows action?
Where is your data coming from?
Is that action pulling all the data?
Is the flow completing all the loop runs? Or is it terminating & being marked as failed?
Not all of the rows are being processed, the source file is pretty large however it is not cycling through all of it and also not giving an error, completes successfully. I assumed it was something with the TopCount set at 2000 but changing it has not made a difference. Almost acting the same as the non batch process, but not looking at the rows starting with the alpha characters, however in that set I separated out those rows and was able to successfully process them in batches of 2k, hence why I picked up this batch process instead. I can email you the file if possible for you to look in case there is something that can be added to the script to add value.