Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2500 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 25 actions or only 2.5% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for creates & updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
Video: https://youtu.be/BP9Kp0Ltj7U
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates & creates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
You can download the Version 5 of this template attached to this post, copy the Office Script codes into an online Excel instance, & try it out for yourself.
-Open an online Excel workbook, go the the automate tab, select New Script, then copy & paste the Office Script code into the code editor. Do this for both the Batch Update and the Batch Create script code. You may want to name them BatchUpdateV6 & BatchCreateV5 appropriately.
-Once you get the template flow into your environment, follow the notes in the flow to change the settings to your datasources, data, & office scripts.
If you need just a batch update, then you can remove the batch create scope.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining batch create scope below it. Then any update data sent to the 1st Select GenerateUpdateData action will just be created, it won't check for rows to update.
(ExcelBatchUpsertV5 is the core piece, ExcelBatchUpsertV5b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
Anyone facing issues with the standard zip file import package method can check this post for an alternative method of importing the flow: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Also be aware that some characters in column names, like \ / - _ . : ; ( ) & $ may cause errors when processing the data. Also backslashes \ in the data, which are usually used to escape characters in strings, may cause errors when processing the JSON.
Thanks for any feedback!
watch?v=HiEU34Ix5gA
Back-up import method for anyone facing issues with the standard import method.
For this, you'll need to copy the code below to your clipboard. It contains the code for a Power Automate Scope action that contains the entire flow. You can then go to a new action in any flow, go to the "My clipboard" tab of the new action menu and control + v paste it into that tab menu so you can select it to add it to the flow. The extra hurdle with this method is the longer expressions in the Select actions usually get messed up & will be greyed out.
You will need to go to each Select action, copy the expression text listed on the action, remove the expression text, use the map button on the right-side of the action to toggle to the table view & back to the single textbox view, then click in the textbox view to bring up the content menu & expression tab, then paste the expression text you copied back into the expression editor & remove any initial or end characters that do not belong in the expression input like the starting '@' or ending '}'.
Excel Batch Upsert V5 Scope Code to Copy & Paste Into the "My clipboard" Tab of a New Action Menu
{"id":"31a439ac-9e40-43ec-a958-a88c5d3221e2","brandColor":"#8C3900","connectionReferences":{"shared_excelonlinebusiness":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness/connections/39f5641011a343e8aac23b8d01bee065"}}},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Excel_Batch_Upsert_V5.2","operationDefinition":{"type":"Scope","actions":{"Office_Script_Batch_Update":{"type":"Compose","inputs":"function main(workbook: ExcelScript.Workbook,\n TableName: string,\n PrimaryKeyColumnName: string,\n ForceMode1Processing: string,\n UpdatedData: updateddata[],\n) {\n //If it is a blank table, add an empty row to prevent errors.\n if (workbook.getTable(TableName).getRangeBetweenHeaderAndTotal().getRowCount() < 2) {\n workbook.getTable(TableName).addRow();\n }\n let table = workbook.getTable(TableName);\n let RowNum: number;\n let TableRange = table.getRange()\n let TableRangeNoHeader = table.getRangeBetweenHeaderAndTotal()\n let TableData = TableRange.getValues()\n let ArrayPK = table.getColumn(PrimaryKeyColumnName).getRange().getValues().join(\"#|#\").split(\"#|#\")\n let ArrayPKErrors = new Array(\"\")\n let ColumnCount = TableRange.getColumnCount()\n let TableSize = (TableRange.getRowCount()) * (ColumnCount)\n let TableSizeBatchProcLimit = 1000000\n\n console.log(`Table size is ${TableSize} cells.`);\n if (TableSize > TableSizeBatchProcLimit) { console.log(`You have exceeded the ${TableSizeBatchProcLimit} total table cell limit for processing larger batches on larger destination tables in the office script, please either reduce your destination table size or use a batch size of 1000 or less in the cloud flow.`) }\n // If the table is less than 1 million cells & not something likely big enough to make errors in the V2 batch processing method then use the batch processing, else use the V1 row by row update method that isn't as affected by table size, but does have a lower cloud flow batch size limit.\n // So if someone generally uses large batch sizes, but then their table grows past 1 million cells, then this will revert to the V1 row by row processing with the smaller batch file size limit and the cloud flow will start to error and they will need to switch their flow settings back to smaller batch sizes as well.\n if (TableSize < TableSizeBatchProcLimit && ForceMode1Processing != \"Yes\") {\n\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < UpdatedData.length; i++) {\n //If the record's Primary Key value is found continue, else post to error log\n if (ArrayPK.indexOf(UpdatedData[i].PK) > 0) {\n //Get the row number for the line to update by matching the foreign key from the other datasource to the primary key in Excel\n RowNum = ArrayPK.indexOf(UpdatedData[i].PK)\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(UpdatedData[i]).length - 1; j++) {\n //Update each value for each item or column given\n TableData[RowNum][Number(Object.keys(UpdatedData[i])[j])] = UpdatedData[i][Number(Object.keys(UpdatedData[i])[j])]\n }\n }\n //Post PK not found value to ArrayPKErrors\n else { ArrayPKErrors.push(UpdatedData[i].PK) };\n }\n //Get array of 1st row formulas to re-apply to columns after posting the updates to the table\n let FirstRowFormulas = [\"\"]\n for (let c = 0; c < ColumnCount; c++) {\n FirstRowFormulas.push(TableRangeNoHeader.getColumn(c).getRow(0).getFormula());\n }\n FirstRowFormulas.shift();\n\n // If the entire table is smaller than 50,000 cells, then just post to the table. Else batch update.\n if (TableSize < 50000) {\n //Post table in memory to the Excel table\n TableRange.setValues(TableData);\n }\n else {\n\n // Run The Batch Update - (Batch update code source: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset)\n const CELLS_IN_BATCH = 15000;\n\n console.log(`Calling update range function...`);\n const updated = updateRangeInBatches(TableRange.getCell(0, 0), TableData, 10000);\n if (!updated) {\n console.log(`Update did not take place or complete. Check and run again.`);\n }\n\n function updateRangeInBatches(\n startCell: ExcelScript.Range,\n values: (string | boolean | number)[][],\n cellsInBatch: number\n 😞 boolean {\n const startTime = new Date().getTime();\n console.log(`Cells per batch setting: ${cellsInBatch}`);\n\n // Determine the total number of cells to write.\n const totalCells = values.length * values[0].length;\n console.log(`Total cells to update in the target range: ${totalCells}`);\n if (totalCells <= cellsInBatch) {\n console.log(`No need to batch -- updating directly`);\n updateTargetRange(startCell, values);\n return true;\n }\n\n // Determine how many rows to write at once.\n const rowsPerBatch = Math.floor(cellsInBatch / values[0].length);\n console.log(\"Rows per batch: \" + rowsPerBatch);\n let rowCount = 0;\n let totalRowsUpdated = 0;\n let batchCount = 0;\n\n // Write each batch of rows.\n for (let i = 0; i < values.length; i++) {\n rowCount++;\n if (rowCount === rowsPerBatch) {\n batchCount++;\n console.log(`Calling update next batch function. Batch#: ${batchCount}`);\n updateNextBatch(startCell, values, rowsPerBatch, totalRowsUpdated);\n\n // Write a completion percentage to help the user understand the progress.\n rowCount = 0;\n totalRowsUpdated += rowsPerBatch;\n console.log(`${((totalRowsUpdated / values.length) * 100).toFixed(1)}% Done`);\n }\n }\n console.log(`Updating remaining rows -- last batch: ${rowCount}`)\n if (rowCount > 0) {\n updateNextBatch(startCell, values, rowCount, totalRowsUpdated);\n }\n let endTime = new Date().getTime();\n console.log(`Completed ${totalCells} cells update. It took: ${((endTime - startTime) / 1000).toFixed(6)} seconds to complete. ${((((endTime - startTime) / 1000)) / cellsInBatch).toFixed(8)} seconds per ${cellsInBatch} cells-batch.`);\n return true;\n }\n /**\n * A helper function that computes the target range and updates. \n */\n function updateNextBatch(\n startingCell: ExcelScript.Range,\n data: (string | boolean | number)[][],\n rowsPerBatch: number,\n totalRowsUpdated: number\n ) {\n const newStartCell = startingCell.getOffsetRange(totalRowsUpdated, 0);\n const targetRange = newStartCell.getResizedRange(rowsPerBatch - 1, data[0].length - 1);\n console.log(`Updating batch at range ${targetRange.getAddress()}`);\n const dataToUpdate = data.slice(totalRowsUpdated, totalRowsUpdated + rowsPerBatch);\n try {\n targetRange.setValues(dataToUpdate);\n } catch (e) {\n throw `Error while updating the batch range: ${JSON.stringify(e)}`;\n }\n return;\n }\n /**\n * A helper function that computes the target range given the target range's starting cell\n * and selected range and updates the values.\n */\n function updateTargetRange(\n targetCell: ExcelScript.Range,\n values: (string | boolean | number)[][]\n ) {\n const targetRange = targetCell.getResizedRange(values.length - 1, values[0].length - 1);\n console.log(`Updating the range: ${targetRange.getAddress()}`);\n try {\n targetRange.setValues(values);\n } catch (e) {\n throw `Error while updating the whole range: ${JSON.stringify(e)}`;\n }\n return;\n }\n }\n //Re-apply the formulas from the 1st row to the entire columns after the update\n for (let f = 0; f < ColumnCount; f++) {\n if (FirstRowFormulas[f].toString().startsWith(\"=\")) {\n TableRangeNoHeader.getColumn(f).getRow(0).setFormula(FirstRowFormulas[f])\n TableRangeNoHeader.getColumn(f).getRow(0).autoFill(table.getRangeBetweenHeaderAndTotal().getColumn(f).getAddress(), ExcelScript.AutoFillType.fillDefault)\n }\n }\n }\n // Update row by row if the table is too large\n else {\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < UpdatedData.length; i++) {\n //If the record's Primary Key value is found continue, else post to error log\n if (ArrayPK.indexOf(UpdatedData[i].PK) > 0) {\n //Get the row number for the line to update by matching the foreign key from the other datasource to the primary key in Excel\n RowNum = ArrayPK.indexOf(UpdatedData[i].PK)\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(UpdatedData[i]).length - 1; j++) {\n //Update each value for each item or column given\n TableRange.getCell(RowNum, Number(Object.keys(UpdatedData[i])[j])).setValue(UpdatedData[i][Number(Object.keys(UpdatedData[i])[j])])\n }\n }\n //Post PK not found value to ArrayPKErrors\n else { ArrayPKErrors.push(UpdatedData[i].PK) }\n }\n }\n //Post ArrayPKErrors to flow results\n console.log(\"Any primary key values not found are listed in the result array.\")\n ArrayPKErrors.shift();\n return ArrayPKErrors;\n}\n//The 1st few column indexes are provided incase anyone edits this for something else later. But the current flow & scripts should work fine with only the PK column here.\ninterface updateddata {\n '0': (string | undefined),\n '1': (string | undefined),\n '2': (string | undefined),\n '3': (string | undefined),\n '4': (string | undefined),\n '5': (string | undefined),\n 'PK': (string | undefined)\n}","runAfter":{},"description":"Go to an online Excel table, go to the Automate tab, select New Script, then replace the placeholder script code by pasting all this in the code editor. You may want to name the script BatchUpdateV5.","metadata":{"operationMetadataId":"7922f637-e66f-45ad-a5e1-c69106b4a210"}},"Office_Script_Batch_Create":{"type":"Compose","inputs":"function main(workbook: ExcelScript.Workbook,\n TableName: string,\n PrimaryKeyColumnName: string,\n ForceMode1Processing: string,\n CreateData: createdata[],\n) {\n //If it is a blank table, add an empty row to prevent errors.\n if (workbook.getTable(TableName).getRangeBetweenHeaderAndTotal().getRowCount() < 2) {\n workbook.getTable(TableName).addRow();\n }\n let table = workbook.getTable(TableName);\n let TableRange = table.getRange();\n let TableRangeNoHeader = table.getRangeBetweenHeaderAndTotal();\n let TableData = TableRange.getValues();\n let PKColumnIndex = table.getColumn(PrimaryKeyColumnName).getIndex();\n CreateData = JSON.parse(JSON.stringify(CreateData).split('\"PK\"').join(`\"${PKColumnIndex}\"`))\n let ColumnCount = TableRange.getColumnCount();\n let InitialNumberOfTableRows = TableRange.getRowCount();\n let TableSize = (InitialNumberOfTableRows) * (ColumnCount);\n let TableSizeBatchProcLimit = 1000000;\n //let BlankLine: string [];\n //BlankLine = new Array(ColumnCount);\n let RowNum: number;\n RowNum = InitialNumberOfTableRows;\n //-1 to 0 index the RowNum\n RowNum = RowNum - 1;\n\n console.log(`Table size is ${TableSize} cells.`);\n if (TableSize > TableSizeBatchProcLimit) { console.log(`You have exceeded the ${TableSizeBatchProcLimit} total table cell limit for processing larger batches on larger destination tables in the office script, please either reduce your destination table size or use a batch size of 1000 or less in the cloud flow.`) }\n // If the table is less than 1 million cells & not something likely big enough to make errors in the V2 batch processing method then use the batch processing, else use the V1 row by row create method that isn't as affected by table size, but does have a lower cloud flow batch size limit.\n // So if someone generally uses large batch sizes, but then their table grows past 1 million cells, then this will revert to the V1 row by row processing with the smaller batch file size limit and the cloud flow will start to error and they will need to switch their flow settings back to smaller batch sizes as well.\n if (TableSize < TableSizeBatchProcLimit && ForceMode1Processing != \"Yes\") {\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < CreateData.length; i++) {\n //Create an empty row at the end of the 2D table array & increase RowNum by 1 for the next line updating that row\n TableData.push(new Array(ColumnCount))\n //Earlier less efficient code --- TableData.push(\",\".repeat(ColumnCount - 1).split(\",\"));\n RowNum++;\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(CreateData[i]).length; j++) {\n //Create each value for each item or column given\n TableData[RowNum][Number(Object.keys(CreateData[i])[j])] = CreateData[i][Number(Object.keys(CreateData[i])[j])];\n }\n }\n //Get array of 1st row formulas to re-apply to columns after posting the updates to the table\n let FirstRowFormulas = [\"\"]\n for (let c = 0; c < ColumnCount; c++) {\n FirstRowFormulas.push(TableRangeNoHeader.getColumn(c).getRow(0).getFormula());\n }\n FirstRowFormulas.shift();\n\n // Run The Batch Create - (Batch update code source: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset)\n const CELLS_IN_BATCH = 15000;\n\n console.log(`Calling update range function...`);\n const updated = updateRangeInBatches(TableRange.getCell(0, 0), TableData, 10000);\n if (!updated) {\n console.log(`Update did not take place or complete. Check and run again.`);\n }\n\n function updateRangeInBatches(\n startCell: ExcelScript.Range,\n values: (string | boolean | number)[][],\n cellsInBatch: number\n 😞 boolean {\n const startTime = new Date().getTime();\n console.log(`Cells per batch setting: ${cellsInBatch}`);\n\n // Determine the total number of cells to write.\n const totalCells = values.length * values[0].length;\n console.log(`Total cells to update in the target range: ${totalCells}`);\n if (totalCells <= cellsInBatch) {\n console.log(`No need to batch -- updating directly`);\n updateTargetRange(startCell, values);\n return true;\n }\n\n // Determine how many rows to write at once.\n const rowsPerBatch = Math.floor(cellsInBatch / values[0].length);\n console.log(\"Rows per batch: \" + rowsPerBatch);\n let rowCount = 0;\n let totalRowsUpdated = 0;\n let batchCount = 0;\n\n // Write each batch of rows.\n for (let i = 0; i < values.length; i++) {\n rowCount++;\n if (rowCount === rowsPerBatch) {\n batchCount++;\n console.log(`Calling update next batch function. Batch#: ${batchCount}`);\n updateNextBatch(startCell, values, rowsPerBatch, totalRowsUpdated);\n\n // Write a completion percentage to help the user understand the progress.\n rowCount = 0;\n totalRowsUpdated += rowsPerBatch;\n console.log(`${((totalRowsUpdated / values.length) * 100).toFixed(1)}% Done`);\n }\n }\n console.log(`Updating remaining rows -- last batch: ${rowCount}`)\n if (rowCount > 0) {\n updateNextBatch(startCell, values, rowCount, totalRowsUpdated);\n }\n let endTime = new Date().getTime();\n console.log(`Completed ${totalCells} cells update. It took: ${((endTime - startTime) / 1000).toFixed(6)} seconds to complete. ${((((endTime - startTime) / 1000)) / cellsInBatch).toFixed(8)} seconds per ${cellsInBatch} cells-batch.`);\n return true;\n }\n /**\n * A helper function that computes the target range and updates. \n */\n function updateNextBatch(\n startingCell: ExcelScript.Range,\n data: (string | boolean | number)[][],\n rowsPerBatch: number,\n totalRowsUpdated: number\n ) {\n const newStartCell = startingCell.getOffsetRange(totalRowsUpdated, 0);\n const targetRange = newStartCell.getResizedRange(rowsPerBatch - 1, data[0].length - 1);\n console.log(`Updating batch at range ${targetRange.getAddress()}`);\n const dataToUpdate = data.slice(totalRowsUpdated, totalRowsUpdated + rowsPerBatch);\n try {\n targetRange.setValues(dataToUpdate);\n } catch (e) {\n throw `Error while updating the batch range: ${JSON.stringify(e)}`;\n }\n return;\n }\n /**\n * A helper function that computes the target range given the target range's starting cell\n * and selected range and updates the values.\n */\n function updateTargetRange(\n targetCell: ExcelScript.Range,\n values: (string | boolean | number)[][]\n ) {\n const targetRange = targetCell.getResizedRange(values.length - 1, values[0].length - 1);\n console.log(`Updating the range: ${targetRange.getAddress()}`);\n try {\n targetRange.setValues(values);\n } catch (e) {\n throw `Error while updating the whole range: ${JSON.stringify(e)}`;\n }\n return;\n }\n //Re-apply the formulas from the 1st row to the entire columns after the update\n for (let f = 0; f < ColumnCount; f++) {\n if (FirstRowFormulas[f].toString().startsWith(\"=\")) {\n TableRangeNoHeader.getColumn(f).getRow(0).setFormula(FirstRowFormulas[f])\n TableRangeNoHeader.getColumn(f).getRow(0).autoFill(table.getRangeBetweenHeaderAndTotal().getColumn(f).getAddress(), ExcelScript.AutoFillType.fillDefault)\n }\n }\n }\n // Create row by row if the table is too large\n else {\n let Blank2dArray = new Array(CreateData.length).fill(null).map(() => new Array(ColumnCount).fill(null));\n let TableData = Blank2dArray;\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < CreateData.length; i++) {\n //i is also the row number\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(CreateData[i]).length; j++) {\n //Create each value for each item or column given\n TableData[i][Number(Object.keys(CreateData[i])[j])] = CreateData[i][Number(Object.keys(CreateData[i])[j])];\n }\n }\n //Append the in-memory TableData to the Excel table\n //Split the in-memory TableData into batches that addRows can handle without error & loop through to add each batch in addRows\n let BatchSize = 200;\n //length - 1 prevents a later slice(n, n) or slice(1000, 1000) which would return a null array & error the addRows\n let NumBatches = Math.floor((TableData.length - 1) / BatchSize);\n for (let k = 0; k <= NumBatches; k++) {\n let BatchRows = TableData.slice(k * BatchSize, (k + 1) * BatchSize)\n table.addRows(-1, BatchRows);\n }\n }\n}\n//The 1st few column indexes are provided incase anyone edits this for something else later. But the current flow & scripts should work fine with only the PK column here.\ninterface createdata {\n '0': (string | undefined),\n '1': (string | undefined),\n '2': (string | undefined),\n '3': (string | undefined),\n '4': (string | undefined),\n '5': (string | undefined),\n 'PK': (string | undefined)\n}","runAfter":{"Office_Script_Batch_Update":["Succeeded"]},"description":"Go to an online Excel table, go to the Automate tab, select New Script, then replace the placeholder script code by pasting all this in the code editor. You may want to name the script BatchCreateV5.","metadata":{"operationMetadataId":"f8ccc556-b345-429e-8c51-af7bb6b42452"}},"List_rows_Sample_source_data_-Placeholder-":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4HB4HVGZLNYHNDLYN3M3UGHB2UT","table":"{00000000-000C-0000-FFFF-FFFF00000000}","$top":2000,"dateTimeFormat":"ISO 8601"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"Office_Script_Batch_Create":["Succeeded"]},"description":"Pagination set to 25000. Initial top count set to a 2000 row batch.","runtimeConfiguration":{"paginationPolicy":{"minimumItemCount":25000}},"limit":{"timeout":"PT5M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/SourceData.xlsx","operationMetadataId":"408aee2b-05c0-44e2-8468-78c78fb0530d","tableId":"{00000000-000C-0000-FFFF-FFFF00000000}","01JWBUU4EEP2QHP4DFINBKWWCSYPOMRAA5":"/SourceData.xlsx","01JWBUU4HB4HVGZLNYHNDLYN3M3UGHB2UT":"/SourceTable.xlsx"}},"Match_new_and_existing_data_key_values_then_batch_update":{"type":"Scope","actions":{"List_rows_Get_header_sample":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","table":"{00000000-000C-0000-FFFF-FFFF00000000}","$top":2},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"SelectGenerateUpdateData":["Succeeded"]},"description":"Input the Location, Doc Library, File, & Table info to get the table you want to update. Not meant to get the data, only headers, so it is limited to the 1st few rows. If your table is blank with no rows, you need to add a blank row to run the flow.","metadata":{"01OVBUQJMSNAHSGEXJCNFISQKOJWJ2ZY3N":"/Book.xlsx","operationMetadataId":"64ff543c-8881-4e6c-b49e-7bbadd6fbdd2","tableId":"{00000000-000C-0000-FFFF-FFFF00000000}","01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}},"SelectGenerateUpdateData":{"type":"Select","inputs":{"from":"@outputs('List_rows_Sample_source_data_-Placeholder-')?['body/value']","select":{"PrimaryKey":"@item()?['SourceC1']","DestinationC2":"@item()?['SourceC2']","DestinationC3":"@item()?['SourceC3']","DestinationC4":"@item()?['SourceC4']","DestinationC5":"@item()?['SourceC5']","DestinationC6":"@item()?['SourceC6']","DestinationC17":"@null","DestinationC18":"","Destination-C 27":"1","DestinationC28":"@if(empty(item()?['SourceC6']), 0, 1)"}},"runAfter":{},"description":" Except PrimaryKey, input EXACT Excel column names on the left & updated values from other sources on the right. Set the PrimaryKey row /w matching key values from the other source. Blank '' values don't alter data, null values update cells to empty.","metadata":{"operationMetadataId":"b221bdd9-0a32-43e4-8498-a8116e6b0a83"}},"Run_script_Update_Excel_rows":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"RunScriptProd","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","scriptId":"ms-officescript%3A%2F%2Fonedrive_business_itemlink%2F01JWBUU4CI62ZOXSKDIJD2KRGYGN2R4NX4","ScriptParameters/TableName":"Table1","ScriptParameters/PrimaryKeyColumnName":"GeneratePK","ScriptParameters/ForceMode1Processing":"No","ScriptParameters/UpdatedData":"@outputs('ComposeDestinationJSON')"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"ComposeDestinationJSON":["Succeeded"]},"description":"Change Location, Doc Library, File, Script, TableName, PrimaryKeyColumnName, & KeepFormula fields to match your use-case. Make sure the primary key column listed is the column you want to be matched with the \"Generate update data\" PrimaryKey values.","limit":{"timeout":"PT15M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","operationMetadataId":"713f593f-c5e4-43b6-ad08-3a71051bbf87","tableId":null,"01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}},"SelectDestinationColumnNumbersByName":{"type":"Select","inputs":{"from":"@range(0, sub(length(body('SelectReformatHeaders')), 1))","select":{"@{split(body('SelectReformatHeaders')[item()], '\":')?[0]}":"@item()"}},"runAfter":{"SelectReformatHeaders":["Succeeded"]},"description":"This action will error if the table you want to update has 0 rows. You will need to add a blank row to your table so the List rows Get header sample action can return the header column name values.","metadata":{"operationMetadataId":"87bfb21b-ef06-4f23-99dc-20b0b95e4f0c"}},"ReformatHeaderObject":{"type":"Compose","inputs":"@json(replace(replace(replace(string(body('SelectDestinationColumnNumbersByName')), '[{', '{'), '}]', '}'), '},{', ','))","runAfter":{"SelectDestinationColumnNumbersByName":["Succeeded"]},"description":"Convert the array of JSON objects to a single searchable JSON object","metadata":{"operationMetadataId":"69b1d189-d56f-45fc-8dbb-a49740832a64"}},"SelectArrayOfArraysUpdateData":{"type":"Select","inputs":{"from":"@body('SelectAddRecordBreakToEach')","select":"@join(json(replace(\r\nreplace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(\r\nconcat('[', skip(take(string(item()), sub(length(string(item())), 1)), 1), ']'),\r\n'\":\"', '\",\":\",\"'), '\":\",\":\",', '\",\":\",\":'),\r\n'\":-', '\",\":\",-'), '\",\":\",\",\":\",-', '\",\":\",\":-'),\r\n'\":0', '\",\":\",0'), '\",\":\",\",\":\",0', '\",\":\",\":0'),\r\n'\":1', '\",\":\",1'), '\",\":\",\",\":\",1', '\",\":\",\":1'),\r\n'\":2', '\",\":\",2'), '\",\":\",\",\":\",2', '\",\":\",\":2'),\r\n'\":3', '\",\":\",3'), '\",\":\",\",\":\",3', '\",\":\",\":3'),\r\n'\":4', '\",\":\",4'), '\",\":\",\",\":\",4', '\",\":\",\":4'),\r\n'\":5', '\",\":\",5'), '\",\":\",\",\":\",5', '\",\":\",\":5'),\r\n'\":6', '\",\":\",6'), '\",\":\",\",\":\",6', '\",\":\",\":6'),\r\n'\":7', '\",\":\",7'), '\",\":\",\",\":\",7', '\",\":\",\":7'),\r\n'\":8', '\",\":\",8'), '\",\":\",\",\":\",8', '\",\":\",\":8'),\r\n'\":9', '\",\":\",9'), '\",\":\",\",\":\",9', '\",\":\",\":9'),\r\n'\":[', '\",\":\",['), '\",\":\",\",\":\",[', '\",\":\",\":['),\r\n'\":{', '\",\":\",{'), '\",\":\",\",\":\",{', '\",\":\",\":{'),\r\n'\":null', '\",\":\",\"null\"'), '\",\":\",\",\":\",\"null\"', '\",\":\",\":\"null\"'),\r\n'\":true', '\",\":\",\"true\"'), '\",\":\",\",\":\",\"true\"', '\",\":\",\":\"true\"'),\r\n'\":false', '\",\":\",\"false\"'), '\",\":\",\",\":\",\"false\"', '\",\":\",\":\"false\"'),\r\n',\":\",', ','), '_#-', '_#_')), '_#-')"},"runAfter":{"SelectAddRecordBreakToEach":["Succeeded"]},"description":"And filter out the JSON colon : items","metadata":{"operationMetadataId":"a5f08622-a062-4a8a-9c34-b09b0bd9b895"}},"SelectAddRecordBreakToEach":{"type":"Select","inputs":{"from":"@body('SelectGenerateUpdateData')","select":"@removeProperty(\r\naddProperty(\r\naddProperty(item(), 'PK', item()['PrimaryKey']),\r\n'_|-', ''),\r\n'PrimaryKey')"},"runAfter":{"ReformatHeaderObject":["Succeeded"]},"metadata":{"operationMetadataId":"33f058fd-c61a-4580-8957-7c263906bd72"}},"ComposeDestinationJSON":{"type":"Compose","inputs":"@json(concat('[{',\r\nskip(\r\ntake(\r\nreplace(join(body('FilterOnlyKeyValueStrings'), ','), ',\"_|-\":\"\",', '},{'),\r\nsub(length(replace(join(body('FilterOnlyKeyValueStrings'), ','), ',\"_|-\":\"\",', '},{')), 9)),\r\n0),\r\n'}]'))","runAfter":{"FilterOnlyKeyValueStrings":["Succeeded"]},"description":"Exceed the 210,000 ProcessAllCells limit & this may error & not parse. Either reduce your batch size or add more SelectDestinationIndexCells # actions to the ProcessAllCells Scope & FilterOnlyKeyValueStrings.","metadata":{"operationMetadataId":"9fc0353c-c586-4590-9366-c6df90dda58a"}},"FilterOnlyKeyValueStrings":{"type":"Query","inputs":{"from":"@json(replace(replace(concat(\r\nstring(body('SelectDestinationIndexCells')), '-#-',\r\nstring(body('SelectDestinationIndexCells_2')), '-#-',\r\nstring(body('SelectDestinationIndexCells_3')), '-#-',\r\nstring(body('SelectDestinationIndexCells_4')), '-#-',\r\nstring(body('SelectDestinationIndexCells_5'))\r\n), '-#-[]', ''), ']-#-[', ','))","where":"@not(equals(item(), '_||'))"},"runAfter":{"ProcessAllCells":["Succeeded"]},"description":"Combine all processed cell arrays & filter blanks","metadata":{"operationMetadataId":"d6b0a71e-6b59-42b5-ac8e-771cf48d6814"}},"ProcessAllCells":{"type":"Scope","actions":{"SelectDestinationIndexCells":{"type":"Select","inputs":{"from":"@range(0, max(Min(100000, length(outputs('CustomDelimitedCells'))), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"1st 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"CustomDelimitedCells":{"type":"Compose","inputs":"@split(Join(body('SelectArrayOfArraysUpdateData'), '_#-'), '_#-')","runAfter":{},"metadata":{"operationMetadataId":"39e5ede0-3b02-469c-b6f7-15221fee5668"}},"SelectDestinationIndexCells_2":{"type":"Select","inputs":{"from":"@range(100000, max(min(sub(length(outputs('CustomDelimitedCells')), 100000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"2nd 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_3":{"type":"Select","inputs":{"from":"@range(200000, max(min(sub(length(outputs('CustomDelimitedCells')), 200000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"3rd 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_4":{"type":"Select","inputs":{"from":"@range(300000, max(min(sub(length(outputs('CustomDelimitedCells')), 300000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"4th 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_5":{"type":"Select","inputs":{"from":"@range(400000, max(min(sub(length(outputs('CustomDelimitedCells')), 400000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"5th 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}}},"runAfter":{"SelectArrayOfArraysUpdateData":["Succeeded"]},"description":"The Range( ) expression is required for this & has a limit of 100,000 items. This breaks things up if someone wants to load large tables up to 210,000 cells, ex: 7 columns & 30000 rows. Exceed 210,000 & the later JSON action will error & not parse.","metadata":{"operationMetadataId":"fec90f32-16ae-41f9-9b14-934fd19a3df3"}},"SelectReformatHeaders":{"type":"Select","inputs":{"from":"@skip(split(\r\nreplace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(\r\nstring(first(outputs('List_rows_Get_header_sample')?['body/value'])),\r\n'\":0', '\":\"0'),\r\n'\":1', '\":\"1'),\r\n'\":2', '\":\"2'),\r\n'\":3', '\":\"3'),\r\n'\":4', '\":\"4'),\r\n'\":5', '\":\"5'),\r\n'\":6', '\":\"6'),\r\n'\":7', '\":\"7'),\r\n'\":8', '\":\"8'),\r\n'\":9', '\":\"9'),\r\n'\":[', '\":\"['),\r\n'\":{', '\":\"{'),\r\n'\":null', '\":\"null\"'),\r\n'\":true', '\":\"true\"'),\r\n'\":false', '\":\"false\"'),\r\n'\":\"'), 2)","select":"@last(split(item(), ',\"'))"},"runAfter":{"List_rows_Get_header_sample":["Succeeded"]},"metadata":{"operationMetadataId":"dfd7c59f-8346-4a15-8f41-b5fe6200c746"}}},"runAfter":{"List_rows_Sample_source_data_-Placeholder-":["Succeeded"]},"description":"Fill out the Generate update data, List rows Get header sample, & Run script Update Excel rows actions with your data, workbook, & script information.","metadata":{"operationMetadataId":"175119c4-e881-4899-81d6-14aaa7f741f8"}},"Get_records_not_in_found_in_table_then_batch_create":{"type":"Scope","actions":{"Filter_array_Get_records_not_found_in_table":{"type":"Query","inputs":{"from":"@outputs('ComposeDestinationJSON')","where":"@contains(outputs('Run_script_Update_Excel_rows')?['body/result'], item()['PK'])"},"runAfter":{},"description":"Get the records from the batch DestinationJSON action where the record primary key is in the list of primary keys not found in the Run script Update Excel action","metadata":{"operationMetadataId":"c340f3a3-1696-4ebb-9d2b-3b0463ae3773"}},"Run_script_Create_Excel_rows":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"RunScriptProd","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","scriptId":"ms-officescript%3A%2F%2Fonedrive_business_itemlink%2F01JWBUU4F2LHYL4XTPENFZJI3WGXKF6ZWM","ScriptParameters/TableName":"Table1","ScriptParameters/PrimaryKeyColumnName":"GeneratePK","ScriptParameters/ForceMode1Processing":"No","ScriptParameters/CreateData":"@body('Filter_array_Get_records_not_found_in_table')"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"Filter_array_Get_records_not_found_in_table":["Succeeded"]},"description":"Change Location, Doc Library, File, Script, TableName, PrimaryKeyColumnName, & KeepFormulas fields to match your use-case. Make sure the primary key column listed is the column you want to be matched with the \"Generate update data\" PrimaryKey values.","limit":{"timeout":"PT5M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","operationMetadataId":"713f593f-c5e4-43b6-ad08-3a71051bbf87","tableId":null,"01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}}},"runAfter":{"Match_new_and_existing_data_key_values_then_batch_update":["Succeeded"]},"description":"Fill out the Run script Create Excel rows action with your data, workbook, & script information.","metadata":{"operationMetadataId":"9e2a036b-c6aa-461a-81f3-efed57aa2f71"}}},"runAfter":{}}}
New action menu tab to paste the code into...
New Scope in Your Flow...
Likely errors in the Select action expressions...
Again. You will need to go to each Select action, copy the expression text listed on the action, remove the expression text, use the map button on the right-side of the action to toggle to the table view & back to the single textbox view, then click in the textbox view to bring up the content menu & expression tab, then paste the expression text you copied back into the expression editor & remove any initial or end characters that do not belong in the expression input like the starting '@' or ending '}'.
@RobFol
If you see the option to select a new non-OpExOptimization resource on your screen, then go there to select a new account for the resource. Otherwise if it is not letting you do anything through the standard import method, then you can try the alternate import method: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Takolota, thank you for helping me import the flow.
I was able to successfully modify the flow to use my excel source and destination files so I went the next step to switch to a SharePoint list as my source and I am running into a problem with the ComposeDestinationJSON step.
I am getting this error: I have only shown one complete record below but when looking at the entire error message I noticed that it contains 20 complete records and the 21st record is cut off in the middle of the 13th column of data. The data I am working with is a list of questions for a quiz app so the questions and their response choices can be quite wordy. I checked the number of characters returned in the error message starting with the first record to the last and it was about 8009 characters. Is there a limit to the number of characters that can be in this JSON file?
The error:
Unable to process template language expressions in action 'ComposeDestinationJSON' inputs at line '0' and column '0': 'The template language function 'json' parameter is not valid. The provided value '[{"1":"Which of the following protocol is used for remote terminal connection service?","2":"(A) RARP|(B) UDP|(C) FTP|(D) TELNET","3":"(D) TELNET","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:44","13":"2021-08-02 13:29:37","14":"1","PK":"1"},{"1":"Which of the following terms is just the collection of networks that can be joined together?","2":"(A) Intranet|(B) Extranet|(C) Internet|(D) LAN","3":"(A) Intranet","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:46","13":"2021-08-02 13:29:38","14":"2","PK":"2"},{"1":"Bluetooth is an example of","2":"(A) Wide area network|(B) Virtual private network|(C) Local area network|(D) Personal area network","3":"(D) Personal area network","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:47","13":"2021-08-02 13:29:38","14":"3","PK":"3"},
I did a test to see if I can load the first 20 records by setting the Pagination limit on the Get Items step to 20. I also changed the CELLS_IN_BATCH = 1000 in both BathCreate and BatchUpdate scripts. This allowed the flow to work and it exported 100 records. I am not sure how it came up with that number. There are over 7000 records in the Sharepoint List and 15 columns, two of them are multiline columns.
I have seen in previous posts you mention limiting the batch size. Are you referring to the Cells_IN_BATCH setting?
Any help you could give to understand the issue with the Compose DestinationJSON step would be greatly appreciated.
Error messages have a relatively smaller character limit, but that shouldn’t affect the JSON.
There is likely another issue.
And you shouldn’t need to change anything in the script. Anything that may need to be changed has an input on the script action in the flow.
I suspect your alterations may be to the table size limit, so you may just be forcing the script into Mode1 Processing. Which you can do right from the script action inputs themselves.
Also if you are having an issue with the DestinationJSON action, please screenshot the action in the flow editor & in the flow run.
Thanks
I've done some tests here where I started with a Pagination setting of 100 and increased it by 100 after each successful run. when I get to 700 it runs fine but if I set it to 701 or greater it fails. I then wanted to test if the 701st record was the issue so I set the filter for the Get Item to ID ne ‘1’ to skip the first record and get the 701st record and set the pagination threshold back to 700. The flow ran fine.
json(concat('[{',
skip(
take(
replace(join(body('FilterOnlyKeyValueStrings'), ','), ',"_|-":"",', '},{'),
sub(length(replace(join(body('FilterOnlyKeyValueStrings'), ','), ',"_|-":"",', '},{')), 9)),
0),
'}]'))
Start of error message.
Unable to process template language expressions in action 'ComposeDestinationJSON' inputs at line '0' and column '0': 'The template language function 'json' parameter is not valid. The provided value '[{"1":"Which of the following protocol is used for remote terminal connection service?","2":"(A) RARP|(B) UDP|(C) FTP|(D) TELNET","3":"(D) TELNET","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:44","13":"2021-08-02 13:29:37","14":"1","PK":"1"},{"1":"Which of the following terms is just the collection of networks that can be joined together?","2":"(A) Intranet|(B) Extranet|(C) Internet|(D) LAN","3":"(A) Intranet","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:46","13":"2021-08-02 13:29:38","14":"2","PK":"2"},{"1":"Bluetooth is an example of","2":"(A) Wide area network|(B) Virtual private network|(C) Local area network|(D) Personal area network","3":"(D) Personal area network","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:47","13":"2021-08-02 13:29:38","14":"3","PK":"3"},{"1":"The application layer of the OSI model is","2":"(A) Four layer|(B) Five layer|(C) Six layer|(D) Seven layer","3":"(D) Seven layer","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:48","13":"2021-08-02 13:29:39","14":"4","PK":"4"},{"1":"FDDI is","2":"(A) Bus based network|(B) Mesh network|(C) Star network|(D) Ring network","3":"(D) Ring network","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:49","13":"2021-08-02 13:29:39","14":"5","PK":"5"},{"1":"Which of the following is the fastest type of computer?","2":"(A) Personal computer|(B) Workstation|(C) Notebook|(D) Laptop","3":"(D) Laptop","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:50","13":"2021-08-02 13:29:40","14":"6","PK":"6"},{"1":"Which can read data and convert them to a form that a computer can use?","2":"(A) Control|(B) Input device|(C) Storage|(D) Logic","3":"(B) Input device","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:51","13":"2021-08-02 13:29:40","14":"7","PK":"7"},{"1":"A device which can be connected to a network without using cable is called","2":"(A) Distributed device|(B) Centralized device|(C) Open-source device|(D) Wireless device","3":"(D) Wireless device","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:52","13":"2021-08-02 13:29:41","14":"8","PK":"8"},{"1":"The vast network of computers that connects millions of people all over the world is called","2":"(A) Internet|(B) Hypertext|(C) LAN|(D) Web","3":"(A) Internet","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:54","13":"2021-08-02 13:29:41","14":"9","PK":"9"},{"1":"The cheapest modems can transmit","2":"(A) 300 bits per second|(B) 1,200 bits per second|(C) 2,400 bits per second|(D) None of these","3":"(A) 300 bits per second","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:55","13":"2021-08-02 13:29:42","14":"10","PK":"10"},{"1":"Computers connected to a LAN (Local Area Network) can","2":"(A) Share information and/or share peripheral equipment|(B) E-mail|(C) Go on line|(D) Run faster","3":"(A) Share information and/or share peripheral equipment","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:56","13":"2021-08-02 13:29:42","14":"11","PK":"11"},{"1":"Most important advantage of an IC is its","2":"(A) Low power consumption|(B) Extremely high reliability|(C) Reduced cost|(D) Easy replacement in case of circuit failure","3":"(B) Extremely high reliability","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:58","13":"2021-08-02 13:29:43","14":"12","PK":"12"},{"1":"The data rate of the ISDN basic access B channel is","2":"(A) 32 Kbps|(B) 64 Kbps|(C) 144 Kbps|(D) 192 Kbps","3":"(B) 64 Kbps","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:58","13":"2021-08-02 13:29:43","14":"13","PK":"13"},{"1":"Null modems are type of","2":"(A) Modem|(B) Modem eliminator|(C) Protocol converter|(D) Multiplexer","3":"(B) Modem eliminator","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:28:59","13":"2021-08-02 13:29:44","14":"14","PK":"14"},{"1":"The geostationary satellite used for communications systems","2":"(A) is positioned over equator|(B) remains stationery relative to the earth|(C) rotates with the earth|(D) all of these","3":"(D) all of these","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:01","13":"2021-08-02 13:29:44","14":"15","PK":"15"},{"1":"Which of the following is not a disadvantage of wireless LAN?","2":"(A) Slower data transmission|(B) Higher error rate|(C) Interference of transmissions from different computers|(D) All of the above","3":"(D) All of the above","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:02","13":"2021-08-02 13:29:45","14":"16","PK":"16"},{"1":"Networks are monitored by security personnel and supervised by _____ who set(s) up accounts and passwords for authorized network users.","2":"(A) Password administrators|(B) Network administrators|(C) The government|(D) IT managers","3":"(B) Network administrators","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:03","13":"2021-08-02 13:29:45","14":"17","PK":"17"},{"1":"______ is a technique that is used to send more than one call over a single line.","2":"(A) Digital transmission|(B) Infrared transmission|(C) Digitizing|(D) Multiplexing","3":"(D) Multiplexing","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:04","13":"2021-08-02 13:29:47","14":"18","PK":"18"},{"1":"The portion of server memory set aside to temporary hold data packets from workstations is called","2":"(A) Dirty packet buffer|(B) Packet block|(C) Packet receive buffer|(D) Packet cache buffer","3":"(C) Packet receive buffer","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:05","13":"2021-08-02 13:29:47","14":"19","PK":"19"},{"1":"What frequency range is used for FM radio transmission?","2":"(A) Very high frequency : 30MHz to 300 MHz|(B) High frequency : 3 MHz to 30 MHz|(C) Low frequency : 30kHz to 300 kHz|(D) Very low frequency : 3kHz to 30 kHz","3":"(A) Very high frequency : 30MHz to 300 MHz","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:06","13":"2021-08-02 13:29:47","14":"20","PK":"20"},{"1":"Band limited signal is","2":"(A) Simultaneous transmission of data to a number of stations|(B) Transmission of signals without modulation|(C) A signal all of whose energy is contained within a finite frequency range|(D) All of these","3":"(C) A signal all of whose energy is contained within a finite frequency range","4":"Networking","5":"SD","6":"USS","7":"Generic","8":"Technical","9":"","10":"SingleSelect","11":"Robert Foletta","12":"2021-12-14 03:29:07","13":"2021-08-0
End of error message.
Please let me know if you need anything else to isolate this issue. Thanks.
I've done some tests here where I started with a Pagination setting of 100 and increased it by 100 after each successful run. when I get to 700 it runs fine but if I set it to 701 or greater it fails. I then wanted to test if the 701st record was the issue, so I set the filter for the Get Item to ID ne ‘1’ to skip the first record and get the 701st record and set the pagination threshold back to 700. The flow ran fine.
json(concat('[{',
skip(
take(
replace(join(body('FilterOnlyKeyValueStrings'), ','), ',"_|-":"",', '},{'),
sub(length(replace(join(body('FilterOnlyKeyValueStrings'), ','), ',"_|-":"",', '},{')), 9)),
0),
'}]'))
Two of the columns in the source data are mutiline text columns.
By the way this is my third attempt to reply here. My previous attempts appeared to work but shortly after I posted my reply I saw that they disappeared. I hope this one stays.
I believe that 2 records were deleted from the list so if I pull 700 records it will pull the record ID's 701 and 702. When I skipped record 1 I got record 703 as my 700th record.
does that answer your question?
The records are questions for an Assessment app. there is a Response column which has 4 responses separated by a pipe.