cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
cfoenki
Helper II
Helper II

My Flow seems to be stucked in a loop with Sharepoint updateitem

Hi everybody,

 

I'm struggling with an issue on my flow.

Basically my flow has for input a json table that I need to pass to a Sharepoint list either as an update when the element exists or a new item when the element is new.

As I cannot determine which items changed since the last call on my json table, I need to update all items each time.


It is working well when I have a loop of around 270 items (completed in 7 minutes)

cfoenki_0-1678985749667.png

When I look at my other loop for a little bit more than 1000 items it is taking forever (still ongoing)

cfoenki_1-1678985793608.png

I took a look on the associated sharepoint list and it seems that nothing is happening for the last 20 minutes (8 hours ago is my previous run)

cfoenki_2-1678985845639.png

I would be extremely greatful if you have a solution.

 

Thank you in advance!

 

1 ACCEPTED SOLUTION

Accepted Solutions

Hopefully this will speed things up significantly. It checks if you have new items and adds them, and checks if any of the items were updated, and only updates those ones. So, it will only loop for new or updated items.

 

For this example, I've got the following Excel file with three worksheets. The ones highlighted in yellow are an item that's been updated, and a new item.

grantjenkins_1-1679195176975.png

grantjenkins_2-1679195210451.png

grantjenkins_3-1679195238072.png

 

And the following SharePoint Lists. Note that I've renamed the Title field to Reference in this example.

grantjenkins_4-1679195327215.png

 

grantjenkins_5-1679195351108.png

 

grantjenkins_6-1679195373546.png

 

See full flow below (for one parallel branch). I'll go into each of the actions. Note that this includes all the actions for Deviations branch. The other two branches would be essentially the same.

grantjenkins_0-1679195068187.png

grantjenkins_18-1679196673559.png

grantjenkins_19-1679196690807.png

 

Run script is the same as what you already have.

grantjenkins_7-1679195486764.png

 

Get items Deviation returns all the items from the list. I've set Top Count to 5000, Pagination On, and a Threshold of 10000. This will retrieve items in batches of 5000 (instead of the default 100 which is slower), up to 10000 items.

grantjenkins_8-1679195576831.png

 

Filter array Deviation is the same as what you have where it filters out the Deviation object from the Office Script results.

grantjenkins_9-1679195622909.png

 

Select Excel Deviation is the same as what you have to extract out the properties. The only change is that I'm stripping out the first row (headers) within the From expression.

//From
skip(first(body('Filter_array_Deviation'))?['data'], 1)

//Reference, Process, Service, Creator
item()[0]
item()[1]
item()[2]
item()[3]

grantjenkins_10-1679195724337.png

 

Parse JSON Deviation is the same as what you have. The Schema used is:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Reference": {
                "type": "string"
            },
            "Process": {
                "type": "string"
            },
            "Service": {
                "type": "string"
            },
            "Creator": {
                "type": "string"
            }
        },
        "required": [
            "Reference",
            "Process",
            "Service",
            "Creator"
        ]
    }
}

grantjenkins_11-1679195793829.png

 

Items to be added

Firstly, I work out what items need to be added (don't exist in the list). To do this, I get all the References from the list, then check what items in the Excel results are not contained within that - those are the new items we need to add.

 

Select Deviation References retrieves a simple array of all References from the list.

grantjenkins_12-1679195932879.png

 

Filter array Deviation To Be Added filters out the data so only items that need to be added are retrieved. From is using the output from Parse JSON Deviation. Output comes from Select Deviation References. Reference gets the Reference from the current item.

//From
body('Parse_JSON_Deviation')

//Output
body('Select_Deviation_References')

//Reference
item()['Reference']

grantjenkins_13-1679196137542.png

 

Items to be updated

Next, I work out what items have been updated (difference in either Process, Service or Creator). I'm using some XPath expressions here.

 

XML Deviation converts our data from Select Excel Deviation to XML so we can use XPath.

xml(json(concat('{"root": { value:', body('Select_Excel_Deviation'), '}}')))

grantjenkins_14-1679196270321.png

 

Select Combined Deviation combines the existing data in our SharePoint List Items with the corresponding data from Excel based on the Reference (Title field in my example). It uses the following XPath expressions.

//ProcessXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Process/text())'))

//ServiceXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Service/text())'))

//CreatorXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Creator/text())'))

grantjenkins_15-1679196391006.png

 

Filter array Deviation To Be Updated uses the output from Select Combined Deviation and the following filter expression, which will check if any of the values have been updated in Excel.

@Anonymous(
    not(equals(item()?['Process'], item()?['ProcessXL'])),
    not(equals(item()?['Service'], item()?['ServiceXL'])),
    not(equals(item()?['Creator'], item()?['CreatorXL']))
)

grantjenkins_16-1679196478420.png

 

Parse JSON Deviation To Be Updated uses the output from Filter array Deviation To Be Updated, and the following Schema.

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "ID": {
                "type": "integer"
            },
            "Reference": {
                "type": "string"
            },
            "Process": {
                "type": "string"
            },
            "Service": {
                "type": "string"
            },
            "Creator": {
                "type": "string"
            },
            "ProcessXL": {
                "type": "string"
            },
            "ServiceXL": {
                "type": "string"
            },
            "CreatorXL": {
                "type": "string"
            }
        },
        "required": [
            "ID",
            "Reference",
            "Process",
            "Service",
            "Creator",
            "ProcessXL",
            "ServiceXL",
            "CreatorXL"
        ]
    }
}

grantjenkins_17-1679196560444.png

 

Apply to each Add Deviation iterates over each of the items from Filter array Deviation To Be Added. And within the loop, it uses Create item Deviation to add each of the new items.

grantjenkins_21-1679196825241.png

 

Apply to each Update Deviation iterates over each of the items from Parse JSON Deviation To Be Updated. And within the loop, it uses Update item Deviation to update each of the items that have changed.

grantjenkins_22-1679196909352.png

 

In my example, both loops will only iterate once, since I've only got one item that was updated and one item that was added. I would hope the entire flow (with all your data) would complete within a couple of minutes all going well.

 

The output in my Deviations Table Dashboard List after the flow runs is below.

grantjenkins_23-1679197355714.png


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

View solution in original post

16 REPLIES 16

Hi @cfoenki ,

Maybe you can try to enable the concurrency function of [Apply to each]

This can improve the execution efficiency of [Apply to each] and reduce the execution time

vchengfenmsft_0-1679021222409.png

 

 

Best Regards

Cheng Feng

cfoenki
Helper II
Helper II

@v-chengfen-msft Thank you for the response. Unfortunately turning on Concurrency control made things worse 😞

grantjenkins
Super User
Super User

Are you able to show what you have in your Condition within the loop?

 

Also, does your List and JSON data have a unique property that you can match on (a way to identify which JSON object relates to which List item)?

 

If you could show a sample of your JSON data and the List item, that would help.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
cfoenki
Helper II
Helper II

Sure, here is what is inside my loop

cfoenki_0-1679043866683.png

I have 3 parrallel loops for 3 different sharepoint list could it cause the issue?

I tested to save my flow as a new one and keep only one branch and it is working fine

I'd suggest explaining exactly what your flow is trying to achieve (including the three SharePoint Lists), and provide a sample of the JSON data and the List data. This would help to perhaps redesign the flow so it's more efficient.

 

Also a screenshot(s) of the entire flow as you have it now.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
cfoenki
Helper II
Helper II

Thank you for taking the time to answer. I redesigned the flow and now it is working but maybe you could help me having a more efficient flow as mine is taking about half an hour.

 

I have an Excel table on a Sharepoint document library. There are 3 tables of interest in that Excel: Deviation, CAPA and Other CAPA Plan.

My excel file has no Table objects (and I cannot change that)

 

From the Excel file, I want to create 3 sharepoint lists that are mirror from the Excel Table data. I have then 3 lists:
- Deviation Table Dashboard --> Should contain data from "Deviation" tab of my Excel

- CAPA Plan Table Dashboard --> Should contain data from "Other CAPA Plan" tab of my Excel

- CAPA Table Dashboard --> Should contain data from "CAPA" tab of my Excel

 

I want that my Sharepoint lists are updated regularly with Excel file updates

 

My Sharepoint lists are linked to a PowerApps application.

 

I built an office script to extract data to power automate.

Here is the script:

 

/**
 * This script returns the values from the used ranges on each worksheet.
 */
function main(workbook: ExcelScript.Workbook): WorksheetData[] {
    // Create an object to return the data from each worksheet.
    let worksheetInformation: WorksheetData[] = [];

    // Get the data from every worksheet, one at a time.
  workbook.getWorksheets().forEach((sheet) => {
    // Check if the sheet name is different from "Audit Trail"
    if (sheet.getName() !== "Audit Trail") {
      let values = sheet.getUsedRange()?.getValues();
      worksheetInformation.push({
        name: sheet.getName(),
        data: values as string[][]
      });
    }
  });

  return worksheetInformation;
}

// An interface to pass the worksheet name and cell values through a flow.
interface WorksheetData {
    name: string;
    data: string[][];
}

 

 

I use sheet.getUsedRange to detect the range of data in each sheet.

 

In Power Automate I use the Run script action to retrieve output

cfoenki_0-1679065791916.png

 

The array result has this kind of format (I trimmed data a lot so it is not too long):

 

 

{
	"body": [
		{
			"name": "Deviation",
			"data": [
				[
					"Reference",
					"Process",
					"Service",
					"Creator",
				],
				[
					"#####",
					"ZZZZ",
					"YYYYY",
					"XXXX"
				],
				[
					"#####2",
					"ZZZZ2",
					"YYYYY2",
					"XXXX3"
				]
			]
		},
		{
			"name": "CAPA",
			"data": [
				[
					"ReferenceCAPA",
					"ProcessCAPA",
					"ServiceCAPA",
					"CreatorCAPA",
				],
				[
					"#####",
					"ZZZZ",
					"YYYYY",
					"XXXX"
				],
				[
					"#####2",
					"ZZZZ2",
					"YYYYY2",
					"XXXX3"
				]
			]
		},
		{
			"name": "Other CAPA Plan",
			"data": [
				[
					"ReferenceCAPAPlan",
					"ProcessCAPAPlan",
					"ServiceCAPAPlan",
					"CreatorCAPAPlan",
				],
				[
					"#####",
					"ZZZZ",
					"YYYYY",
					"XXXX"
				],
				[
					"#####2",
					"ZZZZ2",
					"YYYYY2",
					"XXXX3"
				]
			]
		}
	]
}

 

 

As I cannot use it like this I use the filter action to first select deviation table results:

cfoenki_1-1679065862460.png

And then a Select action to convert the result to a JSON array

cfoenki_3-1679065953396.png

The output is like this (with my previous example):

 

 

{
	"body": [
		{
			"Reference": "Reference",
			"Process": "Process",
			"Service": "Service",
			"Creator": "Creator"
		},
		{
			"Reference": "#####",
			"Process": "ZZZZ",
			"Service": "YYYYY",
			"Creator": "XXXX"
		},
		{
			"Reference": "#####2",
			"Process": "ZZZZ2",
			"Service": "YYYYY2",
			"Creator": "XXXX2"
		},
		{
			"Reference": "#####3",
			"Process": "ZZZZ3",
			"Service": "YYYYY3",
			"Creator": "XXXX3"
		}
	]
}

 

 

To remove the header row I use a Compose action with skip function:

cfoenki_4-1679066271931.png

Then I use a Parse JSON action to easily retrieve my fields:

cfoenki_5-1679066323404.png

Then I retrive all the current items in the sharepoint list (using Pagination as I may have a lot of items)

cfoenki_6-1679066439066.png

And finally I have a loop through all items of my JSON table

 

cfoenki_7-1679066564495.png

 

In the loop I first make a filter on the output of get items to check if reference is foundin the existing list

cfoenki_8-1679066663527.png

If length is equal to 0, it means that the item is new, I used then a Create Item action to fill a new line, if it is >0 then I used an Update Item action to update the existing line

cfoenki_11-1679067034040.pngcfoenki_12-1679067041088.png

The formula inside the field is checking if the value of the field is N/A or with no text, if so field is filled with null value otherwise it is using the Current Item property corresponding

 

If(or(equals(items('Loop_through_all_deviation_lines')['Reference'],''),equals(items('Loop_through_all_deviation_lines')['Reference'],'N/A')),null,items('Loop_through_all_deviation_lines')['Reference'])

 

The formula in the Id field of Update Item is related to the previous filter function:

 

first(body('Filter_on_the_current_deviation_item'))?['ID']

 

 

And that is the end, except that this is repeated 3 times as I have 3 tables.

 

Here is the screenshot of the whole flow for 1 list:

cfoenki_13-1679067329257.png

cfoenki_14-1679067348078.png

 

 

 

If you have an idea on how to optimize this please let me know 🙂

 

Thank you in advance.

What you have is pretty well constructed given what you're working with (Excel file with three tables that aren't defined as Tables, etc.). I'm assuming the main bottleneck is the loops due to the Condition and Add/Update actions.

 

A few questions.

  1. How many items would be in the three tables (approximately).
  2. How many items would be in the existing SharePoint Lists?
  3. How many would be new items vs. existing items that needed to be updated (approximately)?
  4. Are the Reference values always unique (what you are using in your Condition)?
  5. Do you have access to Premium connectors?

----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
cfoenki
Helper II
Helper II

@grantjenkins Your are right the bottleneck is the update/createitem loop.

 

After some tests on my existing flow the flow takes 30-40 min to run when trigger is manuel but when it is a trigger Schedule with a récurrence of 1 per day, the flow is taking forever (when I write this it is still ongoing after 7 hours...)

 

Here are the answers to the questions:

  1. How many items would be in the three tables (approximately).

In Déviation I have roughly 300 items

In Other CAPA Plans 50 items

In CAPA 1000 items

 

  1. How many items would be in the existing SharePoint Lists?

Same as in Excel (mirror)

  1. How many would be new items vs. existing items that needed to be updated (approximately)?

New items for Deviation approximately 5 per week Updates 50 items per week

New items for Other CAPA Plan 2 per week Update 1 item per week

New items for CAPA 20 per week Update 200 per week

  1. Are the Reference values always unique (what you are using in your Condition)?

Yes all references are unique

  1. Do you have access to Premium connectors?

Yes

RADical
Resolver I
Resolver I

@cfoenki You might be hitting an infinite loop in the action. You can try changing that connection to use a different user or create a condition which updates the list when the flow is triggered.

 

Preventing Infinite Loop in MS Power Automate (MS Flow) - YouTube

Hopefully this will speed things up significantly. It checks if you have new items and adds them, and checks if any of the items were updated, and only updates those ones. So, it will only loop for new or updated items.

 

For this example, I've got the following Excel file with three worksheets. The ones highlighted in yellow are an item that's been updated, and a new item.

grantjenkins_1-1679195176975.png

grantjenkins_2-1679195210451.png

grantjenkins_3-1679195238072.png

 

And the following SharePoint Lists. Note that I've renamed the Title field to Reference in this example.

grantjenkins_4-1679195327215.png

 

grantjenkins_5-1679195351108.png

 

grantjenkins_6-1679195373546.png

 

See full flow below (for one parallel branch). I'll go into each of the actions. Note that this includes all the actions for Deviations branch. The other two branches would be essentially the same.

grantjenkins_0-1679195068187.png

grantjenkins_18-1679196673559.png

grantjenkins_19-1679196690807.png

 

Run script is the same as what you already have.

grantjenkins_7-1679195486764.png

 

Get items Deviation returns all the items from the list. I've set Top Count to 5000, Pagination On, and a Threshold of 10000. This will retrieve items in batches of 5000 (instead of the default 100 which is slower), up to 10000 items.

grantjenkins_8-1679195576831.png

 

Filter array Deviation is the same as what you have where it filters out the Deviation object from the Office Script results.

grantjenkins_9-1679195622909.png

 

Select Excel Deviation is the same as what you have to extract out the properties. The only change is that I'm stripping out the first row (headers) within the From expression.

//From
skip(first(body('Filter_array_Deviation'))?['data'], 1)

//Reference, Process, Service, Creator
item()[0]
item()[1]
item()[2]
item()[3]

grantjenkins_10-1679195724337.png

 

Parse JSON Deviation is the same as what you have. The Schema used is:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Reference": {
                "type": "string"
            },
            "Process": {
                "type": "string"
            },
            "Service": {
                "type": "string"
            },
            "Creator": {
                "type": "string"
            }
        },
        "required": [
            "Reference",
            "Process",
            "Service",
            "Creator"
        ]
    }
}

grantjenkins_11-1679195793829.png

 

Items to be added

Firstly, I work out what items need to be added (don't exist in the list). To do this, I get all the References from the list, then check what items in the Excel results are not contained within that - those are the new items we need to add.

 

Select Deviation References retrieves a simple array of all References from the list.

grantjenkins_12-1679195932879.png

 

Filter array Deviation To Be Added filters out the data so only items that need to be added are retrieved. From is using the output from Parse JSON Deviation. Output comes from Select Deviation References. Reference gets the Reference from the current item.

//From
body('Parse_JSON_Deviation')

//Output
body('Select_Deviation_References')

//Reference
item()['Reference']

grantjenkins_13-1679196137542.png

 

Items to be updated

Next, I work out what items have been updated (difference in either Process, Service or Creator). I'm using some XPath expressions here.

 

XML Deviation converts our data from Select Excel Deviation to XML so we can use XPath.

xml(json(concat('{"root": { value:', body('Select_Excel_Deviation'), '}}')))

grantjenkins_14-1679196270321.png

 

Select Combined Deviation combines the existing data in our SharePoint List Items with the corresponding data from Excel based on the Reference (Title field in my example). It uses the following XPath expressions.

//ProcessXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Process/text())'))

//ServiceXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Service/text())'))

//CreatorXL
xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference="', item()?['Title'], '"]/Creator/text())'))

grantjenkins_15-1679196391006.png

 

Filter array Deviation To Be Updated uses the output from Select Combined Deviation and the following filter expression, which will check if any of the values have been updated in Excel.

@Anonymous(
    not(equals(item()?['Process'], item()?['ProcessXL'])),
    not(equals(item()?['Service'], item()?['ServiceXL'])),
    not(equals(item()?['Creator'], item()?['CreatorXL']))
)

grantjenkins_16-1679196478420.png

 

Parse JSON Deviation To Be Updated uses the output from Filter array Deviation To Be Updated, and the following Schema.

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "ID": {
                "type": "integer"
            },
            "Reference": {
                "type": "string"
            },
            "Process": {
                "type": "string"
            },
            "Service": {
                "type": "string"
            },
            "Creator": {
                "type": "string"
            },
            "ProcessXL": {
                "type": "string"
            },
            "ServiceXL": {
                "type": "string"
            },
            "CreatorXL": {
                "type": "string"
            }
        },
        "required": [
            "ID",
            "Reference",
            "Process",
            "Service",
            "Creator",
            "ProcessXL",
            "ServiceXL",
            "CreatorXL"
        ]
    }
}

grantjenkins_17-1679196560444.png

 

Apply to each Add Deviation iterates over each of the items from Filter array Deviation To Be Added. And within the loop, it uses Create item Deviation to add each of the new items.

grantjenkins_21-1679196825241.png

 

Apply to each Update Deviation iterates over each of the items from Parse JSON Deviation To Be Updated. And within the loop, it uses Update item Deviation to update each of the items that have changed.

grantjenkins_22-1679196909352.png

 

In my example, both loops will only iterate once, since I've only got one item that was updated and one item that was added. I would hope the entire flow (with all your data) would complete within a couple of minutes all going well.

 

The output in my Deviations Table Dashboard List after the flow runs is below.

grantjenkins_23-1679197355714.png


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
cfoenki
Helper II
Helper II

Hi @grantjenkins ,

 

Thank you for this step by step explanation.

I followed your explanation and scaled it to my real data model (48 columns).

I got an error on the step "Select Conbined Deviation":

The execution of template action 'Select_Combined_Deviation' failed: The evaluation of 'query' action 'where' expression '{
  "ID": "@item()?['ID']",
  "Reference": "@item()?['Title']",
  "Process": "@item()?['Process']",
  "Service": "@item()?['Service']",
  "Creator": "@item()?['DeviationCreator']",
  "Detection date": "@item()?['Detectiondate']",
  "Creation date": "@item()?['DeviationCreationdate']",
  "Responsible": "@item()?['DeviationResponsible']",
  "Type": "@item()?['DeviationType']",
  "Procedure Reference": "@item()?['ProcedureReference']",
  "Description": "@item()?['Description']",
  "Immediate actions": "@item()?['Immediateactions']",
  "Transmitted to Compliance on": "@item()?['TransmittedtoComplianceon']",
  "Initial Regulatory risk": "@item()?['InitialRegulatoryrisk']",
  "Initial Regulatory risk comment": "@item()?['InitialRegulatoryriskcomment']",
  "Initial Product/Patient risk": "@item()?['InitialProduct_x002f_Patientrisk']",
  "Initial Product/Patient risk comment": "@item()?['InitialProduct_x002f_Patientrisk0']",
  "Initial Deviation level": "@item()?['InitialDeviationlevel']",
  "Initial Justification": "@item()?['InitialJustification']",
  "Date PR (for critical)": "@item()?['DatePR_x0028_forcritical_x0029_']",
  "Date EUQPPV (for critical)": "@item()?['DateEUQPPV_x0028_forcritical_x00']",
  "Evaluation Date": "@item()?['EvaluationDate']",
  "Due Date": "@item()?['DueDate']",
  "Investigation Conclusion": "@item()?['InvestigationConclusion']",
  "Root cause identified": "@item()?['Rootcauseidentified']",
  "Root cause analysis": "@item()?['Rootcauseanalysis']",
  "Root cause category": "@item()?['Rootcausecategory']",
  "Recurrent Deviation": "@item()?['RecurrentDeviation']",
  "Recurrence analysis": "@item()?['Recurrenceanalysis']",
  "Perimeter extension": "@item()?['Perimeterextension']",
  "New Perimeter of impact": "@item()?['NewPerimeterofimpact']",
  "Perimeter extension comment": "@item()?['Perimeterextensioncomment']",
  "Investigation conclusion date": "@item()?['Investigationconclusiondate']",
  "Final Regulatory risk": "@item()?['FinalRegulatoryrisk']",
  "Final Regulatory risk comment": "@item()?['FinalRegulatoryriskcomment']",
  "Final Product/Patient risk": "@item()?['FinalProduct_x002f_Patientrisk']",
  "Final Product/Patient risk comment": "@item()?['FinalProduct_x002f_Patientriskco']",
  "Final Deviation level": "@item()?['FinalDeviationlevel']",
  "Final Justification": "@item()?['FinalJustification']",
  "Final PR Notification date": "@item()?['FinalPRNotificationdate']",
  "Final EUQPPV notification date": "@item()?['FinalEUQPPVnotificationdate']",
  "Deviation closure date": "@item()?['Deviationclosuredate']",
  "CAPA needed": "@item()?['CAPAneeded']",
  "CAPA Plan reference": "@item()?['CAPAPlanreference']",
  "CAPA Plan Creation Date": "@item()?['CAPAPlanCreationDate']",
  "CAPA Plan Objective": "@item()?['CAPAPlanObjective']",
  "CAPA Plan Responsible": "@item()?['CAPAPlanResponsible']",
  "CAPA Plan due date": "@item()?['CAPAPlanduedate']",
  "CAPA Plan closure date": "@item()?['CAPAPlanclosuredate']",
  "ProcessXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Process/text())'))",
  "ServiceXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Service/text())'))",
  "CreatorXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Creator/text())'))",
  "Detection dateXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Detection date/text())'))",
  "Creation dateXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Creation date/text())'))",
  "ResponsibleXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Responsible/text())'))",
  "TypeXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Type/text())'))",
  "Procedure ReferenceXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Procedure Reference/text())'))",
  "DescriptionXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Description/text())'))",
  "Immediate actionsXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Immediate actions/text())'))",
  "Transmitted to Compliance onXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Transmitted to Compliance on/text())'))",
  "Initial Regulatory riskXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Regulatory risk/text())'))",
  "Initial Regulatory risk commentXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Regulatory risk comment/text())'))",
  "Initial Product/Patient riskXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Product/Patient risk/text())'))",
  "Initial Product/Patient risk commentXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Product/Patient risk comment/text())'))",
  "Initial Deviation levelXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Deviation level/text())'))",
  "Initial JustificationXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Initial Justification/text())'))",
  "Date PR (for critical)XL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Date PR (for critical)/text())'))",
  "Date EUQPPV (for critical)XL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Date EUQPPV (for critical)/text())'))",
  "Evaluation DateXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Evaluation Date/text())'))",
  "Due DateXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Due Date/text())'))",
  "Investigation ConclusionXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Investigation Conclusion/text())'))",
  "Root cause identifiedXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Root cause identified/text())'))",
  "Root cause analysisXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Root cause analysis/text())'))",
  "Root cause categoryXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Root cause category/text())'))",
  "Recurrent DeviationXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Recurrent Deviation/text())'))",
  "Recurrence analysisXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Recurrence analysis/text())'))",
  "Perimeter extensionXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Perimeter extension/text())'))",
  "New Perimeter of impactXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/New Perimeter of impact/text())'))",
  "Perimeter extension commentXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Perimeter extension comment/text())'))",
  "Investigation conclusion dateXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Investigation conclusion date/text())'))",
  "Final Regulatory riskXL": "@xpath(ou

Are you able to help me on that one? I am wondering it it could not be a matter of space in my Initial JSON table column names

Yes, it looks like it's because you've got spaces in your Field Names that you're using in your XPath expressions. You'll need to use the converted field names for any that have spaces.

 

//If you had the following Field Name

First Name

//It would be converted to

First_x0020_Name

//Spaces will be replaced with

_x0020_ 

----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
cfoenki
Helper II
Helper II

Thanks I will test that

cfoenki
Helper II
Helper II

I ended up by removing all spaces in my columns but still get the error (here is the simplified select action for test):

 

 

InvalidTemplate. The execution of template action 'Select' failed: The evaluation of 'query' action 'where' expression '{ "ID": "@item()?['ID']", "Reference": "@item()?['Title']", "Process": "@item()?['Process']", "ProcessXL": "@xpath(outputs('XML_Deviation'), concat('string(//root/value[Reference=\"', item()?['Title'], '\"]/Process/text())'))" }' failed: 'The template language function 'xpath' failed to parse the provided XML.'.

 

 

 

cfoenki_0-1679320294241.png

 

Edit: I saw that despite the checkmark, the step of XML creation is returning an error on the output:

cfoenki_0-1679322089609.png

 

After further investigation the above error is due to a special character

cfoenki_1-1679323486508.png

 

cfoenki
Helper II
Helper II

Hi @grantjenkins ,

 

I finally figured out what to do with my special characters embbed in the Excel text. I modified my office script to replace all special characters by a space and now after several attemps it is WORKING!! Using your method.


Now my flow takes only 20 minutes and moreover it is working with the same speed using recurrence trigger.

 

Thank you so much for the help!!

takolota
Multi Super User
Multi Super User

Could this all be done in a simpler & faster way using SharePoint batch updates?

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/td-p/136541...

Helpful resources

Announcements

Exclusive LIVE Community Event: Power Apps Copilot Coffee Chat with Copilot Studio Product Team

It's time for the SECOND Power Apps Copilot Coffee Chat featuring the Copilot Studio product team, which will be held LIVE on April 3, 2024 at 9:30 AM Pacific Daylight Time (PDT).     This is an incredible opportunity to connect with members of the Copilot Studio product team and ask them anything about Copilot Studio. We'll share our special guests with you shortly--but we want to encourage to mark your calendars now because you will not want to miss the conversation.   This live event will give you the unique opportunity to learn more about Copilot Studio plans, where we’ll focus, and get insight into upcoming features. We’re looking forward to hearing from the community, so bring your questions!   TO GET ACCESS TO THIS EXCLUSIVE AMA: Kudo this post to reserve your spot! Reserve your spot now by kudoing this post.  Reservations will be prioritized on when your kudo for the post comes through, so don't wait! Click that "kudo button" today.   Invitations will be sent on April 2nd.Users posting Kudos after April 2nd at 9AM PDT may not receive an invitation but will be able to view the session online after conclusion of the event. Give your "kudo" today and mark your calendars for April 3, 2024 at 9:30 AM PDT and join us for an engaging and informative session!

Tuesday Tip: Unlocking Community Achievements and Earning Badges

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!     THIS WEEK'S TIP: Unlocking Achievements and Earning BadgesAcross the Communities, you'll see badges on users profile that recognize and reward their engagement and contributions. These badges each signify a different achievement--and all of those achievements are available to any Community member! If you're a seasoned pro or just getting started, you too can earn badges for the great work you do. Check out some details on Community badges below--and find out more in the detailed link at the end of the article!       A Diverse Range of Badges to Collect The badges you can earn in the Community cover a wide array of activities, including: Kudos Received: Acknowledges the number of times a user’s post has been appreciated with a “Kudo.”Kudos Given: Highlights the user’s generosity in recognizing others’ contributions.Topics Created: Tracks the number of discussions initiated by a user.Solutions Provided: Celebrates the instances where a user’s response is marked as the correct solution.Reply: Counts the number of times a user has engaged with community discussions.Blog Contributor: Honors those who contribute valuable content and are invited to write for the community blog.       A Community Evolving Together Badges are not only a great way to recognize outstanding contributions of our amazing Community members--they are also a way to continue fostering a collaborative and supportive environment. As you continue to share your knowledge and assist each other these badges serve as a visual representation of your valuable contributions.   Find out more about badges in these Community Support pages in each Community: All About Community Badges - Power Apps CommunityAll About Community Badges - Power Automate CommunityAll About Community Badges - Copilot Studio CommunityAll About Community Badges - Power Pages Community

Tuesday Tips: Powering Up Your Community Profile

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!   This Week's Tip: Power Up Your Profile!  🚀 It's where every Community member gets their start, and it's essential that you keep it updated! Your Community User Profile is how you're able to get messages, post solutions, ask questions--and as you rank up, it's where your badges will appear and how you'll be known when you start blogging in the Community Blog. Your Community User Profile is how the Community knows you--so it's essential that it works the way you need it to! From changing your username to updating contact information, this Knowledge Base Article is your best resource for powering up your profile.     Password Puzzles? No Problem! Find out how to sync your Azure AD password with your community account, ensuring a seamless sign-in. No separate passwords to remember! Job Jumps & Email Swaps Changed jobs? Got a new email? Fear not! You'll find out how to link your shiny new email to your existing community account, keeping your contributions and connections intact. Username Uncertainties Unraveled Picking the perfect username is crucial--and sometimes the original choice you signed up with doesn't fit as well as you may have thought. There's a quick way to request an update here--but remember, your username is your community identity, so choose wisely. "Need Admin Approval" Warning Window? If you see this error message while using the community, don't worry. A simple process will help you get where you need to go. If you still need assistance, find out how to contact your Community Support team. Whatever you're looking for, when it comes to your profile, the Community Account Support Knowledge Base article is your treasure trove of tips as you navigate the nuances of your Community Profile. It’s the ultimate resource for keeping your digital identity in tip-top shape while engaging with the Power Platform Community. So, dive in and power up your profile today!  💪🚀   Community Account Support | Power Apps Community Account Support | Power AutomateCommunity Account Support | Copilot Studio  Community Account Support | Power Pages

Super User of the Month | Chris Piasecki

In our 2nd installment of this new ongoing feature in the Community, we're thrilled to announce that Chris Piasecki is our Super User of the Month for March 2024. If you've been in the Community for a while, we're sure you've seen a comment or marked one of Chris' helpful tips as a solution--he's been a Super User for SEVEN consecutive seasons!   Since authoring his first reply in April 2020 to his most recent achievement organizing the Canadian Power Platform Summit this month, Chris has helped countless Community members with his insights and expertise. In addition to being a Super User, Chris is also a User Group leader, Microsoft MVP, and a featured speaker at the Microsoft Power Platform Conference. His contributions to the new SUIT program, along with his joyous personality and willingness to jump in and help so many members has made Chris a fixture in the Power Platform Community.   When Chris isn't authoring solutions or organizing events, he's actively leading Piasecki Consulting, specializing in solution architecture, integration, DevOps, and more--helping clients discover how to strategize and implement Microsoft's technology platforms. We are grateful for Chris' insightful help in the Community and look forward to even more amazing milestones as he continues to assist so many with his great tips, solutions--always with a smile and a great sense of humor.You can find Chris in the Community and on LinkedIn. Thanks for being such a SUPER user, Chris! 💪 🌠  

Find Out What Makes Super Users So Super

We know many of you visit the Power Platform Communities to ask questions and receive answers. But do you know that many of our best answers and solutions come from Community members who are super active, helping anyone who needs a little help getting unstuck with Business Applications products? We call these dedicated Community members Super Users because they are the real heroes in the Community, willing to jump in whenever they can to help! Maybe you've encountered them yourself and they've solved some of your biggest questions. Have you ever wondered, "Why?"We interviewed several of our Super Users to understand what drives them to help in the Community--and discover the difference it has made in their lives as well! Take a look in our gallery today: What Motivates a Super User? - Power Platform Community (microsoft.com)

March User Group Update: New Groups and Upcoming Events!

  Welcome to this month’s celebration of our Community User Groups and exciting User Group events. We’re thrilled to introduce some brand-new user groups that have recently joined our vibrant community. Plus, we’ve got a lineup of engaging events you won’t want to miss. Let’s jump right in: New User Groups   Sacramento Power Platform GroupANZ Power Platform COE User GroupPower Platform MongoliaPower Platform User Group OmanPower Platform User Group Delta StateMid Michigan Power Platform Upcoming Events  DUG4MFG - Quarterly Meetup - Microsoft Demand PlanningDate: 19 Mar 2024 | 10:30 AM to 12:30 PM Central America Standard TimeDescription: Dive into the world of manufacturing with a focus on Demand Planning. Learn from industry experts and share your insights. Dynamics User Group HoustonDate: 07 Mar 2024 | 11:00 AM to 01:00 PM Central America Standard TimeDescription: Houston, get ready for an immersive session on Dynamics 365 and the Power Platform. Connect with fellow professionals and expand your knowledge. Reading Dynamics 365 & Power Platform User Group (Q1)Date: 05 Mar 2024 | 06:00 PM to 09:00 PM GMT Standard TimeDescription: Join our virtual meetup for insightful discussions, demos, and community updates. Let’s kick off Q1 with a bang! Leaders, Create Your Events!  Leaders of existing User Groups, don’t forget to create your events within the Community platform. By doing so, you’ll enable us to share them in future posts and newsletters. Let’s spread the word and make these gatherings even more impactful! Stay tuned for more updates, inspiring stories, and collaborative opportunities from and for our Community User Groups.   P.S. Have an event or success story to share? Reach out to us – we’d love to feature you!

Users online (5,929)