Hello,
I would like to have a Flow to update new rows from an online excel to sharepoint. I have lines being added to an online excel file and I would like only these rows to the sharepoint list.
I tried to add an 'Excel get rows' and a 'sharepoint get items' and a condition, with an 'apply to each' for the 'Excel get rows' and a second 'apply to each' for the 'sharepoint get item' and inside the second 'apply to each' a IDexcel not equal to IDsharepoint then do a 'sharepoint create item' action.
This approach seems to be slow and do not create the desired result. I also need the Flow to add as many records on the sharepoint list as the newly added rows.
If I copy all the excel rows everytime I need to update the sharepoint list, how do I avoid generating duplicates?
Is there a better trigger for Excel that is not 'when a file is modified'?
Thanks for the help.
Hello,
As far as I know there is only one real trigger for Excel, and that is the selected row trigger. I do suspect it would be easy to set up a flow that transfers a selected row from Excel to a SharePoint list. Maybe this is worth considering?
But there is currently no trigger for new rows added to an Excel file.
Hello @Anonymous,
Thanks for your input. I thought about that too, and seems a good solution. My users however will likely have multiple rows to input so I ma want to consider this option at a later stage. Thanks.
With this flow, I just get the entire table repeated in the sharepoint list for the number of rows in the table itself. I only need the newly added rows.
The following simple Flow will do the job in a sense of adding new lines by replacing the old Sharepoint List with a new one based on the actual Excel table. I have another flow downstream of this one to notify users of items being assigned to them and they will receive the same notification over and over again everytime a new table is created.
Any suggestions welcome. Thanks.
Hi, did you ever manage to find a solution.? I am currently trying to do the same thing, but cannot seem to find a solution online that avoids creating duplicates.
@Marius-Swart
Alright I built a system to automatically update a SharePoint list with registration data from an external site. It required figuring out which rows were new and needed to be created in a list of 100,000+ rows. To work, you just need to have a set of values unique to each Excel & SharePoint record that can form a primary and a foreign key. For example, if each record in your Excel & SharePoint list have different emails, then you could use the email addresses as a unique id/key. My example will use names. Just make sure whatever you use as the unique id is cleaned wherever you use it (ex: use the trim function on email values to remove excess spaces or use the replace function on things like phone numbers to remove and spaces, parentheses, dashes, etc.).
First, use the Get items action on the SP list and turn pagination on with the maximum 100,000 setting to get all the records currently in your list.
If your list has more than 100,000 records, then you may need to use some of the methods I describe in this video to link together several Get items actions and pull several 100,000s of thousands of items.
https://youtu.be/l0NuYtXdcrQ
Then create the rest of this flow.
The Select action is just pulling the unique id/key values from SharePoint so they can be compared to the values in Excel.
Then the flow is getting all the records from Excel. you may need to turn the pagination on and set it to 100,000 for that too if you have a large table.
Then the Filter action is looking through every record from the Excel output and only selecting the records with unique values not already in the list of SharePoint unique values. The body from the Select action of unique SharePoint values must be reformatted as a string for the conditional to work. I have a way to compare array values that is more efficient, but it's more complicated and harder to explain.
Once you have the list of only the Excel records with unique values not in Sharepoint in the Filter array action, you can either use a Parse JSON action on the Filter array output to pull things into dynamic content or you can use the body of the Filter array action as the main set of values for an Apply to each loop and hand code item()?['ColumnName'] wherever you need values from certain columns.
And if you want to get more advanced and much faster with things like this then you can try learning the batch create and batch update SharePoint methods.
Batch Create: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Batch Update: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-Extern...
Thank you very much @takolota .
My flow requires to compare an Excel table and a Google Sheets table and update the Google Sheets one with the rows in Excel that are not in Google Sheets yet.
I tried to build a flow similar to yours here, but it didn't do anything, even though it ran successfully, it didn't update the Google Sheets. Can you perhaps assist me with how I should make the changes to have Google Sheets in stead of Sharepoint. I would really appreciate your assistance so much.
Can you send pictures of the action results?
What do your actions after the Filter array action look like?
Do you have new rows created in Google sheets for each new record in Excel? Like an Apply to each on the Parse JSON outputs creating a new row in Google sheets?
Thank you for your response @takolota .
The "Index" column is the column with the unique value, which is just a sequential number counting as each new entry is added.
That Filter array looks like it’s checking if the item’s Excel index does not contain that item’s Excel index. So it’s not going to return anything.
You need to set the left side of that Filter array condition to string(body(Select)).
Then you should add a Parse JSON action for the Filter array outputs after the Filter array, run the flow to get some example data, copy the output of the Filter array action and paste it into the example schema for the Parse JSON action.
Then use the Parse JSON output body for the Apple to each with the Google sheets action and replace all the Excel column dynamic content with dynamic content from the Parse JSON.
Also, why do you have the 1st Apply to each? You shouldn’t need it and it will cause this to repeat for every row in Excel so it will duplicate the new records like 50 times.
Thanks @takolota , i appreciate your patience.
When I type the expression for the string as in your post above as:
string(body('Select_Get_list_of_unique_values')) Does not contain item()?['Index']
it seems to be invalid for me. Not sure where my mistake is. I copied from what you added in that note.
Can i use on the left side of the filter the Select action's Output automatically created?
I am still new to Power Automate and have no experience with Parse JSON. Also the reason for my seemingly stupid mistakes, sorry for that. I appreciate your help immensely. Any chance you can direct me to an applicable video or setp by step guide on setting up that next Parse JSON step, please?
No my Select action was named Select_Get_list_of_unique_values, but your Select statement was just named Select, so you need to enter string(body(‘Select’)).
Then test run the flow, grab the output from the Filter array, then go back to edit mode, put the Parse JSON action after Filter array, use the Filter array output body in the Parse JSON action, click generate schema from example, and paste the Filter array outputs from the test run in the example schema box.
That should give you the contents of the Filter array under Parse JSON in dynamic content.
I finally managed to build it successfully according to your instructions. Thank you for your patience with my query, I really appreciate it immensely.
User | Count |
---|---|
88 | |
37 | |
26 | |
13 | |
13 |
User | Count |
---|---|
127 | |
54 | |
38 | |
24 | |
21 |