cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
RecFi_Analytics
Regular Visitor

Append XLSX file in SP folder to SP List [title edited as scope changed]

All,

 

Thank you in advance for your help!

 

I need help fixing/creating a flow.

 

Currently I receive an XLSX file via Email with the previous days applications (anywhere form ~50-200 rows).

I have to add these to a master XLSX that has ~50,000+ rows but is formatted the same.

I understand that the data must be in tables for many of the flow processes to work so I have turned my master XLSX into a table.

I have written a flow that saves the attachment from the Email and adds it to a folder.

So, in my flow I would like this even triggered by a file entering this folder which I am working on below.

I think one of my problems is the daily XLSX file is not in table form but to make it into a table I believe I would need a range and the amount of rows can vary greatly.

 

Currently I have:

RecFi_Analytics_0-1655990253657.png

 

Please let me know if you can help me, I am stuck and would very greatly appreciate it.

 

Thank you!

 

1 ACCEPTED SOLUTION

Accepted Solutions

@eliotcole 

Thank you again! 

 

I went ahead and fixed up the concurrency options as well as the timeout limit. It turns out the process was actually still running outside of the test page. After nearly 5 hours it completed and worked perfectly. Now I will use a similar flow to get the daily data in which will be significantly smaller. The only hump to get around is that the data wont come in as a table and will have varying row counts but I'm sure that won't be too much to figure out.

 

For completion I will add the final flow for the one-time transfer:

  • Of note again was that it only did 256 rows initially which had to be manually increased in the "list rows present in a table" settings. I increased it to 40,000 which worked to my surprise.
  • Creating the columns in an empty SP list first fixed my problem of everything being crammed into 1 column on my first flow test.
  • In the "create item" portion some of the column names weren't showing up automatically so I used an expression to get the column names item()?['Column Name'].
  • I was worried about the date not pulling correctly but changed the setting in "list rows present in a table" to "ISO 8601" and it worked perfectly.
  • edit: I changed th e name of the 2nd "apply to each" to "apply to each row" in case that causes confusion for people looking back on this.

RecFi_Analytics_0-1656070336237.png

 

View solution in original post

6 REPLIES 6
eliotcole
Super User
Super User

Hi, @RecFi_Analytics, funnily enough I've just been speaking about this same thing with @takolota over in this thread, which I believe is similar. You may even wish to use takolota's scripts!

 

I have just one question ... is the incoming excel sheet always just one sheet of data?

 

If so ... I believe that we can do what you need doing with just the original email flow, no need for a secondary flow.

 

This will be very reliant on the incoming data being the right format, of course, but you've said that it is.

 

Essentially, though, this could be done within a few actions on your email flow, yes:

  1. Save file to temporary location (SP or OD is fine)
  2. Get range
  3. Check headers
  4. Add rows to table

Steps 2 & 4 are both special calls which we'll have to get right (that could be a bit annoying - but not impossible), and step 3 is the validation (or is it verification? 😅) that it's in the right format.

 

You can add parralel branches which will only run when one of these steps fails, to indicate something went wrong, and you're all good.

 

---

 

My main question to you is ... should all this work be done within an excel sheet? Is there a better place to keep this data. Either SharePoint list, dataverse table, or something else?

 

---

 

Anyway, yes, I'll be back with a rough look at this, but it should work, irrespective of headers. If there's formula, then it would get more complicated.

@eliotcole 

You are amazing!

 

First I would like to acknoweldge your concern of this all being done in an excel sheet because I share a similar concern.

 

I needed real-time power BI dashboards ASAP and there was no data infrastructure present. To buy myself time I created transformations and relations in power BI and was manually appending the xlsx files in SP. In order to free up time to work on the cloud database I was planning on automating these processes which is where my request originates from.


I believe a SP list may be a better intermediate. I am going convert my master sheet to a SP list and try to connect power BI to see if this is feasible. If so I am assuming appending the new daily sheets to the SP list will be easier?

 

Thank you so much for your help!

Much easier, @RecFi_Analytics  ... also, you can probably just have the data feed directly into the SharePoint list, and/or take it when you need from the BI source.

 

Also, with SharePoint you can do away with reports entirely, and just either make:

  • Multiple list views of the data for any given purpose.
  • The list views PLUS bespoke pages to frame them.

This way, any time someone needs to see the 'report' they can just look at the page/list view. Plus you can make views for specific people, too, which are restricted just to them.

 

---

 

If you still need to use excel for whatever reason, takolota's scripts are a ready made approach, with regards to the graph calls, they're a bit fiddly, but they also provide all of the data from the sheets.

 

If you do want a hand with those, though, we'll need some sample data, which I would assume we can handle with just one set, since it'll be in the same format on the main sheet.

@eliotcole 

 

From what I have researched regarding SP Lists it seems this is the most efficient way to move forward, thank you for the recommendation!

 

I understand this is getting off topic so it may be best if I start a new forum post?

 

Unfortunately when I create a list from the excel file (~50,000 rows) it only allows me to add a maximum of 20,000 rows. I split the data then went into edit mode and tried to paste the remainder but it simply loads forever (waited ~30 mins twice). 

 

To combat this I decided to create a one-time flow but it keeps timing out.

Would you mind taking a look?

 

*Of note I increased the list rows present in a table value to 40,000 (from default 256). 

*After researching I do have some concern that the date may not be pulled in correctly even if I get this method to work. 

 

RecFi_Analytics_0-1656012972230.png

RecFi_Analytics_2-1656013153808.png

RecFi_Analytics_3-1656013175927.png

 

 

I have attached a screenshot of a few lines of data. Let me know if you need a file to better check what I am doing.

 

RecFi_Analytics_1-1656013044499.png

 

Thank you so much again for all of your help.

 

There are concurrency options that allows a flow to perform multiple actions at once, tap the menu dots on the Apply to each then take the concurrency number up to whatever its limit is.

 

---

 

Alternatively, if you've got individual identifiers for each row in the table, and a representative column in SharePoint, then you can use this to hone down what exists in the SP list, so you don't need to add as much from the excel.

You can still make that work in retrospect, as long as you're only adding stuff you know hasn't already gone in.

 

But the concurrency stuff should help the most.

It's a little late for me now ... but poke me again tomorrow sometime, perhaps, and I'll take another look.

@eliotcole 

Thank you again! 

 

I went ahead and fixed up the concurrency options as well as the timeout limit. It turns out the process was actually still running outside of the test page. After nearly 5 hours it completed and worked perfectly. Now I will use a similar flow to get the daily data in which will be significantly smaller. The only hump to get around is that the data wont come in as a table and will have varying row counts but I'm sure that won't be too much to figure out.

 

For completion I will add the final flow for the one-time transfer:

  • Of note again was that it only did 256 rows initially which had to be manually increased in the "list rows present in a table" settings. I increased it to 40,000 which worked to my surprise.
  • Creating the columns in an empty SP list first fixed my problem of everything being crammed into 1 column on my first flow test.
  • In the "create item" portion some of the column names weren't showing up automatically so I used an expression to get the column names item()?['Column Name'].
  • I was worried about the date not pulling correctly but changed the setting in "list rows present in a table" to "ISO 8601" and it worked perfectly.
  • edit: I changed th e name of the 2nd "apply to each" to "apply to each row" in case that causes confusion for people looking back on this.

RecFi_Analytics_0-1656070336237.png

 

Helpful resources

Announcements
Power Platform Conf 2022 768x460.jpg

Join us for Microsoft Power Platform Conference

The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.

May UG Leader Call Carousel 768x460.png

June User Group Leader Call

Join us on June 28 for our monthly User Group leader call!

MPA Virtual Workshop Carousel 768x460.png

Register for a Free Workshop

Learn to digitize and optimize business processes and connect all your applications to share data in real time.

Power Automate Designer Feedback_carousel.jpg

Help make Flow Design easier

Are you new to designing flows? What is your biggest struggle with Power Automate Designer? Help us make it more user friendly!

Top Solution Authors
Top Kudoed Authors
Users online (9,984)