Showing results for 
Search instead for 
Did you mean: 
Super User
Super User

Fastest way to get data into Dataverse



I have an Excel worksheet which is released every day from our data warehouse. It comes to me via email only. It contains roughly 6k to 7k rows per day, and I need these in Dataverse as quickly as possible. I have no access to the data warehouse itself, but am using Power Automate to list the rows in the Excel table and import them using "Add a New Row" action into Dataverse. For 6.5k rows (today's file), this took just over 1hr 15mins to import all rows. 


Is there a quicker way to mass import such data each day? The data is only available via email as an attached Excel file. Would a dataflow be significantly faster, if I were to configure such?




Super User
Super User

I have always been advised by colleagues in the community that it goes in following order in relation to performance:
1) Azure Data Factory

2) Power Automate

3) Dataflows


Power Automate is very flexible in terms of triggers and manipulation of steps and can be made to be better performance wise with Concurrency and branching. In this specific scenario I would look to use a Power Automate triggered on receipt of the email and check that the excel is attached with a condition. 

Then i would split it into 6 branches so that each branch lists 1000 rows, e.g. branch 1 starts from the first row and lists 1000, branch 2 starts from the 1000 row and lists another 1000 rows and I would make sure the last branch starts from 6000 but does not have a limit.


then for the add new row action that comes in the apply to each under each branch I would increase concurrency. to maximise speed. 


In my head the logic of this approach makes sense and I can probably test it out to compare performance differences when I am back in the office the day after tomorrow. Give it ago and see if you can get your flow moving even faster.

Dataflows are still a very powerful tool but I wont call them fast I usually run them at night.


If you appreciated my comments/responses please be sure to Like/Kudo them it really does make me smile 🙂 !
Link to the Power Platform Professionals United Kingdom User Group:

Solution Sage
Solution Sage

The Loops are generally where Flow gets really slow.

If you have nested loops see if you can reduce them.  (you may find in some cases you can use array filters they are a lot quicker - particularly if your loop is to create a list that was filtered on some criteria)

If you are not using Variables (that change) inside the Loops, then increase the Concurrency (under settings on the loop action node)

If you are using Variables (that change) inside the Loops, then try and make them Arrays/Objects, then increase the Concurrency.   - the issue with Concurrency and changing Variables inside the Loop is the concurrency is basically a new thread and so setting a variable inside the loop will get overwritten by a different concurrency thread (vs adding to an Array/Object preserves the other entries from the other concurrent threads)


One of the issues with Flow is that it doesn't have batch operations, so each insert/update is a separate web service call, vs a Console App or Kingswaysoft SSIS toolkit (or others) that can develop does support batch operations.

Helpful resources

Power Apps News & Annoucements carousel

Power Apps News & Announcements

Keep up to date with current events and community announcements in the Power Apps community.

Power Apps Community Blog Carousel

Power Apps Community Blog

Check out the latest Community Blog from the community!

Users online (2,885)