Showing results for 
Search instead for 
Did you mean: 
Power Participant
Power Participant

Exporting large amounts of data

Hi All,


I would like to export data from one of our marketing systems (via API) to a data source that I can then connect to from PowerBI

This will be a daily export and the amount of data can be up to 100k records.


Due to the large amount of data I prefer not to do this with an apply to each and also not sure what the best DB solution for this is.

I'm thinking could I just dump the data as a CSV file to Azure blob storage and then connect PowerBI from there? If so how does PowerBI connect to multiple CSVs? As I can see in PowerBI you must choose a specific CSV to work from when choosing Blob storage as your data source.


Bottom line, I'm trying to avoid the need to set up a dedicated DB here and webservice to manage the process.


Looking for all and any advice here



Community Champion
Community Champion

With that amount of data I'd never recommend CSV or non-table storage. Power BI would crawl the more data you would add. With that said, Azure Cosmos DB and Azure Tables are good options for ease of use and cost effectiveness.

As far as apply to each, that's your only option - that is how records are created. See Understand data operations - Power Automate | Microsoft Docs for more details. With that said, you can do this rather quickly with Concurrency Controls and Degree of Parallelism settings.



If you could provide an expanded screenshot of your Flow and steps, your Flow run history, and of any detailed error messages you're receiving we could likely better assist you. Also, for the best results, you may want to review How to write a good forum post.

If this reply answers your question or solves your issue, please ACCEPT AS SOLUTION ☑️. If you find this reply helpful, please consider giving it a LIKE.

Power Participant
Power Participant

Thanks @Brad_Groux. I've had some discussions with the business owners and it seems they want the data to be saved in an existing SQL (on-prem) DB. Is there any quicker way to push large amounts of data to SQL rather than just an apply to each?

Solution Supplier
Solution Supplier

One thing to keep in mind is that Automate has a limit of 5,000 loop iterations for an Apply to Each block.  A premium account gets you a 100,000 limit.

Community Champion
Community Champion

@SamPo, a condition with an apply to each is how the platform works. Power Automate is a no-code/low-code solution, if you want a more programmatic approach, Azure Logic Apps may be a better fit.

I discuss the difference between the two in this blog post - Why My Future is Serverless in the Microsoft Cloud | by Brad Groux | MSFT Engineer.

If this reply answers your question or solves your issue, please ACCEPT AS SOLUTION ☑️. If you find this reply helpful, please consider giving it a LIKE.



Power Participant
Power Participant

If anyone is facing something similar where you are trying to avoid extra maintenance, skill requirements, etc. & you don’t expect your datasets to be more than a couple hundred thousand rows, then you can try some batch Excel options:

Helpful resources

Power Platform Conf 2022 768x460.jpg

Join us for Microsoft Power Platform Conference

The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.

May UG Leader Call Carousel 768x460.png

June User Group Leader Call

Join us on June 28 for our monthly User Group leader call!

MPA Virtual Workshop Carousel 768x460.png

Register for a Free Workshop

Learn to digitize and optimize business processes and connect all your applications to share data in real time.

Power Automate Designer Feedback_carousel.jpg

Help make Flow Design easier

Are you new to designing flows? What is your biggest struggle with Power Automate Designer? Help us make it more user friendly!

Users online (2,055)