Showing results for 
Search instead for 
Did you mean: 
Impactful Individual
Impactful Individual

Exporting large amounts of data

Hi All,


I would like to export data from one of our marketing systems (via API) to a data source that I can then connect to from PowerBI

This will be a daily export and the amount of data can be up to 100k records.


Due to the large amount of data I prefer not to do this with an apply to each and also not sure what the best DB solution for this is.

I'm thinking could I just dump the data as a CSV file to Azure blob storage and then connect PowerBI from there? If so how does PowerBI connect to multiple CSVs? As I can see in PowerBI you must choose a specific CSV to work from when choosing Blob storage as your data source.


Bottom line, I'm trying to avoid the need to set up a dedicated DB here and webservice to manage the process.


Looking for all and any advice here



Super User II
Super User II

With that amount of data I'd never recommend CSV or non-table storage. Power BI would crawl the more data you would add. With that said, Azure Cosmos DB and Azure Tables are good options for ease of use and cost effectiveness.

As far as apply to each, that's your only option - that is how records are created. See Understand data operations - Power Automate | Microsoft Docs for more details. With that said, you can do this rather quickly with Concurrency Controls and Degree of Parallelism settings.



If you could provide an expanded screenshot of your Flow and steps, your Flow run history, and of any detailed error messages you're receiving we could likely better assist you. Also, for the best results, you may want to review How to write a good forum post.

If this reply answers your question or solves your issue, please ACCEPT AS SOLUTION ☑️. If you find this reply helpful, please consider giving it a LIKE.

Impactful Individual
Impactful Individual

Thanks @Brad_Groux. I've had some discussions with the business owners and it seems they want the data to be saved in an existing SQL (on-prem) DB. Is there any quicker way to push large amounts of data to SQL rather than just an apply to each?

Responsive Resident
Responsive Resident

One thing to keep in mind is that Automate has a limit of 5,000 loop iterations for an Apply to Each block.  A premium account gets you a 100,000 limit.

@SamPo, a condition with an apply to each is how the platform works. Power Automate is a no-code/low-code solution, if you want a more programmatic approach, Azure Logic Apps may be a better fit.

I discuss the difference between the two in this blog post - Why My Future is Serverless in the Microsoft Cloud | by Brad Groux | MSFT Engineer.

If this reply answers your question or solves your issue, please ACCEPT AS SOLUTION ☑️. If you find this reply helpful, please consider giving it a LIKE.



Helpful resources

User Groups Public Preview

Join us for our User Group Public Preview!

Power Automate User Groups are coming! Make sure you’re among the first to know when user groups go live for public preview.


Experience what’s next for Power Automate

See the latest Power Automate innovations, updates, and demos from the Microsoft Business Applications Launch Event.

New Super Users

Meet the Power Automate Super Users!

Many congratulations to the Season 1 2021 Flownaut Crew!

Power Platform ISV STudio

Power Platform ISV Studio

ISV Studio is the go-to Power Platform destination for ISV’s to monitor & manage applications post-AppSource publish.

Users online (28,652)