cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
tomb2019
New Member

Processing hundreds of thousands of records from SQL

Hi,

 

I have a requirement to use a Flow to query an on-premise SQL Server using the Gateway and based on the result set that is returned add the same data into an Azure SQL database.

 

The number of records that will be returned is about 200,000 so it's not possible to use a for each loop in Flow to add 200,000 records.

 

Has anyone got any suggestions as to the most efficient way to achieve this?

 

Thanks

 

Tom

4 REPLIES 4
v-bacao-msft
Community Support
Community Support

 

Hi @tomb2019 ,

 

You can consider creating multiple flows, and each flow is created for eligible records.

Each Flow uses Filter Query to filter records that meet the specified conditions. These records are then added to the Azure SQL database.

Besides Flow, it seems that you can use other methods to import data from On-premise SQL Server to Azure SQL database.

https://dzone.com/articles/import-local-sql-server-database-to-azure-pick-the

 

Best Regards,

Community Support Team _ Barry
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi Barry,

 

Not sure what you mean by eligible records. All records that are returned will be eligible as we will only obtain records that we need based on a query. The specific issue is being able to process and insert a lot of records, i.e. 200,000 in an efficient manner.

 

The options in the article are not suitable. They both seem to be aimed at one off data migrations.

 

We are not completing a data migration, we are moving data from an on-premise SQL to an Azure SQL continually, on a scheduled basis and the data that is moved would be determined by a SQL query.

 

Thanks

 

Tom

HI @tomb2019 

 

Indeed this is a lot of information.

 

I would recommend the following:

  1. Add a "migrated" field in the source table
  2. Run a Power Automation that has a scheduled trigger. I'll let you define the frequency, but initially, you can make it quite frequently and reduce the rate as the number of rows to migrate reduces. 
  3. In the "Get Items" portion filter also by "migrated = false"
  4. Filter the number of rows by a set number (do some tests to see how many can you push without issues with Power Automation)
  5. Update the "migrated" source table

 

With this, you'll get a system where all records will start being migrated. You'll get to a point where you'll only have to migrate the deltas between runs.

 

But don't try to do all at the same time. If the Power Automation breaks in the middle you'll have issues understanding the data that was or not migrated.

 

Does all of this make sense? 

 

Migrating stuff is a sensitive topic, so start slow and grow as you're sure that Power Automation is handling stuff correctly. 

 

If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Cheers
Manuel

takolota
Community Champion
Community Champion

Helpful resources

Announcements
Power Automate News & Announcements

Power Automate News & Announcements

Keep up to date with current events and community announcements in the Power Automate community.

Community Calls Conversations

Community Calls Conversations

A great place where you can stay up to date with community calls and interact with the speakers.

Power Automate Community Blog

Power Automate Community Blog

Check out the latest Community Blog from the community!

Users online (4,218)