cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
JoostPielage
Frequent Visitor

Extremely Slow SQL inserts

Hi everyone,

 

I have a flow that takes rows from an Excel table and inserts it into a SQL database in Azure. The inserts into the SQL database are happening but at an extremely slow rate. For 256 rows to be inserted it took 14 minutes to complete. I have a column that sets the time when the record is created and there you can see well that the inserts are very slow. Is there anything that can be done to speed this up? 

 

 

 

Id        CreatedTime
22947 2020-04-14 12:30:33.503
22946 2020-04-14 12:30:32.587
22945 2020-04-14 12:30:31.527
22944 2020-04-14 12:30:30.120
22943 2020-04-14 12:30:29.167
22942 2020-04-14 12:30:27.917
22941 2020-04-14 12:30:26.790
22940 2020-04-14 12:30:25.723
22939 2020-04-14 12:30:24.440
22938 2020-04-14 12:30:23.493

 

 

 

 
1 ACCEPTED SOLUTION

Accepted Solutions

For anybody who is having a similar issue here is my solution to the problem. 

 

  1. Use a select data operation to select from the excel output
  2. Use compose to capture the output from the select
  3. Use the output from compose to insert into SQL.

It took 6 seconds to insert 256 records seconds 

 

SQL.png

 

View solution in original post

17 REPLIES 17
ChristianAbata
Super User II
Super User II

hi @JoostPielage  please try to do the same insert but in your Azure SQL query console, if the insert takes the same time, is becouse your server probably has beed created in a location that is away from your real location in that case your latency is slow and you need to create again your sql data base. In the case that you have the data base into a VM is the same please see if the vm in a correct location for you.

 

Use this page to see what is the reagion with low latency for you. https://www.azurespeed.com/Azure/Latency



Did I answer your question? Please consider to Mark
my post as a solution! to guide others :winking_face:

Proud to be a Flownaut!


If you want you can follow me at www.christianabata.com Quieres contenido en español? Síguenos en Power Automate LA
v-bacao-msft
Community Support
Community Support

 

Hi @JoostPielage ,

 

Follow up @ChristianAbata 's suggestions, you could try to set Apply to each to speed this up.

Image reference:

56.PNG

Hope this helps.

 

Best Regards,

Community Support Team _ Barry
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

@v-bacao-msft Thank you for the help. I did change the setting and it was a bit faster it took 09:39 to run 256 rows. Still not breaking any speed records. Especially at the end it became very slow...

 

ID         CreatedTime

23203 2020-04-15 09:06:42.103
23202 2020-04-15 09:06:41.010
23201 2020-04-15 09:06:39.590
23200 2020-04-15 09:05:26.903
23199 2020-04-15 09:05:25.483
23198 2020-04-15 09:05:08.123
23197 2020-04-15 09:04:09.747
23196 2020-04-15 09:04:08.437
23195 2020-04-15 09:04:07.323
23194 2020-04-15 09:04:06.130
 
I will move the DB to another region this afternoon hopefully that helps. 
Eventually, I need to insert 25k records os this won't do it. Any other tips to improve the performance?

@JoostPielage  sometimes the performance changes, if you create a Elastic DB vs Static db, and stactic db has low performance as elastic.



Did I answer your question? Please consider to Mark
my post as a solution! to guide others :winking_face:

Proud to be a Flownaut!


If you want you can follow me at www.christianabata.com Quieres contenido en español? Síguenos en Power Automate LA

@ChristianAbata Thank you for your help. I think the real problem here is that it creates a connection per insert, that is why it is getting really slow. That is my only reasonable explanation for it, I don't have any hard proof, but it would make the most sense. 

 

On my search to do bulk inserts I found this article: https://garrytrinder.github.io/2019/03/bulk-insert-array-of-json-objects-into-azure-sql-database-usi...

 

He uses a json object to insert with a stored proc.

 

The search continues....

For anybody who is having a similar issue here is my solution to the problem. 

 

  1. Use a select data operation to select from the excel output
  2. Use compose to capture the output from the select
  3. Use the output from compose to insert into SQL.

It took 6 seconds to insert 256 records seconds 

 

SQL.png

 

View solution in original post

Hi JoostPielage,

Can you advise on how to later build input for Compose and SQL Query, please? I tried several different approached, and your looks promising :).

Hi @kamilkalecinski 

 

The output of the compose is a pure JSON object. SQL can actually work with JSON. Here you can find any example. 

 

https://stackoverflow.com/questions/46323946/how-to-insert-json-object-to-sql-server-2016-as-nvarcha...

 

If you have any questions let me know. 

HI @JoostPielage,

Many thanks! I figured it out and actually, it is my first experience with mixed SQL and JSON, so it was good learning. The only problem now is to go beyond 5000 rows limitation :).

 Thanks again.

Hi @kamilkalecinski if you go to the settings of you excel step. You can enable pagination and go past the initial 5000 row limit. The ultimate limit is 100k. I have never tested that. I have gone up to 52k without a big issue. Good luck on your flow!

Hi @JoostPielage ,

 

I am glad I found your topic as I am also struggling to insert 50k+ rows from Excel to SQL Server using flow...Could I ask how do you use Compose output in your SQL query to insert? Do you mind providing a quick example or some documentation as reference? BTW our SQL Server is on-premise so I will probably wrap the query in stored procedure. Thanks for your help in advance!

 

Richard

Hi @RichardW ,

 

I basically store the JSON output from the compose into a SQL temp table. 

 

INSERT INTO @TheTable SELECT '@{outputs('Compose_5')}'
SET @JSON = (SELECT TheJSON FROM @TheTable)

 

After that, you can use the JSON to insert the values into a table. You need to make sure that your column names match up. You can see an example here:

 

https://stackoverflow.com/questions/46323946/how-to-insert-json-object-to-sql-server-2016-as-nvarcha...

 

If you have any question let me know, I;ll try and help where I can

Hi @JoostPielage ,

 

Thanks for the examples - really appreciated! I will have a try and see if I can make it work for my flow...thanks again!

 

Richard

Thanks a lot for this approach. I was able to use this to get desired output. Previously It took me 1 hour to insert 5k rows. Now, I am able to insert 5300 rows within 2 minutes... I am taking the JSON output from Compose and used it as an input to stored procedure which inserts records in the destination table using OPENJSON. I used Variables and do Until loop to process more than 5k rows which is a limitation in MS Flow. I will definitely write a blog about it.

 

Hi @Pranshu27,

Please let me know once you will post your blog article. I have exactly the same approach as you described - 5k rows portions that take 2 minutes each to load to SQL. Would be great to compare and see if there is still something to improve in my pipeline.

Hey @kamilkalecinski 

How many rows are you processing?

I had 5k+ rows so I used a do until loop and a couple of variables, condition logic to process all records.

I am using execute SQL procedure operation just after compose JSON output. So, it inserts 5k rows per batch. I am incrementing the values using variable SkipCount and using that in advanced properties of Excel Get list by rows. 

Let me know if you have any questions

Hi, Give an example how is the query in sql to insert the OUTPUT.
I thought something like :
INSERT INTO Employee
SELECT *
FROM OPENJSON({OUTPUT}

Helpful resources

Announcements
Process Advisor

Introducing Process Advisor

Check out the new Process Advisor community forum board!

MPA User Group

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group

MBAS on Demand

Microsoft Business Applications Summit sessions

On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks

Top Solution Authors
Top Kudoed Authors
Users online (22,444)