cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
michaelshparber
Helper III
Helper III

Save REST API json responses to SQL Server

I am looking for best practice advice.

I am getting data from API calls and I want to save it to SQL Server after parsing.

Is it better to save json results to some blob folder first and then to insert them to SQL?

Should I use for... each body action to insert row-by-row or is there more efficient bulk loading?

Can you please share some of your best practices, please?

Thank you

2 REPLIES 2
manuelstgomes
Super User
Super User

Hi @michaelshparber 

 

Saving the blob somewhere in a folder has the advantage that you can fetch it later without doing another API call. But then you need to keep track of if you have a file or you need to perform an HTTP request. It makes things forked, and I would advise against it.

 

It's important, however, to deal with errors because if you're loading a lot of information, it will fail, so I would build in my Power Automate paths for when the insert fails to re-try or log those somewhere to deal with them later (either manually or have another Power Automate that deals with them).

 

Also, keep in mind that you want to continue inserting in case of the failure, so be aware of the "Run after" choices so that you can get the Power Automate going, even if there are errors.

 

Finally, if you have a big dataset to parse and insert, tap the 3 buttons and enable parallelisation. This will make things more efficient and make your inserts faster.

 

I hope this helps, and if something is not clear, please let me know.

 

If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Cheers
Manuel

michaelshparber
Helper III
Helper III

Excellent advice, @manuelstgomes !

Thanks!

I feel I need to save the jsons first as files.

Then - trigger another flow that parses and inserts to SQL.

This way in the first flow I can ensure it is finished successfully. Only then - trigger the next flow.

The next flow will recognize the jsons' date, delete from SQL if already exists and insert.

Smth like this...

What do you think?

 

Also, where do you recommend I save jsons? They are rather large for Sharepoint.....

Maybe Azure blob? I don't have experience with blog - is it expensive?

 

Do you also think Power Automate is a good fit for this kind of ETL? It takes 4 hours to run(!)

Maybe Logic App instead? Data factory?

I am desperately looking for best practices... 🙂

Thanks a lot! 

Helpful resources

Announcements
Power Platform Conf 2022 768x460.jpg

Join us for Microsoft Power Platform Conference

The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.

MPA Virtual Workshop Carousel 768x460.png

Register for a Free Workshop

Learn to digitize and optimize business processes and connect all your applications to share data in real time.

Power Automate Designer Feedback_carousel.jpg

Help make Flow Design easier

Are you new to designing flows? What is your biggest struggle with Power Automate Designer? Help us make it more user friendly!

Users online (4,297)