cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
michaelshparber
Helper III
Helper III

Save REST API json responses to SQL Server

I am looking for best practice advice.

I am getting data from API calls and I want to save it to SQL Server after parsing.

Is it better to save json results to some blob folder first and then to insert them to SQL?

Should I use for... each body action to insert row-by-row or is there more efficient bulk loading?

Can you please share some of your best practices, please?

Thank you

2 REPLIES 2
manuelstgomes
Community Champion
Community Champion

Hi @michaelshparber 

 

Saving the blob somewhere in a folder has the advantage that you can fetch it later without doing another API call. But then you need to keep track of if you have a file or you need to perform an HTTP request. It makes things forked, and I would advise against it.

 

It's important, however, to deal with errors because if you're loading a lot of information, it will fail, so I would build in my Power Automate paths for when the insert fails to re-try or log those somewhere to deal with them later (either manually or have another Power Automate that deals with them).

 

Also, keep in mind that you want to continue inserting in case of the failure, so be aware of the "Run after" choices so that you can get the Power Automate going, even if there are errors.

 

Finally, if you have a big dataset to parse and insert, tap the 3 buttons and enable parallelisation. This will make things more efficient and make your inserts faster.

 

I hope this helps, and if something is not clear, please let me know.

 

If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Cheers
Manuel

michaelshparber
Helper III
Helper III

Excellent advice, @manuelstgomes !

Thanks!

I feel I need to save the jsons first as files.

Then - trigger another flow that parses and inserts to SQL.

This way in the first flow I can ensure it is finished successfully. Only then - trigger the next flow.

The next flow will recognize the jsons' date, delete from SQL if already exists and insert.

Smth like this...

What do you think?

 

Also, where do you recommend I save jsons? They are rather large for Sharepoint.....

Maybe Azure blob? I don't have experience with blog - is it expensive?

 

Do you also think Power Automate is a good fit for this kind of ETL? It takes 4 hours to run(!)

Maybe Logic App instead? Data factory?

I am desperately looking for best practices... 🙂

Thanks a lot! 

Helpful resources

Announcements
Power Automate News & Announcements

Power Automate News & Announcements

Keep up to date with current events and community announcements in the Power Automate community.

Community Calls Conversations

Community Calls Conversations

A great place where you can stay up to date with community calls and interact with the speakers.

Power Automate Community Blog

Power Automate Community Blog

Check out the latest Community Blog from the community!

Users online (6,443)