I am wanting to work on data stored in Azure SQL in the Dataverse, which is easy to do using a dataflow. I LOVE dataflows and am wanting to go no-code/low-code on this as much as possible. However, I want to then pass the data back to the Azure SQL DB and update any records that were modified by the dataflow. Is it possible to set up some sort of automated sync between the Dataverse and an Azure SQL DB? I know there is the Azure Synapse Link option, but that suggests to me that it is simply copying (rather than updating) data.
The datasets in question could range from anywhere from 1000 to 10k to 100k items, so I need the most performant solution possible.
Any Power Platform gurus out there with ideas?
I can suggest you to go with Microsoft Flow,
1. Create a Field in Azure SQL DB to Store Unique Values
2. Create a Field in DataVerse with the same DataType as Created in Step 1.
3. Create 2 Flows
a. Create a FLOW Trigger when a record in Created/Updated in Azure DB
--> Use List Rows and Check the CREATED SQL Field (Step 1 ) in Step 2 (Dataverse)
--> If Available Update else Create
b. Create a Flow Trigger when a record in Updated in Dataverse
--> Use List Rows and and retrieve the Record from SQL DB
--> then Update
Please mark as Answer if it is helpful and provide Kudos
Hi @arpost ,
You might want to check out the process to take Dataverse data to Azure Synapse with the following process: https://cloudblogs.microsoft.com/powerplatform/2021/05/26/accelerate-time-to-insight-with-azure-syna...
Microsoft also has a Data Export Service but this requires licensing for Dynamics 365 (https://docs.microsoft.com/en-us/power-platform/admin/replicate-data-microsoft-azure-sql-database) and even this article recommends the first article as the approach 😀
Hope this helps. Please accept if answers your question or Like if helps in any way.
It's curious to ask this, is it possible to compare and update the record based on filtering ? Like upsert operation?
Hi @rampprakash ,
Yes, it handles upsert functionality as well. Please see the following: https://www.blog.allandecastro.com/bringing-your-dataverse-data-to-azure-synapse/
Hopefully this is what you were asking.
DataFlow will work in one way bringing data from external source to dataverse and that too will work only for data. If any metadata changes occurs that will not work using dataflows.
Azure synapse is to send dataverse data to synapse for analytics, to store in a data lake or to send data to external source using ADF. Only works for table if change tracking has been enabled.
Since in your case it has to work in both the ways. I would recommend,
Virtual table, since Azure SQL has ODATA provider. It is easy to configure and handle external data's in powerapps without any code. Make sure Id in external datasource in GUID format. Attached link for your reference (https://docs.microsoft.com/en-us/powerapps/maker/data-platform/virtual-entity-odata-provider-require...
2. Or try azure data factory, pull data from dataverse push it to azure sql server and vice-versa (This setup would be bit complex). Thinking very broadly on this.
If it answers your question, Kindly give kudo and accept it as solution.
Correct me if I am wrong.
The one you are looking for is azure sql not for on prem sql server right?
If it is on prem, then you can try that one. FYI, that is still in preview. I will not recommend to use it in production.
If it answers your question, kindly give kudo and accept it as solution.
Learn how to create your own user groups today!
Please join us on Wednesday, January 19th, at 8a PDT. Come and learn from our amazing speakers!
Check out the new Power Platform Community Connections gallery!