cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
PowerRanger
Resident Rockstar
Resident Rockstar

Multi Environment Data Strategy

Hi all,

 

we are looking for a "best practice" or a "recommended" way for the following:

 

1. We have one Environment called "Organization Data Hub" which get its Data from a SAP System and other Systems via Azure Data Factory (This process is up an runnning and works great - All tables are filled as they should)

2. We have several other Environment which are "use-case specific / application specific". Those Environments need to use the data from the "Organization Data Hub". So for example the "Organization Data Hub" has a Table "Products". One Use-Case Environment needs to store additional Information for "Products" in Table "Product Details" which is only relevant in a specific use case. 

 

Questions:

How would you guys set-up these Environment? As we can't create a Lookup from "Product Details" to "Product" (cross environment) we would store just the ID from "Products" in the "Products Details" table. In PowerApps I would then create a connection to both environments to get my Data. Does this make sense? Should we consider using Virtual Entities for that? Would it make sense to out everything in one Environment?

 

Thanks in advance.




​Please click Accept as solution if my post helped you solve your issue. This will help others find it more readily. It also closes the item.

If the content was useful in other ways, please consider giving it Thumbs Up.
2 REPLIES 2
ChrisPiasecki
Dual Super User
Dual Super User

Hi @PowerRanger,

 

This gets into the realm of Master Data Management, and the answer to this is "it depends". There is no absolute right or wrong approach and each will have its pros/cons. Below is not an exhaustive or complete list of considerations and options, but hopefully should help. 

 

In general, an environment for master data that needs to be made available to multiple unrelated downstream environments can be a good design. It reduces impact of change compared to having a single shared environment for multiple business areas with different needs. 

 

Before going further, I have to ask if there is a specific reason the Product Details can't just reside in the same environment as the Products? Is it truly unique to that one use case? If it's possible that it could be needed by other areas and doesn't change often, it may be fine to treat it as master data and keep it close to the products. If it does need to stay separate from the products, then read on. 

 

Some considerations to help drive a decision on environment and data integration strategy:

  • Volume of product data
  • Rate of change in product data
  • How is the data loaded into the hub? E.g. Delta load or full refresh?
  • How many potential subscribers of the data (you mention currently only one but what about near future?)
  • Do the subscribers just need read-only reference to the data (e.g. Act as a Lookup to related data). Or can data be changed in dependant environments and requires two-way sync? 
  • Data consistency - - how quickly do subscribers need to be notified of changes to data? E.g real time or non-real time? seconds/minutes/hours/days?
  • Are there security requirements that prevent you from replicating data outside of the hub? What about security around who can view the data? 
  • Is remaining available Dataverse storage in your tenant, or additional costs for storage an issue? 
  • Reporting needs. It will be harder to report across several environments, particularly operational reporting. That said, Power BI with direct query connection to Dataverse has now made this possible. 

 

Virtual entities are a decent option if:

  • Subscribers only require read-only use of data
  • No security concerns around all users of the subscribing environment having read-only access. 
  • Volume of source data is high and storage is a concern. 
  • Must see any data changes to the hub in real time. 
  • Have a Developer available

Note that the out of box virtual Odata provider does not have a mechanism for handling access tokens for authentication/authorization to another Dataverse environment. You will need to handle this yourself by either writing your own data provider, or using a service in between to handle the requests/responses. 

 

Canvas App

  • Good if you don't need plan to create Product Detail records in your model-driven apps, you won't be able to use the embedded Canvas app on creation of a new record, it will only show up when the record is saved and the field its tied to has a value. 

 

 

Synchronizing the product data from your hub to any subscribers is a decent option if virtual entities don't satisfy for your need. There are quite a number of ways to achieve this:

 

Power Automate from end to end - event based trigger

  • Good for low to medium volume or low to medium delta load
  • Good for near real time
  • Good for a single subscriber 

 

Power Automate + Azure Event Grid + Azure Logic Apps 

  • Good for low to medium volume or low to medium delta load
  • Good for near real time
  • Good for multiple subscribers 

 

Azure Data Factory

  • Good for high volume or high delta load
  • Good for Non-real time 
  • Can send to multiple destinations (subscribers) in a single data factory instance. 

Note that if the sync is meant to be one way only, you'll want to take care and ensure a appropriate change management is in place and take appropriate measures in locking down product data in the subscriber environments. E.g enable so that only an administrator or the service principal used for data integration can create/modify records. 

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

 

EricRegnier
Super User
Super User

Hi @PowerRanger,

Apologies for an answer with links, but they explain it well and are the published Microsoft articles about environment strategies. These would be the "official" best practices:

Suggest to also go through these documents; ALM and Governance:

Hope this helps!

Helpful resources

Announcements
Power Platform Conf 2022 768x460.jpg

Join us for Microsoft Power Platform Conference

The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.

May UG Leader Call Carousel 768x460.png

June User Group Leader Call

Join us on June 28 for our monthly User Group leader call!

PA Virtual Workshop Carousel 768x460.png

Register for a Free Workshop

This training provides practical hands-on experience in creating Power Apps solutions in a full-day of instructor-led App creation workshop.

PA.JPG

New Release Planning Portal (Preview)

Check out our new release planning portal, an interactive way to plan and prepare for upcoming features in Power Platform.

Users online (1,442)