Showing results for 
Search instead for 
Did you mean: 

Detail Step-By-Step - Power Platform ALM with Azure DevOps



  • Azure DevOps Repositories can be used as Source Control for Power Platform Solutions
  • CI/CD can be implemented using Azure DevOps Pipelines
  • We can use Microsoft Power Platform Build Tools to automate common build and deployment tasks related to apps built on Microsoft Power Platform . There are 2 versions of Build tools – older based on PowerShell and latest version based on Power Platform CLI


Power Platform ALM with Azure DevOps Process Flow


To implement CI/CD process with Azure DevOps Pipelines, we can store the Power Platform solution in the source control. There are two main Paths –

  1. Export the unmanaged solution and place it as unpacked in the source control system. The build process imports the packed solution as unmanaged into a temporary build environment , then export the solution as managed and store it as a build artifact in your source control system.
  2. Export the solution as unmanaged and also export the solution as managed, and place both in the source control system.



Fig 1


In this article, we will show the steps to achieve the Option above 2 above, and Fig 1 depicts the process.


1.Setup source and Target environment


Create source and target environments. Both should have Dataverse database enabled. Create an unmanaged Solution in the source environment


2.Set up Azure DevOps


  1. Create Azure Dev Ops Organization
  2. Create the Project within it
  3. Create the Repository to hold the source code
  4. Install Microsoft Power Platform Build Tools into your Azure DevOps organization from Azure Marketpl...
  5. Request parallelism if using Azure DevOps pipelines for first time using the link
  6. Within the Project, navigate to Project Settings> Repositories>Security Tab. Under user Permissions, make sure that for Project Collection Service Accounts under Azure DevOps Groups and <ProjectName> Build Service <OrgName> under users Contribute Permission is set to allow






3a. Create Azure DevOps Pipelines with Application ID and Client Secret


Create Azure AD App Registration


  1. Go to
  2. Search for App Registration, click New Registration
  3. Provide the name, keep other fields with default value and click Register
  4. Once the App is created, go to API Permissions, click Add a Permission>Select Dynamics CRM>Add Permission >Grant Admin Consent for <tenant>suparnabanerje_3-1674123826766.png
  5. Go to Overview>Client credentials>New Secret. Copy the value into a notepad as this will be needed later and you won’t be able to get it once navigate away from this page.
  6.  Come back to overview, and copy the Application (client) ID and Directory (tenant) ID in the same notepad. You will need these 3 values while creating service connection


Add the service principal as App user into Power Platform source and destination environment.


  1. Go to Power Platform Admin Center>Environments
  2. Select your Source Environment
  3. From right navigation, Users >See All>App users list
  4. Click New App user>search for the App created in Previous step>Add it and provide System Customizer or System Administrator role.
  5. Repeat all the steps above for the destination environment


Create Service Connection with Application ID and Client Secret


  1. Go to your Azure DevOps Project, click Project Settings.
  2. Under Pipelines, click Service Connections >New Service Connection>Select Power Platform
  3. Select Authentication method as Application ID and client secret
  4. Go to> Select your Source environment >Go to Settings>Session details>copy the Instance url and paste it under Server Url
  5. Paste Tenant Id, Application Id and Client Secret as saved earlier
  6. Save the Service Connection with the name “Dev Service Principal”
  7. Follow the steps ii to vi above , this time get the destination environment url, create the service connection and save as “Prod Service Principal”


Create Pipeline – Export from Source


i.  From the left navigation within the Project, click on Pipelines >New Pipeline>Use the Classic Editor

ii.  Select the Source as Azure Repos Git, select your Project, Repository and Branch and click continue



iii.  Under select template, start with Empty job




iv.  Click Agent Job 1 and make sure Allow Scripts to access OAuth token is checked




v.  Add the task Power Platform Tool Installer with task version 2




vi.  Add the task Power Platform Export Solution. We are adding the task to export the solution as unmanaged here







For Service Connection, Select Service Principal> Select Dev Service Principal from Dropdown

Provide your Solution Name (not the display name)

Solution output file name $(Build.ArtifactStagingDirectory)\<SolutionName>.zip

Uncheck export as Managed Solutin


vii.  Copy the above task . This time we are exporting managed solution. Keep all settings same, only check the box Export as Managed solution and the Solution Output file name to $(Build.ArtifactStagingDirectory)\<SolutionName>


viii.  Add the task Power Platform Unpack Solution




Solution Input File -$(Build.ArtifactStagingDirectory)\<SolutionName>.zip

Target Folder to Unpack Solution - $(Build.SourcesDirectory)\<SolutionName>

Type of Solution – Both


ix.  Add a task Command Line script, and paste the below script


echo commit all changes

git config “<email>”

git config "<user name>"

git checkout -B main

git add --all

git commit -m "code commit"

git push --set-upstream origin main


x.  Save and queue the Pipeline and wait it to be finished

xi.  Check the repository for the unpacked source code





Create Deployment Settings File


  1. Open Visual Studio Code
  2. Install PAC CLI
  3. Run the below command to export the solution in your local machine


pac solution export --name <solutionname> --path .\ --managed false


     iv.  Run below command to create Deployment Settings file


pac solution create-settings --solution-zip .\<SolutionName>.zip --settings-file <SolutionName>.json

    v.  Update values in the Deployment Settings file for the target environment

    vi.  In the Repository, create a Folder named Settings, create a file <SolutionName>.json within it, copy the text from the Deployment Settings File




Create Build Pipeline


  1.             Create a new Pipeline with Classic Editor>Empty Job
  2.             Add the task Power Platform Tool Installer
  3.             Add a task Power Platform Pack Solution



Source Folder of Solution to Pack -Select Folder by clicking 3 dots

Solution Output File -<SolutionName>.zip

Type of Solution -Both

  iv.     Add a task -Copy Files



Source Folder -Settings

Contents -**

Target Folder - $(Build.ArtifactStagingDirectory)


  v.   Add a task Publish Artifact





Path to publish - $(Build.ArtifactStagingDirectory)

Artifact Name – drop


  vi.   Save and Queue the Build Pipeline


Create Release Pipeline


  1.             From Left navigation, click Releases >New Release Pipeline >Start with Empty Job
  2.             Add the Artifact created in the Build pipeline



Select your project and build pipeline

Source alias -drop

  1.             Under Stages>Stage 1>Click Job
  2.             Add the task Power Platform Tool installer
  3.             Add the task Power Platform Import Solution




      iv.   Service Connection – Select the service connection you created for Prod.

       v.   Solution Input File – Select clicking 3 dots, select the managed zip file for Prod


      vi.   Check Use Deployment Settings File and select the Deployment settings file by clicking 3 dots.

      vii.  For Prod, under Advanced, check Import Managed Solution

       viii.  Save the pipeline and create a Release

       ix.  Check the Solution has been deployed properly to Production


3b. Create Azure DevOps Pipelines with Managed Identity


    1. Create VMScaleSet and assign Managed Identity
    2. Create Self hosted Agent pool and point to the VMscale set
    3. Add the managed identity as App user into Power Platform source and destination environment.
    4. Create Service Connection with managed identity.
    5. Create Build Pipelines- Export from Source and Build Solution. Create Release Pipeline – Deploy to Destination . Use previously created self-hosted agent pool.


Please follow this Blog for detail steps .





Hi Suparna,


Great article. We are trying to set up DevOps process for the power platform in our Agency. We are using our on-prem Azure DevOps server. I was able to export the solution from our dev power apps environment and create a build. But when trying to deploy the build artifact to the test environment I am getting the error below.


Error: RetrievePrivilegeForUser: The user with id {GUID} has not been assigned any roles. They need a role with the prvReadAsyncOperation privilege.


We do not see any user with the id {GUID} in our AAD. Not sure how to resolve this issue. Any help or suggestion will be appreciated. 


Thank you,


I could export the solution to target environment, but not able to play the app. It's happening only when I'm exporting solution through Pipelines and working well when exporting manually. Please look into this issue.

@sxm0275  Did you add the Service Principal as an App user to the destination environment with at least System Customizer role? Please go to>App Registrations and and search for the Service Principal you created for this pipeline, and check that it's Application(Client) ID is matching the Guid that is shown in the error.

@angaravlgs When you export the App using pipelines with a Service Principal, the Service Principal (App user) becomes the owner of the App, any other user will get access denied as it's not shared with them. You need to login as an Admin and share the App with any user, including the Admin . When you are importing manually, the App owner is the user who is importing, so he/she can access the App.

@suparna-banerje The service principle was missing sys admin/customizer role in destination. Adding the roles got it working. Thank you. 

Hi @suparna-banerje 


Your method worked and pipeline went through without any errors. However, in target environment, I could see the connection established from deployment settings file. I had to manually edit the app and configure the connection. Is there anything I am missing?

This is giving us issues between the unmanaged and managed environment. I am able to package, export, import and deploy but it is not behaving the same way in both environments which is beyond frustrating when trying to move your application through the ALM and out to production.


Also, the Dataflows are not easily migrated and the connections between services that are dev/test/prod are not easily switched. It makes for further headaches when it should be seamless.

@onewabash Can you please provide more details on what is behaving differently in 2 environments and any other specific issue you are facing?

@Kavya1 Are you having this issue for all connections or for some connections also? If for some, can you please let me know which connections are having this problem?

@suparna-banerje I tried for SQL connection and got to know there is no option to create an environment variable for SQL database. 

@suparna-banerje It seems that the issue is around the Navigate variables that are being created between screens. The lead product engineer shows in the development environment, but not in test. It is being looked up using a nested gallery (a gallery within a gallery item). So it looks up the order information in a gallery and then brings in the lead product engineer information in a nested gallery within that gallery. Going to try and make it a global variable with the Set command and see if that fixes the issue.

Helper III

Thanks a lot for the post, @suparna-banerje .


However, I have a question: How do I handle plugin assemblies when packing my solution?

When I unpack the .dll files will be exported as "" which will not work when using the "Pack Solution" task.


I understand I must somehow "map" the .dll file using this option in the Pack Solution task. But how is this done? Do I need to pack both the solution and the plugin component separately and use the "Add Solution Component" afterward for the plugin part?


Currently my "Pack Solution" task is failing because it is trying to find the .dll file for my plugin, but the repository only contains the "" exported from the "Unpack Solution" task.


So my question is, how do I map this .dll file correctly so my "Pack Solution" task works? 🙂

Unfortunately, the documentation I have found isn't really helping me understand this step.





Is it possible to perform the step Create Deployment Settings File as part of the DevOps task to fully automate the process? What changes need to be made to achieve this? Thanks


Yes, you can perform the step "Create Deployment Settings File" as Part of the DevOps Pipeline. You can use the powershell-task to achieve it.
For example:

  - powershell: |
      Write-Output "Creating SolutionSettings"
      $env:PATH = $env:PATH + ";" + "$(pacPath)"
      pac solution create-settings --solution-folder $(unpackaged_folder) --settings-file "SolutionSettings/settings$(solutionname)Dev.json"
    displayName: 'Generate SolutionSettings.json'

I also used Powershell to check if the Solution Files also exists for UAT and Prod. If they exist they can be updated via powershell. Sadly you still need to set the properties by yourself once. Maybe here could help a Canvas App to  be more user friendly. 

I hope this helps. 

I am using Patches in MS Dynamics. Did anyone of you experience the CI/CD process including patches? 

I would like to know if the opportunities for this use case are limited or not by Microsoft.

Thanks for this article @suparna-banerje 


When committing the unmanaged solution to the repo using a yaml pipeline, I found that you also need to include a git fetch command to bring the branches into the environment and update the branch after the initial commit.

Hi @suparna-banerje , thank you for the article, its helpful for lot moving forward in automating the deployment. However I am facing an issue wherein the environment variables and connection reference is not getting updated with the new values in the destination. It is still retaining the old values only. Any inputs where I need to check, as I have followed all the steps in the article.

Below is my deployment settings entries. All these variables are connections are still pointing to Dev, even though the solution is imported successfully without any errors.

  "EnvironmentVariables": [
      "SchemaName": "cr5f7_varListName",
      "Value": "NewEmployeeOnboardingUAT"
      "SchemaName": "cr5f7_varSiteURL",
  "ConnectionReferences": [
      "LogicalName": "cr5f7_sharedsharepointonline_8d82e",
      "ConnectionId": "",
      "ConnectorId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline"

Hi @Akash_S 


Environment Variables:

Did you remove current value before deployment?


Doing so ensure that the default value is moved over to the destination/target environment. That is, default value should hold the value meant for the destination/target environment.


Connection Reference:

Did you ensure there is an equivalent connection reference in the destination/target environment before deployment?


Create one if there isn't, or deployment will fail.


Just my two cents.

@JLKM - Yes current value is cleared in the source environment for all variables.

For connections as well, there is a connection already created in the destination and I updated the same in the deployment settings file. 

PP error 1.pngerror 2.pngerror 3.pngerror 4.png

I've imported the solution into the target environment with empty values in the ConnectionID field in the DeploymentSettings.json. However, when I try to pre-populate or update values in the Deployment Settings file for the target environment, I encounter a generic error. 


I attempted to manually add the connection in the target environment and even created a canvas app with a gallery to verify the connection references, but the connection ID is empty.



How can I resolve this? TIA


@eudaimonia_ If you are deploying through service principal, can you please try sharing the connection in the target environment with the service principal, then add the connection id in the deployment settings file and try to deploy?

@Akash_S Are you still facing the issue regarding no update of environment variables in target env? There are some recent Product updates on this, can you please try again?

@badovinacs  This worked for me while committing using yaml pipeline


    script: |
      echo commit all changes
      git config "<email>"
      git config "<user name>"
      git checkout -B main
      git add --all
      git commit -m "solution init"
      echo push code to new repo
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --set-upstream origin main

Hello @suparna-banerje 


I have followed the steps outlined here, but in the Build Solution process-Pack Solution, I am getting this error: "Error: Cannot find required file 'D:\\a\\1\\s\\mysolution\\Other\\Customizations.xml'."




My pipeline tasks are as follows: 



My 1st pipeline completed successfully and added to my repo as shown below.  The Customizations.xml file exists; just not sure what I am doing wrong.  Can you please advise?





@talyrobin1  Why it is searching inside folder mysolution? In your screenshot I can see that your folder name is You may click 3 dots and select the source folder. Can you please check the SolutionName variable in your Export and Build pipeline, ideally it would be the name of the solution (not display name), without zip

@suparna-banerje , thank you for the reply.

I checked my variable and it is correct.  My issue was that when I unpacked the solution in the PP Unpack Solution tasks, I used $(SolutionName).zip.  However, I was trying to unpack only $(SolutionName) (no .zip).  I have fixed this and it works.

Thanks so much for pointing me in the right direction!  Much appreciated.

Great Article, extremely helpful. Everything runs fine but i have trying to achieve something that is not working:

The idea is multiple developers can push their code in thier own branches in azure devops and then they will be merged for deployment.

i have these set of commands in yml. I have yml on my main branch and trying to push solution to custom FeatureBranch. It works if i run pipeline from FeatureBranch but not from main branch. 

echo commit all changes
      git config "<email>"
      git config "<user name>"
      git checkout -B FeatureBranch
      git add --all
      git commit -m "solution init"
      echo push code to new repo
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --set-upstream origin

@fannazephyr You may refer to the ALM accelerator yaml for export solution and push to git, it has the code for pushing to a remote branch 

@suparna-banerje Thanks for your response, yes indeed i have been trying to get around it using ALM code but no success yet. But i will keep trying i guess 🙂 

Hello, do we need to have azure devops paid subscription for ALM. Our customer are data sensitive organization so i am not confident with the free version of azure devops.

Hello @suparna-banerje 


Great article and it will be definitely helpful to everyone who work in Power Platform.

Is there a possibility to integrate the same with GitHub as like Azure Repo?


Please advise! 

Thank you again