cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Sven_Rahn
New Member

Power Platform Build Tools: Connection reference

Hi!

So we successfully implemented ALM with some Azure DevOps Pipelines and the official Power Platform Build Tools. But there's one problem:

When importing a solution which contains a Flow and its connection reference, it doesn't update itself correctly and just turns itself off.
It is as if I would import the solution with the old solution explorer UI. The new PowerApps solution UI asks when importing to update the connection references. But the old UI and the Power Platform Build Tools in Azure DevOps just disable the Flow.

So after having everything automated, we still have to go into the prod environment and update each connection reference and activate each Flow (even if it didn't change). I guess the process is worse now than without ALM.

Are there any workarounds or possibilities to use the "new UI" import API? Or am I missing something? How should this be handled with ALM?

13 REPLIES 13
ryanspain
Advocate I
Advocate I

Hi Sven,

 

While not truly automated, I have found that if a flow has already been configured (connections resolved) using the new import solution experience in the target environment, then subsequent solution imports of that flow (manually or via a pipeline) will keep the flow state as 'On'.

 

So in summary:

  • Import the flows solution manually and configure in the target environment
  • Subsequent solution imports with the same flows should keep the flows state as 'On'

This however isn't ideal when you are introducing a new flow to the target environment for which would need to follow the above process again.

 

Hope this helps,

Ryan

drivard
Advocate II
Advocate II

Hi,

 

You might want to have look at this gist from Marc Schweigert of Microsoft.

https://gist.github.com/devkeydet/f31554566b2e53ddd8e7e1db4af555a6 

 

It explains the current limitation and a possible workaround. I haven't tried it yet though.

 

Another possible avenue that I can think of (while not fully automated, and far from ideal I agree)

- Remove the connection references from your main solution and put them in a separate solution.

- Install the connection references solution manually in the target environment using the new UI and update the connections (these should not change very often)

- Now your automated deployments should not turn off the flows on every release

 

Hope this helps,

 

 

 

Sven_Rahn
New Member

Thanks for the suggestions!

I imported the connection references into the other environment and changed it accordingly to have the right connection. The logical name of the connection reference in dev and prod are the same.


But still when I import the Flow with the pipeline it always disables the Flow. I did the first import manually and the second import with the pipeline via an app user. All imports by the app user disable the Flow.

 

I am pretty sure this is because the connection used by the Flow is not owned by the App user but by the "normal" admin user account. If the owner of the connection would be the app user I guess it would work. But I don't really see a way to change the owner of the connection. The connection doesn't seem to be a really accessible record.

See this https://community.dynamics.com/crm/b/crminthefield/posts/using-connection-references-with-power-auto...


So, the pipeline has to use the same user as the owner of the connection. In the pipeline I can only use an app user because of MFA. But then I need a way to set the owner of the connection.

Is Microsoft aware of this issue?

Edit: I was able to import the Flow by impersonating the owner of the connection in the XRM Toolbox, and the Flow wasn't disabled. So it worked! But I can't see a way to impersonate another user with the Power Platform Build Tools sadly.

The workaround I use for this is to have a script in my pipeline that impersonates the owner of the connection and turns on all the flows that uses that connection.

hi, Scott

Would you please show how can we use the script in pipeline to impersonates the owner of the connection and turns on all the flows ?  Can you provide a script snap shot ? 

Thanks. 

Hi Scott,

 

I would like to see the script as well.  Would save a TON of time.

ryanspain
Advocate I
Advocate I

@ScottCostello/@stone_haha,

 

You can activate/switch on cloud flows from an Azure DevOps pipeline using PowerShell while impersonating another user. i.e. the owner of the cloud flows for example. See below, a sample script for doing this.

 

 

$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("CallerObjectId", "c85463fb-XXXX-XXXX-XXXX-3c73fc279d03")
$headers.Add("Authorization", "Bearer eyJ0eXAi...")
$headers.Add("Content-Type", "application/json")

$body = "{ 'statecode': 1, 'statuscode': 2 }"

$response = Invoke-WebRequest -Uri 'https://XXXXXXXX.crmX.dynamics.com/api/data/v9.2/workflows(bc9d6d8f-XXXX-XXXX-XXXX-000d3ab8a215)' -Method 'PATCH' -Headers $headers -Body $body

$response.StatusCode
$response.StatusDescription

 

 

Some notes on the above:

  • The CallerObjectId header should be the object ID of the AAD user you want to impersonate - Impersonate another user
  • You need to pass in a Bearer token in the Authorization header - See Use OAuth authentication with Microsoft Dataverse
  • statecode and statuscode of 1 and 2 respectively represent an Activated workflow - See Workflow table/entity reference
  • You need to update the Uri parameter of Invoke-WebRequest to use your environments name, region, and the workflow ID.
  • Most of the parameters used in this PowerShell script could be setup as variables in the pipeline using the PowerShell script task.
  • This example only activates a single flow but the same process would apply to activating multiple whereby you could loop over a list of cloud flows and attempt to activate them. A retry pattern could be used.  

Thanks Ryanspain!

 

I've got almost all of it working accept that the Invoke-WebRequest I get an error returned saying (400) Bad Request.  Unfortunately I don't see any other information.  

 

I believe I've authenticated correctly, but something is off.

I think I have the correct WorkflowId, I got it from the WorkFlow edit Url.  

 

Could this be an issue with privileges? 

Hi @ScottCostello,

 

You would receive a 401 unauthorised if it was a privilege issue. A response code of 400 indicates a bad request which could mean a problem with the input arguements.

 

For example, using curly braces in the workflow identifier as below would result in a 400 bad request response.

 

.../api/data/v9.2/workflows({bc9d6d8f-XXXX-XXXX-XXXX-000d3ab8a215})

 

 

If everything looks ok on your end, try pasting your snippets (with sensitive info omitted) here and I'd be happy to cast an eye over.

 

Finally, I'm no export on PowerShell but I generated most of my supplied sample using a neat feature of Postman that lets me turn a HTTP request into a PowerShell script.

 

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!

M365 768x460.jpg

Microsoft 365 Collaboration Conference | December 7–9, 2021

Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.

Users online (1,717)