Showing results for 
Search instead for 
Did you mean: 

PowerShell script as an intermmediate step

I'd like to be able to send the output of one step into a PowerShell script and make the returned object (and its key/value pairs) available to the next step.


Such power!

Status: New
Power Automate

You can also do this without Azure - Check out this blog on local code execution using Microsoft Flow and the on-premises Data gateway:

Advocate I

Hi Sunay,


I literally tried this using File System method for running power shell but was not successful. I have working connection to my file system using data gatway. But when I use it just append or even create a file, I get an error which says


message""The requested action could not be completed. Check your request parameters to make sure the path '\\\\ServerName\\FolderName\/AuditLogExtract.txt' exists on your file system.


What also see strange to me is that flow adds extra slashes in the path for some reason.

I have also ensured to add my user into a local admin user group in the server where the file is located. I have shared that file to my user explictly. But no luck.





New Member

I need to fire a powershell script after a file is dropped into a onedrive folder


Regular Visitor

I absolutely agree with the request. Running PowerShell scripts would allow to significantly simplify workflows and would allow implementing functionality that is cumbersome or next to impossible to implement otherwise, e.g. provision sites based on PnP XML templates, implement automated tenant management tasks, provision workflows for new sites, automate disposition and archival processes, etc. 


This can be implemented in the following manner:


  • List of available Power Automate activities should be managed at tenant level. The PowerShell activity can be disabled by default for new tenants, which would trigger an approval process to justify decision to enable such powerful functionality.
  • PowerShell activity inputs and outputs should be standardized. Output values, $Errors, user messages (e.g. console/host output) should appear as a part of the output object (e.g. a JSON object that conforms to Power Automate Activity IO schema). Thus, execution results can be handled, reported on or persisted.
  • PowerShell scripts should run in the context of the user executing the workflow or by passing app credentials. It should allow for accessing of Azure resources like key vault or triggering of Azure functions,  etc. Having Azure, SPOnline Mangement Shell, Exchange and other M365 management modules available would be a bonus. List of installed modules and their versions can be managed at tenant level.
  • Ideally, a tenant would have its own Cloud Shell instance (see which has PowerShell script execution and file storage capabilities.  Remote script execution needs to be enabled. In this case, Power Automate PowerShell Activity would be able to utilize remote script execution within the Cloud Shell instance and reuse the scripts and resource files. This can be a prerequisite for enabling of the PowerShell activity in the tenant. Or, the instance will have to be specified in the PowerShell activity properties.
  • Utilized network traffic, processing, storage and other resources for the M365/Azure management shell host (Cloud Shell Host) should be managed similar to other Azure resources. And, Power Automate activity requests will also count against the daily Power Platform requests for each call to execute a PowerShell script.