I am trying to copy all files in a subfolder on one internal only site to another subfolder on another client facing site. The internal site has this type of folder hierarchy:
Documents
Year
Project Name
Contracts
Interviews
Attachments
I only want to copy the files in the Interviews folder to a new folder I'm creating during this new flow.
The link to the original folder lives in our database, so the flow is going to be manually triggered to start with an HTTP request that I plan to use to loop this flow on itself until the second step which queries our database for the link to the internal site/folder returns no results. With that link I can get the folder path and am successfully getting the folder metadata using that path.
The problem is that I can't get Power Automate to recognize any of the files in the Interviews folder. In the Get files (properties only) step the outputs section just says "Click to download" and when I tried using Alireza Aliabadi's method the final copy file action is just greyed out and when I check the new SharePoint site where the files are supposed to be copied to the folder is empty.
When I navigate to the current internal site to view the files in the interviews folder this is the folder path I see:
Here's what I put in the Get files (properties only) step:
And here's the current flow:
I suspect there's something I'm doing wrong with the Library Name and Limit Entries to Folder option, but I can't seem to get the right combo there.
Any help would be greatly appreciated.
In the Get files (properties only) step I tried "Documents/2021/Project Name/Interviews" for the library name and got a 404 list not found error.
There are hundreds of projects that need to be moved, so the folder with the desired files in them is going to change for every run.
When I press "Click to download" in the results for the Get Files (properties only) step I get a JSON file with this at the bottom:
"body": {
"@odata.nextLink": "https://flow-apim-unitedstates-002-westus-01.azure-apim.net/apim/sharepointonline/shared-sharepointonl-b517260e-61b3-4eea-8095-22f6840a832a/datasets/https%253A%252F%252Fmycompany.sharepoint.com%252Fsites%252FInternalSite/tables/Documents/getfileitems?folderPath=%2fShared+Documents%2f2021%2fProjectName%2fInterviews&$skiptoken=Paged%3dTRUE%26RootFolder%3d%252fsites%252fInternalSite%252fShared%2520Documents%252f2021%252fProjectName%252fInterviews%26ix_Paged%3dTRUE%26ix_ID%3d00000",
"value": []
}
When I try to manually visit that URL I get this:
{
"Message": "Missing Authorization header for a privileged call on connection.",
"Source": "product policy"
}
OK, so I scrapped the built-in Get files (properties only) for an HTTP request to SharePoint. That at least got me the correct results. Here's a sample of what that returns:
{
"body": {
"d": {
"results": [{
"__metadata": {
"id": "https://mycompany.sharepoint.com/sites/subsite/_api/Web/GetFileByServerRelativePath(decodedurl='/sites/subsite/Shared Documents/2021/Project Name/Interviews/filename.docx')",
"uri": "https://mycompany.sharepoint.com/sites/subsite/_api/Web/GetFileByServerRelativePath(decodedurl='/sites/subsite/Shared%20Documents/2021/Project%20Name/Interviews/Project%20Name%20.docx')",
"type": "SP.File"
},
"Author": {
"__deferred": {
"uri": "/Author"
}
},
"CheckedOutByUser": {
"__deferred": {
"uri": "/CheckedOutByUser"
}
},
"EffectiveInformationRightsManagementSettings": {
"__deferred": {
"uri": "/EffectiveInformationRightsManagementSettings"
}
},
"InformationRightsManagementSettings": {
"__deferred": {
"uri": "/InformationRightsManagementSettings"
}
},
"ListItemAllFields": {
"__deferred": {
"uri": "/ListItemAllFields"
}
},
"LockedByUser": {
"__deferred": {
"uri": "/LockedByUser"
}
},
"ModifiedBy": {
"__deferred": {
"uri": "/ModifiedBy"
}
},
"Properties": {
"__deferred": {
"uri": "/Properties"
}
},
"VersionEvents": {
"__deferred": {
"uri": "/VersionEvents"
}
},
"Versions": {
"__deferred": {
"uri": "/Versions"
}
},
"CheckInComment": "",
"CheckOutType": 2,
"ContentTag": "{32-alphaNumericstring},2,5",
"CustomizedPageStatus": 0,
"ETag": "\"{32-alphaNumericstring},2\"",
"Exists": true,
"IrmEnabled": false,
"Length": "163026",
"Level": 1,
"LinkingUri": "sanitized",
"LinkingUrl": "",
"MajorVersion": 1,
"MinorVersion": 0,
"Name": "Project Name.docx",
"ServerRelativeUrl": "/sites/subsite/Shared Documents/2021/Project Name/Interviews/Project Name.docx",
"TimeCreated": "timestamp",
"TimeLastModified": "timestamp",
"Title": "",
"UIVersion": 512,
"UIVersionLabel": "1.0",
"UniqueId": "32-alphaNumericstring"
}]
}
}
}
I sent that through a Parse JSON step and then tried to pass that into a Apply to each step for the copy file step. I'm getting an error because I can't seem to get any dynamic content out of the parse JSON step and into the copy file step:
from the sample response above I've tried grabbing the id from the __metadata object and the UniqueID from the very bottom. When I test the flow neither of those put any content into the File to Copy field. In the raw output for the copy file step the "parameter/sourceFileId" is always null. Any ideas?
When I hover over the id dynamic content pill in the copy file step it shows items('Apply_to_each')?['__metadata']?['id']. First of all, is this the proper expression? Second of all, should I be using a different parameter from the response to my HTTP request to SharePoint?
When I try adding the copy file step just using a parameter from that second parse JSON step it automatically creates an Apply to each step and inserts body('Parse_JSON_2')?['body']?['d']?['results']. But that throws up the following error:
"The execution of template action 'Apply_to_each' failed: the result of the evaluation of 'foreach' expression '@body('Parse_JSON_2')?['body']?['d']?['results']' is of type 'Null'. The result must be a valid array."
Which is strange because if I check the parse JSON 2 step I see this:
so then I tried array(body('Parse_JSON_2')?['body']?['d']?['results']) for the "select an output from previous steps" in the Apply to each step, but that's when I was getting the null values for the dynamic content I've been trying to add.