cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
Frequent Visitor

Transfering a large number of files

Hi Guys,

 

I have the honor to build a flow to transfer a large number of data. So far so good, I created a flow that based on the first 4 digits it transfers the files. But there are a large number of files ( more then 5000) so flow can't read the files.

 

I have an idea how to do this , but don't know of this is possible.

I want to copy a folder for example 05 06 2020 to an empty library.

Store the folder name in a string to reuse and copy the folder a new location basic on conditions.

 

I can't figure a way out to extract the folder name to reuse when creating a new folder.

 

The process that I need is this:

 

copy/move 1 folder from location a to location b.

Get the folder name in a string (compose?)

create new folder on a new location based on the folder name.

 

If I can get the folder name in a string somehow then this would be a lot easier I assume.

 

Many thanks in advance.

 

2020-06-25 13_39_12-.png

 

19 REPLIES 19
Highlighted
Helper III
Helper III

Re: Transfering a large number of files

More info!

Looks like you're using sharepoint online but are you? Which actions are you using? if you can, paste an image of the significant part of your flow. 

It is very hard to understand what you're actually trying to do.

 

What I think you need is something like the List Files action to loop through a folder. I am not sure but there might be a flag in the response that indicates if a returned record is a "file" or a "folder". When you find a folder you the use the "Copy Folder" or "Move Folder" action to relocate Source to Target. When this has looped through all the responses, your folders/files will have been moved or copied!

Along the way, each response from the "List Files" action will include the Folder name or folder path object fro which you can get the info you require..

 

 

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

These are the steps I need:

 

recurrence -> copy folder from a to b (wait untill the copy is finished) I might need to do this manually, because the library has more then 5000 items and cannot be listed.

 

initialize the file name (filename is the date)

apply some filters (this part works)

 

I created something but it give me an error ( I am not sure if this is the right way).

 

 

  2020-06-25 19_32_42-Settings.png

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

OK. Using Schedule/Recurrence is a good way to overcoming the 5000 items limit. As you are "moving" the folders the number will reduce on each run until there are none left to copy. What you need to ensure is that (a) your schedule is timed so that the previous run has completed, otherwise you'll get "folder not found" errors as the two executing instances try to compete; and (b) that your overall flow does not "timeout" before it has finished its batch of 5000 (or how many) folders. By default, actions and a flow itself have set default parameters for how long (or how many iterations of loops) they can execute before they are "stopped". This is obviously to stop runaway and hogging processes.

 

Now, as to the error you've got in your image... the "Initialize variable" action cannot be inside a loop as you cannot repeatedly initialize a varaible. You do this once, then you "set" (or append, increment, etc) it. So move the "initialize" to after the Recurrence action. You can leave the "initial value" field blank because you're going to set it inside the Apply to each loop.

Oncxe you've moved the "initialize" action, add a "Set Variable" action in its place inside the Apply to each, using the same variable name, and the same value expression. This will then set/reset the variable accordingly on each loop.

 

And go from there...

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

I didn't know the initialize and leave blank trick, it worked kinda. I need to split there variables:

 

Test/18 06 2020/Naheffingen/

I need to split them and just use the second variable.

I know that you can use the last(split(variables or first split but can't figure out how to write it down.

 

The full path is the only one I can use and the folder path keeps getting bigger so I need to filter out the first 2 parts and then just the date. 

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

I played around with variables and this is the output ( the date changes and after the /.... changes as well I just need the red marked part that has unique value each time.

 

2020-06-26 09_30_18-Run History _ Power Automate.png

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

You need to use the split function, as you already suggested and some array handling to get out the correct element of the split result:

 

split(<your-variable>,'/')   ==>  split('test2/18 06 2020/Naheffingen','/')  ==> returns an array as follows:

 

[ 'test2', '18 06 2020', 'Naheffingen' ]   ==>  <result-array>[1]  ==> '18 06 2020' 

 

Does that make sense?

Highlighted
Frequent Visitor

Re: Transfering a large number of files

I can't manage to split the full path somehow.

 

2020-06-26 10_46_39-Run History _ Power Automate.png

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

In the expression builder you need to "pick" dynamic values from the "dynamic Content" list, you cannot just type them (unless you use the correct syntax). 

So you can't type  "datum-archief".

When you pick it, it will be added to the expression using the appropriate syntax, like:

 

variables('datum-archief)     giving        fx split(variables('datum-archief'),'/')  
 
Do you follow this explanation?
Highlighted
Frequent Visitor

Re: Transfering a large number of files

It worked like a charm!

 

Thanks for the help!!

Highlighted
Frequent Visitor

Re: Transfering a large number of files

So I got my flow working but stumbled on 2 problems,

 

1 it takes ages before the apply to each finishes and I have the right license but can't have more then 5000 apply to each (according to the documents of Microsoft it should be 100.000 with my license)

2 I tried playing around with variables etc but that doesn't do the trick as well.

 

https://imgur.com/a/P1mvKYO

https://imgur.com/a/UL0Yg7f

 

I want to have a condition on a field (name, title) and in the output I see this field but it won't work and I don't know why.

The expression is :

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

HI "Tobias" (?)

 

First of all, "no one" chooses to use Flow for its speed! Its a fantastic tool but it is almost never faster that "real programming", so you just have to accept that looping through record sets takes as long as it takes. 

 

Regarding the 5000 record limit... this isn't something that has affected me (yet), but I am aware of other questions on this forum about it, and I *think* you need to enable to "pagination" options, which will be in an Action Task's settings (click the ". . . " top right on the action task bar).  I do suggest you search for "Pagination" or "record limit" on the forum as I am sure someone else will have answered this.

 

Regarding your error in the attached images, I think your issue is that you're using "Get items" without a corresponding "Get item" action. "Get Items" (in the Sharepoint actions) returns a [set] of all the data records that correspond to the request. But many Flow actions cannot act on or process a "set" or list, they need an individual record from the [set]. So you then use "Get item" (no "s") repeatedly to loop through the record set.

 

Get Items

  Apply to Each

     Get Item

     Do stuff

  [End of Apply to Each]

 

So. move the Get Items action outside the Apply to Each block, add a "Get item" task inside the block, using an "ID" from the "Get Items" dataset which represents each individual record in each loop. Then the further actions will be acting on a single record object rather that a set of records, and you shouldnt get that error.

 

OK?

Highlighted
Frequent Visitor

Re: Transfering a large number of files

That's a lot of jibberjabber for me I am not that advanced in coding.

2020-07-01 15_06_06-Photos.png2020-07-01 15_05_38-.png

I need to have get files (I renamed it, and get files doesn't have pagination option) and some variables to loop through.

I have a second get items (files) outside of the apply to each. I want to use some filter on it to filter on the first 4 characters of the file name that starts with 0202 but that doesn't work as well.

 

So I need to fix either the condition and query of just the query to get this going as I am stuck now.

I didn't  know it was this hard .....

I contact microsoft support but they gave me the usual blabla answer.... 

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

I am sorry but it is not easy to see what you#re trying to achieve because I cannot see come of the values in conditions etc.

 

However, I still think you have an issue with "get items" and "get item"... 

I can see you have used "get items" twice, which will return you a list of records (filenames in your case I think), but you are not using "Get item" to cycle through the list and process each list entry one-by-one.

 

I hope this is not an obviously question, but you do realise that "Get item" and "Get items" are two different actions and I think you need to use both together!

 

ThePusscat_1-1593796190371.png

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

Hello,

 

I am not using get items or get item. I am using get files (it's not a list it's a library).

I just renamed it to get items for the convenient.

 

Adm-tob_1-1593867761807.png

 

 

 

 This is the condition.

Adm-tob_0-1593867693679.png

 

 

 

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

Ok you'll have to excuse me for getting confused here...

( I am having trouble with why you would think it "convenient" to rename a action to the name of a different action, but let that slide...)

 

I have got a bit lost in what the current issue you have is... you showed the "condition action" and I think you mentioned you want to select and process files that start with "0202" in which case your condition should be:

 

substring(  <your name field> , 1, 4)   is equal to   '0202'

 

 

But I still believe you need a different structure here. 

I assume you are using "Get Files Properties (Sharepoint)" ... but its not clear as you have renamed it.

 

Here is a screen print of how I would do just this bit...

"Get Files Properties" is going to return a "LIST" of all the files in the Library you indicated (according to any filters)

As you can see from the comment box, an "Apply to Each" is normally used.... which is what I have been saying. 

You cannot process the returned list of files altogether. You need to process each list item, one by one.

 

You cannot apply the condition you have written to the list.... you need that condition to be *inside* the "Apply to Each" so that it can be evaluated for each List Item as the Apply to Each loop executes.

 

However, You might be able to apply the condition to the "Filter Query":

 

ID eq [{x} VarID]   and   substring(<your-name-field,0,4) eq '0202'                       (NB: I am *pretty sure* strings in Flow start at 0)

 

In which case the LIST of files returned by "Get File Properties" would now only include the files starting with '0202', so you wouldn't need the condition later on at all.

 

Does that help at all?

 

 

 

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

**bleep** - sorry!  I forgot the image!

ThePusscat_0-1593894763477.png

 

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

Hello That part worked!

 

Now I need to parse the output but that gives me errors ( I copied the output and generated the the json schema from sample but it's not working well).

 

The above part works now! But the parse json part doesn't work:

 

Adm-tob_0-1594204443745.png

Adm-tob_0-1594204541117.png

 

 

Highlighted
Frequent Visitor

Re: Transfering a large number of files

Never mind I managed to fix this issue.

Highlighted
Helper III
Helper III

Re: Transfering a large number of files

Well done.... Sorry- been busy!

Helpful resources

Announcements
firstImage

Super User Program Update

Three Super User rank tiers have been launched!

firstImage

Power Platform 2020 release wave 2 plan

Features releasing from October 2020 through March 2021

firstImage

Join the new Power Virtual Agents Community!

We are excited to announce the launch of Power Virtual Agents Community. Check it out now!

firstImage

New & Improved Power Automate Community Cookbook

We've updated and improved the layout and uploading format of the Power Automate Cookbook!

thirdimage

Power Automate Community User Group Member Badge

Fill out a quick form to claim your user group badge now!

Top Solution Authors
Top Kudoed Authors
Users online (6,570)