I had a flow running to copy newly created files from on premis file system to SharePoint list but this stopped running a couple of weeks ago and now returns an error
http request failed: the timeout was reached.
If i try to recreate the flow it appears that i can now only create a flow with file metadata and not actually copy the file contents - as "Body" is no longer listed on the dynamic content
Any ideas how to create a flow to copy new files to Sharepoint docment libraries from file system?
How trigger are you using to start this flow? I´m trying to replicate your cenario in my environment to test some possibilities, could you share with me?
the trigger that used to work in the past is "When File is added" and the file name contained "DeliveryTicket"
however when trying to re-crerate the flow the only file system flows appear be 'metadata' only and dont allow to copy the file body
I have discovered there is an option to create a flow from a template however it's not possible to include and conditions as there is no dynamic content available. There is also no option to select the folder, but i assume this can be overcome by using a connector direct to the folder. THIS TEMPLATE APPEARS TO HAVE DISSAPEARD NOW.
It seems that this is no longer possible what if you use the One Drive for Business trigger? You can configure your local folder to connect with a SharePoint library and use this trigger to send to another list or library.
Hi @douglasromao, thanks for the update and this helped to understand why it was not working when we trying to configure that.
Your solution (OneDrive) sync may work in a case were we need to push the files to document library for an attended session where the user will be logged in. But for an unattended session, OneDrive may prompt for credentials when the session / cookie gets expired.
Is there anyway where we can have some kind of un-attended session scenario where any modifications in the file system will push the data once a new file is added to the file system. (we dont want to create custom scripts for reducing the maintainability overhead).
Appreciate any lead in this case.
Thanks in advance.
I want to apologize for the confusion here. We had an update to the File System connector that has the trigger only return back the metadata of Files. This change was made because some files are too big and the trigger was failing (the On-premises Data Gateway has a relatively low limit for file size).
However, you CAN still accomplish your scenario, it just takes one extra step. Now you can do:
- When a file is created or updated
- Get file content (or, Get file content by path)
- Whatever actions you wanted...
With this pattern you can use an error-handling step to also handle the case where the file is too big, so you can choose what to do (e.g. send yourself a notification, or you could ignore it). That would be impossible with the previous pattern.
Again, sorry for the issue, please let me know if this pattern doesn't work.
Stephen, is there a working resolution to this issue? If so, can you provide more details. I continue to get an error on the first action in my FLOW.
I solved my issue in this link:
We are excited to announce the launch of Power Virtual Agents Community. Check it out now!
We've updated and improved the layout and uploading format of the Power Automate Cookbook!
Fill out a quick form to claim your user group badge now!