Hello!
I am wondering how to format my filter array step to get the following:
I got 2 arrays that are the files from 2 different libraries that are supposed to be "synced" (The same files in both libraries) where one of the libraries are considered the "source" and the other library are just copies.
Now i dont want to loop every item just to check if its already "synced" or not, which is why im trying to do it in a filter array step.
Basically i want to filter out any files that have the same URL and where the modified date is less in the source than in the "mirrored library" (This means that it's already the correct version in the mirrored library).
Any help is very much appreciated. 😃
Maybe you can guide me in the right direction? @Expiscornovus
Solved! Go to Solution.
@StretchFredrik Hopefully this works as expected. It should copy over update files and newly added files. I combined what @Chriddle did with the solution as a lot nicer and easier to handle the new files added.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are the same as the original solution.
Select B extracts just the Modified date and the Full Path (excluding the Library Name) from Get files Library B. The expression used is:
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
Select A extracts out the Identifier, Modified date and Path from Get files Library A, plus the matching Modified date from Library B. The expression used to get the Modified date from Library B is:
xpath(
xml(json(concat('{"root": { value:', body('Select_B'), '}}'))),
concat('string(//root/value[FullPath="', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '"]/Modified/text())')
)
Filter array uses the output from Select A with the following filter.
//ModifiedB is empty (new file added) or ModifiedA is greater than ModifiedB (file updated in Library A)
@or(
equals(item()?['ModifiedB'], ''),
greater(item()?['ModifiedA'], item()?['ModifiedB'])
)
Apply to each iterates over each of the items in our Filter array.
Copy file uses the following expressions to copy the new/updated files from Library A to Library B.
//File to Copy
item()?['Identifier']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['Path'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['Path'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
I've got an idea on how to do this within a single Filter array, but just off to sleep now so won't be able to get you something until later (12:30AM for me at the moment). If someone else can provide a solution prior to that then even better 🙂
Are they both Document Libraries that would have identical folder structures/files?
Assuming this would be a scheduled flow that ran daily/weekly?
Hi @StretchFredrik,
Normally I would say, have a look at the Except method described in this blog:
https://pnp.github.io/blog/post/comparing-two-arrays-without-an-apply-to-each/
However, you want to check two things (Url and Modified date time).
Only workaround I can think of at the moment is getting like the max modified date as the latest sync time. Not a great workaround because it doesn't compare it with the modified of the target item itself, but just with the max modified date time of the whole collection of items.
But because it is a sync process, that might be ok? 😁
@and(contains(body('Select_-_Target_Paths'), item()['File']), greater(ticks(item()['Modified']), max(body('Select_-_Target_Modified'))))
Thank you for your reply @Expiscornovus , i would need to compare each items modified date since the sync might take different documents each time since they will be approved or worked on at different times. So a file might not be touched for a year meanwhile another document is worked on daily. The library in question has around 50 000 documents.
Thank you for your reply @grantjenkins
Are they both Document Libraries that would have identical folder structures/files?
Yes they have the same folder structure and files.
Assuming this would be a scheduled flow that ran daily/weekly?
It will run every 15 or 30 minutes, which is why i want it to only spend time on the documents that are out of sync.
Ok, thanks for clarifying. In that case what I provided isn't sufficient.
Let's wait for the response of @grantjenkins. I am sure he can come up with a great solution 😀
Have you considered using Sharepoint (and maybe OneDrive) sync capabilities instead of a Power Automate flow?
Yes, that does not work since the main goal of this is to have one library with ONLY major versions of files and only read permissions @Chriddle.
Yes, that does not work since the main goal of this is to have one library with ONLY major versions of files and only read permissions @Chriddle
Maybe that helps:
I created an array "source" with 500 objects (I hope this is a resonable number of changed files between two flow runs)
and an array "destination" with 50000 objects.
Maybe the amount of destination objects can be reduced by a clever odata filter.
For each object in "source", the Select action "combined" does a lookup in the destinations for an entry with same name and get its "created" with the help of xpath.
You can filter this output with date comparison,
With this amount of values this flow runs in round about 2 minutes.
Of course this is only a POC and you would have to add times, check what happens if the objects are bigger (because of longer names) and probably more 😉
source (Select):
"inputs": {
"from": "@range(0,500)",
"select": {
"name": "@concat('file-',string(item()))",
"created": "@concat('2023-02-', rand(1, 28))"
}
}
destination (Select):
"inputs": {
"from": "@range(0, 50000)",
"select": {
"name": "@concat('file-', string(item()))",
"created": "@concat('2023-02-', rand(1, 28))"
}
}
destinationXML (Compose):
"inputs": "@xml(json(concat('{\"root\":{\"item\":', body('destination'),'}}')))"
combined (Select):
"inputs": {
"from": "@body('source')",
"select": {
"name": "@item()['name']",
"created": "@item()['created']",
"created_destination": "@first(xpath(outputs('destinationXML'), concat('//item[name=\"',item()['name'],'\"]/created/text()')))"
}
}
Filter array:
"inputs": {
"from": "@body('combined')",
"where": "@greater(item()['created'], item()['created_destination'])"
}
I think I might have something that will work.
One concern is the number of files in each library (50,000). We can't just apply a Filter Query within our Get files, so would need to return all files from both libraries (100,000+ files) then apply filtering. The filtering will be quick - just the initial retrieval of files.
The other concern is if there are a lot of files out of sync it will take some time to copy them across which could take quite a while depending on number of files and size of those files. One thing I haven't done here is copied the actual metadata (properties) across. Can easily do that, but not sure if your requirement is just the file sync, or properties too.
If you go with this approach, you may need to run the flow manually a few times to see how long it takes to complete, then schedule the flow accordingly.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are both using Get files (properties only) actions. They both have the filter FSObjType eq 0 which means only get files (not folders). I've also set the Top Count to 5000 for each of them.
I've also gone into the Settings for Get files Library A and Get files Library B, turned on Pagination, and set the Threshold to 60000 (needs to be a number larger than the number of files you will have over the next couple of years at least). This will take a while to retrieve all your files.
Select extracts out a couple of properties from Get files Library B that we will convert to XML so we can apply XPath within the filter later. The expressions used are:
//FullPath - removes the library name from the full path
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
//Modified - replaces characters so we are left with a number (required for XPath comparison)
replace(replace(replace(replace(item()?['Modified'], '-', ''), 'T', ''), ':', ''), 'Z', '')
Filter array uses the output from our Select and the following expression to filter our items that have been updated in Library A since being copied to Library B (in need of updating). It uses an XPath expression to compare both the FullPath (including Filename) and the Modified Date. And if the length of items returned is greater than 0 then we need to update the item.
@greater(
length(
xpath(
xml(json(concat('{"root": { value:', body('Select'), '}}'))),
concat('//root/value[FullPath = "', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '" and Modified < "', replace(replace(replace(replace(item()?['Modified'], '-', ''), 'T', ''), ':', ''), 'Z', ''), '"]')
)
),
0
)
Apply to each iterates over each of the items in our Filter array (files that need to be updated).
Copy file uses the following expressions for File to Copy and Destination folder.
//File to Copy
item()?['{Identifier}']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['{Path}'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['{Path}'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
I think i got it close to working, will test a bit more tomorrow. But i would need the the filter array to also include any items that don't already exist in destination. @grantjenkins
Thank you for your help this far 😃
@StretchFredrik Yea I was thinking about that and already started a bit of a redesign, but off to sleep now, so won't be able to get anything to you until a bit later. I know how to achieve it, but just need to rebuild a part of it.
Also, what if you delete an item from the source - would you also need to delete that from the destination?
Sounds awesome take your time and sleep well. the delete part ive already done by comparing urls with the filter array step. It then loops the ones present in destination and not in source and tries to find the file by document-id to see if its moved in source but not published yet. If it doesnt find it by document-id, it gets deleted.
Thank you for your time @grantjenkins
@StretchFredrik Hopefully this works as expected. It should copy over update files and newly added files. I combined what @Chriddle did with the solution as a lot nicer and easier to handle the new files added.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are the same as the original solution.
Select B extracts just the Modified date and the Full Path (excluding the Library Name) from Get files Library B. The expression used is:
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
Select A extracts out the Identifier, Modified date and Path from Get files Library A, plus the matching Modified date from Library B. The expression used to get the Modified date from Library B is:
xpath(
xml(json(concat('{"root": { value:', body('Select_B'), '}}'))),
concat('string(//root/value[FullPath="', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '"]/Modified/text())')
)
Filter array uses the output from Select A with the following filter.
//ModifiedB is empty (new file added) or ModifiedA is greater than ModifiedB (file updated in Library A)
@or(
equals(item()?['ModifiedB'], ''),
greater(item()?['ModifiedA'], item()?['ModifiedB'])
)
Apply to each iterates over each of the items in our Filter array.
Copy file uses the following expressions to copy the new/updated files from Library A to Library B.
//File to Copy
item()?['Identifier']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['Path'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['Path'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
Thank you @grantjenkins , the filter array step fails on "Greater expects all of its paramteres to be either integer or decimal numbers, Found invalid parameter type "Null". So guessing the filter array query fails when the modifiedB is null(empty).
Did you put the filters in the same order that I had? If ModifiedB is empty, then it wouldn't try to evaluate the second filter.
This is what i have, should be the same:
Nevermind, im stupid, i forgot to change the input of filter array @grantjenkins . Will try again! 😃
We are excited to kick off our new #TuesdayTIps series, "Back to Basics." This weekly series is our way of helping the amazing members of our community--both new members and seasoned veterans--learn and grow in how to best engage in the community! Each Tuesday, we will feature new areas of content that will help you best understand the community--from ranking and badges to profile avatars, from Super Users to blogging in the community. Our hope is that this information will help each of our community members grow in their experience with Power Platform, with the community, and with each other! This Week's Tips: Account Support: Changing Passwords, Changing Email Addresses or Usernames, "Need Admin Approval," Etc.Wondering how to get support for your community account? Check out the details on these common questions and more. Just follow the link below for articles that explain it all.Community Account Support - Power Platform Community (microsoft.com) All About GDPR: How It Affects Closing Your Community Account (And Why You Should Think Twice Before You Do)GDPR, the General Data Protection Regulation (GDPR), took effect May 25th 2018. A European privacy law, GDPR imposes new rules on companies and other organizations offering goods and services to people in the European Union (EU), or that collect and analyze data tied to EU residents. GDPR applies no matter where you are located, and it affects what happens when you decide to close your account. Read the details here:All About GDPR - Power Platform Community (microsoft.com) Getting to Know You: Setting Up Your Community Profile, Customizing Your Profile, and More.Your community profile helps other members of the community get to know you as you begin to engage and interact. Your profile is a mirror of your activity in the community. Find out how to set it up, change your avatar, adjust your time zone, and more. Click on the link below to find out how:Community Profile, Time Zone, Picture (Avatar) & D... - Power Platform Community (microsoft.com) That's it for this week. Tune in for more Tuesday Tips next Tuesday and join the community as we get "Back to Basics."
Are you attending the Microsoft Power Platform Conference 2023 in Las Vegas? If so, we invite you to join us for the MPPC's Got Power Talent Show! Our talent show is more than a show—it's a grand celebration of connection, inspiration, and shared journeys. Through stories, skills, and collective experiences, we come together to uplift, inspire, and revel in the magic of our community's diverse talents. This year, our talent event promises to be an unforgettable experience, echoing louder and brighter than anything you've seen before. We're casting a wider net with three captivating categories: Demo Technical Solutions: Show us your Power Platform innovations, be it apps, flows, chatbots, websites or dashboards... Storytelling: Share tales of your journey with Power Platform. Hidden Talents: Unveil your creative side—be it dancing, singing, rapping, poetry, or comedy. Let your talent shine! Got That Special Spark? A Story That Demands to Be Heard? Your moment is now! Sign up to Showcase Your Brilliance: https://aka.ms/MPPCGotPowerSignUp Deadline for submissions: Thursday, Sept 28th How It Works: Submit this form to sign up: https://aka.ms/MPPCGotPowerSignUp We'll contact you if you're selected. Get ready to be onstage! The Spotlight is Yours: Each participant has 3-5 minutes to shine, with insightful commentary from our panel of judges. We’re not just giving you a stage; we’re handing you the platform to make your mark. Be the Story We Tell: Your talents and narratives will not just entertain but inspire, serving as the bedrock for our community’s future stories and successes. Celebration, Surprises, and Connections: As the curtain falls, the excitement continues! Await surprise awards and seize the chance to mingle with industry experts, Microsoft Power Platform leaders, and community luminaries. It's not just a show; it's an opportunity to forge connections and celebrate shared successes. Event Details: Date and Time: Wed Oct 4th, 6:30-9:00PM Location: MPPC23 at the MGM Grand, Las Vegas, NV, USA
The Reading Dynamics 365 and Power Platform User Group is a community-driven initiative that started in September 2022. It has quickly earned recognition for its enthusiastic leadership and resilience in the face of challenges. With a focus on promoting learning and networking among professionals in the Dynamics 365 and Power Platform ecosystem, the group has grown steadily and gained a reputation for its commitment to its members! The group, which had its inaugural event in January 2023 at the Microsoft UK Headquarters in Reading, has since organized three successful gatherings, including a recent social lunch. They maintain a regular schedule of four events per year, each attended by an average of 20-25 enthusiastic participants who enjoy engaging talks and, of course, pizza. The Reading User Group's presence is primarily spread through LinkedIn and Meetup, with the support of the wider community. This thriving community is managed by a dedicated team consisting of Fraser Dear, Tim Leung, and Andrew Bibby, who serves as the main point of contact for the UK Dynamics 365 and Power Platform User Groups. Andrew Bibby, an active figure in the Dynamics 365 and Power Platform community, nominated this group due to his admiration for the Reading UK User Group's efforts. He emphasized their remarkable enthusiasm and success in running the group, noting that they navigated challenges such as finding venues with resilience and smiles on their faces. Despite being a relatively new group with 20-30 members, they have managed to achieve high attendance at their meetings. The group's journey began when Fraser Dear moved to the Reading area and realized the absence of a user group catering to professionals in the Dynamics 365 and Power Platform space. He reached out to Andrew, who provided valuable guidance and support, allowing the Reading User Group to officially join the UK Dynamics 365 and Power Platform User Groups community. One of the group's notable achievements was overcoming the challenge of finding a suitable venue. Initially, their "home" was the Microsoft UK HQ in Reading. However, due to office closures, they had to seek a new location with limited time. Fortunately, a connection with Stephanie Stacey from Microsoft led them to Reading College and its Institute of Technology. The college generously offered them event space and support, forging a mutually beneficial partnership where the group promotes the Institute and encourages its members to support the next generation of IT professionals. With the dedication of its leadership team, the Reading Dynamics 365 and Power Platform User Group is poised to continue growing and thriving! Their story exemplifies the power of community-driven initiatives and the positive impact they can have on professional development and networking in the tech industry. As they move forward with their upcoming events and collaborations with Reading College, the group is likely to remain a valuable resource for professionals in the Reading area and beyond.
As the sun sets on the #SummerofSolutions Challenge, it's time to reflect and celebrate! The journey we embarked upon together was not just about providing answers – it was about fostering a sense of community, encouraging collaboration, and unlocking the true potential of the Power Platform tools. From the initial announcement to the final week's push, the Summer of Solutions Challenge has been a whirlwind of engagement and growth. It was a call to action for every member of our Power Platform community, urging them to contribute their expertise, engage in discussions, and elevate collective knowledge across the community as part of the low-code revolution. Reflecting on the Impact As the challenge ends, it's essential to reflect on the impact it’s had across our Power Platform communities: Community Resilience: The challenge demonstrated the resilience of our community. Despite geographical distances and diverse backgrounds, we came together to contribute, learn, and collaborate. This resilience is the cornerstone of our collective strength.Diverse Expertise: The solutions shared during the challenge underscore the incredible expertise within our community. From intricate technical insights to creative problem-solving, our members showcased their diverse skill sets, enhancing our community's depth.Shared Learning: Solutions spurred shared learning. They provided opportunities for members to grasp new concepts, expand their horizons, and uncover the Power Platform tools' untapped potential. This learning ripple effect will continue to shape our growth. Empowerment: Solutions empowered community members. They validated their knowledge, boosted their confidence, and highlighted their contributions. Each solution shared was a step towards personal and communal empowerment. We are proud and thankful as we conclude the Summer of Solutions Challenge. The challenge showed the potential of teamwork, the benefit of knowledge-sharing, and the resilience of our Power Platform community. The solutions offered by each member are more than just answers; they are the expression of our shared commitment to innovation, growth, and progress! Drum roll, Please... And now, without further ado, it's time to announce the winners who have risen above the rest in the Summer of Solutions Challenge! These are the top community users and Super Users who have not only earned recognition but have become beacons of inspiration for us all. Power Apps Community: Community User Winner: @SpongYe Super User Winner: Pending Acceptance Power Automate Community: Community User Winner: @trice602 Super User Winner: @Expiscornovus Power Virtual Agents Community: Community User Winner: Pending AcceptanceSuper User: Pending Acceptance Power Pages Community: Community User Winner: @OOlashyn Super User Winner: @ChristianAbata We are also pleased to announced two additional tickets that we are awarding to the Overall Top Solution providers in the following communities: Power Apps: @LaurensM Power Automate: @ManishSolanki Thank you for making this challenge a resounding success. Your participation has reaffirmed the strength of our community and the boundless potential that lies within each of us. Let's keep the spirit of collaboration alive as we continue on this incredible journey in Power Platform together.Winners, we will see you in Vegas! Every other amazing solutions superstar, we will see you in the Community!Congratulations, everyone!
Ayonija Shatakshi, a seasoned senior consultant at Improving, Ohio, is a passionate advocate for M365, SharePoint, Power Platform, and Azure, recognizing how they synergize to deliver top-notch solutions. Recently, we asked Ayonija to share her journey as a user group leader, shedding light on her motivations and the benefits she's reaped from her community involvement. Ayonija embarked on her role as a user group leader in December 2022, driven by a desire to explore how the community leveraged various Power Platform components. When she couldn't find a suitable local group, she decided to create one herself! Speaking about the impact of the community on her professional and personal growth, Ayonija says, "It's fascinating to witness how everyone navigates the world of Power Platform, dealing with license constraints and keeping up with new features. There's so much to learn from their experiences.: Her favorite aspect of being a user group leader is the opportunity to network and engage in face-to-face discussions with fellow enthusiasts, fostering deeper connections within the community. Offering advice to budding user group leaders, Ayonija emphasized the importance of communication and consistency, two pillars that sustain any successful community initiative. When asked why she encourages others to become user group leaders, Ayonija said, "Being part of a user group is one of the best ways to connect with experienced professionals in the same field and glean knowledge from them. If there isn't a local group, consider starting one; you'll soon find like-minded individuals." Her highlight from the past year as a user group leader was witnessing consistent growth within the group, a testament to the thriving community she has nurtured. Advocating for user group participation, Ayonija stated, "It's the fastest route to learning from the community, gaining insights, and staying updated on industry trends." Check out her group: Cleveland Power Platform User Group
Hear from Corporate Vice President for Microsoft Business Applications & Platform, Charles Lamanna, as he looks ahead to the second annual Microsoft Power Platform Conference from October 3rd-5th 2023 at the MGM Grand in Las Vegas.Have you got your tickets yet? Register today at www.powerplatformconf.com
User | Count |
---|---|
62 | |
34 | |
33 | |
28 | |
25 |
User | Count |
---|---|
78 | |
68 | |
67 | |
50 | |
50 |