cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Unknown123
Advocate II
Advocate II

Improve Performance of a Desktop Flow

Hello Everyone.

 

I had previously created a cloud flow. Now I am making the same flow in Power Automate Desktop.

 

In the cloud flow I had used compose actions and for apply to each loop I was using concurrency.

 

So, is concurrency available in PAD?

 

If not can you guys suggest some techniques by which I can increase performance of my flow.

 

Or could I try this: Run that same flow on different machines, so that I can process more files at the same time?

 

And is there any logging feature in PAD? In cloud flows I used to use result function, to get inputs and outputs for a particular action. And then store it in a database. In PAD is there a way to do that? And also log errors.

 

Thanks.

 

@VJR 

@Henrik_M 

@MichaelAnnis 

@Ankesh_49 

@ryule 

@Pstork1 

@ScottShearer 

@Expiscornovus 

 

17 REPLIES 17
Pstork1
Most Valuable Professional
Most Valuable Professional

Why are you trying to recreate a cloud flow as a desktop flow?  In general desktop flows are used when there isn't an available connector or you need to go through the user interface.  A desktop flow is always going to be a bit slower than the same thing in a cloud flow.  Because you are going through the UI they are also a bit more fragile. I've recreated cloud flows for clients that were originally done as desktop flows, but I've never done it the other way around.

 

1) I am not aware of a way to make loops in Power Automate Desktop run concurrently. Since they interact with the UI this would be very difficult to accomplish.

2) Running the flow on multiple machines is possible, but that requires licensing for unattended RPA.

3) For handling errors each desktop action has an advanced section where you can specify what to do in the case of errors. There is nothing I've found like results(), but you can definitely record a text log on the machine when errors occur.  This learning module will walk you through the error handling capabilities in PAD. Configure exception and error handling in Power Automate for desktop - Training | Microsoft Learn



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.
Unknown123
Advocate II
Advocate II

@Pstork1 

 

I am creating the desktop flow because the files I want to process can't be uploaded to the cloud. They are available on the legacy server or on premise database. 

 

Before we were planning to upload the files on blob storage, but the approach got rejected. So, making a desktop flow. 

Pstork1
Most Valuable Professional
Most Valuable Professional

You can still access the files on an on-premises server using an on-premises gateway. But that requires a Premium license, so I understand the rationale of doing it on the desktop instead.  But the reality is that the desktop is going to be inherently slower and a bit more fragile because you are going through a UI instead of using an API.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

I feel your pain @Unknown123 with compliance/security stepping into some of the scenarios 😉

 

@Pstork1What you have suggested is essentially... copying file to the cloud and this can be an issue from legal/compliance/PII/other reason, especially for us in EU.

And on-premises gateway is a well... direct gateway to the entire data plane so very often restricted.

 

Error handling

@Unknown123When you get familiar with PAD error handling possibilities you can really craft whatever you require.

I usually use on block error in Main that covers all error that reach to the top level and then trigger sub flow dedicated to gracefully exiting the flow, logging etc.

Then I use on error blocks in specific flows if i need to be sure it continue to operate when issue happens.

Very often i enclose all actions in loops to handle errors happening in loop, so I update item status, log, and move to next loop.

I also sometime use error handling on individual actions, set variable and move to label or sub flow.

 

Logging

I used to be logging additional info to txt file.

In my Initialization flow i set the variable such as "logfile" then use write text to %logfile% whenever I want to store some diagnostics, I've seen people logging to excel instead but I prefer the simplicity of text file in this case.

 

When it comes to concurrency - no love here, PAD can handle single job at the same time  🙂

 

Performance wise:

- I use SQL to excel whenever possible, instead of opening app and automating it

- I almost never use Wait with fixed time - i use wait for X (UI element, text, whatever) to be sure I pause the flow the minimal required time.

- I try offload work from PAD to whatever is possible, for example, I have couple of flows where i get data from Web app, and then instead of manipulating the data in PAD in loops, i past it to the excel template where i have a set of formulas doing the job in seconds, then if i need i use SQL to get data gain. Might sound counterproductive but it actually improved performance of those flows.

Pstork1
Most Valuable Professional
Most Valuable Professional

I am not suggesting copying the file to the cloud. I pointed out that a cloud flow can work directly with files that are stored on-premises.  But my point was that a Desktop flow is never going to run as fast or as efficiently as a cloud flow.  This is reality based on how it works.

 

Also, your comments about an on-premises gateway would suggest you don't really understand what on-premises gateways are or how they work.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Hi @Pstork1 

 

I am not suggesting copying the file to the cloud. I pointed out that a cloud flow can work directly with files that are stored on-premises.

 

 

My understanding is that when you use gateway to do operations on local file system, you are pulling data from local system to the cloud, e.g. if you read data from file, that data lands in cloud, even if for the time the flow executes (assuming you are not storing it anywhere, even i logs).

That is prohibited for certain scenarios as this is basically data processing on the public cloud (data is collected on-premise, transferred to the cloud server, processes in the cloud).

Processing data in US based servers is a no go for many scenarios, due to the law that allows US gov to obtain data, so if one is working on Environment outside of EU/EOG - he is roasted and very often needs to look for on-prem solutions.

Welcome to the EU 🙂

 

Unknown123
Advocate II
Advocate II

@momlo @Pstork1 

Thanks for the reply.

Yes. Because of the security purposes, we cannot upload any data to cloud. 

 

@momlo 

Can you explain the log file part where you just store diagnostics? What is that?? Can you also share a SS if possible?

 

For the logging part I was thinking, where ever I have an important action in my flow, I will be making an api call after that action. In the body of the api call, I will take the variable I am sending to the above action, which can be called as the input for that action and the output variable that will be coming from that action directly, I will send it as the output of the file. So, thats how I was thinking to log data. And for the errors, I will run a subflow to store the inputs and log it and end that iteration and send it to next iteration.

 

Yea, I saw some videos regarding error handling and understood the concept.

 

One more question for everyone. With machines, can I do this, lets say I have 50 files on 1st machine and 50 files on second and so on. And at the same time on all machines the flow runs and processing of all the files is done. This might improve the performance, if I am understanding the concept of machines right. 

 

And is there parent and child flows in desktop flows? Probably was thinking to use same subflows for many other flows as we can just copy all the actions and paste it in other flow.

 

And I read an article today, in which person had mentioned all the flow runs get stored in dataverse in flow sessions table. But when I opened that table, it was last modified last month. 

 

I have no idea about the flows getting stored in dataverse. Can you guys share articles or videos regarding that?

 

Thanks. 

Unknown123
Advocate II
Advocate II

@Pstork1 

Actually the cloud flow had api calls. And by just the values coming from some api, I was calling some other api. There is no UI involved in the desktop flow, thats why it was pretty easy to convert that cloud flow to a desktop flow. 

 

For the logging part, I have to log the input and output for all the important actions in my flow, not just the error part. Error part I will log separately. Both are done with separate API calls.

Pstork1
Most Valuable Professional
Most Valuable Professional

Whether you use a cloud flow and a gateway or not, using Power Automate will involve running in the cloud.  Power Automate Desktop is still a web based application.  It does not guarantee that all processing takes place on the desktop.  So there really is no difference in that case between using PAD and using regular Power Automate.  If PAD worked completely on the desktop you would be able to run it while you were offline.  If you unplug your computer from the network I think you will find that PAD doesn't run.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

 

 

Can you explain the log file part where you just store diagnostics? What is that?? Can you also share a SS if possible?

 

 

 

Simple example:

 

 

DateTime.GetCurrentDateTime.Local DateTimeFormat: DateTime.DateTimeFormat.DateAndTime CurrentDateTime=> CurrentDateTime
Text.ConvertDateTimeToText.FromDateTime DateTime: CurrentDateTime StandardFormat: Text.WellKnownDateTimeFormat.SortableDateTime Result=> FormattedDateTime
Text.Replace Text: FormattedDateTime TextToFind: $''':''' IsRegEx: False IgnoreCase: False ReplaceWith: $'''.''' ActivateEscapeSequences: False Result=> FormattedDateTime
SET LogFile TO $'''C:\\Users\\SECRETUSER\\Desktop\\%FormattedDateTime%.log'''
File.WriteText File: LogFile TextToWrite: $'''Flow Start: %CurrentDateTime%
-----''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
File.WriteText File: LogFile TextToWrite: $'''Starting browser''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
WebAutomation.LaunchEdge.LaunchEdge Url: $'''mediamarkt.pl/rtv-i-telewizory/telewizory/wszystkie-telewizory?limit=50&page=1''' WindowState: WebAutomation.BrowserWindowState.Normal ClearCache: False ClearCookies: False WaitForPageToLoadTimeout: 60 Timeout: 60 BrowserInstance=> Browser
ON ERROR REPEAT 1 TIMES WAIT 5
END
File.WriteText File: LogFile TextToWrite: $'''Browser started''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
File.WriteText File: LogFile TextToWrite: $'''Extractign data''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
WebAutomation.ExtractData.ExtractList BrowserInstance: Browser Control: $'''html > body > div:eq(0) > div:eq(1) > div:eq(3) > div > div > div:eq(5) > div:eq(1) > div:eq(2) > div:eq(0) > div''' ExtractionParameters: {[$'''div:eq(0) > div:eq(0) > div:eq(0) > div:eq(0) > a > h2''', $'''Own Text''', $''''''] } PostProcessData: False TimeoutInSeconds: 60 ExtractedData=> DataFromWebPage
File.WriteText File: LogFile TextToWrite: $'''Starting loop''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
SET Counter TO 0
LOOP FOREACH CurrentItem IN DataFromWebPage
Variables.IncreaseVariable Value: Counter IncrementValue: 1
File.WriteText File: LogFile TextToWrite: $'''Item no %Counter% out of %DataFromWebPage.RowsCount%: %CurrentItem%''' AppendNewLine: True IfFileExists: File.IfFileExists.Append Encoding: File.FileEncoding.Unicode
END

 

 

 

 

For the logging part I was thinking, where ever I have an important action in my flow, I will be making an api call after that action. In the body of the api call, I will take the variable I am sending to the above action, which can be called as the input for that action and the output variable that will be coming from that action directly, I will send it as the output of the file. So, thats how I was thinking to log data. And for the errors, I will run a subflow to store the inputs and log it and end that iteration and send it to next iteration.

 

 

 

Sounds bit complicated but it really depends what you want to achieve, I store those logs to see some verbose info in case flows fail. But you can call api, store in DB, store in file, excel etc.

 

 

 

One more question for everyone. With machines, can I do this, lets say I have 50 files on 1st machine and 50 files on second and so on. And at the same time on all machines the flow runs and processing of all the files is done. This might improve the performance, if I am understanding the concept of machines right.

 

 

 

You can call another flow from within one flow but  it executes on the same machine, it is treated as any other action. So you would need to have those files on network share for example, then trigger X flows on X machine handling those files, just need to build the mechanism so they don't interfere with each other (such as 1 bot handles 1 file or locks the file so other don't grab it etc).

 

 

 

And is there parent and child flows in desktop flows? Probably was thinking to use same subflows for many other flows as we can just copy all the actions and paste it in other flow.

 

 

Yes, how I use this:

1. I split individual Desktop Flows in Sub flows and treat each subflow kind of a function in programming - it does specific task it was created for.

2. Then if I have a Subflow that I know i will be using in other flows, I save it as a separate Desktop flow, configure input pramateres etc.

3. I call this Desktop flow from any other flow.

 

Example for this is launching web browser or desktop app and logging in with credentials.

Whenever logon process changes, you need to update this one desktop flow only and and other Desktop flows that call it will work as expected. Again, I am treating some of my flows as a functions that are called by other desktop flows.

One issue with those is that they appear in Power Automate web as if they were any other flows/bots, so kind of clutter the logs, but If you name them like "Shared_Web_App_X_Logon" you see it is not the full bot, but shared one.

 

 

 

And I read an article today, in which person had mentioned all the flow runs get stored in dataverse in flow sessions table. But when I opened that table, it was last modified last month.

 

 

Flows that you trigger from Power Automate or console are stored in dataverse.

Runs that you start in Designer do not store in dataverse.

Having said that, remember that all variables (including your secret data if you read it from file/page etc into variable) are stored in logs and they land in the cloud. To prevent that mark variable as a sensitive variable - this will replace the actual value before it sends the log to the cloud.

For the sake of making it habit, I mark all of my variables sensitive 😄

 

https://learn.microsoft.com/en-us/power-automate/desktop-flows/manage-variables#sensitive-variables

 

 

 

I have no idea about the flows getting stored in dataverse. Can you guys share articles or videos regarding that?

 

 

Main issues I had to resolve when it comes to logs in the cloud:

 

1. Use sensitive variables so secret data does not go to cloud

2. When capturing app/web UI hide secret data behind the recording window - screenshot of the entire windows lands in cloud, so if you have sensitive data - it is an issue

3. Power Automate flow logs are stored for 28 days, Power Automate Desktop logs are stored as long as the flow exists in the Environment - for this reason I have a cloud flows that runs each day and removes logs older than 14 days

 

Edit:

 

Forgot to add:

When your desktop flow fail, it captures the screenshot of entire desktop and sends to the cloud.

Again, if you had app or web opened with sensitive information - it lands in cloud.

I manage this in my Main error handling:

If something goes wrong and I cannot recover with error handlign in subflows or loops etc. main error handling calls a subflow names "CleanFail" 😄 where i ensure I close all of the apps, web pages, anything that can display sensitive information then I open empty notepad in full screen that covers entire desktop in case anything is still open that could be displaying any data.

 

 

To sum this up - you can build flows that will limit the sensitive data leak from your on-prem to the minimum if not completely.

Key points:
- Sensitive variables

- Secure Inputs/Outputs in PA and PAD

- No sensitive data displayed in UI/Web or hidden behind record window so they don't get captured on screenshot

- Error handling that manages failing flow, so it does not ends up unmanaged fail

 

I hope this helps

Hey @Pstork1 

You can limit data transfer to the bare minimum that does not include sensitive information - I shared some of my techniques above/below depends where my reply to the OP goes  🙂

 

Below is very unfortunate statement that might lead people to misunderstanding.

 

 So there really is no difference in that case between using PAD and using regular Power Automate

 

 

Pstork1
Most Valuable Professional
Most Valuable Professional

Sorry, but I disagree.  Both Power Automate and Power Automate Desktop are based on Web application technologies.  Yes, there are ways to minimize cloud connections.  But to give people the impression that PAD is a better solution because it runs locally is what will lead people to a misunderstanding of the technology.  

 

The point is that this question was about how to improve performance on PAD because they didn't want to upload the files to the cloud.  

1) There is no chance that they will create a PAD flow that runs more efficiently than a cloud flow.

2) There are ways to do it with a cloud flow that do NOT require that the use upload the file to a cloud location.

 

I stand by my statements.  If you want to insure that none of your content ever touches the cloud you can do that, but neither PAD nor Power Automate are appropriate technologies to accomplish that goal.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Hi @Pstork1 

Apologies if you felt attacked or uncomfortable with my comment, it was not my intention for sure.

I fully agree with everything you wrote except that PA equals PAD, and that's the beauty of the world 🙂 

 

I'm on the same page here, never stated opposite:

Desktop flow will never be as fast/efficient as cloud flow; in fact, I keep saying to people: Try whatever possible not to use PAD or any other RPA - look for a better way, and do RPA as a last resource 🙂

 

Data:

I believe we look at the same task from 2 different angles, as we both lack more insights into what OP is doing with the files he mentioned, and I believe both our views might or might not be correct!

 

From OPs description, I understand he does not want only to move/copy files but also get the content of the file (and even store some parts of it in local DB for logging purposes) - hence to achieve that with a file system connector, he would need to read file content. Then, that content would be transferred to the PA server for processing, which is no go in a situation where that content is protected/sensitive, etc.

 

From what you wrote, I guess (sorry for my arrogance if this is not correct) you assume OP does not want to read the content but only move/copy, etc., files on the local files system without getting the file's content - so data is not transferred into the cloud.

 

So we can be both right and not, depending on what OP wants to achieve 🙂

 

 

 

Pstork1
Most Valuable Professional
Most Valuable Professional

Two quick points

1)  I understand that the OP wants to read the files and add them to a DB.  His comment about why PAD was based on an earlier design where users were going to upload the files to the cloud for processing in PA. This is why he says he went to PAD.  My point was that he can access the files in the local File System without needing to go to PAD.  

 

2) Your assumption is that reading the file contents using PAD keeps the contents on the local desktop.  I believe you are wrong and that whether it is PA or PAD the actual processing is taking place in the cloud. If that were not true then PAD wouldn't be based on web technology and there would be higher local OS requirements to cover that processing.  Whether you are using PAD or PA the actual content is being processed in the cloud.  So I see no particular security difference between the two.

 

But my original intent was to suggest to the OP that he is basing his tool choice on a misunderstanding of the capabilities of PA and PAD.  And also to suggest that his hope to make the PAD flow as efficient as the Cloud flow was not going to be successful.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.

Hi @Pstork1 

 

 

 

 

Your assumption is that reading the file contents using PAD keeps the contents on the local desktop.  I believe you are wrong and that whether it is PA or PAD the actual processing is taking place in the cloud.

 

 

 

 

The above was true before "sensitive variables" were introduced.

Now we can mark any variable being IT input/output/just variable sensitive, and it will not be stored in the logs, hence will not be transferred to the cloud. 

 

You can take a look here for more on sensitive variables:

https://learn.microsoft.com/en-us/power-platform-release-plan/2021wave2/power-automate/sensitive-var...

 

https://learn.microsoft.com/en-us/power-automate/desktop-flows/manage-variables#sensitive-variables

 

momlo_0-1667321888539.png

 

 

Would you be ok sharing your solution on how to read and manipulate the file's content just with PA, without transferring data to the cloud? I see only below scenario, but this requires transferring data to the cloud (File system/ Get file content):

 

1. Obtain the list of files for processing (I would use File System / Get files in a folder - (no data transferred to the cloud, just metadata so we are fine, unless metadata contains sensitive info)

2. Read file content (File System / Get file content - here, we are transferring data from on-prem to the cloud, so this is the pain point OP wants to avoid. We can use "secure inputs/outputs" in PA to prevent data to be shown in flow logs, but data is being transferred and processed in the cloud at this point)

3. Do some business logic with the files' content we have (any action in PA - but this is again data processing in the cloud, we want to avoid that)

4. Update some logs in local DB or store the outcome of our logic from (3) into local DB - so we transfer data from the cloud back on-prem.

I clearly have some gaps here and would love to learn how to achieve this.

 

 

 

 

My point was that he can access the files in the local File System without needing to go to PAD

 

 

 

 

 

 

Pstork1
Most Valuable Professional
Most Valuable Professional

Lets just agree to disagree. 

 

I agree with you that your solution will keep the content from being logged in the cloud.  I'm not talking about logging.  I'm talking about processing.  If you have an action in the PAD that parses or manipulates the data that takes place using cloud computing resources, not local processing resources.  But that transfer of content is not what the OP is talking about.  His original design that was rejected had people uploading the files to the cloud instead of storing them locally.  That can be avoided.  What can't be avoided in either PAD or PA is having processing take place using cloud resources.  Whether it is logged or not the content will be processed using cloud resources.  As I've already pointed out PAD does not work in an offline mode.  It must have access to cloud resources or it doesn't work.



-------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.
momlo
Super User
Super User

Hi,

I have received confirmation from MS support confirming that my understanding of how PAD is processing data (locally) is correct. See below.

 

Question:

Information on where desktop flows process data during runtime/execution

For example:
If we read the file into a secure variable, we manipulate the variable in any way, and we produce other secure variables - is all processing happening on the local machine?
If all variables are set as sensitive, and we do not use output variables - is any data transferred to the cloud for processing?

 

Answer:

Hi XXX.



I hope this email finds you well today and thank you for contacting Microsoft Support.

I'm Harry from the Power Automate EMEA team.



I understand you've got a query regarding the privacy measures around variables in Power Automate Desktop which have been marked as sensitive and how the data is handled.



Sensitive variables in Power Automate Desktop have their values masked during runtime so that the respective logs do not show any such sensitive value in the run history
The actual values are only used during execution in the target apps as specified in the desktop flow actions
However please note that this is true so long as these variables are also populated during runtime and do not contain hardcoded data 
Hardcoded data would be included in the cloud, specifically in the flow definition in the respective Dataverse table
If no output variables are used, then indeed no values are sent back to the portal to be used in following connectors of the cloud flow


For reference, please see the linked documentation below:

Sensitive variables usage - https://learn.microsoft.com/en-us/power-automate/desktop-flows/manage-variables#sensitive-variables
Architecture of Power Automate Desktop - Power Automate for desktop architecture - Power Automate | Microsoft Learn
How Microsoft handles required and optional data in Power Automate - Data Collection in Power Automate - Power Automate | Microsoft Learn
Please let me know if the information provided has helped clarify your query or if you have any additional questions on this topic.



I wish you a great rest of your day and take care.



Kind regards,

XXXXXXXXX

Support Engineer
Power Platform

 

Helpful resources

Announcements

Exclusive LIVE Community Event: Power Apps Copilot Coffee Chat with Copilot Studio Product Team

It's time for the SECOND Power Apps Copilot Coffee Chat featuring the Copilot Studio product team, which will be held LIVE on April 3, 2024 at 9:30 AM Pacific Daylight Time (PDT).     This is an incredible opportunity to connect with members of the Copilot Studio product team and ask them anything about Copilot Studio. We'll share our special guests with you shortly--but we want to encourage to mark your calendars now because you will not want to miss the conversation.   This live event will give you the unique opportunity to learn more about Copilot Studio plans, where we’ll focus, and get insight into upcoming features. We’re looking forward to hearing from the community, so bring your questions!   TO GET ACCESS TO THIS EXCLUSIVE AMA: Kudo this post to reserve your spot! Reserve your spot now by kudoing this post.  Reservations will be prioritized on when your kudo for the post comes through, so don't wait! Click that "kudo button" today.   Invitations will be sent on April 2nd.Users posting Kudos after April 2nd at 9AM PDT may not receive an invitation but will be able to view the session online after conclusion of the event. Give your "kudo" today and mark your calendars for April 3, 2024 at 9:30 AM PDT and join us for an engaging and informative session!

Tuesday Tip: Unlocking Community Achievements and Earning Badges

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!     THIS WEEK'S TIP: Unlocking Achievements and Earning BadgesAcross the Communities, you'll see badges on users profile that recognize and reward their engagement and contributions. These badges each signify a different achievement--and all of those achievements are available to any Community member! If you're a seasoned pro or just getting started, you too can earn badges for the great work you do. Check out some details on Community badges below--and find out more in the detailed link at the end of the article!       A Diverse Range of Badges to Collect The badges you can earn in the Community cover a wide array of activities, including: Kudos Received: Acknowledges the number of times a user’s post has been appreciated with a “Kudo.”Kudos Given: Highlights the user’s generosity in recognizing others’ contributions.Topics Created: Tracks the number of discussions initiated by a user.Solutions Provided: Celebrates the instances where a user’s response is marked as the correct solution.Reply: Counts the number of times a user has engaged with community discussions.Blog Contributor: Honors those who contribute valuable content and are invited to write for the community blog.       A Community Evolving Together Badges are not only a great way to recognize outstanding contributions of our amazing Community members--they are also a way to continue fostering a collaborative and supportive environment. As you continue to share your knowledge and assist each other these badges serve as a visual representation of your valuable contributions.   Find out more about badges in these Community Support pages in each Community: All About Community Badges - Power Apps CommunityAll About Community Badges - Power Automate CommunityAll About Community Badges - Copilot Studio CommunityAll About Community Badges - Power Pages Community

Tuesday Tips: Powering Up Your Community Profile

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!   This Week's Tip: Power Up Your Profile!  🚀 It's where every Community member gets their start, and it's essential that you keep it updated! Your Community User Profile is how you're able to get messages, post solutions, ask questions--and as you rank up, it's where your badges will appear and how you'll be known when you start blogging in the Community Blog. Your Community User Profile is how the Community knows you--so it's essential that it works the way you need it to! From changing your username to updating contact information, this Knowledge Base Article is your best resource for powering up your profile.     Password Puzzles? No Problem! Find out how to sync your Azure AD password with your community account, ensuring a seamless sign-in. No separate passwords to remember! Job Jumps & Email Swaps Changed jobs? Got a new email? Fear not! You'll find out how to link your shiny new email to your existing community account, keeping your contributions and connections intact. Username Uncertainties Unraveled Picking the perfect username is crucial--and sometimes the original choice you signed up with doesn't fit as well as you may have thought. There's a quick way to request an update here--but remember, your username is your community identity, so choose wisely. "Need Admin Approval" Warning Window? If you see this error message while using the community, don't worry. A simple process will help you get where you need to go. If you still need assistance, find out how to contact your Community Support team. Whatever you're looking for, when it comes to your profile, the Community Account Support Knowledge Base article is your treasure trove of tips as you navigate the nuances of your Community Profile. It’s the ultimate resource for keeping your digital identity in tip-top shape while engaging with the Power Platform Community. So, dive in and power up your profile today!  💪🚀   Community Account Support | Power Apps Community Account Support | Power AutomateCommunity Account Support | Copilot Studio  Community Account Support | Power Pages

Super User of the Month | Chris Piasecki

In our 2nd installment of this new ongoing feature in the Community, we're thrilled to announce that Chris Piasecki is our Super User of the Month for March 2024. If you've been in the Community for a while, we're sure you've seen a comment or marked one of Chris' helpful tips as a solution--he's been a Super User for SEVEN consecutive seasons!   Since authoring his first reply in April 2020 to his most recent achievement organizing the Canadian Power Platform Summit this month, Chris has helped countless Community members with his insights and expertise. In addition to being a Super User, Chris is also a User Group leader, Microsoft MVP, and a featured speaker at the Microsoft Power Platform Conference. His contributions to the new SUIT program, along with his joyous personality and willingness to jump in and help so many members has made Chris a fixture in the Power Platform Community.   When Chris isn't authoring solutions or organizing events, he's actively leading Piasecki Consulting, specializing in solution architecture, integration, DevOps, and more--helping clients discover how to strategize and implement Microsoft's technology platforms. We are grateful for Chris' insightful help in the Community and look forward to even more amazing milestones as he continues to assist so many with his great tips, solutions--always with a smile and a great sense of humor.You can find Chris in the Community and on LinkedIn. Thanks for being such a SUPER user, Chris! 💪 🌠  

Tuesday Tips: Community Ranks and YOU

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!This Week: Community Ranks--Moving from "Member" to "Community Champion"   Have you ever wondered how your fellow community members ascend the ranks within our community? What sets apart an Advocate from a Helper, or a Solution Sage from a Community Champion? In today’s #TuesdayTip, we’re unveiling the secrets and sharing tips to help YOU elevate your ranking—and why it matters to our vibrant communities. Community ranks serve as a window into a member’s role and activity. They celebrate your accomplishments and reveal whether someone has been actively contributing and assisting others. For instance, a Super User is someone who has been exceptionally helpful and engaged. Some ranks even come with special permissions, especially those related to community management. As you actively participate—whether by creating new topics, providing solutions, or earning kudos—your rank can climb. Each time you achieve a new rank, you’ll receive an email notification. Look out for the icon and rank name displayed next to your username—it’s a badge of honor! Fun fact: Your Community Engagement Team keeps an eye on these ranks, recognizing the most passionate and active community members. So shine brightly with valuable content, and you might just earn well-deserved recognition! Where can you see someone’s rank? When viewing a post, you’ll find a member’s rank to the left of their name.Click on a username to explore their profile, where their rank is prominently displayed. What about the ranks themselves? New members start as New Members, progressing to Regular Visitors, and then Frequent Visitors.Beyond that, we have a categorized system: Kudo Ranks: Earned through kudos (teal icons).Post Ranks: Based on your posts (purple icons).Solution Ranks: Reflecting your solutions (green icons).Combo Ranks: These orange icons combine kudos, solutions, and posts. The top ranks have unique names, making your journey even more exciting! So dive in, collect those kudos, share solutions, and let’s see how high you can rank!  🌟 🚀   Check out the Using the Community boards in each of the communities for more helpful information!  Power Apps, Power Automate, Copilot Studio & Power Pages

Find Out What Makes Super Users So Super

We know many of you visit the Power Platform Communities to ask questions and receive answers. But do you know that many of our best answers and solutions come from Community members who are super active, helping anyone who needs a little help getting unstuck with Business Applications products? We call these dedicated Community members Super Users because they are the real heroes in the Community, willing to jump in whenever they can to help! Maybe you've encountered them yourself and they've solved some of your biggest questions. Have you ever wondered, "Why?"We interviewed several of our Super Users to understand what drives them to help in the Community--and discover the difference it has made in their lives as well! Take a look in our gallery today: What Motivates a Super User? - Power Platform Community (microsoft.com)

Top Solution Authors
Users online (7,201)