cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
MichelH
Advocate V
Advocate V

SQL stored procedure on-premises data gateway timeout

I found  I can run an SQL stored procedure on the local SQL server and it is very handy when the procedures are short lived. However, when the time the procedure takes to run, exceeds 2 minutes, the Flow seems to do a couple of extra things that I do not understand.

 

When I manually run the SP that I want to launch via Flow, it takes 5 to 7 minutes to run, where it basically pulls data from a number of sources (linked servers) via a local view, and combines them into a local table. This always works as expected, there is no issue with the SP.

 

When I run the SP via Flow it does do the first part of the flow, where it drops the existing table, and then things become unclear. After 2 minutes it signals a timeout, and from then on it seems to hammer the SQL server, since my SQL studio seems to have a hard time getting any response. The flow and the sever then go in limbo for about 30 minutes and seemly do nothing but jamming.

 

The flow is running with my personal credentials, the same as I use in the SQL Studio. 

 

Is there anything I can do to prevent this timeout and retry after 2 minutes?

Is there anything I can do to prevent the explicit cancellation?

Any suggestions to make this work as expected?

 

I did look at and adjusted the timeout settings, but I think the 'Note' tells me it will not work. 

Timeout:  Limit the maximum duration an asynchronous pattern may take.
Note: this does not alter the request timeout of a single request.

 

GatewayTimeout.jpg

 

The error message: 

{
"error": {
"code": 504,
"source": "flow-apim-europe-001-northeurope-01.azure-apim.net",
"clientRequestId": "bba12345-a123-b456-c789-cf64d495e8d1",
"message": "BadGateway",
"innerError": {
"status": 504,
"message": "The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: A task was canceled.\r\nclientRequestId: bba12345-a123-b456-c789-cf64d495e8d1",
"source": "sql-ne.azconn-ne.p.azurewebsites.net"
}
}
}

 

The stored procedure:

DROP TABLE IF EXISTS [LocalDB].[dbo].[MH_test_dev]

SELECT * INTO [LocalDB].[dbo].[MH_test_dev] FROM [LocalDB].[dbo].[MH_OpenSO_dev] 

  

Thanks for any feedback,

 

Michel

19 REPLIES 19
AlanPs1
Resident Rockstar
Resident Rockstar

Hi @MichelH

Please see here

 

User is reporting:

  "error": {

 

    "message": "BadGateway",

    "innerError": {

      "status": 504,

      "message": "The operation failed due to an explicit cancellation

Then goes on to say: "We experienced this issue 8 times this morning. Opening a ticket with Microsoft to resolve. Seems like a Gateway code issue, or perhaps it has an issue connecting to SQL during some timeframes (Backups? DB Maint scripts?) Will post solution when Microsoft engages on the issue. We are collection the basic support info this morning on the case."

 

Looks like they are engagning Microsoft, is this of any help to you?

Maybe @SmartMeter can be asked for further assistance or information?

 

If you, the reader has found this post helpful, please help the community by clicking thumbs up.

If this post has solved your problem, please click "Accept as Solution".

Any other questions, just ask.

Thanks, Alan

Thanks for your response @AlanPs1,

 

I did see that post but the issue occurred over a year ago and multiple updates to the gateway software have become available since.

Also I can say that our gateway does work for short-lived jobs.

As long as it doesn't go for the timeout, then all is fine.

 

Nevertheless it can't hurt to request feedback from @SmartMeter regarding his issue, so I just did.

 

Cheers,

Michel

I'm more and more convinced the source of my problem is the hardcoded timeout (after 2 minutes) of a single request.

There are timeout settings I can use in flow but they don't apply to a single request.

 

timeout2min.jpg

 

Does anyone have an idea how to work arround this?

 

Could I download the flow package file and add something to the json to set this timeout to a longer period?

 

 

Thanks,

 

Michel

Anonymous
Not applicable

Any recommendations on this? I am experiencing the same issue consistently.

No other than just making sure the SP responds in less than 2 minutes.

 

For all the rest I made a python script of 10 lines that connects to the SQL and runs the procedures that take longer.

Cameron
Employee
Employee

As others have mentioned, there is a 120 second timeout for SQL Server connector. If you cannot decrease the time taken by the stored procedure, then I suggest using a queueing mechanism on the server. If you are willing to take on some complexity, others have found success using a "fire and forget" mechanism. It assumes that you have access to the SQL server and have permission to create tables, stored procedures, and server-side triggers. The general approach is: 1) Create a stored procedure which performs the desired query (Master) make sure this procedure also accepts an identity value from the status table. 2) Create a second stored procedure (MasterStart) which can accept the same parameters as the first (sans identity). This procedure should add these parameters to a state table (let’s call it RunState) with an additional “status” column which defaults to “pending”. This table must have an IDENTITY column AND a ROWVERSION column. 3) Create a server-side SQL trigger which executes when new rows are inserted to the RunState table. When it discovers a new row with status “pending” it should execute the stored procedure created in step 1 using the parameters in the inserted row in addition to the identity of that row, and alter the status to “running”. The stored procedure in step 1 must update the state table using the provided identity to meaningful values, but most importantly to “complete” when it has finished. 4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table. 5) Your second flow step should be a trigger which observes the RunState table for Updates. This step should filter out runs and only execute for rows whose status == “complete”. If desired, you can include a “results” column in the table to communicate output values. It’s also good to subsequently update the status to “closed” – meaning, you have successfully processed the completed run. I have also seen implementations that remove the state table row on completion after logging to a second table (ala RunHistory). The advantage of this approach is that the stored procedure can take as long as necessary, and it will not impact Flows or the connector.

Thanks Cameron,

 

I had an SSIS package on the SQL server allowing me to run timed, but I think also triggered stored procedures.

I no longer have this package on the new server, and was told to launch stored procedures using flow.

 

Can SQL server 'by itself' (without additiona packages) have a 'server-side SQL trigger' ?

I will check out if this is possible, or if there are any additional constraints.

 

That said, I tried a kind of fire and forget method, but flow, after the time-out, actively engages to try and stop the stored procedure (because it thinks it has failed, I asume). If it would just let the stored procedure do its thing, it would work for me too. 

Michel, SQL Server side triggers allow you to kickoff a process that is not tied to a Flow request. That's really the only way to make sure the process isn't terminated after a timeout. That's the crux of "fire and forget" - it uses SQL server-side processing to execute the stored procedure. Anyone else who triggers the stored procedure would have to wait for it to complete, or terminate after some timeout. SQL Server doesn't have an async interface. If you do pursue an F-and-F approach: SQL Azure supports server-side triggers. Data Warehouse does not. See this reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-trigger-transact-sql the workaround I described above allows you to do something beyond the norm - namely, run extra long stored procedures that cannot otherwise be limited to 120 seconds. To do this, I've described a method which requires some complexity and nuance - and isn't for the feint of heart. If you are limited to using flows, then you probably want to simply reduce the amount of data you are processing at once, and limit the amount of time your stored procedure is taking to complete. Thanks, -Cameron
Anonymous
Not applicable

I say this is very amateur on Microsoft's behalf. Great thinking outside the box Cameron. Now there has to be away to do more than set and forget? Perhaps a timer to poll a table which keeps state so that you know when your state triggered stored procedure finishes.

Yes, you will notice that in my description of Fire-And-Forget approach, you can actually set your flow to watch for changes in the results table. You can then set a Flow trigger to fire when the result shows up (and therefore the stored procedure is complete).  The most important part of the workaround is the ability to disconnect the flow that triggers the stored proc. Once complete, you can use all of the usual mechanisms you are used to (such as flow triggers).

 

Enjoy,

Cameron 

Hi Cameron,

 

Could you please show how you create a server-side DML trigger? The fire and forget approach sounds like a winner, but I can't figure out how to create the server-side trigger. 

 

Thanks in advance, 

Best,

Javier

 

 

Anonymous
Not applicable

Sorry. It didn't work for me.

Hey there Javier - check out this link,

It has a lot of detail and some good examples.

Hi Cameron,

Thanks for the link. I have no issues creating the triggers. When I followed your approach, you mentioned that the "key" was to create a "server-side" trigger, and I can't find how to create a "DML server-side" triggers. In other words, how do you create a DML trigger that is "triggered" at the server level instead of the table level? Only DDL and Logon triggers are saved either in the "Stored Procedure" section or the server section. 

When I create a DML trigger (table level) to run a stored procedure that takes more than 2 min, I don't get the results back until the SP is finished. I think I'm missing something since I see others were able to follow your steps. 

Anyhow, I found a workaround where the DML trigger creates a Sql Job that calls the SP. The Job is deleted once the SP is executed. It simulates an async operation that is tracked by the status field in the table you mentioned in your approach. 

Thanks again for your reply.

Best,

Javier

 

Hi Javier,

 

Yes, the idea here is to have a server side trigger which executes the stored procedure using parameters inserted into the state table. When the stored proc finishes, it can update the same table with results and update the status. It can also insert result set(s) into a second table, which you can trigger on from LogicApps. I think you've got the gist of it.

 

Thanks,

Cameron

 

Anonymous
Not applicable

I retract my statement. 

1. Using a statetable to start a storedproc. Check

2. Updating the statetable with "Complete" when finished. Check

3. Using a flow statement to monitor SQL "Modified" trigger. Trouble ahead!

 

 

Anonymous
Not applicable

4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table.

 

Some advice here, please. When I do this the store procedure waits for the entry to be created. The entry to be created status = "Pending" waits for the Master stored procedure to finish (20 mins) and thus I am still stuck by the same timeout issue.

The implementation here is admittedly tricky. I have an updated recommendation to use the Elastic Job Agent which, while requiring a new Azure feature, is much less error-prone, and much more straightforward. It still requires use of the state table, since jobs cannot accept input or output parameters.  This document will be published soon (it is in review) but I provide samples that can be easily built on for any stored procedure. You can also use the agent in on-premise SQL or Managed Instance to accomplish the same thing. I will post a link here when the document is published.

Long-running Stored Procedures for Power Platform SQL Connector

 

The SQL Server connector in Power Platform exposes a wide range of backend features that can be accessed easily with the Logic Apps interface, allowing ease of business automation with SQL database tables.  However, the user is still limited to a 2-minute window of execution.  Some stored procedures may take longer than this to fully process and complete. In fact, some long-running processes are coded into stored procedures explicitly for this purpose. Calling them from Logic Apps is problematic because of the 120-second timeout. While the SQL connector itself does not natively support an asynchronous mode, it can be simulated using passthrough native query, a state table, and server-side jobs.

For example, suppose you have a long-running stored procedure like so:

 

CREATE PROCEDURE [dbo].[WaitForIt]
    @delay char(8) = '00:03:00'
AS 
BEGIN  
SET NOCOUNT ON;
    WAITFOR DELAY @delay
END

 

Executing this stored procedure from a Logic App will cause a timeout with an HTTP 504 result since it takes longer than 2 minutes to complete. Instead of calling the stored procedure directly, you can use a job agent to execute it asynchronously in the background. We can store inputs and results in a state table that you can target with a Logic App trigger. You can simplify this if you don’t need inputs or outputs, or are already writing results to a table inside the stored proc.

Keep in mind that the asynchronous processing by the agent may retry your stored procedure multiple times in case of failure or timeout. It is therefore critically important that your stored proc be idempotent. You will need to check for the existence of objects before creating them and avoid duplicating output.

 

For SQL Azure

An Elastic Job Agent can be used to create a job which executes the procedure. Full documentation for the Elastic Job Agent can be found here: https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-jobs-overview

You’ll want to create a job agent in the Azure Portal. This will add several stored procedures to a database that will be used by the agent.  This will be known as the “agent database”. You can then create a job which executes your stored procedure in the target database and captures the output when it is completed. You’ll need to configure permissions, groups, and targets as explained in the document above. Some of the supporting tables and procedures will also need to live in the agent database.

First, we will create a state table to register parameters meant to invoke the stored procedure. Unfortunately, SQL Agent Jobs do not accept input parameters, so to work around this limitation we will store the inputs in a state table in the target database. Remember that all agent job steps will execute against the target database, but job stored procedures run on the agent database.

 

CREATE TABLE [dbo].[LongRunningState](
       [jobid] [uniqueidentifier] NOT NULL,
       [rowversion] [timestamp] NULL,
       [parameters] [nvarchar](max) NULL,
       [start] [datetimeoffset](7) NULL,
       [complete] [datetimeoffset](7) NULL,
       [code] [int] NULL,
       [result] [nvarchar](max) NULL,
 CONSTRAINT [PK_LongRunningState] PRIMARY KEY CLUSTERED
(      [jobid] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

 

The resulting table will look like this in SSMS:

Cameron_0-1602871250891.png

We will use the job execution id as the primary key, both to ensure good performance, and to make it possible for the agent job to locate the associated record. Note that you can add individual columns for input parameters if you prefer. The schema above can handle multiple parameters more generally if this is desired, but it is limited to the size of NVARCHAR(MAX).

We must create the top-level job on the agent database to run the long-running stored procedure.

 

EXEC jobs.sp_add_job
    _name='LongRunningJob',
    @description='Execute Long-Running Stored Proc',
    @enabled = 1

 

We will also need to add steps to the job that will parameterize, execute, and complete the stored procedure. Job steps have a default timeout value of 12 hours. If your stored procedure will take longer, or you’d like it to timeout earlier, you can set the step_timeout_seconds parameter to your preferred value in seconds. Steps also have (by default) 10 retries with a built in backoff timout inbetween. We will use this to our advantage.

 

We will use three steps:

This first step waits for the parameters to be added to the LongRunningState table, which should occur fairly immediately after the job has been started. The first step merely fails if the jobid hasn’t been inserted to the LongRunningState table, and the default retry/backoff will do the waiting for us. In practice, this step typically runs once and succeeds.

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name= 'Parameterize WaitForIt',
       @command= N'
              IF NOT EXISTS(SELECT [jobid] FROM [dbo].[LongRunningState]
                           WHERE [jobid] = $(job_execution_id))           
                     THROW 50400, ''Failed to locate call parameters (Step1)'', 1', 
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

The second step queries the parameters from the state table and passes it to the stored procedure, executing the procedure in the background. In this case, we use the @callparams to pass the timespan parameter, but this can be extended to pass additional parameters if needed. If your stored procedure does not need parameters, you can simply call the stored proc directly.

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name='Execute WaitForIt',
                @command=N'
              DECLARE @timespan char(8)
              DECLARE @callparams NVARCHAR(MAX)
              SELECT @callparams = [parameters] FROM [dbo].[LongRunningState]
                     WHERE [jobid] = $(job_execution_id)
              SET @timespan = @callparams
              EXECUTE [dbo].[WaitForIt] @delay = @timespan', 
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

The third step completes the job and records the results:

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name='Complete WaitForIt',
                _timeout_seconds = 43200,
                @command=N'
              UPDATE [dbo].[LongRunningState]
                 SET [complete] = GETUTCDATE(),
                     [code] = 200,
                     [result] = ''Success''
               WHERE [jobid] = $(job_execution_id)',
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

We will use a passthrough native query to start the job, then immediately push the parameters into the state table for the job to reference. We will use the dynamic data output ‘Results JobExecutionId’ as the input to the ‘jobid’ attribute in the target table. We must add the appropriate parameters for the job to unpackage them and pass them to the target stored procedure.

Here's the native query:

 

DECLARE @jid UNIQUEIDENTIFIER
DECLARE @result int
EXECUTE @result = jobs.sp_start_job 'LongRunningJob', @jid OUTPUT
    IF @result = 0
        SELECT 202[Code], 'Accepted'[Result], @jid[JobExecutionId]
    ELSE
        SELECT 400[Code], 'Failed'[Result], @result[SQL Result]

 

Here's the Logic App snippet:

Cameron_1-1602871250902.png

 

When the job completes, it updates the LongRunningState table so that you can easily trigger on the result. If you don’t need output, or if you already have a trigger watching an output table, you can skip this part.

Cameron_2-1602871250915.png

 

 

For SQL Server on-premise or SQL Azure Managed Instances:

SQL Server Agent can be used in a similar fashion. Some of the management details differ, but the fundamental steps are the same.

https://docs.microsoft.com/en-us/sql/ssms/agent/configure-sql-server-agent

 

Helpful resources

Announcements

Exclusive LIVE Community Event: Power Apps Copilot Coffee Chat with Copilot Studio Product Team

It's time for the SECOND Power Apps Copilot Coffee Chat featuring the Copilot Studio product team, which will be held LIVE on April 3, 2024 at 9:30 AM Pacific Daylight Time (PDT).     This is an incredible opportunity to connect with members of the Copilot Studio product team and ask them anything about Copilot Studio. We'll share our special guests with you shortly--but we want to encourage to mark your calendars now because you will not want to miss the conversation.   This live event will give you the unique opportunity to learn more about Copilot Studio plans, where we’ll focus, and get insight into upcoming features. We’re looking forward to hearing from the community, so bring your questions!   TO GET ACCESS TO THIS EXCLUSIVE AMA: Kudo this post to reserve your spot! Reserve your spot now by kudoing this post.  Reservations will be prioritized on when your kudo for the post comes through, so don't wait! Click that "kudo button" today.   Invitations will be sent on April 2nd.Users posting Kudos after April 2nd at 9AM PDT may not receive an invitation but will be able to view the session online after conclusion of the event. Give your "kudo" today and mark your calendars for April 3, 2024 at 9:30 AM PDT and join us for an engaging and informative session!

Tuesday Tip: Unlocking Community Achievements and Earning Badges

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!     THIS WEEK'S TIP: Unlocking Achievements and Earning BadgesAcross the Communities, you'll see badges on users profile that recognize and reward their engagement and contributions. These badges each signify a different achievement--and all of those achievements are available to any Community member! If you're a seasoned pro or just getting started, you too can earn badges for the great work you do. Check out some details on Community badges below--and find out more in the detailed link at the end of the article!       A Diverse Range of Badges to Collect The badges you can earn in the Community cover a wide array of activities, including: Kudos Received: Acknowledges the number of times a user’s post has been appreciated with a “Kudo.”Kudos Given: Highlights the user’s generosity in recognizing others’ contributions.Topics Created: Tracks the number of discussions initiated by a user.Solutions Provided: Celebrates the instances where a user’s response is marked as the correct solution.Reply: Counts the number of times a user has engaged with community discussions.Blog Contributor: Honors those who contribute valuable content and are invited to write for the community blog.       A Community Evolving Together Badges are not only a great way to recognize outstanding contributions of our amazing Community members--they are also a way to continue fostering a collaborative and supportive environment. As you continue to share your knowledge and assist each other these badges serve as a visual representation of your valuable contributions.   Find out more about badges in these Community Support pages in each Community: All About Community Badges - Power Apps CommunityAll About Community Badges - Power Automate CommunityAll About Community Badges - Copilot Studio CommunityAll About Community Badges - Power Pages Community

Tuesday Tips: Powering Up Your Community Profile

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!   This Week's Tip: Power Up Your Profile!  🚀 It's where every Community member gets their start, and it's essential that you keep it updated! Your Community User Profile is how you're able to get messages, post solutions, ask questions--and as you rank up, it's where your badges will appear and how you'll be known when you start blogging in the Community Blog. Your Community User Profile is how the Community knows you--so it's essential that it works the way you need it to! From changing your username to updating contact information, this Knowledge Base Article is your best resource for powering up your profile.     Password Puzzles? No Problem! Find out how to sync your Azure AD password with your community account, ensuring a seamless sign-in. No separate passwords to remember! Job Jumps & Email Swaps Changed jobs? Got a new email? Fear not! You'll find out how to link your shiny new email to your existing community account, keeping your contributions and connections intact. Username Uncertainties Unraveled Picking the perfect username is crucial--and sometimes the original choice you signed up with doesn't fit as well as you may have thought. There's a quick way to request an update here--but remember, your username is your community identity, so choose wisely. "Need Admin Approval" Warning Window? If you see this error message while using the community, don't worry. A simple process will help you get where you need to go. If you still need assistance, find out how to contact your Community Support team. Whatever you're looking for, when it comes to your profile, the Community Account Support Knowledge Base article is your treasure trove of tips as you navigate the nuances of your Community Profile. It’s the ultimate resource for keeping your digital identity in tip-top shape while engaging with the Power Platform Community. So, dive in and power up your profile today!  💪🚀   Community Account Support | Power Apps Community Account Support | Power AutomateCommunity Account Support | Copilot Studio  Community Account Support | Power Pages

Super User of the Month | Chris Piasecki

In our 2nd installment of this new ongoing feature in the Community, we're thrilled to announce that Chris Piasecki is our Super User of the Month for March 2024. If you've been in the Community for a while, we're sure you've seen a comment or marked one of Chris' helpful tips as a solution--he's been a Super User for SEVEN consecutive seasons!   Since authoring his first reply in April 2020 to his most recent achievement organizing the Canadian Power Platform Summit this month, Chris has helped countless Community members with his insights and expertise. In addition to being a Super User, Chris is also a User Group leader, Microsoft MVP, and a featured speaker at the Microsoft Power Platform Conference. His contributions to the new SUIT program, along with his joyous personality and willingness to jump in and help so many members has made Chris a fixture in the Power Platform Community.   When Chris isn't authoring solutions or organizing events, he's actively leading Piasecki Consulting, specializing in solution architecture, integration, DevOps, and more--helping clients discover how to strategize and implement Microsoft's technology platforms. We are grateful for Chris' insightful help in the Community and look forward to even more amazing milestones as he continues to assist so many with his great tips, solutions--always with a smile and a great sense of humor.You can find Chris in the Community and on LinkedIn. Thanks for being such a SUPER user, Chris! 💪 🌠  

Find Out What Makes Super Users So Super

We know many of you visit the Power Platform Communities to ask questions and receive answers. But do you know that many of our best answers and solutions come from Community members who are super active, helping anyone who needs a little help getting unstuck with Business Applications products? We call these dedicated Community members Super Users because they are the real heroes in the Community, willing to jump in whenever they can to help! Maybe you've encountered them yourself and they've solved some of your biggest questions. Have you ever wondered, "Why?"We interviewed several of our Super Users to understand what drives them to help in the Community--and discover the difference it has made in their lives as well! Take a look in our gallery today: What Motivates a Super User? - Power Platform Community (microsoft.com)

March User Group Update: New Groups and Upcoming Events!

  Welcome to this month’s celebration of our Community User Groups and exciting User Group events. We’re thrilled to introduce some brand-new user groups that have recently joined our vibrant community. Plus, we’ve got a lineup of engaging events you won’t want to miss. Let’s jump right in: New User Groups   Sacramento Power Platform GroupANZ Power Platform COE User GroupPower Platform MongoliaPower Platform User Group OmanPower Platform User Group Delta StateMid Michigan Power Platform Upcoming Events  DUG4MFG - Quarterly Meetup - Microsoft Demand PlanningDate: 19 Mar 2024 | 10:30 AM to 12:30 PM Central America Standard TimeDescription: Dive into the world of manufacturing with a focus on Demand Planning. Learn from industry experts and share your insights. Dynamics User Group HoustonDate: 07 Mar 2024 | 11:00 AM to 01:00 PM Central America Standard TimeDescription: Houston, get ready for an immersive session on Dynamics 365 and the Power Platform. Connect with fellow professionals and expand your knowledge. Reading Dynamics 365 & Power Platform User Group (Q1)Date: 05 Mar 2024 | 06:00 PM to 09:00 PM GMT Standard TimeDescription: Join our virtual meetup for insightful discussions, demos, and community updates. Let’s kick off Q1 with a bang! Leaders, Create Your Events!  Leaders of existing User Groups, don’t forget to create your events within the Community platform. By doing so, you’ll enable us to share them in future posts and newsletters. Let’s spread the word and make these gatherings even more impactful! Stay tuned for more updates, inspiring stories, and collaborative opportunities from and for our Community User Groups.   P.S. Have an event or success story to share? Reach out to us – we’d love to feature you!

Users online (6,019)