cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
Advocate IV
Advocate IV

SQL stored procedure on-premises data gateway timeout

I found  I can run an SQL stored procedure on the local SQL server and it is very handy when the procedures are short lived. However, when the time the procedure takes to run, exceeds 2 minutes, the Flow seems to do a couple of extra things that I do not understand.

 

When I manually run the SP that I want to launch via Flow, it takes 5 to 7 minutes to run, where it basically pulls data from a number of sources (linked servers) via a local view, and combines them into a local table. This always works as expected, there is no issue with the SP.

 

When I run the SP via Flow it does do the first part of the flow, where it drops the existing table, and then things become unclear. After 2 minutes it signals a timeout, and from then on it seems to hammer the SQL server, since my SQL studio seems to have a hard time getting any response. The flow and the sever then go in limbo for about 30 minutes and seemly do nothing but jamming.

 

The flow is running with my personal credentials, the same as I use in the SQL Studio. 

 

Is there anything I can do to prevent this timeout and retry after 2 minutes?

Is there anything I can do to prevent the explicit cancellation?

Any suggestions to make this work as expected?

 

I did look at and adjusted the timeout settings, but I think the 'Note' tells me it will not work. 

Timeout:  Limit the maximum duration an asynchronous pattern may take.
Note: this does not alter the request timeout of a single request.

 

GatewayTimeout.jpg

 

The error message: 

{
"error": {
"code": 504,
"source": "flow-apim-europe-001-northeurope-01.azure-apim.net",
"clientRequestId": "bba12345-a123-b456-c789-cf64d495e8d1",
"message": "BadGateway",
"innerError": {
"status": 504,
"message": "The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: The operation failed due to an explicit cancellation. Exception: System.Threading.Tasks.TaskCanceledException: A task was canceled.\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass7_0`1.<<GetNextResponse>b__0>d.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.PowerBI.DataMovement.Pipeline.Common.TDFHelpers.<>c__DisplayClass11_0.<<ExecuteBlockOperation>b__0>d.MoveNext()\r\n inner exception: A task was canceled.\r\nclientRequestId: bba12345-a123-b456-c789-cf64d495e8d1",
"source": "sql-ne.azconn-ne.p.azurewebsites.net"
}
}
}

 

The stored procedure:

DROP TABLE IF EXISTS [LocalDB].[dbo].[MH_test_dev]

SELECT * INTO [LocalDB].[dbo].[MH_test_dev] FROM [LocalDB].[dbo].[MH_OpenSO_dev] 

  

Thanks for any feedback,

 

Michel

19 REPLIES 19
Highlighted
Resident Rockstar
Resident Rockstar

Hi @MichelH

Please see here

 

User is reporting:

  "error": {

 

    "message": "BadGateway",

    "innerError": {

      "status": 504,

      "message": "The operation failed due to an explicit cancellation

Then goes on to say: "We experienced this issue 8 times this morning. Opening a ticket with Microsoft to resolve. Seems like a Gateway code issue, or perhaps it has an issue connecting to SQL during some timeframes (Backups? DB Maint scripts?) Will post solution when Microsoft engages on the issue. We are collection the basic support info this morning on the case."

 

Looks like they are engagning Microsoft, is this of any help to you?

Maybe @SmartMeter can be asked for further assistance or information?

 

If you, the reader has found this post helpful, please help the community by clicking thumbs up.

If this post has solved your problem, please click "Accept as Solution".

Any other questions, just ask.

Thanks, Alan


Did I answer your question? Mark my post as a solution!

Proud to be a Flownaut!


Highlighted

Thanks for your response @AlanPs1,

 

I did see that post but the issue occurred over a year ago and multiple updates to the gateway software have become available since.

Also I can say that our gateway does work for short-lived jobs.

As long as it doesn't go for the timeout, then all is fine.

 

Nevertheless it can't hurt to request feedback from @SmartMeter regarding his issue, so I just did.

 

Cheers,

Michel

Highlighted

I'm more and more convinced the source of my problem is the hardcoded timeout (after 2 minutes) of a single request.

There are timeout settings I can use in flow but they don't apply to a single request.

 

timeout2min.jpg

 

Does anyone have an idea how to work arround this?

 

Could I download the flow package file and add something to the json to set this timeout to a longer period?

 

 

Thanks,

 

Michel

Highlighted

Any recommendations on this? I am experiencing the same issue consistently.

Highlighted

No other than just making sure the SP responds in less than 2 minutes.

 

For all the rest I made a python script of 10 lines that connects to the SQL and runs the procedures that take longer.

Highlighted
Microsoft
Microsoft

As others have mentioned, there is a 120 second timeout for SQL Server connector. If you cannot decrease the time taken by the stored procedure, then I suggest using a queueing mechanism on the server. If you are willing to take on some complexity, others have found success using a "fire and forget" mechanism. It assumes that you have access to the SQL server and have permission to create tables, stored procedures, and server-side triggers. The general approach is: 1) Create a stored procedure which performs the desired query (Master) make sure this procedure also accepts an identity value from the status table. 2) Create a second stored procedure (MasterStart) which can accept the same parameters as the first (sans identity). This procedure should add these parameters to a state table (let’s call it RunState) with an additional “status” column which defaults to “pending”. This table must have an IDENTITY column AND a ROWVERSION column. 3) Create a server-side SQL trigger which executes when new rows are inserted to the RunState table. When it discovers a new row with status “pending” it should execute the stored procedure created in step 1 using the parameters in the inserted row in addition to the identity of that row, and alter the status to “running”. The stored procedure in step 1 must update the state table using the provided identity to meaningful values, but most importantly to “complete” when it has finished. 4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table. 5) Your second flow step should be a trigger which observes the RunState table for Updates. This step should filter out runs and only execute for rows whose status == “complete”. If desired, you can include a “results” column in the table to communicate output values. It’s also good to subsequently update the status to “closed” – meaning, you have successfully processed the completed run. I have also seen implementations that remove the state table row on completion after logging to a second table (ala RunHistory). The advantage of this approach is that the stored procedure can take as long as necessary, and it will not impact Flows or the connector.
Highlighted

Thanks Cameron,

 

I had an SSIS package on the SQL server allowing me to run timed, but I think also triggered stored procedures.

I no longer have this package on the new server, and was told to launch stored procedures using flow.

 

Can SQL server 'by itself' (without additiona packages) have a 'server-side SQL trigger' ?

I will check out if this is possible, or if there are any additional constraints.

 

That said, I tried a kind of fire and forget method, but flow, after the time-out, actively engages to try and stop the stored procedure (because it thinks it has failed, I asume). If it would just let the stored procedure do its thing, it would work for me too. 

Highlighted

Michel, SQL Server side triggers allow you to kickoff a process that is not tied to a Flow request. That's really the only way to make sure the process isn't terminated after a timeout. That's the crux of "fire and forget" - it uses SQL server-side processing to execute the stored procedure. Anyone else who triggers the stored procedure would have to wait for it to complete, or terminate after some timeout. SQL Server doesn't have an async interface. If you do pursue an F-and-F approach: SQL Azure supports server-side triggers. Data Warehouse does not. See this reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-trigger-transact-sql the workaround I described above allows you to do something beyond the norm - namely, run extra long stored procedures that cannot otherwise be limited to 120 seconds. To do this, I've described a method which requires some complexity and nuance - and isn't for the feint of heart. If you are limited to using flows, then you probably want to simply reduce the amount of data you are processing at once, and limit the amount of time your stored procedure is taking to complete. Thanks, -Cameron
Highlighted
Frequent Visitor

I say this is very amateur on Microsoft's behalf. Great thinking outside the box Cameron. Now there has to be away to do more than set and forget? Perhaps a timer to poll a table which keeps state so that you know when your state triggered stored procedure finishes.
Highlighted

Yes, you will notice that in my description of Fire-And-Forget approach, you can actually set your flow to watch for changes in the results table. You can then set a Flow trigger to fire when the result shows up (and therefore the stored procedure is complete).  The most important part of the workaround is the ability to disconnect the flow that triggers the stored proc. Once complete, you can use all of the usual mechanisms you are used to (such as flow triggers).

 

Enjoy,

Cameron 

Highlighted

Hi Cameron,

 

Could you please show how you create a server-side DML trigger? The fire and forget approach sounds like a winner, but I can't figure out how to create the server-side trigger. 

 

Thanks in advance, 

Best,

Javier

 

 

Highlighted
Frequent Visitor

Sorry. It didn't work for me.

Highlighted

Hey there Javier - check out this link,

It has a lot of detail and some good examples.

Highlighted

Hi Cameron,

Thanks for the link. I have no issues creating the triggers. When I followed your approach, you mentioned that the "key" was to create a "server-side" trigger, and I can't find how to create a "DML server-side" triggers. In other words, how do you create a DML trigger that is "triggered" at the server level instead of the table level? Only DDL and Logon triggers are saved either in the "Stored Procedure" section or the server section. 

When I create a DML trigger (table level) to run a stored procedure that takes more than 2 min, I don't get the results back until the SP is finished. I think I'm missing something since I see others were able to follow your steps. 

Anyhow, I found a workaround where the DML trigger creates a Sql Job that calls the SP. The Job is deleted once the SP is executed. It simulates an async operation that is tracked by the status field in the table you mentioned in your approach. 

Thanks again for your reply.

Best,

Javier

 

Highlighted

Hi Javier,

 

Yes, the idea here is to have a server side trigger which executes the stored procedure using parameters inserted into the state table. When the stored proc finishes, it can update the same table with results and update the status. It can also insert result set(s) into a second table, which you can trigger on from LogicApps. I think you've got the gist of it.

 

Thanks,

Cameron

 

Highlighted
Frequent Visitor

I retract my statement. 

1. Using a statetable to start a storedproc. Check

2. Updating the statetable with "Complete" when finished. Check

3. Using a flow statement to monitor SQL "Modified" trigger. Trouble ahead!

 

 

Highlighted
Frequent Visitor

4) Your Flow should then call the stored procedure created in step 2 above (MasterStart). This will return as soon as the state table entry is created in the RunState table.

 

Some advice here, please. When I do this the store procedure waits for the entry to be created. The entry to be created status = "Pending" waits for the Master stored procedure to finish (20 mins) and thus I am still stuck by the same timeout issue.

Highlighted

The implementation here is admittedly tricky. I have an updated recommendation to use the Elastic Job Agent which, while requiring a new Azure feature, is much less error-prone, and much more straightforward. It still requires use of the state table, since jobs cannot accept input or output parameters.  This document will be published soon (it is in review) but I provide samples that can be easily built on for any stored procedure. You can also use the agent in on-premise SQL or Managed Instance to accomplish the same thing. I will post a link here when the document is published.

Highlighted

Long-running Stored Procedures for Power Platform SQL Connector

 

The SQL Server connector in Power Platform exposes a wide range of backend features that can be accessed easily with the Logic Apps interface, allowing ease of business automation with SQL database tables.  However, the user is still limited to a 2-minute window of execution.  Some stored procedures may take longer than this to fully process and complete. In fact, some long-running processes are coded into stored procedures explicitly for this purpose. Calling them from Logic Apps is problematic because of the 120-second timeout. While the SQL connector itself does not natively support an asynchronous mode, it can be simulated using passthrough native query, a state table, and server-side jobs.

For example, suppose you have a long-running stored procedure like so:

 

CREATE PROCEDURE [dbo].[WaitForIt]
    @delay char(8) = '00:03:00'
AS 
BEGIN  
SET NOCOUNT ON;
    WAITFOR DELAY @delay
END

 

Executing this stored procedure from a Logic App will cause a timeout with an HTTP 504 result since it takes longer than 2 minutes to complete. Instead of calling the stored procedure directly, you can use a job agent to execute it asynchronously in the background. We can store inputs and results in a state table that you can target with a Logic App trigger. You can simplify this if you don’t need inputs or outputs, or are already writing results to a table inside the stored proc.

Keep in mind that the asynchronous processing by the agent may retry your stored procedure multiple times in case of failure or timeout. It is therefore critically important that your stored proc be idempotent. You will need to check for the existence of objects before creating them and avoid duplicating output.

 

For SQL Azure

An Elastic Job Agent can be used to create a job which executes the procedure. Full documentation for the Elastic Job Agent can be found here: https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-jobs-overview

You’ll want to create a job agent in the Azure Portal. This will add several stored procedures to a database that will be used by the agent.  This will be known as the “agent database”. You can then create a job which executes your stored procedure in the target database and captures the output when it is completed. You’ll need to configure permissions, groups, and targets as explained in the document above. Some of the supporting tables and procedures will also need to live in the agent database.

First, we will create a state table to register parameters meant to invoke the stored procedure. Unfortunately, SQL Agent Jobs do not accept input parameters, so to work around this limitation we will store the inputs in a state table in the target database. Remember that all agent job steps will execute against the target database, but job stored procedures run on the agent database.

 

CREATE TABLE [dbo].[LongRunningState](
       [jobid] [uniqueidentifier] NOT NULL,
       [rowversion] [timestamp] NULL,
       [parameters] [nvarchar](max) NULL,
       [start] [datetimeoffset](7) NULL,
       [complete] [datetimeoffset](7) NULL,
       [code] [int] NULL,
       [result] [nvarchar](max) NULL,
 CONSTRAINT [PK_LongRunningState] PRIMARY KEY CLUSTERED
(      [jobid] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

 

The resulting table will look like this in SSMS:

Cameron_0-1602871250891.png

We will use the job execution id as the primary key, both to ensure good performance, and to make it possible for the agent job to locate the associated record. Note that you can add individual columns for input parameters if you prefer. The schema above can handle multiple parameters more generally if this is desired, but it is limited to the size of NVARCHAR(MAX).

We must create the top-level job on the agent database to run the long-running stored procedure.

 

EXEC jobs.sp_add_job
    _name='LongRunningJob',
    @description='Execute Long-Running Stored Proc',
    @enabled = 1

 

We will also need to add steps to the job that will parameterize, execute, and complete the stored procedure. Job steps have a default timeout value of 12 hours. If your stored procedure will take longer, or you’d like it to timeout earlier, you can set the step_timeout_seconds parameter to your preferred value in seconds. Steps also have (by default) 10 retries with a built in backoff timout inbetween. We will use this to our advantage.

 

We will use three steps:

This first step waits for the parameters to be added to the LongRunningState table, which should occur fairly immediately after the job has been started. The first step merely fails if the jobid hasn’t been inserted to the LongRunningState table, and the default retry/backoff will do the waiting for us. In practice, this step typically runs once and succeeds.

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name= 'Parameterize WaitForIt',
       @command= N'
              IF NOT EXISTS(SELECT [jobid] FROM [dbo].[LongRunningState]
                           WHERE [jobid] = $(job_execution_id))           
                     THROW 50400, ''Failed to locate call parameters (Step1)'', 1', 
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

The second step queries the parameters from the state table and passes it to the stored procedure, executing the procedure in the background. In this case, we use the @callparams to pass the timespan parameter, but this can be extended to pass additional parameters if needed. If your stored procedure does not need parameters, you can simply call the stored proc directly.

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name='Execute WaitForIt',
                @command=N'
              DECLARE @timespan char(8)
              DECLARE @callparams NVARCHAR(MAX)
              SELECT @callparams = [parameters] FROM [dbo].[LongRunningState]
                     WHERE [jobid] = $(job_execution_id)
              SET @timespan = @callparams
              EXECUTE [dbo].[WaitForIt] @delay = @timespan', 
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

The third step completes the job and records the results:

 

EXEC jobs.sp_add_jobstep
       _name='LongRunningJob',
       _name='Complete WaitForIt',
                _timeout_seconds = 43200,
                @command=N'
              UPDATE [dbo].[LongRunningState]
                 SET [complete] = GETUTCDATE(),
                     [code] = 200,
                     [result] = ''Success''
               WHERE [jobid] = $(job_execution_id)',
       @credential_name='JobRun',
       @target_group_name='DatabaseGroupLongRunning'

 

We will use a passthrough native query to start the job, then immediately push the parameters into the state table for the job to reference. We will use the dynamic data output ‘Results JobExecutionId’ as the input to the ‘jobid’ attribute in the target table. We must add the appropriate parameters for the job to unpackage them and pass them to the target stored procedure.

Here's the native query:

 

DECLARE @jid UNIQUEIDENTIFIER
DECLARE @result int
EXECUTE @result = jobs.sp_start_job 'LongRunningJob', @jid OUTPUT
    IF @result = 0
        SELECT 202[Code], 'Accepted'[Result], @jid[JobExecutionId]
    ELSE
        SELECT 400[Code], 'Failed'[Result], @result[SQL Result]

 

Here's the Logic App snippet:

Cameron_1-1602871250902.png

 

When the job completes, it updates the LongRunningState table so that you can easily trigger on the result. If you don’t need output, or if you already have a trigger watching an output table, you can skip this part.

Cameron_2-1602871250915.png

 

 

For SQL Server on-premise or SQL Azure Managed Instances:

SQL Server Agent can be used in a similar fashion. Some of the management details differ, but the fundamental steps are the same.

https://docs.microsoft.com/en-us/sql/ssms/agent/configure-sql-server-agent

 

Helpful resources

Announcements
Community Conference

Power Platform Community Conference

Check out the on demand sessions that are available now!

Power Platform ISV Studio

Power Platform ISV Studio

ISV Studio is designed to become the go-to Power Platform destination for ISV’s to monitor & manage published applications.

Users online (7,467)