cancel
Showing results for 
Search instead for 
Did you mean: 

Pass Table parameters through the SQL Connector - Execute a stored procedure action to stored procedures

We would like to be able to pass Table parameters through the SQL Connector - Execute a stored procedure action to stored procedures.

 

Here is a common use case that can benefit from this capability.

 

Imagine you have a PowerApps screen that allows you to see data about Customers and their Accounts, Orders, and Contacts.  Under the hood you have a SQL Server database with many tables that hold the data.  Currently, the screen in the PowerApp can easily execute a stored procedure that returns data from all the related tables.

 

Now imagine the same screen in the PowerApp allows you to create a new Customer, and their Accounts, Orders, and Contacts.  With the current connector that does not support Table parameters you have to use an approch like the one I describe below.  The Flow is optional, but recommended.  In this case, one Flow is created for each stored procedure.

 

From the PowerApp call the Flow that uses the SQL Connector to call the stored proc that inserts the Customer in the customer table and returns the ID for the new customer.  Then from the PowerApp, make subsequent calls multiple times to make each Account, Order, and Contact, passing in the information for the individual records to create each time, as well as the ID for the customer to associate it with in the database.  This is a lot of calls and slows down the app.

 

A single Flow could also be created that orchestrates calling all the individual Flows over and over, but that's a cumbersome solution architecture to support and debug.  

 

In both approaches described above, database transactions can only be applied at each individual stored proc level.  Since that is the case, if any of the insert operations fail, only the insert that fails will be rolled back via a transaction.  Now you are in the state where you have to manually call other stored procs to roll back the inserts that worked, or let the user know which parts did not work and write a lot of logic to handle trying the failed insert again.  This isn't ideal or pretty and it doesn't support true transactions at the all up level for the customer and all their related records being submitted by the screen in the PowerApp.  As a result, your database log is cluttered and tables that auto increment columns will also get messed up.

 

Now, imagine a world where you can pass Table parameters to the Stored Procedures.  In this bright happy world of clean data, minimal code in the PowerApp, simple Flows, and fully transactional data access we can create a 'master' stored procedure that takes in all the data for the Customer record to create, and table parameters that contain the multiple Accounts, Orders, and Contacts we'd like to make as well.  We can then invoke this stored procedure, pass in the Table params and watch the 'master' stored procedure insert the Customer record, get the ID back, then call the other stored procedures and pass in the appropriate Table parameters to create all the associated records in a single database transaction.  If anything goes wrong, we roll back the transaction, return the error back to the PowerApp, easily identify what failed, and let the user know about it so they can try again.  No orphan data will ever exist.

 

Our current workaround is to live with the fact we can not do all the inserts in a single transaction, which as I mentioned above, requires lots of extra code.

 

This same principle applies to updates and deletes as well.

 

Adding this capability will make it much easier to make really powerful PowerApps that interact with relational database schemas in SQL server.

Status: New