Apologies as this is probably something very easy but I can't for the life of me find it.
I've created a flow which connects to an azure sql database successfully, It creates records and updates them. I've added another column to the SQL table but the column is not showing in flow. It's just a varchar(50) column. Is there a button to update the schema of the table so it shows as a parameter in the sql update action on flow please? Don't fancy recreating the whole sql part of flow everything I add a new column lol.
Yes, waiting a day should also refresh the existing connection. The issue is caused by the caching of the connection when it is created. If you don't delete it and create a new one there is no way to manually update the cached definition. I suspect the cache clears overnight and that's why it works the next day. If you can afford to wait that is an easier fix.
Responding to my previous comment.
"Today, Again I had to make some changes to the table (increase the size of the fields from 25 varchars to 50 varchars). The flow didn't reflect the changes immediately, I will wait till tomorrow to see whether it is getting reflected and will keep you posted. "
It worked now.
-Other way to overcome the original issue is that, I created a copy of my table and used that in the flow, it works as expected. (Just to avoid creating the new connection every time). Still this is not a good solution as we progress further because the table might have been used anywhere else in the application and it will be huge impact across the application.
Yes but that's nonsense when you are in dev mode and testing various scenarios out. Just like @Peter_Groen mentioned above. It is a fix Microsoft should have implemented a long time ago. I would go even 1 step further and say: It is an utter disgrace it is taking so long to get a fix. And quite frankly, for developers moving over to Power Automate Cloud, there isn't anything to be gained. In fact, not only do you get less control over the internal details of your app, your productivity drops sharply as well. I'd rather stick to SQL, Python, Powershell, or C# and get things done in a third of the time it takes me to get over the inane bugs of the SQL actions when using a connection to an on-prem DB. 3 years and counting and still no fix... ridiculous.
I created 2 connections with exactly the same config but 2 different names. When the main connection used in the flow starts misbehaving, I switch over to the other connection that has not been used in the flow, and all the uncalled for error messages disappear, ie the flow starts running as intended and the dev/debug cycle can go on. Of course, if you need to change a table schema again, or change one of the ODATA params of the action itself, you are in trouble again. Hopefully enough time would have gone by so that the previous connection has reset and switching back to it allows work to go on. If not, you need a third copy of the connection, and so on ad infinitum. Bloody hell, it feels like we are back in the prehistoric days of computing when MS-DOS ruled the desktop and COBOL was all the rage! 😉
You only have to wait 24 hours if you change the schema of the underlying data source (SQL) and don't want to delete the connection. If you delete and recreate the connection you can do it right now. But the best approach is to make sure the schema of the database is set before you start trying to create a workflow around it.
Yes, I get it but it is not acceptable that it be so rigid. Even if the schema is set beforehand and never changes throughout the whole dev cycle, every single time if you do a DROP TABLE to reset the DB (I know, instead of a TRUNCATE) then you get in trouble again.
And so you are forced to create a test connection so that all the dozen other flows that use more than a dozen SQL actions over the same connection will not be adversely affected. Imagine if one forgets this, he or she will have to manually go into each flow and change a dozen times the settings because there is no UNDO button either! Also, because when you are developing something, time is usually of the essence and thus waiting 24 hours is absolutely bonkers.
And so one is forced to delete/recreate the connection every single time a small mistake is made re the steps above. Anyways, you know it, I know it... but does Microsoft know it?! This is what irks not just me apparently, but many other users... it seems as if MS doesn't give a fig about improving this. In any case, I'll be going back to the traditional dev env as it affords both faster dev/debug cycles and greater productivity. RPA in the cloud is a nice concept, but AFAIC a toy concept not yet ready for prime time (not mentioning all the other issues re UI/UX, var scoping, etc...). Oh and another thing, customers shouldn't feel forced to start moving their on-prem data sources into the cloud in order to get a more seamless experience. I understand that that's the preferential (lucrative) option for MS, still it rubs the wrong way.
Addendum: since I started writing the above, the goddamn connection did refresh... we are talking about what... 10 -20 mins+ since I left the flow before starting the post.... so that's strange, sometimes it takes 24 hours, and sometimes it takes less than 30 mins.
Addendum 2: Unblievable! I add a new SQL UPDATE action to the flow, thinking that since the connection had refreshed for the previous UPDATE action, I could continue working by adding the next. But no such luck! The new action is referring to the old image of the connection, ie the table list is missing tables, whereas the old action, as I mentioned in the previous addendum, suddenly was able to access all the tables.
Talk about not making any sense.
Hey Microsoft! Would it hurt you to add a REFRESH CONNECTION button in the action menu?
Addendum 3: so basically if you copy to My Clipboard the SQL action that is connected to the refreshed image of the connection, then you can add that copy of the action and get immediate access to all your tables.
Exactly... it's utterly and totally outrageous. Now, I am having the same problem with a Form Trigger. I had to recreate the Form, deleting the old one. As a result, the Form connection keeps referencing the deleted Form, no way to get it to understand there is a new Form it should connect to.
Since this wasn't a rush job, I let it sit for 24h expecting the connection to be reset... but it didn't. What an awful user experience this is. I feel sorry for all those people who got lured into this no-code gimmick and who don't have any basic computer science foundation to understand what is really going on. For simple stuff, like email processing for ex, it works fine. Anything else more complex where you have to drop down to using Azure functions and so on, forget it--you'll need a dev to do it properly.
Anyway, I could rant on until I am blue in the face when it comes to Power Automate and all its drawbacks.
The amount of times I've felt the urge to throw my computer out the windows...!
For basically all SQL actions, I've started using the "Execute a SQL Query" action instead of any of the other. That way I'm not reliant on the refresh and just string together whatever I want to do as a SQL command. No more wait times.
Because my org has an on-premise SQL Server solution, that particular SQL action I cannot use, unfortunately. I have since been able to carefully navigate these actions. They are useful for rapidly deploying workflows that our company has wanted for a long time, but once we get our custom front-to-back ERP developed and deployed, bye-bye Power Platform (for the most part).
This problem hasn't been fixed in over 2 years and is very frustrating. Working with a SQL connection is a pain and forced us to look into and implement different solutions with other providers.
Keep up to date with current events and community announcements in the Power Automate community.
A great place where you can stay up to date with community calls and interact with the speakers.
Check out the latest Community Blog from the community!