our flows use a set of static data that is currently being read from the SharePoint list.
It is convenient and separates the 'code' from the 'data', but with a multitude of runs it is now a heavy burden: every run the flow issues the call to read from the said SharePoint list and then iterates over the list to trim and otherwise clean the content entries and set up the JSON of the parameters for the flow to use.
It feels like a waste of resources, both from the user point of view (number of runs, time spent) and from the system point of view (extra compute and read operations).
With all the new functions shipped recently, I wonder if there is a kind of 'in memory' storage we could populate once in a while (e.g. once an hour?) and have all the flows that need it reference this source instead of calling the 'reader-from-SharePoint'.
There is no such specific memory location for storing content in flow. If your data is static and always same then hardcode the values under a specific variable as JSON format.
Thank you, @abm ,
unfortunately this is not a solution at all.
These are parameters I need other people to be able to change. These are not the same people that should have an access to the flow code. So using hard-coded values is not the way to go.
I guess will have to migrate the flows to a solution and use environment variables that I heard are available there.
Yes you could try environment variable feature.
Why not add this in a list against each people and user could run as selected from a list (SharePoint). Before running they could change the value as well.
I've built the whole system of flows, all triggered either by an external application webhook or by other flow. None is initiated by the user in the SharePoint or anywhere else in the Office 365 environment. And we have tens if not hundreds runs a day.
All totally hands-off.
The parameters are in a couple of SharePoint lists now. And all externally triggered flows read from these lists.