cancel
Showing results for 
Search instead for 
Did you mean: 

Removing user ability to access data source without using the app

Problem:

One of the biggest issues in PowerApps right now is that we cannot protect our data. If we connect any data source to PowerApps (Excel, SharePoint List, SQL Server Connection) it has to be shared with all users for them to be able to use the app.

This creates a problem where user has access directly to data source and can bypass the app to do direct modifications to the data source as well as see information not meant to be seen by them. If your app was built to limit users access to some data, for example:

  1. Showing users only their vacation requests and hiding other user vacations
  2. Showing user only their travel request and hiding other user travels

That means that all users can see all data as well as they can modify it without any trace.

 

In case of Excel file on OneDrive we need to give users access to this file, that means that user can just go on OneDrive and find the excel file and edit it.

In case of SharePoint List, that means that user needs to have Edit rights to that list and can just find it on the SharePoint Site and go in and edit.

In case of SQL Server Connection, that means that user can open PowerApps, click create new app, open Data Sources and the shared SQL connection will be there and he can connect to it. This will allow the user to see all tables in that SQL connection with edit rights.

 

Idea:

I believe the best way to fix it, and this will allow PowerApps to become truly powerful tool to replace most of organization applications is to give the App itself write rights to the Excel sheets, SharePoint lists or SQL Connection and not the user. This way the user will have no access to the files, SharePoint List or SQL Connection and the only way to interact with data will be through the App.

Status: Planned

this work is now planned.   All of the connection types you discuss above are "implicit" connections.  We are currently in arch design discussions to address this. 

Comments
omal
Regular Visitor

@LanceDelano 

I wish this could be implemented very soon. Any update on the ETA? This will definitely increase the usage and encourage developers to use PowerApps. 

TedBabcock
Advocate II

The flow-calling-a-flow method to get around this works perfectly, and I've become so used to it that I plan to keep using it, even after this fix is implemented. I never particularly cared for Power Apps Forms, and I can basically design my own forms. And once the ability to parse JSON comes to Power Apps this summer, I will be able to handle almost all of my data operations -- I use SQL Server -- with exactly two flows. The process is described in an earlier post in this thread.

AIUYM19
Advocate V

@TedBabcock, PowerApps already has the ability to parse JSON. See this example in one of my apps...

 

App.OnStart: Set(JustOpened, true);

[Title screen].OnVisible:

If(JustOpened,
    UpdateContext({TempReturned: ITAuditAppUserProfileRequester.Run(JSON(UserParms))});
        //That function, "ITAuditAppUser..." takes a JSON object, and returns a JSON object.
    If(TempReturned.ResultCode <> 200, //ResultCode is an element in the JSON object.
        UpdateContext({
            ErrorNotification: true,
            ErrorMessage: TempReturned.ResultCode & " " & TempReturned.Message
                //Message is another JSON element.
        }),
        Set(UserProfile, TempReturned.Content) //Content is another JSON element (itself a JSON object).
    );
    Set(Programs, ITAuditAppProgramsRequester.Run())
        //This is another PowerAutomate function that returns a JSON object.
);

 

TedBabcock
Advocate II

Thanks, @AIUYM19 , but I don't understand. Where does ITAuditAppUserProfileRequester come from?

 

I've seen other methods of parsing JSON in Power Apps. Mr Dang uses Split() and some regex. He even provides an app that will help you design the schema of your collection.

 

https://www.youtube.com/watch?v=L4pPCDXoL7c

 

April Dunnam also has a regex-based method. I could never get those to work.

 

https://www.youtube.com/watch?v=2xWiFkBf294

 

Shane Young had a video on copying and pasting from Excel into a Power Apps collection. With some tweaking, I was able to get it to (mostly) work, using Split() and an HTML table returned from a flow. However, I never got very far with it, and when it was announced earlier this year that they would release the Power Apps function to parse JSON this summer, I didn't pursue it further.

 

https://www.youtube.com/watch?v=foGaOrBzVLc

 

AIUYM19
Advocate V

@TedBabcockITAuditAppUserProfileRequester and ITAuditAppProgramsRequester are functions I created in PowerAutomate. I designed both of them to return JSON objects containing the information requested along with some other metadata. In both functions, the actual data requested is returned as a nested JSON object in the Content element. Both functions are designed such that even if there is an error (in the argument or in the PowerAutomate process), a valid JSON object is still returned.

 

One of the biggest reasons I designed my PowerApps like this is because it handles JSON parsing almost identically to how JavaScript handles it. That is, you take a valid JSON object, and just start referencing its elements. And since PowerApps script is dynamic, you (usually) don't have to worry about the variable type when referencing those elements (just like in JavaScript...mostly).

 

In fact, PowerApps handles the JSON parsing so well that in this particular app, literally all the data the app uses is either stored in or returned in JSON (user data, permissions data, user preferences, etc.). Of course, all this data is stored on a secure server (mostly in SharePoint), which brings us back to the point of this thread: we need a way to limit access to this data only to the PowerApps app without the need to use the circuitous and resource-hungry flow-calling-a-flow method.

TedBabcock
Advocate II

Thanks, @AIUYM19 ! Sorry for the basic questions. I'm nothing but an experienced amateur here.

 

How is your JSON object returned to Power Apps from the flow? Using the Response action? Or is it a serialized object returned as a string? It's the latter I'm looking to use, because then I don't need to specify the JSON schema. With the Response action, it does need to be specified, which means I need a separate flow for every distinct set of returned data, because a single flow can have only one JSON schema specified.

AIUYM19
Advocate V

@TedBabcock, hey, no worries! My flows are using the Response action to return defined JSON objects. However, I've had success with returning it as a serialized string, too. If returned as a string, then you'll need to implement something like Dang's or Dunnam's methods within PowerApps to parse the object.

 

Before giving up hope, though, it's not necessarily a bad thing to create multiple flows, one for each JSON schema. PowerAutomate is billed on the number of actions each flow takes within a billing period, and there are limits to how many actions a flow can have. I've found PowerAutomate (and, subsequently, my PowerApp) to run much more quickly (and cheaply!) when using multiple short flows rather than one giant flow.

 

Consider what needs to happen when you have one flow take care of, say, four different return JSON schemas:

  1. The flow needs to determine which schema to use.
  2. The flow needs to define the correct schema.
  3. The flow needs to crosscheck the data with the schema to ensure compliance.
  4. The flow needs to create schema-specific error responses in case of failure.

These four steps alone will lead to dozens of extra actions in your single flow. All but #3 can be avoided entirely, if you create a separate (shorter) flow for each schema. Fewer actions means cheaper and faster runs. Shorter flows mean easier and faster maintenance and bug fixing (and less whisky; that may be a good or a bad thing 😉).

 

I take the same approach to building flows as I take to coding. Think of a flow as a function. You call a function that does generic task from other functions that do specialized tasks. For example, the ITAuditAppUserProfileRequester and ITAuditAppProgramsRequester flows call many of the same other flows that do some generic data manipulation, but they each have their own schema to return to the PowerApp. This way, if I want an object with the schema from ITAuditAppProgramsRequester, I can avoid running any of the actions in ITAuditAppProgramsRequester, that are unnecessary, and vice-versa.

TedBabcock
Advocate II

Thanks, @AIUYM19 ! This is all great stuff.

 

I see the point about flow runs. Currently, however, I have literally dozens of flows. To see my whole list, I have to click "Show More" several times. Admittedly, lots of those flows are unneeded -- for example, I don't need a separate flow to bring back a different set of columns from the same basic data; I can use one flow that calls all the columns of a certain type, and then use ShowColumns() or DropColumns() in Power Apps to get only the particular ones I want. 

 

Even so, such a system gets less efficient as the number of basic data schemas grows. As an alternative (just off the top of my head), if I run into a problem resource- or cost-wise with a single parent flow doing too much work, I could clone into several copies (Flow A, Flow B, Flow C, etc) and assign different calls to them to distribute the workload. This can scale up indefinitely as usage grows, but they're all the same flow, so nothing else needs to be kept track of -- only each flow's usage.

 

And, back in Power Apps, the call would always the same, regardless of the particular flow being used.

 

Because I'm using SQL Server, I can handle all CRUD operations there. So I send a serialized JSON string with the parameters. The parent flow -- one for each schema now, but the same (with clones) in the near future -- sends the string to the child flow, which sends it to a Stored Procedure in SQL Server. That SP takes the first parameter to determine which detailed SP to send it to. Then each detailed SP does its thing and ends with a SELECT to return a read of the refreshed data -- as a table now, but as a JSON string when Power Apps ParseJSON() is ready. The child returns the data to the parent, which returns it to the app.

 

Every data operation in Power Apps, then, is something like this:

 

ClearCollect(collection, ShowColumns(Flow.Run(strJSONparams), "ColA", "ColB", "ColC"))

 

This summer, ParseJSON() will be inserted into that statement, and it will be the same flow that gets the JSON string from the data source (whether it's literally a single flow, or copies).

 

That's the plan, anyway. "Best laid plans..." and all that. I'm looking forward to testing it all.

 

Many thanks for all your help on this -- as well as back when I was first learning about flow-calling-flow to get around the problems of implicit sharing. You and @skylitedave were generous beyond any expectation in helping me grasp that.

drwaz
Helper I

@LanceDelano

Any update or ETA on this?

 

LanceDelano
Power Apps

A feature to secure new SQL Server (and all implicitly shared connections) connections is currently being developed.  Old ones can be easily converted (drop and re-add).  As you might imagine it involves several different groups and things have to come together but code is actively being written.  I don't have an exact ETA but you should see it this fall or before.