Thanks MelindaK. I think I understand the delegation workaround. I guess what I ultimately need to discover for my company is how feasible or advisable it is to use SharePoint lists as a datasource (as oppoed to the more expensive Dynamics 365, say). A judgement call I imagine (based on the ultimate size--number of rows--of the planned data tables), but to what extent do the number of COLUMNS/FIELDS also contribute to the inevitable degradation of perfomance? Are there recommended upper limits (on rows and columns) beyond which SP becomes obviously impractical as a datasource for PowerApps?
I'm not sure how PowerApps actually gets the data from SharePoint Online, but i suspect that somewhere behind the scenes a CAML query must be involved in order to fetch records. Where I am, we have SharePoint lists and libraries with >100,000 items, so long as queries are made on Indexed columns, and the query returns less than 5000 items at a time (limit set by Microsoft) I have witnessed pretty minimal performance degradation.
Taking into account this, I would think that so long as your query falls within the boundaries set by PowerApps you wouldn't experience performance issues. The biggest list I've worked with in SP from PowerApps so far has been about 700 items, and so far so good, when workarounds for delegation are implemented correctly.
The main limitations I have found with using SP as a datasource at the moment has been at the Powerapps end, with the 500 item limit and (even with delagation) the inability to filter on more complex fields (e.g. Person, Date). Workarounds (such as the one Mr. Dang has put together) are around, but it would be great if Microsoft can lift these limitations down the road to simplify the process.