So I have a customer that has a closed system; pretty much all you can do is download a spreadsheet that gives you a snapshot of the data at the point in time. I could use Selenium to automate the download every 5 minutes and Powershell to update the spreadsheet in Office 365.
My issue was in using the spreadsheet as a data source, the data is static. So Refreshing the data from within the app was not an option. I literally had to open PowerApps and refresh the data source and republish the App as it is not "Managed".
I did, however, read an article on using the OneDrive Connection and the Refresh function. Once your connector is created, you select the tables you want to work with and bingo - that's that. I did however noticed that PowerApps created an ID field as an index.
So like I mentioned earlier, the data dumps from the customer's system is what I am working with for a data source. My first was just to do a copy and paste from the master file, but I want to be as methodical as possible as not to harm those indexes created by PowerApps. Should I overwrite the entire range minus the indexes; should overwrite based on a lookup field to find and update on differences ( could be time-consuming based on the number of rows; currently just under 5000 ). Thought about writing a sync tool from the customer dump to excel file in one drive, and then a refresh counter in PowerApps for automation.
Does the datadump provide a unique ID (i.e. Primary Key) for each record? If yes, you can safely use it to identify the correct row to update within the Excel file and then overwrite the entire range minus the indexes.
I noticed you are using the Selenium webdriver to obtain data from your customer's closed system. I've used Selenium before too and thought it was pretty cool! Do you work full-time as a programmer? Did you also use Selenium + Python?
Please click "Accept as Solution" if my response helped to solve your issue so that others may find it more quickly. If your thought the post was helpful please give it a "Thumbs Up."
Never wrote in Phyton but using it with VBScript.
And Yes I have another field called ROE that serves as a unique identifier for each row.
Let me know if you have some time to chat; I have a WebEx can send your way.
In using the OneDrive connectors, though I noticed it only pulled in the first 500 rows;
I have already set the limit to 2000 in PowerApps; Do I need to switch now to Collections?
To me it sounds like you now are problem-solving for two issues:
#1 How can I perform the sync between the closed system datadump
Lets put this question aside for one moment
#2 How can I work with large datasets within PowerApps
I can tell you've already started to notice some of the issues encountered when using large datasets. If the datasource is expected to grow beyond 500 rows I recommend storing the data elsewhere. Yes, you can increase the limit to 2,000 rows but this feature is considered experimental which means the feature can radically change or disappear at any time. In my own opinion, I would not want to use this in a solution for a client unless they fully understand the risks.
Your next idea was to collect all of results in order to bypass the 500 row limit. You can load data this way but I suspect the performance of your app will suffer as a result. This is why PowerApps has a set row limit: to make sure the user experience is good.
My recommendation would be to use a datasource that supports delegation such as Sharepoint List instead. Delegation allows the datasource to do the processing of data as opposed to PowerApps itself (which is what OneDrive/Excel does). Whenever possible we want to do work on the server-side (Sharepoint), not the client-side (PowerApps). How does this help us with the row limit? There are several possibilities: you can use functions like Filter, Search and LookUp to reduce the data below the 500 row threshold. You can use Pagination to show a smaller set of results and let the user browse to the previous/next page. Some controls even support delegation such as the Gallery by only loading rows into the app as they are needed. Most controls don't though.
Hopefully these sharing these ideas was helpful to you. The main principle here is try loading only the minimum amount of data into your app that is needed and let your datasource do the heavy work.
Sharepoint (nightmare). I can sync data from Sharepoint to Excel, but not the other way. This is the missing functionality that would take SharePoint from "why do I need to use this?" to "WOW! this is great!!!"
There has got to be a better way.
When it comes PowerApps and datasources it all comes down to licensing costs and what the client is willing to pay for. Yes, there are better datasources: an Azure SQL Datacbase omes to mind. But that's considered a premium connector and it's going to cost $10 per user per month and there will also be consumption-based costs for SQL. To keep costs to a minimum you basically have 3 "free" options: OneDrive/Excel, Sharepoint Lists or Common Data Service. Best to start there and work with what's available before considering other options.
Anyways, that's just my opinion. If you do find a better way please make sure to share it with me. I really mean that because I want to build the best apps possible too!
Find your favorite faces from the community presenting at the Power Platform Community Conference!
See the latest Power Apps innovations, updates, and demos from the Microsoft Business Applications Launch Event.
ISV Studio is designed to become the go-to Power Platform destination for ISV’s to monitor & manage published applications.
Features releasing from October 2020 through March 2021