I have had some experience now with some moderate sized databases. I have a situation where I have a very large list. Currently 80,000 rows. This will grow every year by 10,000. I want this app to be able to do basic searches on the entire list. Nothing ultra complex, but simple individual IDs, date range, by match in one column or progressive column matches.
I have used SQL (through on-prem), and sharepoint online. To me, it seems that SQL is much more suited for large lists. However, the on-prem can be a challenge. But also isnt SQL more flexible with regards to delegation?
I'm curious on some suggestions.
Yes! I was sure there was a way. I found the 'old' tool. I ingested 87,000 lines in one shot from the excel spreadsheet. It didnt take but a couple of hours. No errors or interruptions! I'll post a how-to here in a bit.
Are there any limitations on rows that i can read from excel in power automate?
Let's assume I have 50K rows in excel(my excel is going to have more than this on the first data dump). How it will work, Are there any limitations that I will face?
I'm not sure what the ins and outs are with Excel in a flow as I don't really use Excel all that often and when using it, it's usually a small set of data.
I believe there is some limit, but the default limit is around 255 or something. It can be set to paginate and increase the amount of each page, but I am not sure on the mechanics of it.
I have created a sample 80000 row database and imported into SQL and Sharepoint. I will say the import went much smoother with sharepoint. It injested the excel file very readily. I had to dally with SQL in many respects. First, there was no fixing the 'Microsoft.ACE.OLEDB.16.0' provider is not registered issue. This was supposedly resolved here:
I did everything it said, even on multiple PCs, and even the server directly. No joy at all. The stand alone import/export wizard seemed to work, but there was no link to the SQL database... so, whats the point?
I ended up fooling with my datafile, and added a index column, with 1-80000 to give it something to keep the data records straight, and did a 'flat file import' of the csv file. This finnaly worked. But wow, what a pain in the rump.
Anyway, with a low loaded server, I found the sharepoint and SQL to have mixed results. Sometimes Sharepoint was faster with the results, other times SQL was faster. However, the notable difference was Sharepoint seemed to have much more of an issue with delegation rules. 'In' statements for filters for some datatypes were an issue. with SQL, no problem at all. So, overall, if I had just looked at the delegation rules chart first, I would have resulted in using SQL for this to begin with.
However, I just now noticed this Dataverse thing. I have been trying to find the record limits for this, and have not. I am wondering if this is suited for very large databases. I want to have a database capable of over 500,000 records.
The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.
This training provides practical hands-on experience in creating Power Apps solutions in a full-day of instructor-led App creation workshop.
Check out our new release planning portal, an interactive way to plan and prepare for upcoming features in Power Platform.