I have had some experience now with some moderate sized databases. I have a situation where I have a very large list. Currently 80,000 rows. This will grow every year by 10,000. I want this app to be able to do basic searches on the entire list. Nothing ultra complex, but simple individual IDs, date range, by match in one column or progressive column matches.
I have used SQL (through on-prem), and sharepoint online. To me, it seems that SQL is much more suited for large lists. However, the on-prem can be a challenge. But also isnt SQL more flexible with regards to delegation?
I'm curious on some suggestions.
Delegation is going to be pretty much the same between SQL and SharePoint. There are minor differences, but not significant.
What is more significant with SQL is that you can create filtered views of your data and then use that as a datasource.
If your criteria for your filters works against delegable columns - and your result set is less than 2000 (same for any datasource), then SharePoint can work fine if you are trying to avoid the on-prem scenario.
Keep in mind that you have to contend with record limits and delegation. These are two different things. All app are constrained by 2000 record limits. You can never bring back more than 2000. However, if you have the ability to pre-filter via delegable criteria that will result in less than 2000, then you can post filter with non-delegable criteria against that record set.
So, the above is relatively generic and applies to both SharePoint and SQL.
I hope this is helpful for you.
Actually, those details I did know. What I'm more interested in is which one would be better performance. My gut tells me (as well as I'm operating within the rules you state above) that SQL has the potential of a quicker result. Filtered views is definitely an advantage. In the future, if I decide to archive, I can add a single row to the master table, and make it 0 or 1. Then I can make a view of current vs archived, etc. Just a simple example.
As far as ultimate record limits of SP and SQL.... where do we stand with this?
I actually find that SQL seems a little slower in many of our apps. Just a casual observation though, no definitive proof.
As for the archive aspect, you could achieve this in either SharePoint or SQL as well.
You can have up to 30 million records in a SharePoint list. SQL will be more dependent on the version of SQL you are using, but it too is more than ample for your needs.
How about initial data load? Or adding large amounts at once? I find using SSMS to add tons of data at once is pretty easy. IN sp.. It seems very slow. I know there is this sort of table edit mode where you can paste in data, but it seems that only works with a limited number of rows at a time, and it becomes quickly resource saturated.
I'm going to have several spreadsheets. I know with either tool, I can ingest the excel file into a new table. I suppose I should do all of my data handling in excel, then ingest it for the initial load.
Depends on the size of that initial load. SQL is going to be far more superior in data management and manipulation.
But, if there is an Excel file as the start of this initial load, you could also consider a PowerAutomate flow to get the data in as needed. It would still be slow for processing if there are a lot, but at least it is not a user waiting for it to complete.
If I remember right, there is a ingest function in SP for an Excel file to create a new list. I'll play with each one and see how they respond. I do have 80,000 rows now that I can play with to get some practical tests completed.
I'll post here for others to enjoy.
That would be valuable information to share with all!!
I would say that is a limitation of the tool...not SharePoint.
Keep up to date with current events and community announcements in the Power Apps community.
A great place where you can stay up to date with community calls and interact with the speakers.
Check out the latest Community Blog from the community!