Searched lots on Google and can see this mentioned but not in relation to PowerBI so wondering how spectacularly I've broken it.
I have a DataFlow that pulls in Daily (using Incremental refresh) 3 months worth of data from daily flat files. There's more than a few million rows in every file. This works and updates itself daily.
I pull that into PowerBI desktop and up to know has worked, no issues.
I think maybe because the volume of rows in the file is increasing per file that this might be a capacity issue.
I refresh the PowerBI desktop (I'm doing it there to get it all up to date and then I'll publish it and set up scheduled updates) and after the refresh of data happens I have all my data in there. However, in the table that imported all that Data I have a few columns that do 'stuff' with DAX for each row. One column is now showing this error when I look at it using the DATA view of PowerBI Desktop
A string store or binary store with a compatibility level of '1050' is at the maximum file size of 4 gigabytes. To store additional strings, you can change the StringStoresCompatibilityLevel property of the associated dimension or distinct count measure to '1100' and reprocess. This option is only available on databases with a compatibility level of '1100' or higher. Physical file:. Logical file:.
The DAX is just doing this
The first thing I'd check is to make sure you don't have a really old version of Power BI Desktop. File > About should show what version you're on.
I've seen people having this issue with SSAS but not with Power BI Desktop.
I have just ran into this problem of yours... Same exact message after creating a summary dax table of a very large data table.
It's very frustrating. The computer has tons of memory available, and all power BI settings are good.
My Power BI desktop is updated.
If I find a solution, I'll come back here.
Does your Power BI File have any sort of connection with a SQL Database? Even though you are currently using Desktop, and the data isn't from a SQL Database... HOWEVER, is there any connection elsewhere in your file or dataflow?
I have not solved anything yet, it's just that it appears the issue stems from the way strings are stored in a SQL Database. I personally am not dealing with a SQL Database directly, however, some of the associated data comes from a SQL Server, and somehow, perhaps the SQL server is doing some of the calculations, even though I had assumed it was all upstream, and not relevant.
Here is the Microsoft Doc I am trying to pick apart: https://learn.microsoft.com/en-us/analysis-services/multidimensional-models/configure-string-storage...
I appears that a setting or parameter can be changed in the SQL Server to allow for larger strings. Whatever that means. 🙂