08-18-2022 00:05 AM
Excel read rows unlimit
This flow helps you read excel data rows beyond your limit for any plan.
The data is read in chunks, written to temporary text files, then read back in and processed.
In loop get all rows control the number of loops by setting the value is equal to "number".
The change limits should match the equal to value. I could not find documentation on this but this also controls the number of loops.
To come up with the number of loops, you will need to do some math.
For example, I want to read 10,000 rows at 500 each loop, so that makes 20 loops. 20 x 500 = 10,000.
You will also need some math in the list rows present action step.
The top count and skip counts will control which set of data is read each loop.
Using the same example of 500 each loop, set top count to 500 and the skip with the formula to multiply the loop by 500
The JSON action is where you need to set your own columns and types.
You should be able to figure out the column name and data type structure by looking at the Parse JSON.
Then you can set your columns and data types and even add more columns.
This flow writes each record into sharepoint, the destination can be changed.
Please post any questions.
Anything else we should know:
Note that your hourly, items per 5 minute, and other limits can not be over come.
That is why this flow does not use parallelism to help keep it under the those limits, and it can't use parallelism due to the use of variables within the loops.
Yes, the flow can take a long time to run on large data, yet it achieves the goal of automating it for you while you sleep.
So it is probably best suited for plans where you can only read less than 5,000 rows and the file has less than 100,000 rows.
My sample run of 50,000 rows written to sharepoint from excel
Thank you for your solution but when I try to import the solution I see the below errors.
Please help me with this issue.
Hello, Download the new zip file. I had some settings set on that action in error that are now cleared.
I believe it should import now.
Thank you for your help and it is working for me.
havent tested it, if it does what you saiy, i love you, thanks,
Hi, why my item almost stucks at 768 everytime I ran the flow?
I got a error message: The property 'content' must be of type JSON in the 'ParseJson' action inputs, but was of type 'application/octet-stream'.
Thanks for your "Automate"
After I import to my enviroment and modify some configs in thise Automate , now I can import almost 20,000 data to my sharepoint list !
Have you figured out why your run is stuck at 768?
I can only think maybe there is something odd in that row of your data.
If it gets stuck and there is no error, it is probably running....
Have you let it run until it times out or it finishes?
Is your created file in the loop get all rows step the right type of extension. It may be getting converted to an actual xlsx file if you are using that extension.
application / octet-stream means it was converted to a binary file such as excel and can not be loaded as plain text.