Hello,
I followed the blog post here to download 35k records from Sharepoint List. As part of the final "Apply to Each", I need to recompile the 5k chunks of json into a single json doc.
Unfortunately, with Append to String Variable, I hit the size limit! Any ideas for how to overcome this?
Thats a lot of json 🙂
Maybe you could write a azure function to do that final step? Assume you just want to end up with a big .json file?
Hi @Gristy,
Thanks for the idea. I've tried my hand at Azure Functions, but haven't gotten the grasp yet. Any recipes you can think of to get started?
Hi,
You can try writing the chunks of data to text files on sharepoint.
I see that you are processing a loop or apply to all?
If you write the chunks then you can read the files back in for processing. You would only need to figure out the JSON parsing on reading them back in.
I do this with a flow where I read about 50k excel rows and write them to text, then open each one for processing through JSON.
Here I read the rows which will not fit in either a string append or array append.
The files are named dynamically and are also read in dynamically.
Here I read the files back in. The JSON was a little difficult finding where the object, the array, and items are nested, and the way to access them as well. In my JSON, it begins with the array.