This question is an extension to my previous question. I was successfully able to implement my requirement of writing large dataset using graph API using the "Send an HTTP request" action of "Office 365 groups" connector. It worked fine until I had to write an excel which contains 98 columns and 3000+ rows. Normally this would take several seconds but in this case the request kept timing out (See the image attached) and when I tried to insert only 100 or so records it worked fine. So, my conclusion was that the request-body data was too large. Therefore, I think one solution will be to divide my json data into several parts and send via several API calls to write to excel. How can I achieve this? or is there any better way than this to do my task?
Any help would be appreciated. Thanks in advance!
Can you get first 100 or 200 records each time? If it is then split the flow into two. Parent flow retrieves the records in batch and pass to child flow to process (Post) 100 records each.
Please see the document section Timeout
Thank you for the reply, Currently I take the whole data in a single GET and store it within a compose. How can I retrieve only a set of records at a time?
Can't you add any extra filter in your GET request?
$top=N (N means the number of records you want to retrieve) to the URL to filter the returned data and to return only the first N entries. Then you may need to implement a Do while loop to get the next set of records.
So many events happening this month - don't miss out!
Explore the latest innovations, learn from product experts and partners, level up your skillset, and create connections from around the world.
Learn to digitize and optimize business processes and connect all your applications to share data in real time.