We use software from a third-party vendor and we need to capture their data on a daily basis.
Their output sits on the web and we can capture it in Flow via HTTP GET. The path is https://api.SERVER.com/DATA/###/flow?start=0&api_token=PASSWORD. #### has a static range from 100 to 999. The PASSWORD is essentially a static constant.
We are able to build a flow that captures data in JSON format from a specific path, e.g. https://api.SERVER.com/DATA/111/flow?start=0&api_token=PASSWORD.
However, we are having issue with converting this into a loop, with issues being:
(1) concatenating/union/appending JSON files,
or (2) if converting each JSON data batch, incrementally appending multiple rows to Excel,
and (3) properly skipping blank data sets (some of the urls have no record and return errors).
Could you please help us with building the flow?
Hi, @ rotator,
Could you please share a full screenshot with details of the configuration of your flow?
For your first question,do you want to concatenate/union/append JSON files?
Please take a try to use "Append to array variable" to realize your needs.
For your second question,could you please explain more about the converting each JSON data batch?
if you want to insert date to excel, you could use the "Insert row" action.
For your third question,I afraid skipping blank data sets is not supported in microsoft flow currently,
You could post this idea in the Flow Ideas forum, so it might be considered for future releases.
Alice, thank you for your reply. Here is the full flow.
Based on your resposnes above, it sounds like I may need to ajdust the construction of the flow. Could you plesae give me a some guidence on how to change/adjust the blocks?
Right now my flows fails with good data. The schema was constructed by copying in the exactly same sample payload that it currently fails on.
Three Super User rank tiers have been launched!
Features releasing from October 2020 through March 2021
We've updated and improved the layout and uploading format of the Power Automate Cookbook!
Fill out a quick form to claim your user group badge now!