cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Cram08
New Member

Json to CSV with column arrangement

Hi Anyone,

Please help me out im trying to convert Json to CSV thru power app. I am getting the data to CSV but my problem is how can i re order the column in CSV though JSON? the output was alphabetically arrange. 

Thanks

1 REPLY 1

@Cram08 

 

You can try a Power Automate Flow that outputs a csv string of multiple rows of the data in any order you want out of a JSON input. Here is a custom Flow example I made for you, as follows:

 

pf23-01.png

This above part is very important where it says a,c,b,e,d. This determines the order of the columns. If you change this part, this changes order of the columns. For example, changing it to a,b,c,d,e will mean that the rows will be a,b,c,d,e respectively.

 

pf23-02.png

 

 

split(variables('HeadersString'),',')

 

Here we turn our "order" specification into an Array, so we can start iterating over it soon in that specific order.

 

 

 

 

pf23-03.png

 

 

 

concat(variables('HeadersString'),decodeUriComponent('%0A'))

 

This part places the names of the columns themselves as the first row, then follows that by a line feed.

The reason we need decodeUriComponent('%0A') is because we want to force a 'line feed' here at this point. When a row has finished, we need a line feed in there for it to work correctly in csv. You can instead alternatively press enter on your keyboard inside the editor instead of using decodeUriComponent('%0A'), but I prefer to be more explicit about it, so I prefer decodeUriComponent('%0A')

 

 

 

 

 

pf23-04.png

The above initializes a temporary variable used in a later step.

 

 

 

pf23-05.png

 

The JSON is for testing purposes containing input data of two elements (so it will output the header row and two additional rows if working correctly), and consists of two elements:

 

 

[
  {
    "a": 1,
    "c": 3,
    "b": 2,
    "e": "test,comma\"andquote",
    "d": 4
  },
  {
    "a": 11,
    "c": 33,
    "b": 22,
    "e": "test,comma2\"andquote",
    "d": 44
  }
]

 

The strange element with the escaped quote and the comma, the values for "e" are basically to test certain edge cases - such as, what if the data has a comma in it.

 

The order of these elements does not matter by the way - it is the order of variable HeadersString that matters (i.e. see closer to top of this post).

 

JSON Schema is:

 

 

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "a": {
                "type": "integer"
            },
            "c": {
                "type": "integer"
            },
            "b": {
                "type": "integer"
            },
            "e": {
                "type": "string"
            },
            "d": {
                "type": "integer"
            }
        },
        "required": [
            "a",
            "c",
            "b",
            "e",
            "d"
        ]
    }
}

 

 

 

 

 

Note - you might be able to replace the whole "Parse JSON" above with just "Compose" by the way - this solution does not even rely on Parse JSON at all actually!

 

 

 

 

 

pf23-06.png

Current item as shown above, is picked from Dynamic Content, under Apply to Each

The following expression can also be used instead and would equivalent to the above where it says Current item:

 

items('Apply_to_each')

 

 

 

Under Body above, this was picked from the Dynamic Content , the word 'Body' under 'Parse JSON' in Dynamc Content.

Equivalent expression instead of Body would be this:

 

body('Parse_JSON')

 

 

 

 

 

If you choose to use Compose instead of Parse JSON, then just pick that specific Outputs for that Compose under Dynamic Content and substitute that in for Body above.

 

 

 

 

 

 

 

 

pf23-07.png

Above sets a temporary variable to empty in case.

 

Overview of what comes after the above:

-----------------------

The below is a new Apply to Each (inner, or Apply_to_each_2) which takes the headers themselves that we specified as the input, to iterate over them. This basically will end up creating the row data based on the current JSON element, and this is is because we want to have the csv ultimately have this row data in the exact order we want.

pf23-08.png

 

------------------

 

Now, stepping inside Apply_To_Each_2:

 

pf23-09.png

Current item is picked from Dynamic Content, under Apply to Each 2

The following expression is also equivalent to the above:

 

items('Apply_to_each_2')

 

This gives us the current "Key" - in the order as given in HeadersString (which was converted to Array with HeadersAsArray). 

 

 

 

 

 

pf23-10.png

 

 

items('Apply_to_each')[items('Apply_to_each_2')]

 

This basically gets the specific "value" that is in our "key". So essentially, for the "Current Item" from the Apply to Each from the "outside" (which is our "current JSON object element, as an Object" - and then using the square brackets, we "pick" out the "key" to access - which is the current item of Apply_to_each_2 or the inner loop. This way, we are always ending up iterating them in the exact order we want for each element.

 

 

 

 

 

 

pf23-11.png

 

 

replace(string(outputs('CurrentValue')),'"','""')

 

The point of this part is because, what if our data cell contains a comma? Well, that won't matter if we enclose each data element in double quotes (which we do in next step). However, what if the data itself contains double quotes? Well, the correct way to do that is to escape it with another double quote. Here, any existing double quote is replaced with two double quotes for this exact purpose. This should help with any edge cases, such as data containing a comma, or even a double quote itself

 

 

 

 

 

pf23-12.png

 

 

concat('"',outputs('Escape_all_Quotes_in_CurrentValue'),'"')

 

And then, the result is wrapped in double quotes, so that the escaped double quotes, and any commas, are captured as contents of the cell rather than something else.

 

 

 

 

 

 

pf23-13.png

 

concat(outputs('Then_Surround_in_Quotes'),',')

 

Finally, a comma is appended to the result, since it is possible that another item follows this one.

 

 

 

 

 

~~~~~Now here we are OUTSIDE the Apply_to_Each_2, (but still inside the "outer" Apply_to_Each)~~~~~~~

 

 

pf23-14.png

We must use this above block because we cannot edit tempString in any way directly by self reference, so we use Compose to make a copy of it as above.

 

 

 

 

 

pf23-15.png

 

substring(outputs('Compose_-_copy_of_this_variable,_or_else_it_cannot_be_changed'),0,add(length (outputs('Compose_-_copy_of_this_variable,_or_else_it_cannot_be_changed')),-1))

 

Because in the Apply_to_each_2 we always put a comma in there anticipating another data element, we need to take it out from the "row string" at this point since there are no more elements in this row. The above code removes the comma from that string, returning a new string.

 

 

 

 

 

 

pf23-16.png

 

concat(variables('tempString'),decodeUriComponent('%0A'))

 

And finally, we append our "row string" and a new line feed, to finalCSVString which is the String variable holding our final string result.

 

 

 

 

 

 

~~~~~~Now from this point on we are OUTSIDE the Apply_to_Each~~~~~

 

pf23-17.png

 This above is the final result.

 

 

 

Here is how the final result looks when the Flow is run:

pf23-18-final.png

 

So we get the following output:

 

 

a,c,b,e,d
"1","3","2","test,comma""andquote","4"
"11","33","22","test,comma2""andquote","44"

 

 

 

If you copy the above exactly into a text file, and then change the extension to csv then open it in excel, you should get the right result.

 

Here is how this output looks like when I copy and paste the output exactly into a file on my desktop, and attempt to import it into a program that interprets csv files, with the setting of comma as delimiter:

 

pf23-19-fnltst.png

That output is perfect according to the above. Notice it even interprets the data for column e correctly, even if it has a comma and a quote inside of it.

 

Check if the above helps you.

 

 

 

Helpful resources

Announcements
Power Automate News & Announcements

Power Automate News & Announcements

Keep up to date with current events and community announcements in the Power Automate community.

Community Calls Conversations

Community Calls Conversations

A great place where you can stay up to date with community calls and interact with the speakers.

Power Automate Community Blog

Power Automate Community Blog

Check out the latest Community Blog from the community!

Users online (1,449)