Skip to main content
Power Automate
    • Connectors
    • Templates
    • Digital process automation
    • Robotic process automation
    • Business process automation
    • Process advisor
    • AI Builder
  • Pricing
  • Partners
    • Blog
    • Documentation
    • Roadmap
    • Self-paced learning
    • Webinar
    • Business process and workflow automation topics
    • Overview
    • Issues
    • Give feedback
    • Overview
    • Forums
    • Galleries
    • Submit ideas
    • User groups
    • Register
    • ·
    • Sign in
    • ·
    • Help
    Go To
    • Microsoft Power Automate Community
    • Welcome to the Community!
    • News & Announcements
    • Get Help with Power Automate
    • General Power Automate Discussion
    • Using Connectors
    • Building Flows
    • Using Flows
    • Power Automate Desktop
    • Process Advisor
    • AI Builder
    • Power Automate Mobile App
    • Translation Quality Feedback
    • Connector Development
    • Power Platform Integration - Better Together!
    • Power Platform Integrations
    • Power Platform and Dynamics 365 Integrations
    • Galleries
    • Community Connections & How-To Videos
    • Webinars and Video Gallery
    • Power Automate Cookbook
    • Events
    • 2021 MSBizAppsSummit Gallery
    • 2020 MSBizAppsSummit Gallery
    • 2019 MSBizAppsSummit Gallery
    • Community Engagement
    • Community AMA
    • Community Blog
    • Power Automate Community Blog
    • Community Support
    • Community Accounts & Registration
    • Using the Community
    • Community Feedback
    cancel
    Turn on suggestions
    Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
    Showing results for 
    Search instead for 
    Did you mean: 
    • Microsoft Power Automate Community
    • Galleries
    • Power Automate Cookbook
    • Re: [Guide] Better Import CSV + Convert to JSON (w...

    Re: [Guide] Better Import CSV + Convert to JSON (without Plusmail or any additional apps)

    06-12-2020 09:34 AM

    ewchris_alaska
    Helper I
    2138 Views
    LinkedIn LinkedIn Facebook Facebook Twitter Twitter
    Yuhas
    Yuhas
    Frequent Visitor
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    [Guide] Better Import CSV + Convert to JSON (without Plusmail or any additional apps)

    ‎05-21-2020 03:35 AM

    Disclaimer, I did not fully write this guide. I found this elsewhere on the internet with intense Google searching. However, the guide there was rather poorly written. I was unable to find the original author, I would like to credit them for such exploration into this black hole.

     

    I played with it for about 5 hours and finally figured it out. It was truly a pain but this is going to help my team process a report that we do weekly from ServiceNow. I hope this solves a lot of headache for some of you.

     

    Please note that the zip file is empty. Follow the guide. I apologize about the formatting as well but the forums did not like how my ordered/unordered lists were setup with code-snippits in the HTML.

     

    Order of Actions

    • Manually Trigger Flow
    • SharePoint Get file content
    • Compose (just a general compose), press Enter/Return on your keyboard ONCE. This will count as valid
    • Filter Array
    • Select

     

    1. SharePoint - Get File Content
      1. Pick the Site Address
      2. Pick the File Identifier of your CSV
    2. Add a New Action - "Compose" (Data Operation)
      1. In the INPUT, just hit Enter/Return ONCE, it should just make the box bigger. Nothing else should be here. I was unable to get "\n" or "\r" to work here.
        newline.PNG
      2. Click the three dots on the right of the header for this action, rename it to "Newline" --- without quotes and CASE SENSITIVE.
    3. Add a New Action - "Filter array" (Data Operation)
      1. In the From section, click in the box
      2. The dynamic content/expression pop-up window should appear.
      3. Click on Expression within the pop-up window
      4. Copy and Paste this into the Expression input (where it says fx) - it might look like it only pasted the last line. Don't worry, it got all of it.

     

     

    json(
      uriComponentToString(
        replace(
          replace(
            uriComponent(
              split(
                replace(
                  base64ToString(
                    body('Get_file_content')?['$content']
                  ),
                  ',',';'),
               outputs('Newline')
             )
           ),
           '%5Cr',
           ''),
        '%EF%BB%BF',
        '')
      )
    )

     

     

    • Press OK. If you are unsure if it all pasted, just hover your mouse over the expression inside the "From" field, it should show a little pop-up of the code. Alternatively, you can click the three dots on the upper right hand of the action and "peak code" it should show it all there.
    • Click where it says Choose Value
    • The Dynamic Content window should pop up again.
    • Click on Expression inside the pop up window.
    • Type "item()" in the field where it says fx 

     

     

    item()

     

     

    • Click OK
    • Set the logic to "is not equal to"
    • Keep the third value completely blank! This will filter out any lines that are blank. If you have blank lines, it will break the flow in the Select action which is next.
    • Add A New Action - "Select" (Data Operation)
      1. Click into the From field
      2. The Dynamics and Expression pop-up window should appear.
      3. Click Expression
      4. Copy and Paste this in the fx field. This is so that it skips your header rows.

     

     

    skip(body('Filter_array'),1)​

     

     

    • Click OK
    • In the Mapping section, this is where you will set your JSON values. In my case, I uploaded a simple CSV file that has two columns: Firstname and Lastname. So if I wanted the first column:

     

     

    split(item(),';')[0]​

     

     

    If I wanted to change columns, I would change the 0 to 1... or 2... or 3... etc.

    • Each column is represented via the code below:

     

     

    split(item(),';')[0]  #COMMENT: This would be column 1 (remember we start from 0 in programming)
    split(item(),';')[1]  #COMMENT: This would be column 2​

     

     

    • As for your "Key," you can define that however you like. I like to stick to the column names for clarity.
      capture3.PNG

    • The End Result should look something similar to this when testing:

      end_result.PNG
    • The Full Picture:
      Capture1.PNG
      capture2.PNG
    Preview file
    9 KB
    no_files.zip
    Labels:
    • Labels:
    • Solutions
    Message 1 of 21
    6,578 Views
    5 Kudos
    Reply
    • All forum topics
    • Previous Topic
    • Next Topic
    • « Previous
      • 1
      • 2
      • 3
    • Next »
    Yuhas
    Yuhas
    Frequent Visitor
    In response to ewchris_alaska
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎06-12-2020 08:52 AM

    @ewchris_alaska 

     

    You're absolutely sure everything looks like this?

     

    capture.png


     
    Message 11 of 21
    2,161 Views
    0 Kudos
    Reply
    ewchris_alaska
    ewchris_alaska Helper I
    Helper I
    In response to Yuhas
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎06-12-2020 08:58 AM

    @Yuhas 

     

    Yes,  here is my screen shot.

    Preview file
    48 KB
    Message 12 of 21
    2,159 Views
    0 Kudos
    Reply
    Yuhas
    Yuhas
    Frequent Visitor
    In response to ewchris_alaska
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎06-12-2020 09:06 AM

    When you changed back the ";" did you click update on the expression?

    I'd advise directly copying and pasting what I have in the guide. I just ran mine and it worked.

     

    Can you create a new flow with the same steps, and just do it for FirstName,LastName like in the example? Maybe your current CSV data doesn't work with this.

     

    Also, sometimes, when updating the expressions, it will not update even if you click the button. Hover your mouse over the expression to check that it changed.

    @ewchris_alaska 

    Message 13 of 21
    2,149 Views
    0 Kudos
    Reply
    ewchris_alaska
    ewchris_alaska Helper I
    Helper I
    In response to Yuhas
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎06-12-2020 09:34 AM

    @Yuhas 

     

    Still no joy.  I have no idea what I am doing wrong.    below is the failed run and then the code.  

    process.JPG

    process1.JPG

    Message 14 of 21
    2,138 Views
    0 Kudos
    Reply
    Yuhas
    Yuhas
    Frequent Visitor
    In response to ewchris_alaska
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎06-12-2020 08:10 PM

    UPDATE TO ALL INVOLVED WITH THE THREAD!

     

    I have discovered there is a limit to how large your CSV files can be when using this method. It is an unknown number at this moment but that is why @ewchris_alaska and I were having a hard time diagnosing his issue today.

     

    Now.... I don't know if this is a "cell" limit or a "character" limit. If your CSV file is 700 lines and 5 columns wide or larger, (3500 cells), it did not work for us. Anything below, did work with his cleaned (confidential info that was wiped) data set. It would be interesting if someone could test this out with their own large data sets in CSV format. @prodaptiv-c this may have been slightly related to your issue.

     

    @DimaDima @prodaptiv-c 

    Message 15 of 21
    2,115 Views
    0 Kudos
    Reply
    Anonymous
    Not applicable
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎08-04-2020 08:54 AM

    Thanks. This was very useful.
    I only had to correct the JSON code a little bit, to exclude the replacement of ',' by ';' because my source file already had ';' by default and some data on it uses ',' in other contexts.
    5*

    Message 16 of 21
    2,077 Views
    0 Kudos
    Reply
    Mattw112IG
    Mattw112IG Impactful Individual
    Impactful Individual
    In response to Yuhas
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎09-20-2020 11:58 AM

    Has anyone found a way to do this with large CSVs?

     

    Terry

    Message 17 of 21
    2,015 Views
    0 Kudos
    Reply
    SZAR1
    SZAR1
    Regular Visitor
    In response to Mattw112IG
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎10-05-2020 11:46 AM

    Hi All,

     

    Any updates on processing the large CSV file?

    Message 18 of 21
    2,003 Views
    0 Kudos
    Reply
    Fanatic
    Fanatic Helper I
    Helper I
    In response to SZAR1
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎10-24-2020 04:06 PM

    his idea working well, but i don't use the json schema. I'm grabbing the data directly with   split(item(),<delimiter>)  and then formating it inside the select. 

     

     

     

    Fanatic_0-1603580657005.png

     

    Message 19 of 21
    1,973 Views
    0 Kudos
    Reply
    jisherwood1
    jisherwood1
    New Member
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎10-28-2020 08:20 PM

    I've found this to be the simplest way to parse a CSV, but this code as others have mentioned is still highly dependent on the crazy things CSV's can do. I imagine that's part of why it's a premium feature right now. Anyway, if you're struggling here's some tips I found that finally made this work for me. 

     

    1. My csv data came from an email attachment.  Use the Get attachment built-in functionality to grab the content of this easily.

    2. My data needed to be pulled from the contentBytes section of the attachment, you can easily target this with outputs('Get_Attachment_(V2)')?['body/contentBytes']  this replaces body('Get_file_content')?['$content'] in the instructions. 

    3. My data had double quotes around it and returns/new lines, this caused the data to fail to properly convert to JSON and I had one big element containing all of my data with lots of double quotes and \'s all over. Check your output and see if there are lots of backslashes and \r\n in yours, as you need to remove them.

      Example

      ["\Name\",\"Email\"\r\n\"Bob Smith\",\"bob.smith@hotmail.com\"\r\n\"] 
      Instead of 
      [ "Bob Smith, bob.smith@hotmail.com", ]

      To deal with that you need to replace(),'"','')  This targets the double quote " and removes it. Look at how the current code replaces commas with semi-colons, you're copying this syntax. 

      You also need to change the split to deal with those pesky \r\n's. The one included here doesn't do this. And this gets tricky and weird, because of a bug in the way expressions work.  To split correctly, you need to do this:

      split( ),'                    < this can be entered in your expression but must have this formatting with the hard return
      ')

      OR

      "@split( ),'\r\n')"      < this needs to be entered directly into the text box, not as an expression double quotes and everything.

      Both of these do the same thing, it took me a long time and lots of research to even find out that this is an issue. But basically if you do a normal split( ), ',') like the original code shows, my CSV added all those nasty \r\n's rather than ignoring them. Using one of the above uses the hard returns as the delimiter instead of the commas. That in combonation with the quote replace cleaned up my csv data well enough to then use this. 

    Reading this over again it's a bit of a scattershot so I hope someone finds it useful, but I really needed to write this in one place as it was a nightmare to make this work. MS needs to just add a parse CSV option, it seems so basic. 



    Message 20 of 21
    1,963 Views
    0 Kudos
    Reply
    • « Previous
      • 1
      • 2
      • 3
    • Next »

    Power Platform

    • Overview
    • Power BI
    • Power Apps
    • Power Pages
    • Power Automate
    • Power Virtual Agents

    • Sign up free
    • Sign in

    Browse

    • Templates
    • Connectors
    • Partners

    Downloads

    • Mobile
    • Gateway

    Learn

    • Documentation
    • Learn
    • Support
    • Community
    • Give feedback
    • Blog
    • Pricing

    • © 2023 Microsoft
    • Contact us
    • Trademarks
    • Privacy & cookies
    • Manage cookies
    • Terms of use
    • Terms & conditions
    California Consumer Privacy Act (CCPA) Opt-Out Icon Your California Privacy Choices