Skip to main content
Power Automate
    • Connectors
    • Templates
    • Take a guided tour
    • Digital process automation
    • Robotic process automation
    • Business process automation
    • Process advisor
    • AI Builder
  • Pricing
  • Partners
    • Blog
    • Documentation
    • Roadmap
    • Self-paced learning
    • Webinar
    • Business process and workflow automation topics
    • Overview
    • Issues
    • Give feedback
    • Overview
    • Forums
    • Galleries
    • Submit ideas
    • User groups
    • Register
    • ·
    • Sign in
    • ·
    • Help
    Go To
    • Microsoft Power Automate Community
    • Welcome to the Community!
    • News & Announcements
    • Get Help with Power Automate
    • General Power Automate Discussion
    • Using Connectors
    • Building Flows
    • Using Flows
    • Power Automate Desktop
    • Process Advisor
    • AI Builder
    • Power Automate Mobile App
    • Translation Quality Feedback
    • Connector Development
    • Power Platform Integration - Better Together!
    • Power Platform Integrations
    • Power Platform and Dynamics 365 Integrations
    • Galleries
    • Community Connections & How-To Videos
    • Webinars and Video Gallery
    • Power Automate Cookbook
    • Events
    • 2021 MSBizAppsSummit Gallery
    • 2020 MSBizAppsSummit Gallery
    • 2019 MSBizAppsSummit Gallery
    • Community Engagement
    • Community AMA
    • Community Blog
    • Power Automate Community Blog
    • Community Support
    • Community Accounts & Registration
    • Using the Community
    • Community Feedback
    cancel
    Turn on suggestions
    Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
    Showing results for 
    Search instead for 
    Did you mean: 
    • Microsoft Power Automate Community
    • Galleries
    • Power Automate Cookbook
    • Re: import csv
    Accepted Solution

    Re: import csv

    01-10-2020 05:50 AM

    juresti
    Continued Contributor
    43634 Views
    LinkedIn LinkedIn Facebook Facebook Twitter Twitter
    juresti
    juresti Continued Contributor
    Continued Contributor
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    import csv

    ‎01-08-2020 11:44 AM

     

    Title: Import CSV File

     

    I have created a csv import 2.0

    It is smaller and simpler.

     

    Find it here

    https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/import-csv-with-office-script/td-p/13899...

     

    Description: This flow allows you to import csv files into a destination table.

    Actually I have uploaded 2 flows. One is a compact flow that just removes the JSON accessible dynamic columns.

    You can easily access the columns in the compact version by using the column indexes . . . [0],[1],[2], etc...

     

    Compact Version: 

    This step is where the main difference is at. We don't recreate the JSON, we simply begin to write our data columns by accessing them via column indexes (apply to each actions). This will cut down in the run time by even another 30% to 40%.

    32.PNG

     

    Detailed Instructions: 

    ** If you need to add or remove columns you can, follow the formatting and pattern in the "append json string items" and anywhere else columns are accessed by column indexes with 0 being first column -- for example  "variables('variable')[0]" or "split(. .. . )[0]", etc.. .

    Import the package attached below into your own environment.

    You only need to change the action where I get the file content from, such as sharepoint get file content, onedrive get file content.

    My file in the example is located on premise.

     

    When you change the get file content action it will remove any references to it, this is where it belongs though.

    45.PNG 

     

    Please read the comments within the flow steps, they should explain the using this flow.

    Be sure to watch where your money columns or other columns with commas fall as they have commas replaced by asterisk in this flow, when you write the data you need to find the money columns to remove the commas since it will most likely go into a currency column.

    Also check the JSON step carefully since it will let you access your columns dynamically in the following steps.

    You will need to modify the JSON schema to match your column names and types, you should be able to see where the column names and types are within the properties brackets.

     

    In this step of you JSON notice my values have quotes because mine are all string type, even the cost.

    If you have number types remove the quotes (around the variables) at this step where the items are appended and you probably need to replace the comma out of the money value (replace with nothing).

    44.PNG

     

    This step will control how many records you are going to process.

    02.PNG

    The flow is set up to process 200 rows per loop and should not be changed or due to the nesting of loops it may go over the limit.

    It will detect how many loops it needs to perform. So if you have 5,000 rows it will loop 25 times.

    You should change the count though. Make sure the count is over the number of loops, the count just prevents it from looping indefinitely. 

    Note: This flow can take a long time to run, the more rows with commas in the columns the longer.

    9-25-20 - A new version of the flow is available, it is optimized and should run 40% to 50% faster.

     

    Questions: If you have any issues running it, most likely I can figure it out for you.

     

    Anything else we should know: You can easily change the trigger type to be scheduled, manual, or when a certain event occurs.

    The csv file must be text based as in saved as plain text with a csv extension or txt in some cases.

    Note: any file with csv extension will probably show with an excel icon.

    The best way to find out if it will work is to right click it on your computer and choose to open it in word pad or note pad. If you see the data it will work. If you see different characters it is an excel based file and won't work. 

    An excel based file is of type object and can't be read as a string in this flow.

    You can easily convert it by saving excel file as CSV UTF-8 comma delimited in save as options.

    It should also work on an excel file without table format as long as you convert to csv the same way and same extension requirement.

    ** If the file is on sharepoint go ahead and save as csv utf-8 and then change the extension to .txt or sharepoint will force it to open as an actual excel spreadsheet file. 

    you may also need to use .txt extension from other "get file contents" besides sharepoint. I know for a fact on premise can stay as .csv extension.

     

    My sample run of 12000 rows, this sample has 10 columns, you can have any number of columns. 

    10.PNG

    11.PNG

    12.PNG

    13.PNG

    14.PNG

    15.PNG

     

    csvimport-withjson_09-25-20.zip
    csvimport-compact_10-14-20.zip
    Labels:
    • Labels:
    • Solutions
    Message 1 of 47
    49,541 Views
    9 Kudos
    Reply
    • All forum topics
    • Previous Topic
    • Next Topic
    • « Previous
      • 1
      • 2
      • 3
      • 4
      • 5
    • Next »
    EdHansberry
    EdHansberry Kudo Collector
    Kudo Collector
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-08-2020 12:24 PM

    Can you clarify on setting the connector where I open the file from? I have OneDrive for Business set up as a connector, but it doesn't show up in the list. My connector list is blank when I try to set it.

     

    Does it have to be a network path that is accessible through the on-prem gateway?

    Message 2 of 47
    43,836 Views
    0 Kudos
    Reply
    juresti
    juresti Continued Contributor
    Continued Contributor
    In response to EdHansberry
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-08-2020 12:36 PM

    In the step where I open from the network you can replace it with onedrive.

     

    So I need to update that, it should say select an action to get the file content with, which could be sharepoint, onedrive, etc...

     

    You should be able to open the csv file from one drive.

     

    42.PNG

     

    43.PNG

    Message 3 of 47
    43,831 Views
    0 Kudos
    Reply
    EdHansberry
    EdHansberry Kudo Collector
    Kudo Collector
    In response to juresti
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-08-2020 02:04 PM

    Right. But I cannot get to the flow. When I import, I have to make two selections in reviewing the package content, the CSV import, and the file system connection. The file system connection does nothing. It doesn't recognize OneDrive for Business or any of the SharePoint connectors as a valid file system. Make sense? Or am I totally missing something?

     

    20200108 13_57_28-.png

    Message 4 of 47
    43,807 Views
    0 Kudos
    Reply
    JonL
    JonL Power Participant
    Power Participant
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-08-2020 02:25 PM

    Thanks so much for following the format of the example posts! 

    Message 5 of 47
    43,801 Views
    1 Kudo
    Reply
    R3dKap
    R3dKap Community Champion
    Community Champion
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-09-2020 01:07 AM

    Hi @juresti,

    The problem @EdHansberry is facing is the same for me and for all people that will try to import your flow: you cannot change the resource type to access the CSV file during the import phase. And since your flow's resource type is File System Connexion, if one wants to import your flow in its tenant, it needs to create a File System Connexion and define a local gateway, ... which can be a bit harsh.

    May be if you could provide another version of your flow with a OneDrive or SharePoint connexion... 😉

    Note to all: you can find attached to this reply a similar flow with a connexion to a SharePoint document library where you can put your CSV file.

    Thanks,

    Emmanuel

    ImportCSVtoTABLE_20200109090559.zip
    Message 6 of 47
    43,744 Views
    1 Kudo
    Reply
    juresti
    juresti Continued Contributor
    Continued Contributor
    In response to EdHansberry
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-09-2020 05:32 AM

    I did not realize how the import would work.

     

    I have uploaded a new one using sharepoint as default.

    Message 7 of 47
    43,726 Views
    0 Kudos
    Reply
    juresti
    juresti Continued Contributor
    Continued Contributor
    In response to R3dKap
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-09-2020 05:33 AM

    Ok, I have uploaded a new one using sharepoint as the default start.

     

    Also notice that in sharepoint the extension should be .txt after using the csv utf 8 save as if you had to convert it.

    Message 8 of 47
    43,725 Views
    0 Kudos
    Reply
    Jsidarta
    Jsidarta Advocate I
    Advocate I
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-09-2020 04:18 PM

    Hi,

     

    I have a CSV file that consists of 12 columns, can this script be able to accommodate that?

     

    Update: I received below error when I tried to upload the csv file; I guess the answer is no

    Spoiler
    InvalidTemplate. Unable to process template language expressions in action 'append_json_string_items' inputs at line '1' and column '2212': 'The template language expression 'replace(variables('eachDataRow')[3],'*',',')' cannot be evaluated because array index '3' is outside bounds (0, 0) of array. Please see https://aka.ms/logicexpressions for usage details.'.

     

    200110_114509.png

    33 KB
    Message 9 of 47
    43,668 Views
    0 Kudos
    Reply
    juresti
    juresti Continued Contributor
    Continued Contributor
    In response to Jsidarta
    • Mark as New
    • Bookmark
    • Subscribe
    • Mute
    • Subscribe to RSS Feed
    • Permalink
    • Print
    • Report Inappropriate Content

    ‎01-10-2020 05:50 AM

    Yes, it can handle any amount of rows.

     

    It is not specific to rows.

     

    You just need to build the columns in the append json and JSON schema and anywhere columns are read or written.

     

    Your error "out of bounds" means that there was nothing to access such as a blank data or line.... 

     

    I'll post for you a sample with 12 columns. Or you can provide me a sample.

     

    Message 10 of 47
    43,634 Views
    0 Kudos
    Reply
    • « Previous
      • 1
      • 2
      • 3
      • 4
      • 5
    • Next »

    Power Platform

    • Overview
    • Power BI
    • Power Apps
    • Power Pages
    • Power Automate
    • Power Virtual Agents

    • Sign up free
    • Sign in

    Browse

    • Templates
    • Connectors
    • Partners

    Downloads

    • Mobile
    • Gateway

    Learn

    • Documentation
    • Learn
    • Support
    • Community
    • Give feedback
    • Blog
    • Pricing

    • © 2023 Microsoft
    • Contact us
    • Trademarks
    • Privacy & cookies
    • Manage cookies
    • Terms of use
    • Terms & conditions
    California Consumer Privacy Act (CCPA) Opt-Out Icon Your California Privacy Choices