Hi,
I have an API wich return a text, suppossed to be a csv file, in the form below :
datetime;mois;semaine;joursem;heure;vacance;Text;freQ;Scored Labels 7/15/2017 6:00:00 PM;7;28;7;18;1;29,234;67;148,2313385 10/14/2016 4:00:00 AM;10;42;6;4;0;18,922;0;-9,692166328 02/04/2017 12:00;2;5;7;12;0;9,239;0;39,99219513 05/11/2017 05:00;5;19;5;5;0;17,421;0;1,262338638 10/01/2016 13:00;10;40;7;13;0;22,333;2;-0,870968521 11/20/2016 6:00:00 AM;11;48;1;6;0;11,83;0;-13,13813114 10/18/2016 4:00:00 PM;10;43;3;16;0;20,529;42;46,49481583 2/23/2018 9:00:00 AM;2;8;6;9;0;1,231;0;1,8540411 01/05/2017 05:00;1;1;5;5;1;6,426;0;0,300328046
I don't need to save a csv file, I need to transform this to a table to save 2 specifc columns on a sharepoint list.
I tried create CSV table but, I can't make it work.
Thank you for your help.
Solved! Go to Solution.
The short answer is that there is not currently a csv parsing function in Flow. The somewhat longer answer is that depending on how reliable your csv content is, it may be possible. You'd need to write some loops to create an array of json objects that contain your data... then you could parse through the json more or less normally
Thanks you for your helps.
Can you give me an exmle of this loop ? le toos an an expemple of the code.
Thank you.
All the answers on this topic are SOOOO much more complicated than it should be!
We have an Import-CSV command in powershell that takes CSV data and makes an object.... there should be the same thing that works just as easily in MS Flow/Logic apps.
So frustrated that MS has not yet fixed this.
You can use Parse CSV action from Plumsail Documents connector. It allows you to convert CSV into an array and variables for each column. Please read this article demonstrating how it works.
Once you parsed CSV file you can iterate through result array and insert specific column values into SharePoint as you wanted.
This would probably work, but I don't think it's reasonable to spend over $300 a year for something that should be a base option.
@anton-khrit wrote:You can use Parse CSV action from Plumsail Documents connector. It allows you to convert CSV into an array and variables for each column. Please read this article demonstrating how it works.
Once you parsed CSV file you can iterate through result array and insert specific column values into SharePoint as you wanted.
I don't have access to the destination forum but need this solution....Could the solution be completely duplicated here?
I still can't believe that Microsoft has not made parsing a CSV part of the basic options/actions available in Power-Automate/Flow. I've been advocating for this for literally years now. C'mon MS!! This is a "standard" OOB function in almost all scripting languages, including PowerShell! This is also one of the oldest and most common "data exchange" formats.
Hi,
I am trying to parse csv to filter it with power automate to distribute the right document to the right person, I am interested by you process, could you share the flow and not the result to really understand what action you are doing within it ?
Thanks !
+1
also curious of the solution but get hit with permission errors when accessing this link
https://powerusers.microsoft.com/t5/Flow-Cookbook/Convert-CSV-data-into-json/m-p/162034#M93
I tried viewing and getting hit as well.
I’ve done a video on it which you may find helpful:
Thanks, Paulie,
Again, the issue here is not that there is no way to coerce Flow to do what you need, it's that this functionality should be built into PowerAutomate the same way it's baked in with every iteration of PowerShell i.e. 'Import-Csv'.
It's unconscionable to expect businesses to have to pay a third party for a connector to do this.
Your solution is find for small CSV's of normalized data, and is fairly identical to what others (me included) have suggested. But what happens if you don't know your column headers? What happens when your data fields could contain your delimiter string itself? Well now you have to build all of that conditional logic into the flow. What if, in addition, your data has a thousand, or ten thousand rows? we're talking about a monstrous time spent in a single PowerAutomate run, just to get it to parse a silly CSV that would take 10 seconds in PowerShell.
MS Needs to build this functionality into PowerAutomate.
ok... rant over... 🙂
Nearly all of your points are totally valid. Makes me think I should do a video on using an Azure function that uses convert-csv.
The method I used, made use of no variables so you can use concurrency to make it much faster than the methods that use variables. You could set concurrency to 50. But 10,000 rows would still take some time.
This is exactly my point as well. I know it "can be done" but the steps involved are just ludicrous for something that's so simple in most languages, I.E. PowerShell, Python, PHP, and I'm sure is an extremely common need.
Another issue with concurrency is that, depending on your subscription level, you're likely to get rate limited. Again, for a simple function like parsing a csv, this seems backwards... I understand why rate limits are invoked, but with larger data sets it's unlikely that you won't get rate limited.
Just in case anyone else is browsing this... There is a feature request/idea for this topic... Please vote!!
https://powerusers.microsoft.com/t5/Power-Automate-Ideas/Text-CSV-file-get-rows/idc-p/778837#M23876
Thanks Paulie78 - this definitely helps as my use case involves consuming multiple csv files that has fewer than 50 records.
But I do agree with tutankh, Microsoft should provide this feature as an builtin action.
@pedwin In addition to the video, I've written up the process for parsing the CSV on my blog: How to Parse CSV files with Power Automate
You can download my flow from there and import into your environment to get you started quickly.
User | Count |
---|---|
88 | |
37 | |
25 | |
13 | |
13 |
User | Count |
---|---|
121 | |
55 | |
36 | |
24 | |
21 |