Hello there!
I'd like to ask if any of you have any experience where you work with a CSV file and you get content like this:
["��i\u0000d\u0000\t\u0000c\u0000r\u0000e\u0000a\u0000t\u0000e\u0000d\u0000_\u0000t\u0000i\u0000m\u0000e\u0000\t\u0000a\u0000d\u0000_\u0000i\u0000d\u0000\t\u0000a\u0000d\u0000_\u0000n\u0000a\u0000m\u0000e\u0000\t\u0000a\u0000d\u0000s\u0000e\u0000t\u0000_\u0000i\u0000d\u0000\t\u0000a\u0000d\u0000s\u0000e\u0000t\u0000_\u0000n\u0000a\u0000m\u0000e\u0000\t\u0000c\u0000a\u0000m\u0000p\u0000a\u0000i\u0000g\u0000n\u0000_\u0000i\u0000d\u0000\t\u0000c\u0000a\u0000m\u0000p\u0000a\u0000i\u0000g\u0000n\u0000_\u0000n\u0000a\u0000m\u0000e\u0000\t\u0000f\u0000o\u0000r\u0000m\u0000_\u0000i\u0000d\u0000\t\u0000f\u0000o\u0000r\u0000m\u0000_\u0000n\u0000a\u0000m\u0000e\u0000\t\u0000i\u0000s\u0000_\u0000o\u0000r\u0000g\u0000a\u0000n\u0000i\u0000c\u0000\t\u0000p\u0000l\u0000a\u0000t\u0000f\u0000o\u0000r\u0000m\u0000\t\u0000m\u0000e\u0000l\u0000y\u0000i\u0000k\u0000_\u0000p\u0000o\u0000z\u0000�\u0000c\u0000i\u0000�\u0000r\u0000a\u0000_\u0000s\u0000z\u0000e\u0000r\u0000e\u0000t\u0000n\u0000e\u0000_\u0000j\u0000e\u0000l\u0000e\u0000n\u0000t\u0000k\u0000e\u0000z\u0000n\u0000i\u0000?\u0000\t\u0000f\u0000u\u0000l\u0000l\u0000_\u0000n\u0000a\u0000m\u0000e\u0000\t\u0000e\u0000m\u0000a\u0000i\u0000l\u0000\t\u0000p\u0000h\u0000o\u0000n\u0000e\u0000_\u0000n\u0000u\u0000m\u0000b\u0000e\u0000r\u0000","\u0000l\u0000:\u00004\u00001\u00005\u00008\u00004\u00002\u00001\u00009\u00003\u00007\u00006\u00001\u00007\u00004\u00001\u00002\u0000\t\u00002\u00000\u00002\u00001\u0000-\u00001\u00000\u0000-\u00001\u00006\u0000T\u00000\u00009\u0000
I get \u0000 before every letter, and all Hungarian characters like (áéíóöőúüű) are changed to a � symbol.
I'd like to use the compose actions to convert the data to json like in this video: https://www.youtube.com/watch?v=sXdeg_6Lr3o
But I can't get past this problem. Even replace doesn't work on removing the '\u0000' parts.
This is how my flow looks. I just want the data from the CSV to appear normally, without the � and the '\u0000' parts, and I don't know what I'm doing wrong.
I appreciate any help!
Solved! Go to Solution.
Hello @gaborszollosy ,
I was recently dealing with the same issue, and in the end I used a workaround. Instead of the conversions I stored the .csv as a .txt file in SharePoint, and when I used 'Get file content' on the .txt file it was formatted as expected.
Hello @gaborszollosy ,
I was recently dealing with the same issue, and in the end I used a workaround. Instead of the conversions I stored the .csv as a .txt file in SharePoint, and when I used 'Get file content' on the .txt file it was formatted as expected.
Great idea, thanks! So basically if I load a csv from an attachment and create a file with it's content as .txt on SharePoint and use that for the ongoing flow, do you think it would work out?
Yes, if the .csv is a SharePoint item attachment then it's exactly the same way I used it.
Does the more direct base64ToString(Attachment Content) expression not work for this?
Also if anyone wants to parse a CSV with commas in the data, feel free to use the template here: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/CSV-to-Dataset/td-p/1508191
The first Microsoft-sponsored Power Platform Conference is coming in September. 100+ speakers, 150+ sessions, and what's new and next for Power Platform.
Learn to digitize and optimize business processes and connect all your applications to share data in real time.
User | Count |
---|---|
62 | |
50 | |
30 | |
29 | |
24 |