I've developed a solution to read in tab delimited files.
I had help from these sources which really helped figuring it out. Specially getting a tab character.
Sources:
http://techcarpenter.blogspot.com/2018/02/building-json-objects-in-microsoft-flow.html
Note: If your content is csv (comma separated) and it has commas in the fields you should follow my other post:
I would also suggest tab delimited over csv, you would have less issues and excel also provides option to save as tab delimited.
This is how I imported the tab delimited file and wrote it to SQL table.
1. get the file content tab delimited in this example
In the next 2 steps I set up my variables for new line and tab character which I use to parse out the data.
Note: to get the new line character and tab character I typed them in notepad and copied them to flow and they will appear as blank space.
2. set the tab character as a variable of type string and set its value copying a "tab" from notepad
3. set the "enter" new line character as a variable of type string and set its value copying a "return" from notepad
The next steps cover setting up the arrays that will hold all the data, then the array that will go through each column in a row.
4. set a new variable of type array - this var will be used to read each column in each row
5. set a new variable of type array - this var will hold the file content which we break into rows. We go ahead and set this array's value. This is where we break up the file content into rows.
Notice the value. We use skip to skip the header row if you have one. Then we use the varialbe new line as the split character. This makes each row into an item in the array.
So the value looks like so:
Now that this is all set up we move on to the final steps where we loop through the rows and get the columns and write them to a destination (SQL in this example).
6. Set up your apply to each loop.
Note: for this to work you can't use concurrency control. It must be turned off. That means it may take time to write large amount or rows.
7. In the apply to each we loop through the data rows.
On each data row we set the value of our each data row array by splitting the row into columns using the tab as the split character.
The split columns value looks like so:
Now this gives us the row broken up into columns which we can now reference to write the data.
8. Write the data to SQL or other destinations. SQL used in this example.
Note: You could easily modify this step to update records.
Now set up your SQL insert columns by referencing each column of the "each data row" array which is an array of columns.
You will access each column by using its index just like c#, java, etc...
So your value of each field will look like so:
My sample run quick view:
sql query to verify records written
Solved! Go to Solution.
Hello,
I'm glad this is helping users.
I was also having a hard time with it.
Thanks to those links I found I was able to fill in the remaining pieces to document the complete steps.
I will also post how to update a destination database or table from a tab delimited / CSV which is very similar to this one.
Hi, I've posted a solution on handling csv files for those intereseted.
This is a complete solution meaning it handles commas inside of fields.
https://powerusers.microsoft.com/t5/Flow-Cookbook/read-csv-file-and-write-to-destination/m-p/258524
This is fantastic. I have been searching for a long time for a way of doing this, it seems crazy to me that there isn't a canned way of doing this. Microsoft even told me it is not possible to enter data into a table or database unless it was already in a table. I added one step to deal with empty last rows in my csv files.
The string expression is just string('')
Hello,
I'm glad this is helping users.
I was also having a hard time with it.
Thanks to those links I found I was able to fill in the remaining pieces to document the complete steps.
I will also post how to update a destination database or table from a tab delimited / CSV which is very similar to this one.
Hi, I've posted a solution on handling csv files for those intereseted.
This is a complete solution meaning it handles commas inside of fields.
https://powerusers.microsoft.com/t5/Flow-Cookbook/read-csv-file-and-write-to-destination/m-p/258524
Will this work with a file that's in a Sharepoint folder?
Struggling to get from step 1 to step 2 with "Get file content from path"
Hello, I did not realize sharepoint returns the content as "encrypted".
What you need to do is first get the file content of the csv, then create a file with the content as .txt
In the third step you open the txt content with sharepoint get file content, then you will see the actual content.
(You should create a blank .txt file with the name you like so it will exist for the flow creation.)
@juresti thank you so much for posting this...it was exactly what I was trying to do with CDS!
hi @ahall ,
I was doing the same thing and found that the condition inside the 'Apply to each' was really killing performance, so I filtered out the empty row(s) using an Array Filter prior to the loop so I could remove the condition.
The condition inside the function is:
When I go to Choose and Action I do not see Tab Character as an option. Thoughts?
you look for variable, set variable as the action. Then you can rename it to Tab Character for ease of reading. the actual variable name is tabChar. you could use whatever you want, just change it where needed when you actually use the variable.