cancel
Showing results for 
Search instead for 
Did you mean: 

Text (CSV) file - get rows

An action to get rows from a text file (CSV) would be handy.

Status: New
Comments
Kudo Collector

We have a "Create CSV Table" action from an array, but no actions to load, read, transform CSV data which many enterprises processes still requires.

 

Actually, why not using the SAME powerful "Edit SQL data with Power Query" and create a new tool called "Edit CSV with Power Query"?  I'm going to create a new idea...

Advocate III

Does anyone know how many 'votes' an idea has to get before it gets noticed and addressed (even if it's denied)?

Frequent Visitor

Great idea, but why limit to "CSV"? Just have it be a "text extract" action with a specifiable delimiter, maybe even make it a 1-10 character string, just in case of weird legacy formats. Also, use whitespace as delimiter, and now you get "all the words". Also allow just grabbing the whole line with no delimiter.

Helper V

Hi!

 

You can use Parse CSV action from Plumsail Documents connector. It allows you to convert CSV into an array and variables for each column. Please read this article demonstrating how it works.

Helper V

Yes, I need this, and from reading here, many many people do.  The idea is 2 1/2 years old and I don't see a response.  Very disappointing.

For me, I would need two things:

1. A simple record count for a csv file

2. A way to get a record count based on a filter, such as counting error records by testing field x for a value.

Anonymous
Not applicable

@akharns You can do this by setting up a custom Azure function. (You get a million free runs per month.) You can then either set that as a connector or just send a direct HTTP request.

I installed the npm module "csv2json" to the function and then used this as the index.js (where "csvString" is the CSV file set as a string... I'll attach an image.)

module.exports = async function (context, req) {
    context.log('JavaScript HTTP trigger function processed a request.');

    if (req.body.csvString) {
        const csv2json = require('./csv2json.js');
        const csv = (req.body.csvString);

        const json = csv2json(csv, {parseNumbers: true});
                console.log(json);
            // status: 200, /* Defaults to 200 */
        context.res = { 
                        body: json
        };
    }
    else {
        context.res = {
            status: 400,
            body: "Please pass data into the body."
        };
    }
    context.done();
};

installed npm module with cmd tool hereinstalled npm module with cmd tool here

Overall FlowOverall FlowHTTP Request to FunctionAppHTTP Request to FunctionAppParse Data to JSON ObjectParse Data to JSON Object

You can skip the "Transform Data From Parser" step. In my case one of the fields was in the "City, State" format, so I needed to remove that comma in order for the Azure function to work correctly.

 

From there you can use the "Filter Array" built-in action to filter your data and "length(outputs(your_filter_action))" to return the number of rows that match that criteria.

 

But, yeah, this should be just built-in to Flow. SO MANY things output CSV files.

Helper V

@Anonymous , thanks for the tip.  I'll look into that.  I'll have to get a programmer involved, I think.  At least I have an option.  🙂

Regular Visitor

Agree!

Advocate V

Handy? I think this is a serious fundamental requirement for PA to serve as an integration tool.

Advocate II

three years since it was suggested and still nothing??? Seriously.....🙄