cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
jameswiseman
New Member

Comma-Delimiter troubles in Dataverse Table export to Data Lake

In Powerapps/Dataverse I can export tables to a Gen2 data link via Synapse Link. One configured, the data arrives happily in my Data lake in CSV format.

 

However, I now find myself subject to the age-old problem of delimiting characters in data. In this instance, if there's a comma in the individual field data, then any mapping I have, say from a CopyData task, are corrupted.

 

There seems to be dataflow pipelines within ADF that read and infer the fields correctly, but DF pipelines cannot be used in conjunction with a self-hosted runtime, which is what I require. I have also seen advice that I should transform it to a Parquet file first and then export from there, but that will require more Azure resources at the inherent extra cost.

 

This seems to be absolutely fundamental, and would break this Microsoft-prescribed pattern and solution for a large proportion of use-cases.

 

The process for files seems to be a bit of a black box, and I can't see a way of stipulating a custom delimiter as part of this.

Does anyone know how I might solve this?

 

0 REPLIES 0

Helpful resources

Announcements
Power Apps News & Annoucements carousel

Power Apps News & Announcements

Keep up to date with current events and community announcements in the Power Apps community.

Community Call Conversations

Introducing the Community Calls Conversations

A great place where you can stay up to date with community calls and interact with the speakers.

Power Apps Community Blog Carousel

Power Apps Community Blog

Check out the latest Community Blog from the community!

Users online (2,018)