cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
dazzerd
Helper I
Helper I

Bulk Updating Data in Dataverse Table

Hi all I've come up against a problem.

We recently changed the structure of one of our data tables. This worked perfectly fine. We have basically gone from a table with a smallish number of rows but lots of columns to a table with lots of rows but a smallish number of columns. This new table has approx 500k rows.

 

I used to update the data in the smaller table by just linking Access to the table and writing some update script. Doing this on the larger table "works" but I get intermittent memory issues and the update will not always complete.

 

I know you can update using the "import template" in the advanced section of the environment so using a template like this.Capture.JPG

 

This works in principle but for this to work I need to be able to get to row checksum for each row in the table. The template only provides the top 20 rows or so. As I could be updating any of the 500000 rows this wont work. I was hoping this was an available field in the dataverse table but apparently not as I cant see this in the table or by using power query to try and extract.

Any ideas? Or am I missing an easier way to update.

 

1 ACCEPTED SOLUTION

Accepted Solutions
dazzerd
Helper I
Helper I

Yup you can do this in power query to extract the guid of the record and then dataflows/power automate to update fairly easily and much quicker than using linked tables. Still need to figure out how I update a null value to a date field but thats a different problem.

View solution in original post

4 REPLIES 4
a33ik
MVP

Hello,

I would recommend using something like SSIS/ADF/DataFlows.

dazzerd
Helper I
Helper I

Thanks dataflows works using the actual GUID which I can access, problem is the dataflow updates 1 row at a time... when I need to bulk update 1000+ in one go. Must be some way to update from a spreadsheet then import in bulk.

ChrisPiasecki
Super User
Super User

Hi @dazzerd,

 

The export to Excel will have a limit of 50k (or 100k?) rows, in your case the volume is a bit too high to effectively leverage the out of box export/import.

 

I believe Dataflows have been significantly improved and will handle all the optimization of batching/bulk updating under the hood. 

 

If you want to have full control over how the data is loaded, you can use Azure Data Factory or SSIS like @a33ik suggested.

 

 

---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.

 

dazzerd
Helper I
Helper I

Yup you can do this in power query to extract the guid of the record and then dataflows/power automate to update fairly easily and much quicker than using linked tables. Still need to figure out how I update a null value to a date field but thats a different problem.

Helpful resources

Announcements
Microsoft 365 Conference – December 6-8, 2022

Microsoft 365 Conference – December 6-8, 2022

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Users online (4,078)