cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Helper III
Helper III

How to improve upload speed a large amount of record ?

So currently we have a Canvas app that is used to record data and upload to D365 FO. We store the records in a collection then use ForAll() and Patch() to upload the record to the database. We are currently doing performance test and our customer want us to measure the time to upload around 3000 records. So we prepared around 3700 records and it took the app 43 minutes to upload to the database, and our customer asked if we can do anything to improve our upload speed. So I want to ask if there is anyway to optimize the upload process excluding improving the internet connection and hardware specification ? 

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @TiagoFreire ,

 

Based on your suggestion, we did some research and found this article:

https://kurthatlevik.com/2020/04/13/d365-importing-json-data-the-hard-way/

 

This basically shows you how to import JSON to D365. So we changed our method, we package the Collection into a JSON, then push it in a Flow, in the Flow, we call a D365 FO Action that take the JSON as input, and parse it to the D365 FO database. We tested this with 1800 records and the whole process only took 7 seconds to finish, which is a significant improvement. So I think it would be great to share this finding.

 

Cheers

View solution in original post

6 REPLIES 6
Super User II
Super User II

That record volume seems like it should be done by an ETL tool instead. Otherwise, the app itself will freeze and or the create an undesirable experience for users.

 

That being said, we can probably enhance things. What table in FO are you writing to? Are there processes that run whenever those records are created (on the F&O side)?

 

How often are these records sets being created?

 

Do you have access to SQL online or on-premise? Could we stage the records from the app into there, and then let an ETL tool process them into FO?

 

The main concern with a long-running process, or high volume of patches, is that it might fail midway and you'll have very little control over resuming or logging.

Continued Contributor
Continued Contributor

Maybe you should partition data in batches and user Power Automate to do the saving process. 

Sending like 100 at a time and looping in Power Automate will allow you to have control over the data upload. 

 

A possible working model: 

 

Save the data as a collection in the app. 

Select a portion of the data (100 or whatever) as a second collection.

Convert this second collection as a JSON document and send it at once to Power Automate and mark these records as "processing".

Power Automate loops over them and saves to database. You can make Power Automate return immediately or wait until it is complete. 

 

You can send multiple requisitions to Power Automate in parallel and they will execute separately.

But PA actions are based on MS Graph API or some 3rd party API, these have a maximum number of calls per minute. Be mindful of not sending too many at once, or you will have the API throttled.  

 

It is safer from a data standpoint to send data chunks to PowerApps, because then you can just keep using your app. 

 

You can create additional Power Automate or PowerApps scripts that will check the Collection chunks marked as 'processing'  and see if the saving process to the database is finished, then update their state in the collection to 'saved'.

 

@MinhPham96 
I wrote an article describing how to PATCH records much faster than FORALL + PATCH.  By patching records simultaneously instead of one-by-one as you are doing I was able to achieve a 10x improvement in speed.  Check it out at the link below.

 

Link to PATCH Multiple Records In Power Apps 10x Faster:

https://matthewdevaney.com/patch-multiple-records-in-power-apps-10x-faster/

 

---
Please click "Accept as Solution" if my post answered your question so that others may find it more quickly. If you found this post helpful consider giving it a "Thumbs Up."

Helper III
Helper III

Hi @mdevaney , we tried to follow your article, but it seems that this is only applicable if the record is already in the database and you want to update its value, for our case we want to create new record, is it possible ?

 

@TiagoFreire , we followed your suggestion and it worked really well, using Power Automate to handle the data transmission save time in the Canvas app, but the only problem we have is that in our app we need to check data integrity, that means if we upload 3000 records we need to make sure that app knows that 3000 records have been uploaded, meaning we still have to wait for the Flow to finish to get the total response back. But if the Flow has multiple calls at the same time, the process time will be longer, meaning the app have to wait more to get the final response. Any suggestion ?

If you need to have a long-running process, and must confirm the save process went good, a concession must be made somewhere. It will be either execution time, or app complexity. 

 

If you  concede on app execution time, you also risk putting the app or data in an inconsistent state, if the app crashes or is forcibly closed - unless you use some tricks to preserve state client-side.

 

Power Automate can either return after the processing is done, or return immediately.
You can serialize the save actions, waiting for each batch to return before sending the next. 

Or send them in parallel, returning immediately and sending another.

 

If you return immediately, as you stated the data consistency requirement,

Your app would need a second Power Automate action that polls the database to check if each batch has been saved, to meet that requirement. 

 

To preserve the app state client-side, you can make use of SaveData() and LoadData() to preserve Collection state in non-volatile  memory, and restore the app state if it is closed and reopened. SaveData() and LoadData() do not work on the browser though, they only work on the dedicated App Runtime. 

 

With those, it becomes safer to preserve the state of multiple batches as "processing" and marking the mas "done" as soon as they finish.

Other than that, you can test the size of batches to find the optimal size.

 

Another thing to look at is to check whether the database you are using is limited to saving one row at a time from Power Automate, or whether you can insert, update, or upsert data in batches as well, in the Power Automate <-> DB script logic.


Or using / writing a stored procedure in the DB to help speed it up further.

Beyond it, I don't know if there are other tricks. Lots of data will always require some processing time. 

 

Cheers!

Hi @TiagoFreire ,

 

Based on your suggestion, we did some research and found this article:

https://kurthatlevik.com/2020/04/13/d365-importing-json-data-the-hard-way/

 

This basically shows you how to import JSON to D365. So we changed our method, we package the Collection into a JSON, then push it in a Flow, in the Flow, we call a D365 FO Action that take the JSON as input, and parse it to the D365 FO database. We tested this with 1800 records and the whole process only took 7 seconds to finish, which is a significant improvement. So I think it would be great to share this finding.

 

Cheers

View solution in original post

Helpful resources

Announcements
News & Announcements

Community Blog

Stay up tp date on the latest blogs and activities in the community News & Announcements.

Power Apps Community Call

Power Apps Community Call- January

Mark your calendars and join us for the next Power Apps Community Call on January 20th, 8a PST

PP Bootcamp Carousel

Global Power Platform Bootcamp

Dive into the Power Platform stack with hands-on sessions and labs, virtually delivered to you by experts and community leaders.

secondImage

Power Platform Community Conference On Demand

Watch Nick Doelman's session from the 2020 Power Platform Community Conference on demand!

Top Solution Authors
Top Kudoed Authors
Users online (3,850)