Hello Guys, i'm trying to graph for each Url and each letter on this website (file attached) the informations(datas) about name of company, contact name, contact number and field or speciality
I kind of having a flow which is correct and which i did in the past but for this time, i can't grab the correct url (href attribute) to make my flow and make my pagination.
don't forget we have to click on it, grab the datas and then turn back to the page of the list item
May someone have a look and help me on my steps?
thank you in advance
Fred
Solved! Go to Solution.
Hi Fred,
I have moved your post from "General Power Automate Discussion" to "Power Automate Desktop".
One good thing is that if you scroll this webpage to the end, it contains all the A-Z links and therefore pagination is not required.
https://sepem.a-p-c-t.net/annuaire/SEPEM Industries Nord-Ouest/Rouen/2022
- Use Extract data from webpage
How to use this:
Drag this action to the PAD editor -> Double click and open -> While keeping this action open, go to the website -> You will automatically get a "Live web helper" -> click Advance Settings
Add all the entries as shown above including the word "Table" in the dropdown.
html > body > div:eq(2) > div > div:eq(1) > div > div
div:eq(1) > a > h4
div:eq(1) > a
On clicking OK it will show a datatable of Names and Links
Click Finish and go to the PAD editor
- On running the process it will give you a Datatable called DataFromWebpage having 378 rows of all the links
After this:
- Use a "For Each" and loop through all rows of the Datatable
Use "Go to webpage" to navigate to each of the URLs in the above table
- There you grab each of the contact name, number etc by creating UI elements for each of them
Hi @PAuserFromFranc - I started looking at this, but honestly, it is a very big job to get through.
What I did notice is that each "page" has a direct URL - so you could make a datatable of #A-Z and cycle through it, to build the URL, and go there. Then, you can get data from web page, get the table, and extract the links from each panel - then cycle through THAT, and scrape the page it lands on.
Now, how to do that last part...? I am not sure. I am relatively new to Power Automate RPA, I've used Foxtrot / Enablesoft / Nintex RPA in the past, and I am just now learning PAD. I know @VJR is a pro though, I wonder if they can help us figure out the best way to do this? ❤️
I answer questions on the forum for 2-3 hours every Thursday!
Hi Fred,
I have moved your post from "General Power Automate Discussion" to "Power Automate Desktop".
One good thing is that if you scroll this webpage to the end, it contains all the A-Z links and therefore pagination is not required.
https://sepem.a-p-c-t.net/annuaire/SEPEM Industries Nord-Ouest/Rouen/2022
- Use Extract data from webpage
How to use this:
Drag this action to the PAD editor -> Double click and open -> While keeping this action open, go to the website -> You will automatically get a "Live web helper" -> click Advance Settings
Add all the entries as shown above including the word "Table" in the dropdown.
html > body > div:eq(2) > div > div:eq(1) > div > div
div:eq(1) > a > h4
div:eq(1) > a
On clicking OK it will show a datatable of Names and Links
Click Finish and go to the PAD editor
- On running the process it will give you a Datatable called DataFromWebpage having 378 rows of all the links
After this:
- Use a "For Each" and loop through all rows of the Datatable
Use "Go to webpage" to navigate to each of the URLs in the above table
- There you grab each of the contact name, number etc by creating UI elements for each of them
You're such an amazin friend @VJR thank you so much, let me know if i can help here in Paris !
I sort of wish there was a "buy a coffee" button for some of these solution providers ... that was a very good post.
I answer questions on the forum for 2-3 hours every Thursday!
@Rhiassuring @VJR To both of you thank you ! i'm not much familiar with english joke "buy a coffee" but the one in Paris is really good!
So i finally finished my flow with the luck to have you, but i jump into another issue which i'm trying to figure out but i should find the answer shortly. the scroll down page ...to grab all the link instead of stopping to the 30th occurence.
Again Thank you!
Fred
@VJR @Rhiassuring finally, i will need your help guys ...i'm so dumb! ok, so i could do it up to 30 instances (30rows) because, i think, the attribute height:auto;overflow:auto; so it means i'm not able to grab more than 30 rows because page not scrolling well even if i use a JS Function to do a window.scroll(0, 500px(or even more))
I would like my flow to grab all the items (all urls from A company to Z company) which appears while scrolling every 30 instance.
May i have your guess or answers please?
Thanks / Fred (a boring afternoon checking a solution for that)
You don't need to navigate 30 records per page per alphabet.
As shown above, all the 378 rows from # to Z end up in this datatable that you can loop through.
See the right hand side where it shows 378 rows.
HI @VJR sorry same topics but another website
https://global-industrie.com/fr/liste-des-exposants
and here is the issue so only 30 records...if you may help, really appreciating !
Since this thread is already closed, request you to open a new thread with fresh details.
Myself or anyone available will be able to assist you on it.