Skip to main content
Question

Hello, I need to go through multiple pages, extract a table, and then append it to an excel file. Is there a way to use a loop for this? I don't want to manually capture each table. Thank you

  • September 1, 2022
  • 5 replies
  • 371 views

image

5 replies

  • Cadet | Tier 2
  • 2 replies
  • September 1, 2022

Use a If statement to check if "next" hyperlink still exist. If so, click on next and recapture the table again


  • Navigator | Tier 3
  • 20 replies
  • September 1, 2022

Hi @Liam O'Dowd​ 

 

You can use a while loop with object exists so it will click on next until it is disabled/hidden, you can use get table from recorder>capture

 

Thanks,

Prankur


  • Author
  • Cadet | Tier 2
  • 7 replies
  • September 5, 2022

Hi @Prankur Joshi​ thank you for responding.

 

When the bot clicks next, the recorder action in the while loop doesn't update the URL so it is capturing the data on the first page only. I dont see a way to edit the link. Any ideas on how to resolve this?


  • Navigator | Tier 3
  • 20 replies
  • September 6, 2022

Yes @Liam O'Dowd​ this is because for every page URL may be different/dynamic so you will have to use wild cards in like * or some variables so that your code will run for each page

 

See this thread

https://apeople.automationanywhere.com/s/question/0D52t00000KVVNqCAP/can-we-scrape-data-from-dynamic-window-title-using-web-recorder-in-taskbot/p>


  • Author
  • Cadet | Tier 2
  • 7 replies
  • September 6, 2022

Thanks for your help I got it figured out!


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings