I am in the process of developing an application to collect data. My tools of choice are Python 2.7 with Scrapy and Selenium on Windows 10. I have successfully implemented this on a few web pages before, however, I am facing an issue with selecting or clicking on a button from the following website.
The problem lies in not being able to click on the button labeled "Search Permits/Complaints".
I have utilized Chrome dev tools to inspect the XPaths, but to no avail.
Here is the relevant code snippet:
import scrapy
from selenium import webdriver
class PermitsSpider(scrapy.Spider):
name = "atlanta"
url = "https://aca3.accela.com/Atlanta_Ga/Default.aspx"
start_urls = ['https://aca3.accela.com/Atlanta_Ga/Default.aspx',]
def __init__(self):
self.driver = webdriver.Chrome()
self.driver.implicitly_wait(20)
def parse(self, response):
self.driver.get(self.url)
time.sleep(15)
search_button = self.driver.find_element_by_xpath('//*[@id="ctl00_PlaceHolderMain_TabDataList_TabsDataList_ctl01_LinksDataList_ctl00_LinkItemUrl"]')
search_button.click()
Executing this code results in the following error message:
NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*@id="ctl00_PlaceHolderMain_TabDataList_TabsDataList_ctl01_LinksDataList_ctl00_LinkItemUrl"]"}
Despite adding various sleeps and waits, I have failed to ensure that the page is completely loaded before attempting selection. I have tried other methods for selecting the element like Link Text, without success. It's puzzling why this approach works on some pages and not on others. The WebDriver does launch the page during execution, appearing fully loaded on my screen.
Any assistance on resolving this issue would be highly appreciated. Thank you...