Web scraping returned null in selenium but loaded in normal browser

Question:

Hi I am try to get the price of domain by selenium, the page is not loading, don’t know why, I have attached the screenshot, my code is:

import time
import chromedriver_autoinstaller
from selenium import webdriver



chromedriver_autoinstaller.install()  # Check if the current version of chromedriver exists
                                      # and if it doesn't exist, download it automatically,
                                      # then add chromedriver to path

driver = webdriver.Chrome()
driver.get('https://in.godaddy.com/domainsearch/find?checkAvail=1&domainToCheck=bjmtuc.club#')

time.sleep(100)

It loaded the website in normal browser, but its not working in selenium. I just need the price of the domain (₹149.00) or even the page source can work cause I know how to scrape the source using BS4. I don’t know what else to write, but if you want some additional information just comment down.

Asked By: Gaming Fury

||

Answers:

It seems it is detecting you are running a bot.

As an workaround you can do couple of things.

  1. use random user agent
  2. disable-blink-features
  3. when load the page wait for couple of seconds and then refreshed the browser and wait for couple of second

code:

from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from webdriver_manager.chrome import ChromeDriverManager
from fake_useragent import UserAgent

chrome_options = webdriver.ChromeOptions()
ua = UserAgent()
user_agent = ua.random
print(user_agent)
chrome_options.add_argument('user-agent={}'.format(user_agent))
chrome_options.add_argument('--disable-blink-features=AutomationControlled')
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()),options=chrome_options)
driver.get('https://in.godaddy.com/domainsearch/find?checkAvail=1&domainToCheck=bjmtuc.club#')
time.sleep(5)
driver.refresh()
time.sleep(5)
print(driver.find_element(By.CSS_SELECTOR,"[data-cy='pricing-main-price']").text)

you need to install fake_useragent using

pip install fake_useragent

browser snapshot
enter image description here

Answered By: KunduK