selenium

Acessing multiple anchor elements inside text

Acessing multiple anchor elements inside text Question: I have the following XPATH //a[@class=’product-cardstyles__Link-sc-1uwpde0-9 bSQmwP hyperlinkstyles__Link-j02w35-0 coaZwR’] This xpath, finds a lot of anchor tags similar to the following HTML sample <a href="/produto/10669/acucar-refinado-da-barra-pacote-1kg" class="product-cardstyles__Link-sc-1uwpde0-9 bSQmwP hyperlinkstyles__Link-j02w35-0 coaZwR" font-size="16" font-family="primaryMedium">Açúcar Refinado DA BARRA Pacote 1kg</a> What I want to do is not to acess it is href, …

Total answers: 2

Python Scraper Won't Complete

Python scraper won't complete Question: I am using this code to scrape emails from Google search results. However, it only scrapes the first 10 results, despite having 100 search results loaded. Ideally, I would like for it to scrape all search results. Is there a reason for this? from selenium import webdriver import time import …

Total answers: 3

Finding an element and clicking on it to navigate to another webpage using selenium

Finding an element and clicking on it to navigate to another webpage using selenium Question: I’m trying to use selenium to open up this webpage (https://app.powerbi.com/view?r=eyJrIjoiMzE2ZDIyN2YtODY1Yy00ZGY0LWE4YTktNDcxOTcwYWQyMjM5IiwidCI6IjcyMmVhMGJlLTNlMWMtNGIxMS1hZDZmLTk0MDFkNjg1NmUyNCJ9) and click on the Tram icon to navigate to the web page I want to scrape. This is what I’ve tried up to now from selenium import webdriver driver=webdriver.Chrome() …

Total answers: 1

How to avoid click intercept while sending text to email field using Selenium in headless mode

How to avoid click intercept while sending text to email field using Selenium in headless mode Question: I want to connect website. I write the following code: from time import sleep from fake_useragent import UserAgent from selenium.webdriver.support.ui import WebDriverWait as W from selenium.webdriver.support import expected_conditions as E from selenium import webdriver options = webdriver.ChromeOptions() options.add_argument("–start-maximized") …

Total answers: 2

Selenium Python: How to close the overlay by clicking on the svg element

Selenium Python: How to close the overlay by clicking on the svg element Question: I am looking for a way to click on the svg cross to close overlaying welcome window. I managed to go through login and authorization but this cross is getting me crazy. Code trials: WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.CLASS_NAME, "jss109"))).click() WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//svg[@class=’jss109′]"))).click() WebDriverWait(driver, …

Total answers: 1

How to find element by XPATH using variable Selenium

How to find element by XPATH using variable Selenium Question: I’m trying to parse some elements using XPATH This is my python code: driver.get("http://watir.com/examples/shadow_dom.html") shadow_host = driver.find_element(By.XPATH, ‘//*[@id="shadow_host"]’) shadow_root1 = shadow_host.shadow_root shadow_host1 = shadow_root1.find_element(By.XPATH, ‘/span’) selenium.common.exceptions.InvalidArgumentException: Message: invalid argument: invalid locator I know that I can use CSS Selectors, but I want to use it …

Total answers: 1

How to web scrap Economic Calendar data from TradingView and load into Dataframe?

How to web scrap Economic Calendar data from TradingView and load into Dataframe? Question: I want to load the Economic Calendar data from TradingView link and load into Dataframe ? Link: https://in.tradingview.com/economic-calendar/ Filter-1: Select Data for India and United States Filter-2: Data for This Week Asked By: Rohit || Source Answers: You can request this …

Total answers: 1

How to extract only specific elements, combine "find_all" and "find_elements_by_xpath"?

How to extract only specific elements, combine "find_all" and "find_elements_by_xpath"? Question: I want to extract all of ‘data-test-id’=’^fdbk-item-.*$’ in <span> from link. Futhermore, within that contain whichever capital or lower case mirror|tray|ceramic. source Using find_all(), retrieving ‘data-test-id’=’^fdbk-item-. *$’ was successfully. from selenium import webdriver from selenium.webdriver.common.by import By from bs4 import BeautifulSoup import time import …

Total answers: 1

Scrape Table Data with navigation using Beautiful Soup

Scrape Table Data with navigation using Beautiful Soup Question: I tried downloading pdf files from the website, which is contained in a table with pagination. I can download the pdf files from the first page, but it is not fetching the pdf from all the 4000+ pages. When I tried understanding the logic by observing …

Total answers: 1