Cannot supress PySpark warnings
Cannot supress PySpark warnings Question: I am having some issues with trying to suppress pyspark warnings, specifically pandas on spark API. What I currently have: import warnings warnings.simplefilter(action=’ignore’, category=Warning) warnings.filterwarnings("ignore") import pandas as pd from pyspark.sql import SparkSession from pyspark.sql import functions as F import pyspark.pandas as %%capture spark = SparkSession.builder .master("local[32]") .config("spark.driver.memory", "150g") .config("spark.driver.maxResultSize", …