How to set `spark.driver.memory` in client mode – pyspark (version 2.3.1)
How to set `spark.driver.memory` in client mode – pyspark (version 2.3.1) Question: I’m new to PySpark and I’m trying to use pySpark (ver 2.3.1) on my local computer with Jupyter-Notebook. I want to set spark.driver.memory to 9Gb by doing this: spark = SparkSession.builder .master(“local[2]”) .appName(“test”) .config(“spark.driver.memory”, “9g”) .getOrCreate() sc = spark.sparkContext from pyspark.sql import SQLContext …