dataproc

How to use new Spark Context

How to use new Spark Context Question: I am currently running a jupyter notebook on GCP dataproc and hoping to increase the memory available via my config: I first stopped my spark context: import pyspark sc = spark.sparkContext sc.stop() Waited until running the next code block so sc.stop() can finish conf = pyspark.SparkConf().setAll([(‘spark.driver.maxResultSize’,’8g’)]) sc = …

Total answers: 1