PySpark loading from MySQL ends up loading the entire table?
PySpark loading from MySQL ends up loading the entire table? Question: I am quite new to PySpark (or Spark in general). I am trying to connect Spark with a MySQL instance I have running on RDS. When I load the table like so, does Spark load the entire table in memory? from pyspark.sql import SparkSession …