init Exception: Java gateway process exited before sending its port number
Question:
Code is below
import findspark
findspark.init(r'C:UsersuserDocumentsspark-3.0.0-bin-hadoop2.7')
import pyspark
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('Basics').getOrCreate()
Got error
Exception: Java gateway process exited before sending its port number
Disclaimer: Already added JAVA_HOME
to env variables, I didnt needed to add spark
as i am calling init
Already went through so many links to check the error could not resolve
Answers:
What’s your Java version? Adding JAVA_HOME
to your path won’t help if you’ve got a non-compatible version of Java. Java v8 is by far the easiest to get running but the docs say that you should be able to get v11 running.
For me the fix was to call the process with spark-submit name_of_spark_file.py
I believe it would also work with scala files.
Code is below
import findspark
findspark.init(r'C:UsersuserDocumentsspark-3.0.0-bin-hadoop2.7')
import pyspark
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('Basics').getOrCreate()
Got error
Exception: Java gateway process exited before sending its port number
Disclaimer: Already added JAVA_HOME
to env variables, I didnt needed to add spark
as i am calling init
Already went through so many links to check the error could not resolve
What’s your Java version? Adding JAVA_HOME
to your path won’t help if you’ve got a non-compatible version of Java. Java v8 is by far the easiest to get running but the docs say that you should be able to get v11 running.
For me the fix was to call the process with spark-submit name_of_spark_file.py
I believe it would also work with scala files.