spark-cassandra-connector

Spark + Kafka app, getting "CassandraCatalogException: Attempting to write to C* Table but missing primary key columns: [col1,col2,col3]"

Spark + Kafka app, getting "CassandraCatalogException: Attempting to write to C* Table but missing primary key columns: [col1,col2,col3]" Question: Run env kafka —-ReadStream—-> local —-WriteStream—-> cassandra source code place on local and kafka, local, writeStream is different IP Table columns are: col1 | col2 | col3 | col4 | col5 | col6 | col7 df.printSchema …

Total answers: 1

Spark with Cassandra python setup

Spark with Cassandra python setup Question: I am trying to use spark to do some simple computations on Cassandra tables, but I am quite lost. I am trying to follow: https://github.com/datastax/spark-cassandra-connector/blob/master/doc/15_python.md So I’m running the PySpark shell: with ./bin/pyspark –packages com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3 But I am not sure how to set things up from here. How do …

Total answers: 3

pyspark datastax cassandra connector keeps connecting to localhost

pyspark datastax cassandra connector keeps connecting to localhost Question: I am trying to connect pyspark to cassandra using datastax driver conf = SparkConf() .setAppName(‘Test’) .setMaster(‘local[4]’) .set(“spark.cassandra.connection.host”, “192.168.0.150”) sc = SparkContext(conf=conf) sqlContext = SQLContext(sc) df = sqlContext.read.format(“org.apache.spark.sql.cassandra”). options(table=”test”, keyspace=”test_keyspace”).load() For some reason it keeps connecting to 127.0.0.1:9042 instead of 192.168.0.150 Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried …

Total answers: 1