Snowflake table as external table in databricks hive metastore

Question:

Does anyone know if it is possible to set a snowflake table as an external table in databricks hive metastore?

I’m working on a project in which we have some tables in Azure Data Lake Gen 2 and we are managing them from databricks. For being able to query the tables from databricks it is needed to add them to an existing database in databricks hive metastore. The syntax is as follows:

CREATE TABLE IF NOT EXISTS <DATABASE>.<TABLE_NAME> USING DELTA LOCATION <PATH_TO_TABLE>

Now, I need to do the same with some tables we have in snowflake. I am able to bring tables to databricks with spark connector.

sfOptions = {
  "sfURL" : "<account>.snowflakecomputing.com",
  "sfUser" : "<user>",
  "sfPassword" : "<password>",
  "sfDatabase" : "<database>",
  "sfRole": "<role>",
  "sfWarehouse" : "<cluster>"
} 

df = spark.read.format("net.snowflake.spark.snowflake")
      .option("column_mapping","name")
      .options(**sfOptions) 
      .option('dbtable',  "<schema>.<table_name>").load() 

I am also able to query snowflake tables from databricks with python snowflake connector as follows:

import snowflake.connector

# Set options below
sfOptions = {
  "account" : "<account>",
  "user" : "<user>",
  "password" : "<password>",
  "database" : "<database>",
  "role": "<role>",
  "warehouse" : "<warehouse>"
} 

ctx = snowflake.connector.connect(**sfOptions)
cs = ctx.cursor()

cs.execute(query)

But, what I need to do is slightly different. I need to bring the snowflake tables as databricks external tables because I want to merge them with delta lake tables, querying directly from a databricks notebook.

Thanks in advance.

Asked By: Cristian Ispan

||

Answers:

For now, this is not possible.

Answered By: Cristian Ispan

Now it is possible. At least if you have S3 mounted in Databricks and added as Stage in Snowflake.

Answered By: Alexander Zot