Pass parameters to SQL in Databricks (Python)
Question:
I have this field from a dataframe that I need to use as parameter for a SQl, so I’m using the code below and it throws empty, no error, just nothing.
d = df_max_dt.select('update_tstmp');
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format(d);
However, if I manually replace the parameter like this below, it does work:
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format('2020-02-25T18:54:48.435+0000'))
What am I missing here?
Answers:
The issue is that d
is a dataframe. Try this:
d = df_max_dt.select('update_tstmp').collect()[0].update_tstmp;
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format(d);
I have this field from a dataframe that I need to use as parameter for a SQl, so I’m using the code below and it throws empty, no error, just nothing.
d = df_max_dt.select('update_tstmp');
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format(d);
However, if I manually replace the parameter like this below, it does work:
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format('2020-02-25T18:54:48.435+0000'))
What am I missing here?
The issue is that d
is a dataframe. Try this:
d = df_max_dt.select('update_tstmp').collect()[0].update_tstmp;
df_tmp = spark.sql("SELECT * FROM invoices WHERE last_upd_tstmp > from_utc_timestamp('{0}', 'MST') ".format(d);