df to table throw error TypeError: __init__() got multiple values for argument 'schema'

Question:

I have dataframe in pandas :- purchase_df. I want to convert it to sql table so I can perform sql query in pandas. I tried this method

purchase_df.to_sql('purchase_df', con=engine, if_exists='replace', index=False)

It throw an error

TypeError: __init__() got multiple values for argument 'schema'

I have dataframe name purchase_df and I need to perform sql query on it. I need to perform sql query on this dataframe like this ….engine.execute(”’select * from purchase_df where condition”’). For this I need to convert dataframe into sql table as in our server pandas_sql is not installed only sql alchemy is installed.

I ran this code in pycharm locally and it work perfectly fine but when i tried this in databrick notebook it is showing an error. Even though week ago it was running fine in databrick notebook too. Help me to fix this.

note:- pandas version ‘1.3.4’
Name: SQLAlchemy
Version: 2.0.0

Asked By: Arpan Ghimire

||

Answers:

It seems that the version 2.0.0 (realeased on January 26, 2023) of SQLAlchemy is not compatible with earlier versions of .
I suggest you to upgrade your pandas version to the latest (version 1.5.3) with :

pip install --upgrade pandas

Or:

conda upgrade pandas
Answered By: Timeless

I got the same issue in databricks, and I had to downgrade sql alchemy to !pip install sqlalchemy==1.4.46

Answered By: Gonzalo Rojas

I experience exactly the same issue with databricks at AWS. I try upper solutions but nothing works for me. So i install sqlalchemy-databricks library instead of SQLAlchemy and everything is back to life. https://pypi.org/project/sqlalchemy-databricks/
Please uninstall/purge SQLAlchemy first to not be in conflict with sqlalchemy-databricks.

Answered By: tlubenov