Connect to cloudSQL db using service account with pymysql or mysql.connector

Question:

I have a running CloudSQL instance running in another VPC and a nginx proxy to allow cross-vpc access.
I can access the db using a built-in user. But how can I access the DB using a Google Service Account?

import google.auth
import google.auth.transport.requests
import mysql.connector
from mysql.connector import Error
import os

creds, project = google.auth.default()
auth_req = google.auth.transport.requests.Request()
creds.refresh(auth_req)


connection = mysql.connector.connect(host=HOST,
                                     database=DB,
                                     user=SA_USER,
                                     password=creds.token)
if connection.is_connected():
    db_Info = connection.get_server_info()
    print("Connected to MySQL Server version ", db_Info)
    cur = connection.cursor()
    cur.execute("""SELECT now()""")
    query_results = cur.fetchall()
    print(query_results)

When using mysql connnector, I get this error:

DatabaseError: 2059 (HY000): Authentication plugin 'mysql_clear_password' cannot be loaded: plugin not enabled

Then I tried using pymysql

import pymysql
import google.auth
import google.auth.transport.requests
import os

creds, project = google.auth.default()
auth_req = google.auth.transport.requests.Request()
creds.refresh(auth_req)


try:
    conn =  pymysql.connect(host=ENDPOINT, user=SA_USER, passwd=creds.token, port=PORT, database=DBNAME)
    cur = conn.cursor()
    cur.execute("""SELECT now()""")
    query_results = cur.fetchall()
    print(query_results)
except Exception as e:
    print("Database connection failed due to {}".format(e))    
Database connection failed due to (1045, "Access denied for user 'xx'@'xxx.xxx.xx.xx' (using password: YES)"

I guess these errors are all related to the token.
Anyone to suggest a proper way to get SA token to access CloudSQL DB?

PS: Using cloudsql auth proxy is not a good option for our architecture.

Asked By: Max

||

Answers:

The error that you have mentioned in description , indicates an issue with authentication , to exactly understand what could have caused ,try these things

  • Verify the username and corresponding password.
  • Check the origin of the connection to see if it matches the URL where
    the user has access privileges.
  • Check the user’s grant privileges in the database.

As you are trying to access the DB using a Google Service Account then you should try to use the default service account credentials to include this authorization token for you. Check out the Client libraries and sample code page for more info.Alternatively, if you prefer to manually create the requests, you can use an Oauth 2.0 token. The Authorizing requests page has more information for how to create these.These access tokens are only valid for 60 minutes after which they expire – however once a token expires it does not disconnect clients but if that client connection is broken and must re-connect to the instance, and it’s been more than an hour, then a new access token will need to be pulled and provided on that new connection attempt.
For your use case as you are not interested in cloud sql proxy, a service account IAM user is the better way to go.
Note that to get an appropriate access token the scope must be set to Cloud SQL Admin API.

Answered By: Vaidehi Jamankar

It finally works.
I had to enforce SSL connection.

import pymysql
from google.oauth2 import service_account
import google.auth.transport.requests

scopes = ["https://www.googleapis.com/auth/cloud-platform", "https://www.googleapis.com/auth/sqlservice.admin"] 
credentials = service_account.Credentials.from_service_account_file('key.json', scopes=scopes)
auth_req = google.auth.transport.requests.Request()
credentials.refresh(auth_req)
config = {'user': SA_USER,
            'host': ENDPOINT,
            'database': DBNAME,
            'password': credentials.token,
            'ssl_ca': './server-ca.pem',
            'ssl_cert': './client-cert.pem',
            'ssl_key': './client-key.pem'}
try:
    conn =  pymysql.connect(**config)
    with conn:
        print("Connected")
        cur = conn.cursor()
        cur.execute("""SELECT now()""")
        query_results = cur.fetchall()
        print(query_results)
except Exception as e:
    print("Database connection failed due to {}".format(e))  
Answered By: Max

I’d recommend using the Cloud SQL Python Connector it should make your life way easier!

It manages the SSL connection for you (no need for cert files!), takes care of the credentials (uses Application Default Credentials which you can set to service account easily) and allows you to login with Automatic IAM AuthN so that you don’t have to pass the credentials token as a password.

Connecting looks like this:

from google.cloud.sql.connector import Connector, IPTypes
import sqlalchemy
import pymysql

# initialize Connector object
connector = Connector(ip_type=IPTypes.PRIVATE, enable_iam_auth=True,)

# function to return the database connection
def getconn() -> pymysql.connections.Connection:
    conn: pymysql.connections.Connection = connector.connect(
        "project:region:instance",  # your Cloud SQL instance connection name
        "pymysql",
        user="my-user",
        db="my-db-name"
    )
    return conn

# create connection pool
pool = sqlalchemy.create_engine(
    "mysql+pymysql://",
    creator=getconn,
)

# insert statement
insert_stmt = sqlalchemy.text(
    "INSERT INTO my_table (id, title) VALUES (:id, :title)",
)

# interact with Cloud SQL database using connection pool
with pool.connect() as db_conn:
    # insert into database
    db_conn.execute(insert_stmt, id="book1", title="Book One")

    # query database
    result = db_conn.execute("SELECT * from my_table").fetchall()

    # Do something with the results
    for row in result:
        print(row)

Let me know if you run into any issues! There is also an interactive Cloud SQL Notebook that will walk your through things in more detail you can check out.

Answered By: Jack Wotherspoon