Cloud Functions: How to send a daily email at 10am with an extract of a table?

Question:

New to GCP Cloud Functions, and I am wanting to send a daily email at 10am, which would contain an extract of a specific table in BigQuery.

So far, I have written my code to send an email; though, I am not sure what to do in order to send the email at 10 am daily alongside the table extract:

def email(request):
    import os
    from sendgrid import SendGridAPIClient
    from sendgrid.helpers.mail import Mail, Email
    from python_http_client.exceptions import HTTPError

    sg = SendGridAPIClient(os.environ['EMAIL_API_KEY'])

    html_content = "<p>Table XYZ is attached here</p>"

    message = Mail(
        to_emails="[Destination]@email.com",
        from_email=Email('[YOUR]@gmail.com', "Your name"),
        subject="Hello world",
        html_content=html_content
        )
    message.add_bcc("[YOUR]@gmail.com")

    try:
        response = sg.send(message)
        return f"email.status_code={response.status_code}"
        #expected 202 Accepted

    except HTTPError as e:
        return e.message
Asked By: SausageMan123

||

Answers:

Use Google Cloud Scheduler to activate your Cloud Function and the BigQuery API to extract the data from the particular table to be included in a daily email that is sent out at 10 a.m.

Write the code to extract the data from the particular table in BigQuery and send an email by creating a new Cloud Function in the GCP console. The google-cloud-big query and google-auth packages can be used to communicate with BigQuery.

A sample of the code is shown below:

from google.cloud import bigquery
from google.oauth2 import service_account
from sendgrid import SendGridAPIClient
from sendgrid.helpers.mail import Mail, Email
from python_http_client.exceptions import HTTPError
from datetime import datetime

def email(request):
    
    project_id = '<your-project-id>'
    credentials = service_account.Credentials.from_service_account_file('<path-to-your-service-account-key>')

    client = bigquery.Client(project=project_id, credentials=credentials)

    query = '''
        SELECT *
        FROM `<your-project-id>.<your-dataset>.<your-table>`
    '''

    results = client.query(query).result()

    html_content = '<table>'
    for row in results:
        html_content += '<tr>'
        for field in row:
            html_content += f'<td>{field}</td>'
        html_content += '</tr>'
    html_content += '</table>'

    message = Mail(
        to_emails="<destination-email>@example.com",
        from_email=Email('<SausageMan123>@gmail.com', "SausageMan123"),
        subject="Data extract from BigQuery",
        html_content=html_content
    )
    message.add_bcc("<SausageMan123>@gmail.com")

    try:
        sg = SendGridAPIClient('<your-sendgrid-api-key>')
        response = sg.send(message)
        return f"Email sent at {datetime.now()} with status code {response.status_code}"
    except HTTPError as e:
        return e.message
  • Create a Cloud Function that takes the information from your BigQuery
    table and sends an HTML email. To extract the data, use the BigQuery
    API, and to send the email, use a library like Nodemailer.
  • Build a Pub/Sub topic in Cloud Pub/Sub that will be used to activate
    Cloud Function.
  • Create a new job in Cloud Scheduler that publishes a message to the
    Pub/Sub topic every day at 10am. Whatever parameters your Cloud
    Function requires to function, such as the table name or email
    recipients, can be included in the message payload that you supply.

Sample code for HTML

    const {BigQuery} = require('@google-cloud/bigquery');
    const nodemailer = require('nodemailer');
    
    exports.dailyEmail = async (message, context) => 
{
      const {tableId, recipients} = message.attributes;
     const bigquery = new BigQuery();
 const [rows] = await bigquery.query(`SELECT * FROM ${tableId}`);
    
    
      const htmlTable = '<table><thead><tr><th>Column 1</th><th>Column 2</th></tr></thead><tbody>' + 
        rows.map(row => `<tr><td>${row.column1}</td><td>${row.column2}</td></tr>`).join('') +
        '</tbody></table>';
    
      
      const transporter = nodemailer.createTransport({  });
      await transporter.sendMail({
        from: '[email protected]',
        to: recipients,
        subject: 'Daily BigQuery Report',
        html: htmlTable,
      });
    };

updated code

const {BigQuery} = require('@google-cloud/bigquery');
const {Storage} = require('@google-cloud/storage');
const sgMail = require('@sendgrid/mail');

exports.extractDataAndSendEmail = async (event, context) => {
  const projectId = 'your-project-id';
  const datasetId = 'your-dataset-id';
  const tableId = 'your-table-id';
  const bucketName = 'your-bucket-name';
  const fileName = 'your-file-name.csv';
  const recipients = ['[email protected]', '[email protected]'];
  
  const bigquery = new BigQuery({projectId});
  const storage = new Storage({projectId});
  const bucket = storage.bucket(bucketName);
  
  const query = `SELECT * FROM ${projectId}.${datasetId}.${tableId}`;
  const [rows] = await bigquery.query(query);
  const csvData = rows.map(row => Object.values(row).join(',')).join('n');
  
  const file = bucket.file(fileName);
  await file.save(csvData);
  
  sgMail.setApiKey(process.env.SENDGRID_API_KEY);
  const msg = {
    to: recipients,
    from: '[email protected]',
    subject: 'Your Daily Extract',
    text: 'Attached is your daily extract',
    attachments: [
      {
        content: file.createReadStream(),
        filename: fileName,
        type: 'text/csv'
      }
    ]
  };
  await sgMail.send(msg);
};

In Python, you would use the from… import statement to import the BigQuery class from the google.cloud.bigquery package.

Here’s an example of how to import BigQuery into Python:
from google.cloud import bigquery

    client = bigquery.Client()
    bigquery_object = client.bigquery()


As an Example of how you could use the relevant client libraries and implement this functionality in Python, consider the following:

from google.cloud import bigquery
from google.cloud import storage
import sendgrid
from sendgrid.helpers.mail import Mail, Attachment, FileContent, FileName, FileType, Disposition

def extract_data_and_send_email(event, context):
    
    project_id = 'your-project-id'
    dataset_id = 'your-dataset-id'
    table_id = 'your-table-id'
    bucket_name = 'your-bucket-name'
    file_name = 'your-file-name.csv'
    
    
    recipients = ['[email protected]', '[email protected]']

    bigquery_client = bigquery.Client(project=project_id)

   
    query = f'SELECT * FROM `{project_id}.{dataset_id}.{table_id}`'
    job = bigquery_client.query(query)
    rows = job.result()

   
    csv_data = ''.join([','.join(map(str, row.values()))+'n' for row in rows])

  
    storage_client = storage.Client(project=project_id)
    bucket = storage_client.bucket(bucket_name)

   
    blob = bucket.blob(file_name)
    blob.upload_from_string(csv_data)

 
    sendgrid_client = sendgrid.SendGridAPIClient(api_key=process.env.SENDGRID_API_KEY)
    message = Mail(from_email='[email protected]',
                   to_emails=recipients,
                   subject='Your Daily Extract',
                   plain_text_content='Attached is your daily extract')
    file_content = blob.download_as_bytes()
    attachment = Attachment(FileContent(file_content),
                             FileName(file_name),
                             FileType('text/csv'),
                             Disposition('attachment'))
    message.attachment = attachment
    sendgrid_client.send(message)
Answered By: Robina Mirbahar