Python boto3 aws lambda – where did the csv file was written on windows 11 -Fully working code now

Question:

I am using boto3 for writing AWS lambda. The intent here is read few attributes of all EC2 instances and write to a file local to C drive.

I am running this code on windows 11. where is /tmp folder on windows 11.

import json
import boto3
from pprint import pprint
from datetime import datetime
import csv

def lambda_handler(event, context):
      aws_mgmt_console = boto3.session.Session()
      ec2_console_resource = aws_mgmt_console.resource('ec2')
      count = 1

     csv_object=open("/tmp/csv_file.csv", "w+")
     csvwriter_object = csv.writer(csv_object)
     csvwriter_object.writerow(["Instance 
        Id","Instance_Type","LaunchTime","count"])

     for eachinstance in ec2_console_resource.instances.all():
         print(eachinstance.instance_id,eachinstance.instance_type, eachinstance.launch_time, 
            count)
         count+=1
         csvwriter_object.writerow(eachinstance.instance_id,eachinstance.instance_type, 
         eachinstance.launch_time, count)

     csv_object.close()

     s3 = boto3.resource('s3')
     s3.meta.client.upload_file('/tmp/csv_file.csv', 'mybucketbasam', 
    'EC2instances.csv')
Asked By: Jason

||

Answers:

AWS Lambda functions run on containers provisioned by the AWS Lambda service. Lambda functions are running on AWS infrastructure and have no access to your own computer.

Lambda functions can store data in the /tmp/ directory that is local to the Lambda function.

If you wish to access files produced by a Lambda function, it will be necessary to have the Lambda function copy those files to somewhere that you can access. This is typically done by copying/writing files to an Amazon S3 bucket. You can use the boto3 upload_file() method to upload from the Lambda environment to an Amazon S3 bucket.

Answered By: John Rotenstein