Lambda function triggered when adding files to a S3 bucket


I am having a problem with the behavior of a Lambda function. Ultimately, I want my lambda function to read some information about json files that I upload to my S3 bucket. Here is a test function:

import boto3
import json


def LambdaHandler(event,context):
#    bucket=event['Records'][0]['s3']['bucket']['name']
#    key=event['Records'][0]['s3']['object']['key']

        print('I was triggered')

    except Exception as e:
        raise e

Note the two lines that are commented. I believe that I used the correct role and permissions, since the logs on AWS show that the function is triggered each time I upload a file to my S3 bucket (the log shows "I was triggered").

The problem is that, when I uncomment the two lines, the log no longer show anything. It is like the function is no longer triggered. Am I missing something?

I am expecting at this point the log to still show that the function is triggered (and display the message "I was triggered"). Ultimately I want the function to read the json files that I’m uploading, so that I can make it take some further action.

Asked By: Blue Moonshine



I suspect that there is an error reading the event dictionary. Lambda should be publishing logs regardless of the print statement runs or not. You will see in those logs if one of the commented lines fails.

As a general Python thing, your try/except statement doesn’t really add much value. The statement you are trying will always succeed, so I’m not really sure what exception you are attempting to catch there.

My suggestion is to remove the try/except block entirely and place the print statement above your two statements which read from the event object. That way you know the lambda function will print logs regardless and you can further debug your issue.

Answered By: Lawrence Aiello

Alright, I don’t know what happened, but I deleted everything from AWS and created the whole thing again, and now it works. My Lambda function can read the files that are uploaded to the S3 bucket. Problem fully solve.

Answered By: Blue Moonshine