Adding multiple S3 triggers for a single lambda function for different environments

I need to add multiple s3 triggers to my lambda function with environment specific. Like I had dev, test, prod environment for these i built a serverless yml file with lambda single lambda function for that function i need to add 4 s3 triggers with 4 different prefixes for dev and 6 s3 triggers for prod and 5 for test. Now how to deploy it with single serverless file or else i need to achieve this through creating 3 specific files for different environment.

You can list multiple events for a given function.

Post a sample from your .yml and we’ll try and help :slight_smile:

Hmmm. We’ve had to do similar, for things like Target Groups we were updating.

We changed the code so we could pass in a CSV list, something like

and then split the string at the start of the code.

Thanks for the quick reply @TomC . I’ll implement it in my code. Thanks :+1:

@TomC I’m having another doubt. In the above code i need to store the hardcoded values into my AWS Secret Manager and i need to fetch those secrets to my default.yml and then i need to use it in my serverless.yml is there any way to do it. if you know something about it could pls advice.

We make the stage part of the path :

environment :
  RDS_SECRET_NAME: ${self:custom.RDS_SECRET_NAME.${self:provider.stage}}
....
custom: 
  RDS_SECRET_NAME:
    dev: project name/rds/goo
    live: project name/rds/boo
1 Like