Stage specific functions not triggering on s3 bucket write events

I’m trying to create a prod/dev deployment which will trigger on an s3 write event.

The deployment part of this works (i.e. when code is pushed on dev/master branch the relevant artefacts are created along with the correct s3 buckets, and in the serverless dashboard the functions are listed out with the correct s3 bucket showing in the subscriptions section).

However when a file with the correct suffix is written to either the prod or dev bucket no lambda is triggered. This makes it impossible to debug as the event isn’t firing (e.g. no logs), so any help here would be greatly appreciated.

Below is a minimal version of the serverless.yml:


service: my-service
provider:
  name: aws
  runtime: python3.7
  region: eu-west-1
  stage: ${opt:stage, 'dev'}

  iamRoleStatements:
    - Effect: Allow
      Action:
      - s3:ListBucket
      Resource:
      - "arn:aws:s3:::${self:custom.bucket}"
    - Effect: Allow
      Action:
        - s3:ListBucket
        - s3:GetObject
        - s3:GetObjectAcl
        - s3:PutObject
        - s3:CopyObject
        - S3:GetObjectTagging
        - S3:PutObjectTagging
        - s3:PutObjectAcl
      Resource:
        "arn:aws:s3:::${self:custom.bucket}/*"
functions:
  my-func:
    name: my-service-${opt:stage}-my-func
    handler: handler.handler
    events:
      - s3:
          bucket: ${self:custom.bucket}
          event: s3:ObjectCreated:*
          rules:
           - suffix: .csv
          existing: true
    environment:
      BUCKET: ${self:custom.bucket}

custom:
  pythonRequirements:
    dockerizePip: true
    zip: true
    slim: true
    layer: false
  bucket: ${file(environment.yml):stages.${self:provider.stage}.bucket}

The external enivronment.yml file contains the bucket names for prod/dev which are created and shown as the subscriptions in the dashboard. As I said, the deployment is working fine, it’s just the function invocation which isn’t taking place currently.

Any advice on what I’m doing wrong or how to debug this would be greatly appreciated.

Many thanks!

Update - this is with the dev bucket only, that is to say, uploading to the prod bucket does trigger the function to execute. I’m wondering if this could be to do with the existing: true property? I’ve read that you can only have one existing bucket for a function, although I didn’t think that’d be an issue there as the prod/dev flavours of the stages create their own functions in serverless.

Is the Dev bucket a pre-existing bucket?

Can you show the contents of environment.yml incase there is something wrong there?

Also, if you check the cloudformation file inside the .serverless directory where you’re deploying from you should be able to see exactly what’s being deployed.

Hi Dr Colza,

I actually resolved this - the issue was AWS had throttled the function for the dev env (but not prod). I’m not sure why this was the case as I hadn’t specified to throttle the function, but there you go…

Diagnoses:

$ aws lambda get-function --function-name ${FUNCTION_NAME} --output json --query Concurrency

which returned
{
“ReservedConcurrentExecutions”: 0
}

Solution:

Manually set the ReservedConcurrentExecutions in the serverless.yml:

functions:
  some_function:
    handler: handler.run
    reservedConcurrency: 5

This did the job nicely!

For anyone else, running these types of issues this blog helped me greatly: https://aws.amazon.com/blogs/compute/managing-aws-lambda-function-concurrency/

Hi jshhrrsn,

Glad you got sorted, no idea why the reserved concurrency would be set to zero, very odd.

Thanks for the link too, will give it a read.

DrColza