I’m trying to create a prod/dev deployment which will trigger on an s3 write event.
The deployment part of this works (i.e. when code is pushed on dev/master branch the relevant artefacts are created along with the correct s3 buckets, and in the serverless dashboard the functions are listed out with the correct s3 bucket showing in the subscriptions
section).
However when a file with the correct suffix is written to either the prod or dev bucket no lambda is triggered. This makes it impossible to debug as the event isn’t firing (e.g. no logs), so any help here would be greatly appreciated.
Below is a minimal version of the serverless.yml:
service: my-service
provider:
name: aws
runtime: python3.7
region: eu-west-1
stage: ${opt:stage, 'dev'}
iamRoleStatements:
- Effect: Allow
Action:
- s3:ListBucket
Resource:
- "arn:aws:s3:::${self:custom.bucket}"
- Effect: Allow
Action:
- s3:ListBucket
- s3:GetObject
- s3:GetObjectAcl
- s3:PutObject
- s3:CopyObject
- S3:GetObjectTagging
- S3:PutObjectTagging
- s3:PutObjectAcl
Resource:
"arn:aws:s3:::${self:custom.bucket}/*"
functions:
my-func:
name: my-service-${opt:stage}-my-func
handler: handler.handler
events:
- s3:
bucket: ${self:custom.bucket}
event: s3:ObjectCreated:*
rules:
- suffix: .csv
existing: true
environment:
BUCKET: ${self:custom.bucket}
custom:
pythonRequirements:
dockerizePip: true
zip: true
slim: true
layer: false
bucket: ${file(environment.yml):stages.${self:provider.stage}.bucket}
The external enivronment.yml
file contains the bucket names for prod/dev which are created and shown as the subscriptions in the dashboard. As I said, the deployment is working fine, it’s just the function invocation which isn’t taking place currently.
Any advice on what I’m doing wrong or how to debug this would be greatly appreciated.
Many thanks!