Calling lambdas through S3 events from existing bucket

My goal is to set up a file uploading/notifying/parsing/deleting pipeline based on AWS S3 and Lambda.

A user can upload a file to a S3 bucket, which will trigger an event which causes the Lambda function to be executed.
Apparently, this is currently not supported?

So without using that plugin, is there any way to keep a bucket and its contents around? It seems rather pointless to re-create the bucket (and delete all its previous contents) every time I deploy the lambda with slight changes.

1 Like

Why do you think this is going happen?

Perhaps the original poster was referring to my example below.

    functions:
  resize:
    handler: transformer_handler.thumb
    events:
      - s3:
          bucket: my-bucket-name-1
          event: s3:ObjectCreated:*

running sls deploy will create the bucket. If I change the bucket name, the original bucket is deleted. However, I noticed if there are objects in the bucket, then the deploy fails (which is much better than deleting files in the bucket)