My goal is to set up a file uploading/notifying/parsing/deleting pipeline based on AWS S3 and Lambda.
A user can upload a file to a S3 bucket, which will trigger an event which causes the Lambda function to be executed.
Apparently, this is currently not supported?
So without using that plugin, is there any way to keep a bucket and its contents around? It seems rather pointless to re-create the bucket (and delete all its previous contents) every time I deploy the lambda with slight changes.
running sls deploy will create the bucket. If I change the bucket name, the original bucket is deleted. However, I noticed if there are objects in the bucket, then the deploy fails (which is much better than deleting files in the bucket)