edit: So as embarrassed as I am to admit this, it turns out that the only thing I was doing wrong was using a version of serverless
that was too old to support the (relatively new) existing: true
option. I was running 1.45.x
, and apparently this feature didn’t make it in until 1.47.x
or 1.48.x
. Either way, upgrading serverless to use the latest version solved my issue!
Hello! I’m trying to follow some better practices (best practices might be debatable!) when it comes to splitting up my serverless projects, so when I want to redeploy lambda functions / api gateway configurations I have fewer tendrils with things like S3 and DynamoDB (basically so data that’s persisted and can’t be safely deleted is in a different service
). I’ve been following https://serverless-stack.com/chapters/s3-as-a-serverless-service.html
and it’s been super useful so far. The basic goal is to be able create an event configuration in one service that listens for events on a bucket setup in a different service.
The problem is when I deploy the service with the event / lambda / function configuration, I get an error:
An error occurred: S3BucketTeststoreprojectorderimportdev - test-store-test-order-import-dev already exists in stack arn:aws:cloudformation:us-west-2:558299803503:stack/test-store-s3-DEV/b3a3c580-a9ff-11e9-a091-0ae846f1e916.
This is saying that test-store-test-order-import-dev
already exists, which is true, it does, but as you can see (in the truncated yml) below I am using the existing: true
option with the understanding that it shouldn’t try to re-create the bucket:
#first serverless.yml with my lambda functions / events...
provider:
name: aws
...
...
...
iamRoleStatements:
- Effect: Allow
Action:
- 's3:*'
Resource:
- 'Fn::ImportValue': ${self:custom.alias}-ProjectOrderProcessingBucketArn
functions:
test-order-import:
name: store-test-order-import
handler: src/functions/project/test-order-import.handle
environment:
PROJECT_STORE_CODE: Test
events:
- s3:
bucket: orders-${self:custom.lower-case-suffix.${opt:alias,'DEV'}}
event: s3:ObjectCreated:*
rules:
- prefix: import/
- suffix: .csv
existing: true
In my (much simpler) s3 / bucket serverless service project, the relevant bits are here:
#second serverless.yml with config for the S3 bucket(s)
resources:
Resources:
ProjectOrderProcessingBucket:
Type: AWS::S3::Bucket
DeletionPolicy: Retain
Properties:
BucketName: ${self:custom.bucket-name-test-order-import}${self:custom.lower-case-suffix.${opt:alias,'DEV'}}
Outputs:
ProjectOrderProcessingBucketArn:
Value:
Fn::GetAtt:
- ProjectOrderProcessingBucket
- Arn
Export:
Name: ${self:custom.alias}-ProjectOrderProcessingBucketArn
ProjectOrderProcessingBucketName:
Value:
Ref: ProjectOrderProcessingBucket
Export:
Name: ${self:custom.alias}-ProjectOrderProcessingBucket
Now it may be worth noting that I get this same error regardless of whether or not I include the iamRoleStatement
:
iamRoleStatements:
- Effect: Allow
Action:
- 's3:*'
Resource:
- 'Fn::ImportValue': ${self:custom.alias}-ProjectOrderProcessingBucketArn
Now besides this, I’m not doing anything too “crazy”, my service configuration is pretty simple actually.
I’d greatly appreciate any thoughts / input / ideas or even steps I can take to debug this. Maybe this is a bug? Looking at https://serverless.com/framework/docs/providers/aws/events/s3#using-existing-buckets it seems like this should “just work” but clearly I’m doing something wrong (or, well, there is a bug!)