Serverless Error: S3BucketAttachments - test already exists

Hey guys, I’m trying to deploy a lambda with an S3 upload event as my trigger. I’ve got my s3 bucket broken out into it’s own .yml file and then bring it into the serverless yaml on deployment. I’ve been following this example in the docs to the letter, and I keep getting the same error: the bucket already exists: https://serverless.com/framework/docs/providers/aws/events/s3/#custom-bucket-configuration.

I’ve been trying to debug based off this post: https://github.com/serverless/serverless/issues/3257

I’m not sure what I’m doing wrong here, am I not following the right naming convention?

serverless.yml
service: farm-app-api

# Use the serverless-webpack plugin to transpile ES6
plugins:
  - serverless-webpack
  - serverless-offline

custom:
  # Stage is based on what is passed in when running serverless
  # commands. Or fallsback to what has been set in the provider section.
  stage: ${opt:stage, self:provider.stage}
  # Set the table name here so it can be used while testing locally
  # Change table name to farms
  tableName: ${self:custom.stage}-farms
  # Set DynamoDB throughput for prod and all other non-prod stages.
  tableThroughputs:
    prod: 5
    default: 1
  tableThroughput: ${self:custom.tableThroughputs.${self:custom.stage}, self:custom.tableThroughputs.default}
  # Load webpack config
  webpack:
    webpackConfig: ./webpack.config.js
    includeModules: true
  # Load secret environment variables based on current stage.
  # Fallback to default if it is not in prod.
  environment: ${file(env.yml):${self:custom.stage}, file(env.yml):default}



provider:
  name: aws
  runtime: nodejs8.10
  stage: dev
  region: us-east-1

  # These environment variables are made available to our functions
  # under process.env.
  environment:
    tableName: ${self:custom.tableName}
    stripeSecretKey: ${self:custom.environment.stripeSecretKey}

  iamRoleStatements:
    - Effect: Allow
      Action:
        - dynamodb:DescribeTable
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
      # Restrict IAM role permissions to
      # the specific table for the stage
      Resource:
        # Changed to LivestockTable to test building new DynamoDB
        - "Fn::GetAtt": [ LivestockTable, Arn ]

functions:
  csvHandler:
    # Defines an HTTP API endpoint that calls the main function in csvHandler.js
    # - path: url path is /farms/{id}
    # - method: PUT request
    # Defines a S3 Bucket that calls the main function in csvHandler.js
    handler: csvHandler.main
    events:
      - s3:
          bucket: attachments
          events: s3:ObjectCreated:*
          rules:
            - suffix: .csv
          # existing: true
      - http:
          path: farms/{id}
          method: post
          cors: true
          authorizer: aws_iam

  # API Gateway Errors
  - ${file(resources/api-gateway-errors.yml)}
  # DynamoDB
  - ${file(resources/dynamodb-table.yml)}
  # S3
  - ${file(resources/s3-bucket.yml)}
  # Cognito
  - ${file(resources/cognito-user-pool.yml)}
  - ${file(resources/cognito-identity-pool.yml)}

    Resources:
      S3BucketAttachments:
        Type: AWS::S3::Bucket
        Properties:
          BucketName: test
          # Set the CORS policy
          CorsConfiguration:
            CorsRules:
              -
                AllowedOrigins:
                  - '*'
                AllowedHeaders:
                  - '*'
                AllowedMethods:
                  - GET
                  - PUT
                  - POST
                  - DELETE
                  - HEAD
                MaxAge: 3000
      # Configuring lambda handler permissions for s3 event
      CsvHandlerLambdaPermissionAttachmentsS3:
        Type: 'AWS::Lambda::Permission'
        Properties:
          FunctionName:
            'Fn::GetAtt':
              - CsvHandlerLambdaFunction
              - Arn
          Principal: 's3.amazonaws.com'
          Action: 'lambda:InvokeFunction'
          SourceAccount:
            Ref: AWS::AccountId
          SourceArn: 'arn:aws:s3:::test'
    # Print out the name of the bucket that is created
    #Outputs:
    #  AttachmentsBucketName:
    #    Value:
    #      Ref: S3BucketAttachments