Use S3 Bucket cross stack on S3 Events

I’ve 2 stacks running on serverless using AWS Cloud and I want to share S3 Bucket between stack A and stack B and use that bucket as lambda trigger.

On stack A, I have exported the output as follow:

Outputs:
  EvidenceEventsBucketName:
    Value:
      Ref: EvidenceEventsKinesisFirehoseBucket          
    Export:
      Name: ${self:service}-evidences-events-bucket-${opt:stage, self:provider.stage}

And on stack B, I’m trying to import it:

evidenceDataSent:
  handler: evidences.handlers.process_sent_events
  memorySize: 512
  timeout: 900      
  events:
    - s3:
        bucket: 
          Fn::ImportValue: 'twilio-stream-kinesis-evidences-events-bucket-prod'
        event: s3:ObjectCreated:*
        rules:
          - prefix: raw/

Something is wrong because when I try to deploy it, I got the following error:

Type Error ---------------------------------------------

name.replace is not a function

How can I share a bucket between stacks?

The serverless variable ‘cf’ command is a good way of importing other stack outputs.

Although I fear this is possibly just an abstraction of the intrinsic function you are already using.

Worth a try though?