Package artifact not finding my s3 bucket

I’ve set my serverless.yml with:

package:
  artifact: s3://_a_private_bucket_/_sls_app_.zip

as shown in Serverless Framework Guide - AWS Lambda Guide - Packaging

But when I do sls deploy I get this flabgastering error:

 Exception -----------------------------------------------

  [OperationalError: ENOENT: no such file or directory, open 's3://_a_private_bucket_/_sls_app_.zip'] {

however, I have no problems to access it with aws s3 ls s3://_a_private_bucket_/_sls_app_.zip.

Besides, I’ve been able to deploy other similar projects using my S3.

I’m trying to deploy a project whose artifact is bigger than 260 Mb uncompressed (so, could it not work even if I get S3 to work?).

Hi

Did anyone find a resolution to this? I seem to have the exact same problem.

  1. A package >100Mb which needs uploading via s3
  2. Sucessfully can be uploaded via AWS console
  3. Gitlab script uploads package to S3 and ls verifies its existance
  4. Same Gitlab script unable to access package via serverless yml

package:
artifact: s3://private-bucket/package.zip

Error --------------------------------------------------

Error: ENOENT: no such file or directory, open ‘s3://private-bucket/package.zip’

No, I ended up using docker.

Thanks, I was fortunate in that my package was bloated and after a cleanup, actual size was much smaller no longer needed to use S3