Usage of -f affecting s3 upload

When I run

‘sls deploy’

my package file (which is about 72mb) gets uploaded to s3 (Uploading service .zip file to S3 (72 MB)…) vs when I run

‘sls deploy -f function-name’

I get the follow error because the package is too big. I see this error message:

Request must be smaller than 69905067 bytes for the UpdateFunctionCode operation

any help would be extremely appreciated. Does specifying the function to deploy somehow circumvent the s3 upload process? why the diff between “Uploading function: function-name” vs “Uploading service .zip file to S3 (72 MB)”

Thanks

2 Likes

At the time, we stopped passing the -f arg and that let us make progress, but we might be needing to pass the -f arg again and will run into this problem again. Anybody run into this issue by any chance?

Any progress made about this? I’m experiencing the same issue. full deploy works but function only fails which is kind of weird.