SLS deployment fails with EPIPE

Hi Serverless Community,

I was using serverless framework before and recently I have recognised that the deployment stops and dies with “Recoverable error occurred (Error: write EPIPE” errors. Based on googling I think the S3 upload fails as I do not see the artefact in the deployment bucket. I have tried to do the artifact upload manually, set timeouts with export AWS_CLIENT_TIMEOUT=30000 , upgrade to various versions the sls itself and also node. However, I am not able to overcome the issue. Have you also facing similar issue?

The artifact I am trying to upload is only 7-8M.

The exact versions I am using:
Your Environment Information ---------------------------
Operating System: darwin
Node Version: 16.2.0
Framework Version: 2.43.0
Plugin Version: 5.1.3
SDK Version: 4.2.2
Components Version: 3.10.0

The exact error message:
Serverless: Uploading service file to S3 (7.75 MB)…
Serverless: Recoverable error occurred (Error: write EPIPE
at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:94:16)
----------------------------------------------------------------------------------------------------), sleeping for ~4 seconds. Try 1 of 4

Many thanks,

Hi All,

Based on further testing and checks it seems this issue is somehow connected with the M1 Mac what I am using. Strangely the exact same Mac setup is working with very small few kB zip files and the bigger files with the same config are working on my intel based older Mac.

For now I have am using a very hacky script to manually upload the generated artefacts and also finish the CloudFormation updates:

# Temporary solution for uploading sls artifact 
# and finising the cloudformationupdates.
# Usage: profile
# ---------------------------------------------

# Determine artifacts to upload
stackname=$(cat ${PWD}/.serverless/serverless-state.json | jq '.service.provider.stackName' -r)
profile=$(cat ${PWD}/.serverless/serverless-state.json | jq '.service.provider.profile' -r)
artifacts=$(cat ${PWD}/.serverless/cloudformation-template-update-stack.json | jq 'getpath( paths(has("S3Bucket")?) ) | [.S3Bucket,.S3Key] | join("/") ' -r ) 

echo "Executing upload in ${PWD} directory with ${profile} profile and updating stack ${stackname}"
for i in ${artifacts}
  file=$(echo ${i} | awk -F\/ '{print $NF}')
  echo "Upload " ${file} ":" 
  aws s3 cp ${PWD}/.serverless/${file} s3://${i} --profile ${profile}

# If all uploaded then update the stack.
aws cloudformation update-stack --stack-name ${stackname} --template-body=file://${PWD}/.serverless/cloudformation-template-update-stack.json --capabilities CAPABILITY_NAMED_IAM --profile ${profile}