'serverless deploy' command (aws python service with dependencies) hangs when downloading files to cache directory

I am trying to deploy a python (3.6) lambda service using the ‘serverless deploy’ command. It hangs at this step: “Serverless: Using download cache directory /Users/…/Library/Caches/serverless-python-requirements/downloadCacheslspyc”

If I leave it hanging for too long, my mac says that it is running out of disk space leading me to believe that there is some loop or too many dependencies are being downloaded.

Here is the content of my requirements.txt file:
cycler==0.10.0
kiwisolver==1.1.0
matplotlib==3.1.1
numpy==1.17.0
pandas==0.25.0
pyparsing==2.4.2
python-dateutil==2.8.0
pytz==2019.2
six==1.12.0

Here is what I have added to my serverless.yml file:
provider:
name: aws
runtime: python3.6
plugins:
serverless-python-requirements
custom:
pythonRequirements:
dockerizePip: non-linux
package:
exclude:
- a/** # virtualenv

Why would it hang here? Are my dependencies too broad? Could this be related to my docker settings?

1 Like

Hey, did you ever get a fix for this?
My serverless deploy command is just hanging as well. (at “Running docker …” )

Whilst it looks looks the requirements were generated from the right place:
Generated requirements from /Users/mikeb/dev/proj123/greatAPI/requirements.txt

What looks a bit suspect is that “Installing Requirements from” step is pointing at /Users/mikeb/Library/Caches/serverless-python-requirements/0cf3759736038430b48f8f5894d3500ba566baf01f99172d17defe97148b4952_slspyc/requirements.txt
(but actually, this is just a copy of the requirements.txt file in my project)
I have done everything from within a virtual environment

UPDATE: turns out my libraries and package dependencies were just too large. I had to package them up into custom>pythonrequirments with some slim globbing (it’s all in serverless-python-requirements docs)