We are using the serverless framework to deploy a (not that complex) serverless API. Which uses a couple of lambda functions hooked up via API Gateway, pretty traditional stuff I suppose.
My doubt is the following,
Whenever I deploy the project from my local machine, using my configured AWS profile, I get a file size of around 25 MB for each of my lambdas.
However, if a partner of mine uses the exact same serverless.yml file BUT with his profile, the lambdas have a size of around 3 MB. Is there someway that a plugin or config is set globally for when I am performing the deployments? To be honest I am not sure why is this happening, and any assistance would be very much appreciated.
Thanks a lot in advance
We had a similar issue and this was because of caching the slimed version of libraries. We were using serverless-python-requirements plugin for packaging python dependencies.
You can enable two kinds of caching with this plugin which are currently both ENABLED by default. First, a download cache that will cache downloads that pip needs to compile the packages. And second, a what we call “static caching” which caches output of pip after compiling everything for your requirements file. Since generally requirements.txt files rarely change, you will often see large amounts of speed improvements when enabling the static cache feature. These caches will be shared between all your projects if no custom cacheLocation is specified.
So, what had happened in our case was that the second person had enabled the “slim” paramter and cached the slimed version of all the libraries on his local machine. Now, since this plugin uses cache by default so his final deployment package kept using the slimed versions across all the projects. The first person didn’t use slim so for him package size was slightly larger than the second person.
A way to avoid this confusion is to configure serverless-python-requirements plugin in the following way in your serverless.yml