Step-Functions reduce lambda size code

I have a step-function running, but the size of each lambda is exactly the same and I am now getting the error Code storage limit exceeded when i try to deploy this.

my setup is:

➜  ingest git:(feature/ingest) ✗ tree -L 2                                                                                                                                                                     (git)-[feature/ingest] 
├── authorization
│   └── index.js
├── balance
│   └── index.js
├── config.js
├── dynamodb
│   └── index.js
├── execute_steps
│   └── index.js
├── ffprobe
│   └── index.js
├── get_assembly
│   └── index.js
├── index
│   └── index.js
├── Makefile
├── node_modules
│   ├── acorn
│   └── yargs-parser
├── package.json
├── serverless.yml
├── transloadit
│   └── index.js
├── upload
│   └── index.js
├── waiting_user_choice
│   └── index.js
└── yarn.lock

So I have about 8 lambda function in this step-function - what is the correct way to deploy this and reduce the size of each of the function?

Any advice is much appreciated.

It doesn’t sound like you’re packaging your functions individually.

Keep in mind too that the packaging settings changed in the v1.11 release.

In my project the node_modules and package.json is in the root of the project and when adding the:

  individually: true

to the serverless.yaml, the deploy command hangs on Uploading function .zip files to S3...:

➜  ingest git:(feature/ingest) ✗ yarn deploy-dev                                                                                                                                                                (git)-[feature/ingest] 
yarn deploy-dev v0.22.0
$ rm -rf node_modules && yarn install --production && AWS_PROFILE=dev sls deploy --stage dev --region us-east-1 
yarn install v0.22.0
[1/4] Resolving packages...
[2/4] Fetching packages...
[3/4] Linking dependencies...
[4/4] Building fresh packages...
Done in 4.28s.
Serverless: Packaging service...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading function .zip files to S3...

would I have to for each function, exclude node_modules and then individually include the modules needed for the specific function as per ?

Unfortunately yes, you would.

For more complex functions (i.e. those with many dependencies) I usually have a subdirectory per function, and can then package it by directory.

If you are going to do it off a shared node_modules directory, I suggest you use the --legacy-bundling option. From the docs:

The --legacy-bundling argument will cause npm to install the package such that versions of npm prior to 1.4, such as the one included with node 0.8, can install the package. This eliminates all automatic deduping.

This means each module will have all of its dependencies contained in its own directory.

cool, this works:

  - serverless-step-functions
  individually: true
    - yarn.lock
    - .git/*
    handler: authorization/index.getSignedUrl
        - node_modules/**
        - '!node_modules/jsonwebtoken/**'
        - '!node_modules/aws-sdk/**'
        - '!node_modules/moment/**'
        - '!node_modules/uuid/**'
      - http:
          path: get-upload-url
          method: post
          integration: lambda

for the getSignedUrl will include only:

        - '!node_modules/jsonwebtoken/**'
        - '!node_modules/aws-sdk/**'
        - '!node_modules/moment/**'
        - '!node_modules/uuid/**'

but @rowanu how do you deal with the included module’s dependancies as those will need to also be included?

That’s what the --legacy-bundling argument does - it installs a module’s dependencies in the module’s directory.

Another thing you can look at is moving your dev dependencies into a node_modules in the parent directory (the one above your service folder). They’ll still be available to Serverless but they won’t be packaged with your code. While it won’t help you with having the smallest possible package for each function it is a quick win to remove packages like the aws-sdk, compilers/transpilers, test frameworks, plugins (and their dependencies).

that is a good idea, to move dev dependencies, thanks and yes using --legacy-bundling does fix the issue. I am using the aws-sdk in my functions to access dynamoDB and S3 - is there an alternative way to package these?

See the docs

The following libraries are available in the AWS Lambda execution environment, regardless of the supported runtime you use, so you don’t need to include them:

  • AWS SDK – AWS SDK for JavaScript version 2.22.0
  • AWS SDK for Python (Boto 3) version 1.4.4, Botocore version 1.5.13
  • Amazon Linux build of java-1.8.0-openjdk for Java.

If you can limit yourself the currently included version then there’s no need to include it in the upload package to use it.

Hi, thanks for the replies. I have one issue, which I find strange, in that, if I enable the individually: true for a specific function, as:

    handler: execute_steps/index.executeSteps
    # package:
    #   individually: true
    #   exclude:
    #     - node_modules/**
    #     - '!node_modules/jsonwebtoken/**'
    #     - '!node_modules/request/**'
      - http:
          path: choice-start
          method: post
          integration: lambda

I get an error about missing module:

START RequestId: 09714a20-26cd-11e7-9dc5-4f7f268fbac2 Version: $LATEST
Unable to import module 'execute_steps/index': Error
    at Function.Module._resolveFilename (module.js:469:15)
    at Function.Module._load (module.js:417:25)
    at Module.require (module.js:497:17)
    at require (internal/module.js:20:19)
    at Object.<anonymous> (/var/task/node_modules/jsonwebtoken/node_modules/joi/lib/date.js:7:14)
    at Module._compile (module.js:570:32)
    at Object.Module._extensions..js (module.js:579:10)
    at Module.load (module.js:487:32)
    at tryModuleLoad (module.js:446:12)
    at Function.Module._load (module.js:438:3)
END RequestId: 09714a20-26cd-11e7-9dc5-4f7f268fbac2
REPORT RequestId: 09714a20-26cd-11e7-9dc5-4f7f268fbac2	Duration: 57.48 ms	Billed Duration: 100 ms 	Memory Size: 128 MB	Max Memory Used: 25 MB	

I deploy my function as:

$ rm -rf node_modules && npm install --production --legacy-bundling && AWS_PROFILE=dev sls deploy --stage dev --region us-east-1

otherwise it works, if i comment the package block.

i have checked the function code, and I am only using aws-sdk, request and jsonwebtoken within it.

any advice is much appreciated