Lambda Layers & Node & require the layer code & sls invoke local

Hi - I am picking up the new Lambda Layers code, have successfully gotten my layer deployed but I’m struggling to run it from another serverless project.

I’ve read the writeup here: https://serverless.com/blog/publish-aws-lambda-layers-serverless-framework/
But its not clear how to include/require the reference in your local code…

I don’t have some compiled exec as they are showing for example, I’m trying to use plain old javascript in the layer and want to require that into another lambda in another serverless project. Does anyone have any working simple examples?

Not sure if this helps you, but I’ve found that including the layer-specific modules as developer dependencies (npm install --save-dev ...) works for me: the libraries are available locally but excluded during the package/deploy process, relying on the layer to execute on Lambda.

I don’t know of a better way, but it’s worked for me in my first use-case.

2 Likes

My post on a similar idea: AWS Lambda Layers and local dev

Thank you @deltafactory, I ended up doing the same. That works for local, but now when I try and deploy and run I get ‘Cannot find module’ on my require in the lambda with the layer.

If I leverage node’s fs and look at the lambda file system, I can see my module from the layer under /opt/node_modules but its not being picked up by the running lambda. Is there some other trick?

For anyone else reading in future, I missed this part: https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html#configuration-layers-path

In your layer, you need to put your local module under some folder /nodejs/node8/node_modules

So if your layer looks like:

/
serverless.yml
layer/nodejs/node8/node_modules/myLocalModule

where the layer config in the yml looks like:

layers:
  myLocalModule:
    path: layer

Yeah, invoke local is tricky. We’re still figuring out how best to make layers work with that.

1 Like

thanks @dschep, great article btw got me a long ways here

Thanks for your helpful comments. How did you import the module after creating this folder structure?
I have a similar issue and /opt/nodejs/node6/node_modules is present at execution time but I see Cannot find module… any help appreciated!

@mardh take a harder look at this link:
https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html#configuration-layers-path
specifically

To include libraries in a layer, place them in one of the folders supported by your runtime.

Node.js – nodejs/node_modules, nodejs/node8/node_modules (NODE_PATH)
Example AWS X-Ray SDK for Node.js
xray-sdk.zip
└ nodejs/node_modules/aws-xray-sdk

@srg-avai, thanks again. I created a folder structure nodejs/node_modules too (in a new layer version). I can see /opt/nodejs/node_modules is mounted, NODE_PATH includes /opt/nodejs/node_modules but no luck. For context, I’m attempting to access /opt/nodejs/node_modules/puppeteer via require('puppeteer').

I’ve just gone through the process of creating my own custom layer (global application code). To be able to run with sls invoke local I used npm link. Assuming your layer is here:

layers/nodejs/node_modules/some-module

Go into this directory and execute npm link. This will link the module globally.

Then to be able to use this module when running locally go to your home directory and execute npm link some-module. This will create ~/node_modules/some-module, which points to the global symlink, which points back to the module you have in the layers dir.

Doing this puts your layer module in Node’s execution path, which means require('some-module)will work when runningsls invoke local`.

You might think, “why not run npm link some-module in my application directory?” - if you do this, sls deploy ends up deploying the layer code into the lambda function code package, which is redundant.

Hope this helps.

5 Likes

Hi, any thoughts how include/require node_modules in my lambda local code before deploying through sls? I’m trying to deploy puppeteer and convert-svg-to-png to manipulate svg2png base64 image. Any success?

This method works for me:
Update the NODE_PATH environment variable in the serverless.yml of the lambda,

provider:
  tracing:
    lambda: true
  environment:
    NODE_PATH: "./:/opt/node_modules"

Hey jamiechong,

Thanks for the information.
Was wondering if there has been any progress on the matter?

I’m trying to deploy an AWS lambda using typescript (w. webpack) that uses another AWS layer i’ve deployed.
I used your suggestion re the npm link (outside of the application folder) but webpack still concatenated the npm package (my layer code) into the lambda “handler.js”.

If I “npm unlink” then sls deploy fails on not finding the module that is hosted in the layer…
Any ideas?

Thanks

This is my setup:

layers:
  NodeModules:
    path: nodejs
    compatibleRuntimes:
      - nodejs14.x

functions:
  reports:
    handler: src/index.handler
    environment:
      NODE_PATH: "./:/opt/nodejs/node14/node_modules" # NodeModulesLambdaLayer
    package:
      exclude:
        - package.json
        - node_modules/**
        - nodejs/node_modules/**
    layers:
      - { Ref: NodeModulesLambdaLayer }

If you have trouble with this part AWS - Layers: Using your layers,

To use a layer with a function in the same service, use a CloudFormation Ref. The name of your layer in the CloudFormation template will be your layer name TitleCased (without spaces) and have LambdaLayer appended to the end. EG:
the next code will help you.

  • wrong
layers:
  test:
    path: layer
functions:
  hello:
    handler: handler.hello
    layers:
      - { Ref: TestLambdaLayer }
  • right
layers:
  Test:
    path: layer
functions:
  hello:
    handler: handler.hello
    layers:
      - { Ref: TestLambdaLayer }

This phenomenon may happen when layers name is equal to service name. That’s because layers name create zip file (e.g.Test.zip) as well as service name create zip file (e.g.test.zip) when sls deploy with cloudformation info.

I wrote the same issue here.