Hi - I am picking up the new Lambda Layers code, have successfully gotten my layer deployed but I’m struggling to run it from another serverless project.
I don’t have some compiled exec as they are showing for example, I’m trying to use plain old javascript in the layer and want to require that into another lambda in another serverless project. Does anyone have any working simple examples?
Not sure if this helps you, but I’ve found that including the layer-specific modules as developer dependencies (npm install --save-dev ...) works for me: the libraries are available locally but excluded during the package/deploy process, relying on the layer to execute on Lambda.
I don’t know of a better way, but it’s worked for me in my first use-case.
Thank you @deltafactory, I ended up doing the same. That works for local, but now when I try and deploy and run I get ‘Cannot find module’ on my require in the lambda with the layer.
If I leverage node’s fs and look at the lambda file system, I can see my module from the layer under /opt/node_modules but its not being picked up by the running lambda. Is there some other trick?
Thanks for your helpful comments. How did you import the module after creating this folder structure?
I have a similar issue and /opt/nodejs/node6/node_modules is present at execution time but I see Cannot find module… any help appreciated!
To include libraries in a layer, place them in one of the folders supported by your runtime.
Node.js – nodejs/node_modules, nodejs/node8/node_modules (NODE_PATH)
Example AWS X-Ray SDK for Node.js
xray-sdk.zip
└ nodejs/node_modules/aws-xray-sdk
@srg-avai, thanks again. I created a folder structure nodejs/node_modules too (in a new layer version). I can see /opt/nodejs/node_modules is mounted, NODE_PATH includes /opt/nodejs/node_modules but no luck. For context, I’m attempting to access /opt/nodejs/node_modules/puppeteer via require('puppeteer').
I’ve just gone through the process of creating my own custom layer (global application code). To be able to run with sls invoke local I used npm link. Assuming your layer is here:
layers/nodejs/node_modules/some-module
Go into this directory and execute npm link. This will link the module globally.
Then to be able to use this module when running locally go to your home directory and execute npm link some-module. This will create ~/node_modules/some-module, which points to the global symlink, which points back to the module you have in the layers dir.
Doing this puts your layer module in Node’s execution path, which means require('some-module)will work when runningsls invoke local`.
You might think, “why not run npm link some-module in my application directory?” - if you do this, sls deploy ends up deploying the layer code into the lambda function code package, which is redundant.
Hi, any thoughts how include/require node_modules in my lambda local code before deploying through sls? I’m trying to deploy puppeteer and convert-svg-to-png to manipulate svg2png base64 image. Any success?
Thanks for the information.
Was wondering if there has been any progress on the matter?
I’m trying to deploy an AWS lambda using typescript (w. webpack) that uses another AWS layer i’ve deployed.
I used your suggestion re the npm link (outside of the application folder) but webpack still concatenated the npm package (my layer code) into the lambda “handler.js”.
If I “npm unlink” then sls deploy fails on not finding the module that is hosted in the layer…
Any ideas?
To use a layer with a function in the same service, use a CloudFormation Ref. The name of your layer in the CloudFormation template will be your layer name TitleCased (without spaces) and have LambdaLayer appended to the end. EG:
the next code will help you.
This phenomenon may happen when layers name is equal to service name. That’s because layers name create zip file (e.g.Test.zip) as well as service name create zip file (e.g.test.zip) when sls deploy with cloudformation info.