Accessing stage or env variables in lambda

I’ve been going in circles for quite a while trying to set-up and access variables from my serverless.yml in node.js

The docs are quite confusing in this respect, which I somewhat understand as you are trying to create a system which is incredibly flexible to the needs of a huge target audience, but some basic examples should help much of the community getting started.

Specifically, I am trying to get a different s3 bucket based on which stage I’m running. I’m not too picky on how this happens (I don’t think), I just want to get it done.

Reading through the docs, trying everything it has again and again, this is where I’m at.

  name: aws
  runtime: nodejs4.3
  stage: ${opt:stage, self:custom.default_stage}
  environment: 
    bucket: ${self:stage}-bucketname

functions:
  view:
    handler: handler.view
    environment:
        bucket: ${self:stage}-bucketname

From here, I expected to be able to get process.env.bucket, but unfortunately in my handler code, bucket is not in my env.

No problem, let’s not do environment, let’s try stageVariable, maybe that’s the correct way. Following this guide here in the forums Make stageVariables clear - #2 by buggy

  stage: ${opt:stage, self:custom.default_stage}
  environment: ${file(env.yml):${self:provider.stage}}

referenced in an external file env.yml, in the output context.stageVariables=null

Then followed this Using Environment Variables with the Serverless Framework

Again, nothing in environment variables. Are any of these supposed to work???

Can you run sls --version and confirm you’re using 1.2.0 or later?

Also I think it should be:

provider:
  name: aws
  runtime: nodejs4.3
  stage: ${opt:stage, self:custom.default_stage}
  environment: 
    bucket: ${self:provider.stage}-bucketname

custom:
  default_stage: dev

functions:
  view:
    handler: handler.view
    environment:
        bucket: ${self:provider.stage}-bucketname

You’re using ${self:stage} instead of ${self:provider.stage}

Thanks,

I’m using serverless 1.3.0.

I had ${self:stage} as I thought I was able to reference local yml defined variables like that.
updating with the environment both under the provider and under my function, I still don’t have bucket defined in process.env

Log in to the AWS console and find your Lambda. When you’re on the code tab for the function scroll down and look for “Environment variables”. Does it have an environment variable called bucket listed?

If bucket isn’t defined then try hard coding a value like bucket: "xxx" and deploy again. Once you have it deploying with a value it should be easy to make it stage specific.

You can reference local environment variables using self but the stage is in the provider section so it’s provider.stage and not just stage. Hence it the whole variable is ${self:provider.stage}.

Hi @buggy
I don’t know it the variables should be in context.stageVariables or in process.env but anyway I can’t confirm that your sample it does not work (serverless 1.3.0). bucket variable is not created and it does not appear on amazon console (lambda environment variables).

I was battling this. After looking around a lot I finally found the solution in the sls deploy function reference.

Note: Because this command is only deploying the function code, function properties such as environment variables and events will not be deployed. Those properties are deployed via CloudFormation, which does not execute with this command.

In other words, you gotta use sls deploy (without function) to update environment vars.

3 Likes