Hey!
I’m using serverless with LocalStack. As soon as I need to add need to add Python package using the serverless-python-requirements, I got an error message
TypeError [ERR_INVALID_ARG_TYPE]: The “path” argument must be of type string or an instance of Buffer or URL. Received undefined
I wrote a super simple Lambda function
import requests
def httprequest(event, context):
r = requests.get("https://news.ycombinator.com/news")
return {"content": r.text}
and here is my serverless file:
service: test
frameworkVersion: ">=2.15.0"
plugins:
- serverless-localstack
- serverless-prune-plugin
- serverless-python-requirements
provider:
name: aws
runtime: python3.8
stage: ${opt:stage, 'local'}
memorySize: 128
timeout: 20
logRetentionInDays: 14
region: us-east-1
deploymentPrefix: aws-lambda
deploymentBucket:
name: ${self:service}-${opt:stage}-deploy
package:
exclude:
- node_modules/**
- venv/**
- .pytest_cache/**
- tests/*
functions:
httprequest:
handler: handlers.test.httprequest
# severless deploy --stage=local
# local stage configurations this will only be invoked for deploy stage local
# Using stage attribute allows you to switch environments easily
# you need to ensure localstack services started with the following s3,lambda,cloudformation,sts,iam,cloudwatch, apigateway(if ur lambda uses http methods, websockets)
custom:
pythonRequirements:
dockerizePip: true
localstack:
debug: true # enable or disable debugging
stages:
# list of stages for which the plugin should be enabled
- local
host: http://localhost # optional - LocalStack host to connect to
autostart: true # optional - start LocalStack in Docker on Serverless deploy
endpoints:
# This section is optional - can be used for customizing the target endpoints
S3: http://localhost:4566
DynamoDB: http://localhost:4566
CloudFormation: http://localhost:4566
Elasticsearch: http://localhost:4566
ES: http://localhost:4566
SNS: http://localhost:4566
SQS: http://localhost:4566
Lambda: http://localhost:4566
Kinesis: http://localhost:4566
lambda:
# Enable this flag to improve performance
mountCode: True
docker:
# Enable this flag to run "docker ..." commands as sudo
sudo: True
Does any body have an idea of how I can make it works?
Thanks
Hey @HugoPelletier I just ran into this issue as well. What fixed it for me was to do the following in my serverless.yml
package:
individually: true
functions:
httprequest:
handler: handlers.test.httprequest
package:
exclude:
- ./**
include:
- ./handlers/test.py
give that a try but depending on your directory structure you might need to adjust the package include section for the function. Also if you can include your project directory structure that would help as AWS Lambda functions don’t follow the same rules for python module paths.
1 Like
@cturner-confluent Same error. If I try without this part, I got another error message Cannot read property 'artifact' of undefined
package:
exclude:
- ./**
include:
- ./handlers/test.py
Running on this version
serverless --version
Serverless: config.options_stage: undefined
Serverless: serverless.service.custom.stage: undefined
Serverless: serverless.service.provider.stage: {opt:stage, 'local'}
Serverless: config.stage: {opt:stage, ‘local’}
Framework Core: 2.15.0
Plugin: 4.3.0
SDK: 2.3.2
Components: 3.4.3
Try changing the function block for the httprequest
from:
functions:
httprequest:
handler: handlers.test.httprequest
to:
functions:
httprequest:
handler: handlers/test.httprequest
and keep everything the same as our original serverless.yml file
Just tried your suggestion, no luck!
I also removed all the functions from the yaml file and still got the same error…pretty strange.
I’m pasting the yaml file just in case you see something wrong. Obviously, I’m new to serverless!!
Thanks again for your help.
service: test
frameworkVersion: ">=2.15.0"
plugins:
- serverless-localstack
- serverless-prune-plugin
- serverless-python-requirements
provider:
name: aws
runtime: python3.8
stage: ${opt:stage, 'local'}
memorySize: 128
timeout: 20
logRetentionInDays: 14
region: us-east-1
deploymentPrefix: aws-lambda
deploymentBucket:
name: ${self:service}-${opt:stage, 'local'}-deploy
package:
individually: true
functions:
httprequest:
handler: handlers/test.handler
package:
exclude:
- ./**
include:
- ./handlers/test.py
# severless deploy --stage=local
# local stage configurations this will only be invoked for deploy stage local
# Using stage attribute allows you to switch environments easily
# you need to ensure localstack services started with the following s3,lambda,cloudformation,sts,iam,cloudwatch, apigateway(if ur lambda uses http methods, websockets)
custom:
pythonRequirements:
dockerizePip: true
localstack:
debug: true # enable or disable debugging
stages:
# list of stages for which the plugin should be enabled
- local
host: http://localhost # optional - LocalStack host to connect to
autostart: true # optional - start LocalStack in Docker on Serverless deploy
endpoints:
# This section is optional - can be used for customizing the target endpoints
S3: http://localhost:4566
DynamoDB: http://localhost:4566
CloudFormation: http://localhost:4566
Elasticsearch: http://localhost:4566
ES: http://localhost:4566
SNS: http://localhost:4566
SQS: http://localhost:4566
Lambda: http://localhost:4566
Kinesis: http://localhost:4566
lambda:
# Enable this flag to improve performance
mountCode: True
docker:
# Enable this flag to run "docker ..." commands as sudo
sudo: True
Can you share what your project directory looks like?
I try multiple things, so please ignore the extra files in the handlers directory
for your function block can you try doing using the handler.py file that is the root of your project directory?
package:
individually: true
functions:
httprequest:
handler: handler.<entry point here> // i.e. main() or something in the handler.py file
package:
exclude:
- ./**
include:
- ./handler.py
try this and see if it works if it doesn’t I am not sure the issue is with the plugin serverless-python-requirements