I have several AWS Lambda functions in Serverless, and found I was writing the same sendEmail function over and over. Just setting up the client and composing the SES response with a few minor tweaks. I decided it would be best to abstract this to a single function that is called by the others. My folder structure resembles:
My question is, how do I call this email function from other deployments? Is it as simple as including a fully-qualified path to it in the calling function’s serverless.yml file (which already has the proper SES IAM permissions), or will this need its own serverless.yml file with a trigger instead of an event schedule? Obviously, I’d prefer the former, but I’m confused as to how it should all come together.
Any pointers or suggestions would be greatly appreciated!
Related issue #2199
Perhaps I misunderstand your use case but rather than have two seperate services I would use one, with a single
serverless.yml. It would look something like this:
# Commons is therefore implicitly included in all packages
In the issue you posted (2199) you said you are using python.
For python code requiring a limited number of extra modules (in this case, your
Commons) I symlink it directly into the function directory. This is clean and clear to those looking at project structure. Furthermore it allows for clear python imports in the lambda function.
If you symlinked
Alarm.py might look like this:
from commons import send_email
def your_handler(event, context):
This method would allow you to keep your two individual services