I have several AWS Lambda functions in Serverless, and found I was writing the same sendEmail function over and over. Just setting up the client and composing the SES response with a few minor tweaks. I decided it would be best to abstract this to a single function that is called by the others. My folder structure resembles:
My question is, how do I call this email function from other deployments? Is it as simple as including a fully-qualified path to it in the calling function’s serverless.yml file (which already has the proper SES IAM permissions), or will this need its own serverless.yml file with a trigger instead of an event schedule? Obviously, I’d prefer the former, but I’m confused as to how it should all come together.
Any pointers or suggestions would be greatly appreciated!
Perhaps I misunderstand your use case but rather than have two seperate services I would use one, with a single serverless.yml. It would look something like this:
package:
individually: true
exclude:
- AlarmsManager
- BackupManager
# Commons is therefore implicitly included in all packages
functions:
alarm1:
handler: AlarmsManager/Alarm1.handler_function
package:
include:
- AlarmsManager
events:
...
backup1:
handler: BackupManager/Backup1.handler_function
package:
include:
- BackupManager
events:
...
In the issue you posted (2199) you said you are using python.
For python code requiring a limited number of extra modules (in this case, your Commons) I symlink it directly into the function directory. This is clean and clear to those looking at project structure. Furthermore it allows for clear python imports in the lambda function.
If you symlinked commons from AlarmsManager/, your Alarm.py might look like this: