I have about a dozen functions in a project and I’ve now added a function log-shipper which I want the other functions cloudwatch logs to be sent to. I can achieve this simply in the AWS UI by choosing the action “stream to AWS Lambda” for each function but this seems way to manual and error prone and would like to get this configuration into my serverless.yml
.
I see in the docs that there is a cloudwatchLog
event you can associate to a function but I’m struggling on how I might configure it such that all functions — other than the log-shipper
— are configured to stream to the log-shipper
.
The cloudwatchLog
event source monitors log groups and then streams them to your Lambda function. So in your case, you’d want to configure your log-shipper
function with a cloudwatchLog
event for each function you want to stream to log-shipper
.
For example:
functions:
log-shipper:
handler: # your handler
events: # The Events that trigger this Function
- cloudwatchLog:
logGroup: '/aws/lambda/func1-to-stream'
filter: ''
- cloudwatchLog:
logGroup: '/aws/lambda/func2-to-stream'
filter: ''
- cloudwatchLog:
logGroup: '/aws/lambda/func3-to-stream'
filter: ''
This will create subscriptions to those log groups and send them all to log-shipper
. Note that you are limited to ONE subscription per log group, so this would be the only function that will monitor them.
1 Like
Many thanks! Will give a try post-coffee.
Hi Jeremy,
If you are using the Serverless Enterprise Framework which already creates a log group subscription, how would be the best way to ingest all Lambda logs to ELK? In the past i was using the serverless-es-logs plugin, until we moved to Serverless, SLS gives us monitoring in return but not raw logs to consult. As they retrieve the logs using a IAM Role and not a lambda y can’t reuse the lambda to send logs to SLS and ELK. So how can you deal with this problem?
Best.