Hi all I’m deploying a Python lambda function with a native dependency (ffmpeg)
I’m getting this error:
[ERROR] FileNotFoundError: [Errno 2] No such file or directory: 'ffprobe': 'ffprobe'
I have this demo in Node which works fine.
This is the relevant snippet of serverless.yml file for the example. Beautifully simple:
functions:
mkgif:
handler: handler.mkgif
events:
- s3: ${self:custom.bucket}
layers:
- {Ref: FfmpegLambdaLayer}
layers:
ffmpeg:
path: layer
And this is mine:
layers:
# Ref name is generated by TitleCasing the layer name & appending LambdaLayer
- {Ref: AudioLambdaLayer}
layers:
audio:
path: dependencies
This produces in both cases, inside .serverless/
the zip file named ffmpeg.zip
in the demo case and audio.zip
in my case, as you’d expect, and both contain what you’d expect inside.
Layers are set up properly in AWS, they look good.
But in my case, ffmpeg is nowhere to be found, whereas the demo, ffmpeg
turns up inside /opt/ffmpeg/ffmpeg
I’d have expected mine to turn up either (a) in one of the directories in the PATH, or (b) /opt/audio/ffmpeg
Buuut it’s not. In fact, /opt
appears to be blank: I get its contents via os.popen(f"ls -Rl /opt")
I’m also using serverless-python-requirements
too to copy over the other dependencies. I’m hoping that’s not interfering with the other bits and pieces, but maybe it is. I can just see the antipattern: external script crashes, taking down the whole pipeline, causing other elements of the pipeline to fail.
I’m pretty stuck. The next step will be to port the example to python and maybe I’ll find something. Or try to manually build the dependencies rather than use the serverless-python-requirements
?
Any guidance most appreciated.