S3 event to trigger lambda not parsed correctly

I am having some trouble invoking a lambda function that I have deployed with Serverless. It is supposed to be triggered by S3 objectCreate:Copy events, into an existing S3 bucket. And it works like a charm when objects are actually created in the bucket. But if I try to invoke it like this:

sls invoke --function config-file-exporter -p s3-event.json -l --region eu-west-1 --stage dev
"OK"
--------------------------------------------------------------------
START RequestId: 765ded8c-80ac-4cb2-bf73-5ac941580b47 Version: $LATEST
2020-08-19 13:41:34.517  INFO 8 --- [           main] n.p.a.f.c.ConfigFileExporterFunction     : Received event: {"records":[]}
2020-08-19 13:41:34.517  INFO 8 --- [           main] n.p.a.f.c.ConfigFileExporterFunction     : How many records?: 0
END RequestId: 765ded8c-80ac-4cb2-bf73-5ac941580b47
REPORT RequestId: 765ded8c-80ac-4cb2-bf73-5ac941580b47  Duration: 15.96 ms      Billed Duration: 100 ms Memory Size: 1024 MB    Max Memory Used: 114 MB

Well, you see what I get, 0 records in the parsed S3Event.

This is the code that produces that output:

@Override
public String apply(S3Event event) {
    String eventAsJson = null;
    try {
        eventAsJson = objectMapper.writeValueAsString(event);
        if (eventAsJson.contains("ConfigWritabilityCheck")) {
            return "OK";
        }
    } catch (JsonProcessingException e) {
        log.error(e.getMessage());
        return "KO";
    }
    log.info("Received event: {}", eventAsJson);
    log.info("How many records?: {}", event.getRecords().size());
    ...

The problem is that the content of the local event file named s3-event.json is NOT {“records”:[]} at all, but an event that I took from the logs of this very same function after it was properly invoked by an S3 bucket object creation.

Has anyone seen anything similar?