Lambda upload s3 works for the first time, after that got access denied

Hello,
I was looking for some post with my problem but I not founded nothing yet.

So, I made a simple resize lambda function trigged by my S3 bucket, when image is uploaded on bucket, it got resized and copy to another folder on bucket.
The problem is this just work for the first 2 images, and after that it got Access Denied error. Take a look at my serverless code:

service: ${env:BUCKET_NAME}-lambda-images-optimize

frameworkVersion: "3"

plugins:
  - serverless-plugin-typescript
  - serverless-dotenv-plugin
	
useDotenv: true

provider:
  name: aws
  runtime: nodejs18.x
  memorySize: 256
  iam:
    role:
      name: lambda-s3-permissions
      statements:
        - Effect: Allow
          Resource:
            - "arn:aws:s3:::${env:BUCKET_NAME}/*"
          Action:
            - s3:GetObject
            - s3:PutObject
            - s3:DeleteObject

functions:
  optimize:
    handler: src/optimize.handle
    environment:
      bucket: ${env:BUCKET_NAME}
    events:
      - s3:
          bucket: ${env:BUCKET_NAME}
          event: s3:ObjectCreated:*
          rules:
            - prefix: orderItems/
            - suffix: .jpg 
          existing: true
      - s3:
          bucket: ${env:BUCKET_NAME}
          event: s3:ObjectCreated:*
          rules:
            - prefix: orderItems/
            - suffix: .png 
          existing: true
      - s3:
          bucket: ${env:BUCKET_NAME}
          event: s3:ObjectCreated:*
          rules:
            - prefix: orderItems/
            - suffix: .jpeg 
          existing: true
      - s3:
          bucket: ${env:BUCKET_NAME}
          event: s3:ObjectCreated:*
          rules:
            - prefix: orderItems/
            - suffix: .gif 
          existing: true
  onDelete:
    handler: src/delete.handle
    environment:
      bucket: ${env:BUCKET_NAME}
    events:
      - s3:
          bucket: ${env:BUCKET_NAME}
          event: s3:ObjectRemoved:*
          rules:
            - prefix: compressed/
            - suffix: .jpg 
          existing: true
package:
  individually: true

this is my function:

'use strict';

import { S3Client, GetObjectCommand, PutObjectCommand } from "@aws-sdk/client-s3";
import * as sharp from "sharp";
import { basename, extname } from "path";
import { Readable } from "stream";

const s3Client = new S3Client();

export const handle = async ({Records: records}) => {
  
  try {
    await Promise.all(
      records.map(async record => {
        const { key } = record.s3.object;

        const regex = /^(.*\/).*\//;
        const filePath = key.match(regex);

        const getObjectCommand = new GetObjectCommand({
          Bucket: process.env.bucket,
          Key: key,
        });

        const image = await s3Client.send(
          getObjectCommand
          );
          const imageBuffer = await streamToBuffer(image.Body as Readable)
        const optimized = await sharp(imageBuffer)
          .resize(1280, 1280, { fit: "inside", withoutEnlargement: true })
          .toFormat("jpeg", { progressive: true, quality: 70 })
          .toBuffer();

        const comand = new PutObjectCommand({
          Body: optimized,
          Bucket: process.env.bucket,
          ContentType: "image/jpeg",
          Key: `compressed/${filePath[0]}${basename(key, extname(key))}.jpg`,
          StorageClass: 'INTELLIGENT_TIERING',
        })

        await s3Client.send(comand);

      })
    );

    return {
      statusCode: 201,
      body: {ok: true}
    }
  } catch (error) {
    throw new Error(error);
  }
};

async function streamToBuffer(stream: Readable): Promise<Buffer> {
  return new Promise((resolve, reject) => {
    const chunks: Buffer[] = [];
    stream.on('data', (chunk: Buffer) => chunks.push(chunk));
    stream.on('end', () => resolve(Buffer.concat(chunks)));
    stream.on('error', (error: any) => reject(error));
  });
}

I really don’t know what to do, can someone help me?

Just making sure that you’re uploading only png, jpg, jpeg or gif right ? And this occurs only for optimization and not for delete. What happens if you upload the same file again ? As the config seems good and it worked for few minutes so it should work unless other kind of file is uploaded.

I found the error, is the key when images with special character don’t recognize the same key provided by s3 get objects list, I fixed deciding the key after all

1 Like