How to query dynamoDB, transform results, and upload transformed results to s3?

Hi, I am trying to build a lambda function that will query a dynamoDB table, transform the results and upload the transformed results as a csv file to an s3 bucket. I have gotten most of it to work but the s3.upload call never runs and never returns an error from what I can tell.

This project is a modification of the fantastical tutorial at

My query call returns the results I am expecting every time I test this out:

const result = await"query", dynamoParams);

I am also able to compile the notes into a string that is recognized as a csv file when saved with .csv:

const corpus = gatherNotes(result.Items);

gatherNotes function definition:

function gatherNotes(notes) {
        let notesArray = ["Content"];
        for (let note in notes) {
            notesArray.push(notes[note].content + ", ");
        let csvString = notesArray.join("\n");
        return csvString;

The csvString is then passed to an uploadNotes function:

await uploadNotes(corpus);

that looks like this:

async function uploadNotes(csvBody) {
        let csvBuffer = Buffer.from(csvBody);
        let params = {
            ACL: "public-read",
            Bucket: process.env.notesBucket,
            Key: 'private/' + event.requestContext.identity.cognitoIdentityId + '/notes.csv',
            Body: csvBuffer
        console.log("s3.upload function about to run");
        s3.upload(params, function (err, data) {
            console.log("s3.upload function happening here");
            if (err) console.log(err, err.stack); // an error occurred
            else console.log(data);               // successful response

The console.log call for the csvBuffer returns the notes I am expecting.

After defining these functions in this lambda’s main function, I execute them in a try block at the bottom of my function:

try {
        const result = await"query", dynamoParams);
        const corpus = gatherNotes(result.Items);
        await uploadNotes(corpus);
        return success({ status: true });
    } catch (e) {
        return failure({ status: false });

I test this by running the command:

serverless invoke local --function collectNotes --path mocks/collect-event.json

The response will return everything I expect, up to the line:

console.log("s3.upload function about to run");

After that it returns:

    "statusCode": 200,
    "headers": {
        "Access-Control-Allow-Origin": "*",
        "Access-Control-Allow-Credentials": true
    "body": "{\"status\":true}"

The s3.upload call never runs and I have absolutely no idea why. Can anyone tell me what I am missing? Thanks in advance.

Hi, it seems to me that you need to change a little bit your uploadNotes. s3.upload is async, but you don’t wait for a result.
try this one

console.log("s3.upload function about to run");
await s3.upload(params).promise()
1 Like

Your upload notes function calls the S3 upload function asynchronously but you exit the function before waiting for the result which is what is causing your issue.

1 Like

Thanks! That did the trick. I had tried adding await keyword in front of the s3.upload but I added async in front of the function calling it instead of adding .promise() at the end. I’m still trying to wrap my mind around promises.

I’m still not getting any of the log output that I put in s3.upload call but I assume that’s because its still wrapped in the promise somehow. The file notes.csv shows up in the s3 bucket though, which is what I want it to do so thanks.

Hi, I’m glad you have fixed your code. If you want to see “s3.upload function happening here” when s3 finishes uploading your file - just add it after await s3.upload, you don’t need a callback anymore.

try {
  const result = await s3.upload(params).promise();
  console.log("s3.upload function happening here");
} catch(e) {
1 Like