AWS/Netlify Lambda Memoized/Cached API Fetch Function

I have this situation, where in I don’t have the required API for the front end I am trying to implement. So I’m trying to solve this by having an intermediary cloud function that fetches all data from the original backend server and then does a simple filter to generate the data i need in my front end. The only issue with this approach is this, since I am fetching all data from the backend on every call, this becomes heavy on the original backend server in terms of database reads. Also I’m having to wait for the call to finish from the front end.

How can i Memoize my Netlify AWS Lambda Cloud Function, such that it periodically fetches data from the original server, say once every 2 hours and then caches it, so when i call from my front end the cloud function doesn’t have to talk to the original backend server and can just send me the cached data

This is the Code I wrote as a solution

import fetch from 'node-fetch';
const {API_TOKEN,USER_ID} = process.env;
let data = [];
let lastUpdate = 0;

`exports.handler = async (event,context,callback) => {`

const fetch_retry = async (url, options, n) => {
    let error;
    for (let i = 0; i < n; i++) {
        try {
            return await fetch(url, options);
        } catch (err) {
            error = err;
    throw error;

const fetchHotDeals = async () => {

            const activeCategoriesRes = await fetch_retry('',{
            method: 'GET',
            headers: {'APIToken': API_TOKEN}
            },3).catch(error => console.log(error));

            const activeCategoriesData = await activeCategoriesRes.json();
            const allCategories = activeCategoriesData.result;
            const subCategories = allCategories.filter(category => category.ParentCategoryId !== 0);
            const categoriesIdArray = => category.CategoryId);

            const productsNested = await Promise.all( async categoryID => {

                    const categoryProductsRes = await fetch_retry(`${categoryID}&UserID=${USER_ID}`,{
                        method: 'GET',
                        headers: {'APIToken': API_TOKEN }
                    const categoryProductsData = await categoryProductsRes.json();
                    return categoryProductsData.result;

            })).catch(error => console.log(error))

            const productsList =  productsNested.flat();
            const hotDeals = productsList.filter(product => product.DiscountPercentage > 15 && product.SellingPrice > 50);
            return hotDeals;    
        } catch (error) {


    if(data.length === 0){
        const hotDeals = await fetchHotDeals();
        data = [...hotDeals];
        lastUpdate = new Date().getTime();

        statusCode: 200,
        body: JSON.stringify(data)

    const currentTime = new Date().getTime();

    if(currentTime-lastUpdate > 1800000){ /* 3600000 = 1hour */
        const hotDeals = await fetchHotDeals();
        data = [...hotDeals];
        lastUpdate = new Date().getTime();

} catch (error) {
    callback(new Error("Unable to fetch hot deals"));


The issue with this is I think after the function goes to sleep the cached variable gets deleted, another issue is the first API Call takes a long time to finish. I’m also getting some errors while running this code

(node:19101) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'statusCode' of undefined
    at callback (/home/sc/web-projects/ecomm/node_modules/netlify-lambda/lib/serve.js:35:42)
    at /home/sc/web-projects/ecomm/node_modules/netlify-lambda/lib/serve.js:68:7
    at processTicksAndRejections (internal/process/task_queues.js:93:5)

UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 4)

I don’t why that is as i have provided try{} catch{} and then .catch methods.

Can someone guide me to a better solution or critique me on how i can better implement what i am trying to achieve. Thank You.