Process large file on multiple lambda

Hi guys,

We have a use cases like this :

On a form I upload an XLS with many lines (more than 50k) and chose wich config I want for processing this file (Config like lambda 1, lambda 2, step functions 1).

When each traitements is done we want to generate a new XLS with new data from our lambdas .

Today, we want to insert each lines from our XLS file on DynamoDB and trigger lambda functions when we have a new entrie on our dynamoDB but I don’t know if is the best way to do this ?

Thanks for you help :slight_smile:

I would let the front end submit each line to a temp DynamoDB table, then process each line with your different Lambdas

Actually I take that back. After thinking about how I would do it, I would do it with an ETL job and AWS Glue.

Good luck.