Microservice that updates multiple dynamodb tables?

interested in best approaches to this - or should a microservice (even a higher level one) never do this ?

Are other services also updating those same tables? My interpretation of “best practice” is that you don’t want multiple services talking to the same table(s) if possible, not the other way around.

I don’t think there’s anything wrong with a service touching multiple tables if the data makes lends itself to be put in multiple tables; I definitely wouldn’t want to “force” the data in to one table, just so my service only had to talk to one table.

1 Like

lets imagine the simplest case - a microservice A that owns and uses 2 dynamodb tables. Now lets say A.update() involves an update to those 2 tables - whats the best way to handle it without having to use the dynamdb transaction lib (which apparently does 7n+4 operations or something) or a complex saga pattern ?

Implementing transactions in a distributed system is going to be complex. Off the top of my head you’d need to lock, copy, update, unlock and remove the copy for each item in each table as a minimum. I’m sure I’ve missed something which explains the 7n+4.

Options:

  1. Restructure the data so it fits into a single table.
  2. If you’re storing more than 400kb of data then consider using S3. It’s basically a giant key/value store like DynamoDB but slower.
  3. If you can cope with temporary inconsistencies and your concern is all of the data being written eventually then maybe put a Kinesis stream in the middle so failures are retried?
1 Like