Getting off the ground with DynamoDB

I’m having real trouble getting off the ground with DynamoDB and the AWS docs are all but useless. Can someone point me to some node.js code examples where we query a DynamoDB table and read the records one by one, accessing the field values in each (not just doing a stupid console.log)? I know this is incredibly trivial but I am actually not able to find such an example in all the AWS docs.

here’s an example of an update call to table of a project I worked on.

var param = {
ExpressionAttributeNames: {
#ipAddr”: “IpAddress”,
#update”: “LastUpdate”
ExpressionAttributeValues: {
“:addr”: ipAddress,
“:time”: timeStamp
Key: {
“MACAddress”: macAddress
ReturnValues: “UPDATED_NEW”,
TableName: “”,
UpdateExpression: “SET #ipAddr = :addr, #update = :time”

            dynamodb.update(updateParam, function(error, data) {
                if(error) {
                    callback("Error in UpdateDevice - UPDATE");
                } else {

Appreciate that, thanks. Do you have any Query code? I’m just amazed I can’t find any that does more than console.log. I need to actually retrieve field values and use them.

sure…its a little harder to read - primarily b/c of the query, upon return will then turn around and invoke the code you see above (as well as potentially a PUT if the # of rows returned is 0).

var findParam = {
    TableName : <table name>,
    IndexName : 'idxEmailAddress',
    KeyConditionExpression: "#email = :addr",
        "#email": "EmailAddress"
    ExpressionAttributeValues: {

dynamodb.query(findParam, function(error, data) {
if ( error ) {
callback("Error in UpdateDevice - " + macAddress);
} else {
if ( data.Count===0 ) {
< blah blah blah, do a put otherwise to an update>

We’re so close. What I’m looking for is how to access the data object returned by the query.

For instance, how can you access the email address in your query? data.Email.S doesn’t work in my testing and neither does data.items[0].email.S. How can you access that?

that second parameter - “data” is an array of whatever object you have stored in the Dynamo table. for instance…let’s say your table was defined at:

email string
macAddress string

upon doing a query, that didn’t return an error, you’d be given an array of objects that will the contain 0 to N number of elements which satisfied the query. the array has a property of “Count” which contain the number of rows returned. In the case of “Count” >= 0 you can safely address the data array position of data.Items[0].. something like this would work …

            for(var i = 0; i < data.Count; i++) {

You’re a hero! It worked! I really, really appreciate your help.

honestly? when in doubt? let it error out…just put alot of console.log(); in your code so you can see what the objects look like. properties will appear that will only improve your understanding…

and of course you are welcome…

1 Like

Just one more question if I may. I am using API gateway to invoke my Lambda functions as I’m trying to build an entirely serverless system. If I put console.log in my code that is invoked by browsing to a website, can I see the output anywhere?

if your console.log(=== message ===); is in your lambda function it will appear in the cloudwatch logs created for your gateway api/lambda function automatically by the serverless framework.

Perfect. A good project for tomorrow. :slight_smile: Thanks again.

you are welcome…its getting late here too…need to hit the sack…

DynamoDB was so tough to set up and query that my team switched back to RDS/Postgres.


The problem experienced here seems to be the misunderstanding that DynamoDB is not a drop in replacement for a relational database. If you design your tables for DynamoDB the way you do for a postgres or mysql you will end up having issues. For example, there is no need to have relations. You can store relations as objects within an item in Dynamo. You also can’t just query willy nilly. You need to understand the access patterns of your application and then add keys around that access otherwise you will suffer frustration at the sheer cost of running scans.

DynamoDB is also crap at doing things like regular imports or exports of large data sets. You need to add ancillary services on in order to accommodate that. For example, I use the DynamoDb stream on a table to send changes to data to a lambda function which persists them in an S3 bucket. When it comes time for the monthly CSV export the client wants it comes straight out of S3.

DynamoDB is designed to be a key-value store with single digit millisecond latency on requests over its keys. It is not a general purpose datastore.


These are all good points, @garethmcc. I actually think I understand the purpose of DynamoDB quite well, and I understand that it’s not a traditional RDBMS. When someone like me moves from an RDBMS to a NoSQL store, there will undoubtedly be many (hundreds or even thousands) of cases where he has to re-learn how to do something simple. That’s all that’s happening with me. I will have a year or two in front of me to get as comfortable with NoSQL as I am with SQL, and that year or two will be filled with pain and frustration. In my case it’s exacerbated by having to also learn a new language (Node) which operates quite differently from what I’m used to. I obviously think it’s all worth the pain or I wouldn’t do it.

Hi @RegNatarajan. My comment wasn’t meant as criticism to any single person just a general FYI to anyone currently reading this thread or that may come along later and not be aware of these reasonably fundamental differences.

I am by no means a NoSQL or DynamoDB expert but I’m learning.

1 Like