Can't upload to s3 from python hander

My Serverless user still has admin access, but for some reason, my lambda cannot upload to s3.

import boto3
open(’/tmp/file.txt’, ‘w’).write(data)
s3.upload_file(’/tmp/file.txt’, bucket_name, ‘test.txt’)

“errorMessage”: “An error occurred (AccessDenied) when calling the PutObject operation: Access Denied”

I haven’t been able to Google anything exactly on topic. What I’ve found suggests adding a policy to my bucket, but all of the policy options I’ve tried have failed.

Hi @AlexMcConnell, You lambda function gets its own role IamRoleLambda. You have to add your own s3 access policy to this and add the role to the policy.

Try something like this:

      Type: AWS::IAM::Policy
        PolicyName: ${self:provider.stage}-${self:service}-lambda
          Version: "2012-10-17"
              Effect: Allow
                - s3:*
              Resource: arn:aws:s3:::bucket/sub_dir/*
          - Ref: IamRoleLambda
      Description: ARN of the lambda IAM role
        Fn::GetAtt: [ "IamRoleLambda", "Arn"]

There is an easier way to do this now. If you use a template, the configuration to achieve this is commented out. Try:

$ sls create -t aws-python

and look at serverless.yaml

I did this, but it’s very confusing. I had to upgrade to 1.0 from 0.5.5. Are those three files the entire project? Do I replace one of my functions with them?

I understand how you are confused. Firstly version 1.0 is completely different to 0.5.x. Everyone on this forum is using and discussing v1.0.

My guess is that your problems stem from a lack of groundwork. You will feel much more confident going forward if you spend some time understanding the AWS services you wish to use and the serverless framework concepts.

Try creating a new project with the aws-python template, as shown above. Try deploying it. Afterwards, go find the lambda function in your AWS console and hit the Test button. Perhaps try adding an http GET endpoint and sls deploy again.

Check out the docs and good luck