AWS lambda function to write csv data to an Amazon DynamoDB table

Below is the code to trigger a lambda function when a csv file is uploaded to an Amazon S3 bucket and write data to an Amazon DynamoDB table. I am getting error "expected str, bytes or os.PathLike object, not dict".

Could you please suggest where is the mistake I did.

import boto3
import csv

s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')
#table = dynamodb.Table('batch_data')

def csv_write(table_name, rows):
    table = dynamodb.Table(table_name)

    with table.csv_write() as batch:
        for row in rows:
    return True

def read_csv(csv_file, list):
    rows = csv.DictReader(open(csv_file))

    for row in rows:

def lambda_handler(event, context):

        bucket = event['Records'][0]['s3']['bucket']['name']
        csv_file_name = event['Records'][0]['s3']['object']['key']
        response = s3_client.get_object(Bucket=bucket, Key= csv_file_name)

        table_name = 'batch_data'
        items = []

        read_csv(response, items)
        status = csv_write(table_name, items)

            print('Data saved')
            print('Error in saving data...')
    except Exception as err:
        print (err)

Read more here:

Content Attribution

This content was originally published by abhishek at Recent Questions - Stack Overflow, and is syndicated here via their RSS feed. You can read the original post over there.

%d bloggers like this: