The S3-Lambda-Dynamo-CSV-Processor is a project that uses AWS services to automatically process and store data from CSV files. It uses S3 for storage, Lambda for processing, DynamoDB for data storage and API Gateway to access the data stored in S3.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- An AWS account
- AWS CLI installed and configured on your local machine
- Python3 and pip3 installed on your local machine
- Clone the repository to your local machine
git clone https://github.com/<anubhav-ojha>/S3-Lambda-Dynamo-CSV-Processor.git
- Navigate to the project directory
cd S3-Lambda-Dynamo-CSV-Processor
- Install the necessary npm packages
pip3 install -r requirements.txt
- Deploy the project to your AWS account using the AWS CLI
aws cloudformation deploy --template-file template.yml --stack-name S3-Lambda-Dynamo-CSV-Processor
- Once the deployment is complete, you can test the project by uploading a CSV file to the S3 bucket created by the CloudFormation template. The Lambda function will automatically trigger and process the file, and the data will be stored in the DynamoDB table.
- AWS S3
- AWS Lambda
- AWS DynamoDB
- AWS API Gateway
- Python3 - Python Programming Language
- boto3 - AWS SDK for Python
- Anubhav Ojha - anubhav-ojha