- MySQL Database.
- S3 Bucket
- Airflow Instance.
- Create the schema and table in MySQL
- Create below two connections,
a. aws_conn -> AWS Connection Credentials
b. rds_mysql -> MySQL RDS Credentials - Place CSV files in the bucket under the path with pattern my_s3_bucket/yyyy/mm/dd/hh/mm/file.
- Update the s3 source location and MySQL schema-table info in the script.
- Place S3_Mysql_Load_Airflow.py in airflow dag folder and enable it in UI.
- Trigger the airflow dag or wait for the DAG to trigger (DAG is scheduled for every 1 hour)