Balagururaja Alagarsamy
Senior AWS Developer
contact
************@*****.***
Current location: Chennai
LANGUAGES
Tamil
English
SKILLS
Programming
Python, SQL, Pyspark, Java, Unix
Data
Snowflake, Redshift, Metabase, DBT, Oracle, Postgres
Orchestration
Apache Airflow, Control-M, Step function
Cloud
AWS - S3,EC2, EMR, IAM, Lambda, EMR, Glue, SCT, DMS
AZURE - Azure function, Azure DataLake Storage (Gen 2), DataFactory
BI Tools: SSIS, PowerBI
INTERESTS
Reading books
Playing Cricket
Summary
Versatile AWS Data Engineer with background in designing, architecting, and implementing high-performance data processing systems. Demonstrated ability to leverage full stack AWS technologies while applying expertise in big data solutions and complex business intelligence infrastructures. Strengths include advanced knowledge of ETL process design, development, optimization, along with robust problem-solving skills. Previous work has consistently produced reliable, scalable systems designed for optimal functionality and performance.
WORK EXPERIENCE
Senior AWS Developer
HTC Global Services
April 2022 - Present
●Design and implement data pipelines using AWS Services and orchestrate using Step functions.
●Develop data ingress frameworks using the AWS tech stack.
●Develop Terraform/CloudFormation templates to automate deployment of AWS services.
●Collaborate with other data architects to implement data models and ensure seamless integration with AWS services.
●Ownership of supporting the process, monitor and fix production issues to meet set SLA's (service level agreement).
Senior Data Engineer
Fleet Studio
September 2021 - April 2022
●Designed and developed python framework to process the data from different client warehouses to S3 and then load it into postgres DB.
●Automate and orchestrate the process using Apache Airflow.
●Developed DBT queries for faster query processing.
Specialist Data Engineer
L&T Infotech
August 2019 – September 2021
●Designed and developed a metadata driven python framework to unload the data to S3 from Redshift and then load it into downstream DB .
●Automate and orchestrate the process using Control-D.
●Developed unix scripts to do the data file transformation from windows server to S3 using SFTP
●Developed a SSIS package to extract data from sql server and then load to S3.
Data Engineer
Cognizant Technology Services (CTS)
March 2016 – August 2019
●Designed and developed a python framework to unload the data to S3 from Redshift and manipulate the data then load it into Snowflake.
●Automate and orchestrate the process using Apache Airflow.
●Efficiently reused existing python scripts with minimal development to do data transformation.
Senior Software Engineer
HTC Global Services
April 2012 – March 2016
●Designed the website pages
●Developed java code for data store
●Debugging and fixing of the issues
●Support in the SIT testing
●Deployed the code in svn using Jenkins
EDUCATION HISTORY
Bachelor of Engineering in C.S.E
Anna University
2007 – 2011
Percentage - 65 %
H.S.C. – 84.5 %
S.S.C. - 87 %