PROFESSIONAL SUMMARY
Over Nine years of experience in Cross-Platform application development, design, and Cloud Technologies.
In-depth experience with DevOps and AWS Cloud Services, (EC2, S3, EBS, ELB, Cloud Watch, Elastic IP, EKS, RDS, SNS, SQS, Glacier, IAM, VPC, ECR, ECS, Cloud Formation, Route53) and managing security groups on AWS.
Experience in Docker for containerizing the web applications and deploying them on EKS.
Experience in Writing terraform templates in modules to reuse them in enterprise level.
Experienced in monitoring tools Datadog and Splunk.
Experience in Cost optimization using the FinOps in cloud .
Experienced in DevOps Environment as a DevOps engineer working on various technologies and applications like GIT, Jenkins, Terraform, AWS, and Maven.
Experience in Infrastructure automation with Terraform and CloudFormation Templates. experience in developing and supporting both production and development environments.
Developing multi-Availability zone architecture applications to achieve the high availability of the applications.
Experienced in deploying and operating AWS, specifically VPC, EC2, S3, EBS, IAM, ELB, Cloud Formation and Cloud Watch using AWS Console and AWS Command Line Interface.
Experienced in DevOps, Cloud Infrastructure, and Automation. It includes Amazon Web Services (AWS), CloudFormation, Cloudbees Jenkins, GIT.
Experienced in AWS Cloud platform and its features which includes Amazon AWS Cloud Administration which includes services likcloud2, S3, EBS, VPC, ELB, AMI, RDS, IAM, SNS, Route 53, LAMBDA, Cloud Watch, Cloud Trail, Cloud Formation, Security Groups, RDS.
Experienced in creating Customized IAM Roles to control and monitor the user’s access.
Experienced in Splunk monitoring tools to monitor the application logs.
Experienced in storing the data in S3 Buckets and archive the data into Glacier by using the S3 lifecycle policy to reduce the cost and expenses to store the data in S3 Buckets.
Encrypted the data in S3 buckets by using Customer managed KMS Key (Key Management Service) and AES-256 (Advanced Encryption Service).
Experienced in launching the Ec2 Instances in Red-hat Linux, Linux, and Windows operating environments.
Worked on EBS (Elastic Block Storage) volumes to store the data locally to increase the performance of the applications.
Experienced in launching the RDS Instances (Relational Data Services) in multiple Data sources like PostgreSQL, SQL Server launching QL Database Instances.
Experienced in connecting the RDS Instances from Ec2 Instances Server Installing the PostgreSQL client.
Experienced in encrypting RDS Instances by using Customer Managed KMS Keys .
Experienced in RDS Database backups by taking the automatic RDS Snapshots.
Working experience in AWS CLI (Command Line Interface)
Experienced in Data migration from On-premises Data sources into AWS Cloud RDS Databases by using the AWS Glue and AWS DMS Service.
Experienced in Cloud Trail to monitor user’s activities and for auditing purposes.
Experienced in Creating the AWS API gateways for Restful web services.
Experienced in creating the Amazon Machine Image (AMI) from EC2 instances.
Experienced in creating AWS EBS Volumes snapshot and Encrypting EBS Volume snapshot by using KMS Key.
CERTIFICATIONS
AWS Certified Solution Architect Associate
HashiCorp Terraform Certified
Certification Badge :
Validation Number: YZHP8D61VN1Q18SG
Validate at: https://aws.amazon.com/verification
TECHNICAL SKILLS
Languages : Java, SQL, PLSQL, XML, C++, C, WSDL, XHTML,
HTML, CSS, Java Script
Development Tools : Eclipse, IntelliJ, Visual Studio
Design and Modeling : Rational Rose, Microsoft Visio.
Cloud Computing: : AWS, EC2, ELB, S3, VPC, Lambda, RDS, IAM,, DMS, Redshift,
Cloud Formation, CloudWatch, Glacier, AWS Glue, AWS API Gateway, SNS, SQS, Step Functions, KMS, Route53, Data Pipelines, EKS, ECS, Glue,
Monitoring Tools : AWS Cloud Watch, Splunk, Datadog
Cloud Security Tools : Prisma Cloud, Evident.io
CI/CD Tools: : Jenkins, Cloudbees, Gitlab
Build/Monitoring : Maven, ANT
Configuration Mgmt tools : Chef, Ansible.
Databases : Oracle 11g/10g/9i/8i, SQLServer, DynamoDB,AWS RDS
App/Web Servers : Weblogic8.1/11g, Jboss, Apache, Tomcat,Webspherev8.5
Software Engineering : Agile, Waterfall
Scripting languages : Java Script, Bash Script, Shell Script, JSON
Version Control : GitHub
Environments : UNIX, Red Hat Linux, Windows 2000, Windows XP.
PROFESSIONAL EXPERIANCE
AWS Cloud Infrastructure, DevOps, and Security Engineer Date: May’ 21 to Present
Client: Verisk Analytics, Jersey City, NJ
Responsibilities:
Extensive experience working in an Agile development environment.
Experience in modernizing the applications by deploying them to AWS Cloud.
Experienced in containerizing the Java based and Python applications using the Docker and deploying on to AWS EKS.
Developed Terraform modules to automate and provision the cloud infrastructure so that Terraform modules can be re-usable in enterprise level across the organization.
Rehydrated the applications with new versions of the images and made sure they are up-to date.
Experienced in using Datadog as a monitoring tool and written the index queries and created the data dog dashboards for alerts.
Experienced in using Datadog for synthetic monitoring of applications.
Experience in FinOps and provided the solutions for Cost optimization.
Saved 65% of the cost on the AWS cloud services by applying the FinOps techniques for the cost optimization and provided the cost estimation for the Budget approvals for the higher management.
Experienced in automating to provision the Custom AWS IAM Roles, Policies with least privileges through Roles-as-a-Service Pipeline to secure the cloud infrastructure.
Experience in using the Prisma Cloud to scan the entire Aws Cloud account and find the Security and vulnerabilities and auto-remediate them by modifying the Prisma Cloud API’s and writing the Custom AWS Config Rules.
Experienced in Migrating data from On-Premises to AWS Cloud S3 Buckets using the AWS Glue Service.
Experience in automating the ETL Pipelines using AWS Glue Studio pipeline service.
Experience in using the AWS Secret manager to store the secrets and passing the secrets while establishing the Database connection from Cloud to On-premises data center.
Experience in writing the Jenkins files to automate the build and Deployment process.
Used SonarQube for Code coverage to improve the performance of the applications.
Developed the AWS Lambda functions for server less application using python.
Experienced in processing data from S3 buckets to redshift Datawarehouse database using the AWS Glue Jobs.
Experienced in deploying the application in Blue Green deployment model to have to have high availability, resiliency and Disaster recovery.
Worked on Amazon Web Services (AWS) for a multitude of applications utilizing the b stack such as EC2, VPC, Glacier, Route53, S3, RDS, Cloud Watch, Cloud Trial, WAF, SNS and IAM, focusing on high-availability, fault tolerance, Load balancing and auto-scaling in designing, Deploying and configuring
Monitored various metrics like CPU Utilization, Swap Usage, Database connections, Current Connections, Read IOPS/Write IOPS etc. using Cloud Watch on different services like EBS, Elastic Cache, RDS, ELB etc.
Created EC2 Instances for Web based Applications and attached Elastic Load Balancing (ELB)for high performance across multiple AZ's.
Design of server less application architecture using S3, Lambda, API Gateway, DynamoDB, Route53 and SQS.
Provided highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.
Perform Jenkins administration, updating plugins, setting up new projects and debugging build problems.
Experienced in the installation and configuration of Nginx as a web server.
Administered and Engineered Jenkins for managing weekly Build, Test and Deploy chain, SVN/GIT with Dev/Test/Prod Branching Model for weekly releases.
Worked on Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple names spaced versions using Helm Charts.
Experienced in creating the Hash Corp Terraform Scripts to provision the AWS Cloud infrastructure.
Experienced in creating the Terraform modules to automate and provision the cloud infrastructure.
Worked on core AWS services such as Setting up new server (EC2) instances/services in AWS, configuring security groups, and setting up Elastic IP's, auto-scaling and Cloud Formation.
Used Simple storage services (s3) for snapshot and Configured S3 lifecycle of Applications & Databases logs, including deleting old logs, archiving logs based on retention policy of Apps and Databases.
Configured and managed AWS Glacier, to move old data to archives based on the retention policy of databases / applications (AWS Glacier Vaults).
Experience in IP networking, VPN's, DNS, load balancing and firewalling.
Developing scripts for build, deployment, maintenance, and related tasks using Jenkins, Docker, Maven, Python and Bash.
Written scripts in Python, Perl, Java to automate log rotation of multiple logs from web servers and for automation and monitoring of AWS services like VPC, ELB, RDS, Lambda, AWS Opsworks, Cloud Front using Shell and Python scripts and Using Jenkins AWS Code Deploy plugin to deploy
Experienced in developing the Rest API’s using python.
Experienced in automating and Provisioning the Cloud infrastructure using the DevOps Pipelines.
Hands on experience in automating the infrastructure using the Terraform as infrastructure as a code.
Environment: AWS services like EC2, S3, RDS, Lambda, API Gateway, VPC, Redshift, KMS, IAM, Jenkins, GitHub, SonarQube, Checkmarks, DynamoDB, Glacier, EMR, Amazon Linux Image, Nagios, CloudWatch, RDS, ESB, ELB, Cloud Watch, OPSWorks, Cloud Formation Templates and other tools like Jenkin, Git, Git, JIRA, Confluence, Bitbucket, Cloudbees, PowerShell, Jenkins, Docker, Maven, Python.
AWS Cloud and DevOps Engineer. Date: June’ 20 to April’ 21
Client: Cigna, Raleigh, NC
Responsibilities:
Extensive experience working in an Agile development environment.
Was a part of Disaster Recovery exercise and actively involved in creating replica in another AWS AZ’s.
Creating alarms in Cloud watch service for monitoring the servers' performance, CPU Utilization, disk usage etc.
Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.
Worked in an IAAS environment called Terraform, to manage application infrastructures such as storage and networking.
Experienced in Security and Compliance tools Prisma and Evident.io to provide security in AWS Cloud.
Prepared the cost estimation to use AWS services and provided the detailed cost estimation report to the Manager.
Reduced the cost and expenses to end customer on AWS Cloud Services by analyzing the business requirements and duration of the projects and recommending the instances purchase options.
Controlled the cost and expenses on AWS Services by creating the AWS Instance scheduler by 70% of the AWS Cloud Services bill.
Scheduled the EC2 instance by using Instance scheduler to automate the stop and start the instances in Dev and Test environment based on working hours and Weekends.
Migrated the entire 70TB of the Asia data lake data from multiple on-premises data sources to AWS cloud Redshift data warehouse and S3 Buckets using AWS DMS Service and AWS Transfer Family.
Written scripts in Python, Perl, Java to automate log rotation of multiple logs from web servers and for automation and monitoring of AWS services like VPC, ELB, RDS, Lambda, AWS Opsworks, Cloud Front using Shell and Python scripts and Using Jenkins AWS Code Deploy plugin to deploy.
Troubleshooted issues on production servers and implemented Failover policies.
Created an AWS RDS Aurora DB cluster and connected to the database through an Amazon RDS Aurora DB Instance using the Amazon RDS Console.
Automated the cloud deployments using AWS CloudFormation Templates (JSON and YAML).
Created AWS Macie service to analyze the sensitive data and do the data classification and extract the classified data and find the data security threats.
Prepared the AWS Infrastructure Documentation to help the Team.
Used Dynamo DB to store the data for metrics and backend reports for Data Stage Team.
Created a AWS Code pipeline to automate build and deployments.
Installed and configured Jenkins and SonarQube on AWS Code pipeline.
Installed JMS Client on EC2 Instance and configure the JMS Connection with IBM MQ server to send and receive the Messages.
Migrated the production SQL server schema to the new AWS RDS Aurora instance.
Worked on Internal PowerShell applications for System administration of 400+ servers in Dev, QA and prod environments.
Developed and improved Java applications which perform config changes, Deployments in different environments via PowerShell.
Managed Git repositories and code of DevOps team's internal applications and integrated setup with SourceTree for other users.
Part of the Platform team which architected the deployment procedure for multiple applications and worked as release co-coordinator for bi-weekly deployments of microservices.
Developed scripts for build, deployment, maintenance, and related tasks using Jenkins, Docker, Maven, Python and Bash.
Developed front-end UI application in AngularJS for Dev teams to submit requests to Platform team for eg; New Repo, Subsystems, and User Access etc.
Continuous Integration Delivery using bamboo, Jenkins, GitHub. Maintained Bitbucket Server for our Git repositories.
Proficient with Atlassian suite (JIRA, Confluence, Bitbucket, Bamboo, Hip Chat).
Environment: AWS services like EC2, S3, RDS, Lambda, Docker, IAM, Code Pipeline, Code Commit, DynamoDB,, Glacier, EMR, Amazon Linux Image, Nagios, Cloud watch, RDS, ESB, ELB, Cloud Watch, OPS Works, Cloud Formation Templates and other tools like Docker,Python, Jenkin, Git, Ant, servlet, Maven, TFS, Git, JIRA, Confluence, Bitbucket, Bamboo, PowerShell, Jenkins, Docker, Maven, Python, BASH, Ant, SVN, ANT, C#, Angular, Perl, Ansible, WebSpherev8.5 .
AWS Cloud & DevOps Engineer Date: Jan’ 19 to June’ 20
Client: Anthem, Cary, NC
Responsibilities:
Migrating a production infrastructure into Amazon Web Services utilizing AWS Cloud formation, Code Deploy, Terraform, EBS, and Ops Works.
AWS Cloud management and managed Chef Cookbooks to automate system operations.
Wrote Chef Cookbooks to bootstrap Chef Client and creation of VM's on cloud environments with the desired applications on the node.
Process to support the automation of builds, deployments, testing and infrastructure (infrastructure as Code) using Chef.
Developed Chef Recipes to configure, deploy and maintain software components of existing infrastructure.
Automated cloud deployments using chef, python (boto & fabric) and AWS Cloud Formation Templates.
Created scripts in Python which integrated with Amazon API to control instance operations.
Used Python AWS SDK for dynamic creation of EC2 instances.
Assisted in migrating the existing data center into the AWS environment.
Setting up Auto scaling of the instance group using AWS command line tools and AWS cloud environment for Dev/QA environments.
Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups.
Used IAM to create and manage AWS users and groups and use permissions to allow and deny their access to AWS resources.
Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, Dynamo DB tables, EBS volumes; to set alarms for notification or automated actions; and to monitor logs for a better understanding and operation of the system.
The complete installation and hosting were automated with AWS cloud formation and PowerShell scripts
Involved in Shell and Perl scripts for compilation and deployment processes and automation of builds and PowerShell for Windows deployment and Administration.
Used PowerShell for DevOps in Windows-based systems.
Experience with writing Python scripts to automate some of the EC2 instance tasks.
Used Python BOTO and Fabric for launching and deploying instances in AWS.
Automated the Applications and MySQL container deployment in Docker using Python and monitoring of these containers using Nagios.
Documented all build and release process related items. Level one support for all the build and deploy issues encountered during the build process.
Automated deployment of builds to different environments using Jenkins
Implemented & maintained the branching and build/release strategies utilizing GIT.
Responsible for writing the Release Notes, documenting all the useful info about the release, software versions, changes implemented in current release, Defects fixed, Labels applied.
Strong understanding of infrastructure automation tooling (AWS cloud formation, EBS)
Experienced in deployment of applications on Apache Web server, Nginx and Application Servers such as Tomcat, Oracle web logic sever.
Implemented Puppet modules to Install, configure and maintain web servers like Apache Web Server, Nginx
Wrote Puppet modules for installing and managing Java versions and managing persistent SSH tunnels.
Involved in periodic archiving and storage of the source code for disaster recovery.
Development of SPLUNK Queries to generate the Report
Environment: EC2, RDS, S3, IAM, VPC, Cloud Watch, Cloud Trail, SNS, EBS, Route 53, ELB, Amazon Machine image, Elastic Bean Stack, Python (boto), Shell scripting, Linux, MySQL, Jira, Jenkins, Ant, Maven, Puppet, GIT, App Dynamics, Splunk, Dockers, Rack Space
AWS Cloud & DevOps Engineer Date: Nov’ 16 to Dec’ 18
Client: Vanguard Malvern PA
Responsibilities:
Extensive experience working in an Agile development environment.
Reduce the cost and expenses on AWS Cloud Services by analyzing the business requirements and duration of the projects.
Prepared the cost estimation to use AWS services cost estimation report to the Manager.
Schedule the EC2 instance by using the services scheduler to automate the stop and start the instance in Dev and Test environment based on Office hours and Weekends.
Created the Terraform scripts to automate and provision the AWS Cloud Services through DevOps Pipeline.
Monitoring the application logs and server logs using CloudWatch.
Responsible for maintaining infrastructure resources on the AWS platform using services like EC2, S3, Route53, VPC, SQS, RDS, IAM, Cloud Formation Template and Cloud Watch focusing on highly available and fault-tolerant environment.
Used Route 53 Weighted policy for any failure in the primary region (US-east) and routing the traffic to the secondary region (US-West).
Experienced in Elastic Kubernetes Services (EKS) to deploy, manage, and scale containerized applications using Kubernetes on AWS
Experienced in Prisma Cloud Security and Compliance tool to provide security to the AWS Cloud Services.
Created notifications and Alarms for the EC2 instances (CPU Utilization) using Cloud Watch and Installed ELK Stack for the monitoring purpose.
Encrypted the Data stored in S3 Buckets using Customer managed KMS Keys and keep the version enabled to keep the data multiple versions.
Created the S3 Bucket Custom policies to provide security to S3 Buckets Data.
Created the Custom IAM Policies to specific Region and provide the security to the AWS Cloud Services by providing custom Roles to the users.
Provisioned the EC2 instance in Linux and Windows operating systems.
Created security groups and allowed the traffic through inbound rules.
Created Amazon machine images (AMI) for Ec2 instances.
Created Elastic block storage volumes and encrypted EBS Volumes with Customer managed KMS Keys.
Created the Simple system manager role and attached it to the Ec2 instance to connect through Session Manager.
Encrypted the AWS RDS databases using Customer managed KMS Keys.
Created the Cloud formation templates in JSON and YAML.
Created the SNS Topics to send the notifications.
Migrated the entire 20TB of the US data lake data from on-premises data sources to AWS cloud RDS and S3 Buckets using AWS DMS Service.
Worked on Amazon Web Services (AWS) for a multitude of applications utilizing the stack such as EC2, VPC, Glacier, Route53, S3, RDS, Cloud Watch, multitude of applications, focusing on high-availability, Load balancing and auto-scaling in designing, Deploying and configuring.
Automated the AWS Cloud infrastructure through CI/CD Pipeline using Terraform and Jenkins.
Administered and Engineered Jenkins for managing weekly Build, Test and Deploy chain GIT with Dev/Test/Prod Branching Model for weekly releases.
Configured and managed AWS Glacier, to move old data to archives based on the retention policy of database Configuration (AWS Glacier).
Worked on setting up and configuring AWS EMR Clusters and Used Amazon IAM to grant permissions on AWS resources to users.
Developing scripts for build, deployment, maintenance and related tasks using Jenkins, Docker, Maven, Python, and Bash.
Troubleshoot issues on production servers and implemented Failover policies.
Created an AWS RDS DB cluster and connected to the database through an Amazon RDS Aurora DB Instance using the Amazon RDS Console.
Automated the cloud deployments using AWS CloudFormation Templates (JSON and YAML).
Involved in AWS Code Pipeline for Continuous Delivery services to model, visualize and automate the steps required to release Software.
Environment: AWS services like EC2, S3, RDS, Lambda, Docker, IAM, DynamoDB,, Glacier, EMR, Amazon Linux Image, CloudWatch, RDS, ESB, ELB, Cloud Watch, OPSWorks, Cloud Formation Templates, Terraform, Jenkin, Git, JIRA, Confluence, PowerShell, Jenkins, Docker, Maven, Python, BASH, Ant .
JAVA Developer Date: Jan’ 16 to Oct’16
Client: AVCO Consulting Inc, Worcester, MA
Responsibilities:
●Involved in Technical software and application design. Web Development using J2EE Frameworks.
●Collaborating and consulting with business analysts. Developed web application using java, Spring MVC, RESTful Web Services, Apache Tomcat and Oracle.
●Provided technical Enhanced web pages using JavaScript, C#, jQuery and CSS/HTML.
●Translated prototype designs into HTML and CSS elements.
●Supported projects utilizing skills in Java, EJB, Oracle, XML, JSP and Ajax.
●Used CSS, HTML, jQuery, jQuery UI and JavaScript for developing rich user interface.
●Front-end Coding using but not limited to Magento Themes and HTML, JavaScript, jQuery, and XML.
●Implemented Spring transaction management for some database transactions.
●Also, worked on enhancements with the existing application, which was implemented using MVC paradigm, implemented by Spring Framework.
●Worked extensively on J2EE for developing Web and Distributed Applications by using JSP, JSF, Servlets, Struts, Hibernate, Spring Framework and Web Services, EJB, JDBC.
●Used Hibernate as an Object-Relational Mapping ORM tool to store persistent data and for communicating with Oracle database.
●Web Interface is designed using J2EE, XML, RESTful Web Services and JDBC.
●Designed and developed user interfaces using JSP, JavaScript, XML and HTML.
Environment: Java, J2EE (Servlets, JDBC), Spring 3(Spring AOP, Spring IOC, Spring Theme framework, Spring MVC, Spring Annotations), JMS, RESTful Web Services, Junit, Git, HTML, XML, Apache Tomcat, Maven.
EDUCATION DETAILS
Jawaharlal Nehru Technological University, Kakinada, AP, India.
Bachelor’s in computer science
GPA: 3.6
Master’s in computer science
King Monroe School
New Rochelle, New York
GPA: 3.8