Sign in

Engineer Aws

Ann Arbor, MI
February 19, 2020

Contact this candidate


Professional Summary:

•Over *+ years of IT experience as DevOps Engineer in applications configurations, code compilation, building, automating, managing and releasing code from one environment to other environment and deploying to servers.

•Experience working in Agile Software Development Methodology that includes product backlog refinement, sprint planning, daily scrum meetings, iterative development, product demo, retrospective meetings.

•Ansible and Chef Automation experience including writing playbooks, cookbooks, and customized recipes for test driven development environments and test kitchen.

•Excellent hands on experience on configuration management tool like Chef, Puppet and Ansible.

•Hands on experience in using Jenkins for continuous Integration (CI/CD) and push the code to production.

•Implemented highly available, cost effective and fault tolerant systems using multiple EC2 instances, Auto Scaling, Launch Configuration, Route53, SNS and Elastic Load Balancer.

•Create ECS and EMR clusters to deploy the applications using new AMI and configure the security groups for each API.

•Designed AWS Cloud Formation templates to create custom sized VPC, subnets to ensure successful deployment of Web applications and database templates.

•Configured roles and groups for users and resources using AWS Identity Access Management (IAM).

•Provide highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMI for mission critical production servers for backup.

•Good experience on Amazon RDS to manage, creates snapshots, and automate backup of database.

•Utilized CloudWatch logs to monitor resources such as EC2, CPU memory, EBS volumes to set alarms for notification or automated actions.

•Scheduling cron jobs in AWS Lambda to configure the applications to run automatically using Python and Groovy scripting.

•Create Docker containers and deploy EC2 instances. Using Artifact download war, jar and pom files for running the API’s in IntelliJ.

•Monitor logs, create dashboards, and visualize the data in ELK and Splunk for a better understanding and operation of the applications.

•Proficient in designing and handling Tables, Views, Stored Procedures, Functions, Cursors and Triggers with SQL, Oracle Databases and MYSQL.

•Good experience in usage of version controls such as SVN, GIT.

•Hands on experience in Atlassian tools like JIRA and well versed in build tool Maven.

•Excellent analytical ability, consultative, communication and management skills.

•Self-motivated, easily adaptable to new environments and ability to work independently as well as in team.


Programming Languages

Core Java, Groovy, Python, Shell scripting and SQL.

Technologies &

Service API’s

JDBC, Webservices

Configuration Management tool

Ansible, Puppet, Chef, Terraform

CI/CD tool

Docker, Jenkins


Oracle (8i/9i/10g/11g), SQL Server.

Application Servers

Apache Tomcat.

Web Services


Operating Systems


Apache Tools

Maven and ANT.

Development Tools

IntelliJ, TDD.


Amazon Web Services

Version Control

SVN, GitHub.

Client: Capital One, Rolling Meadows, IL Nov 2018 - Present

Project Name: Customer Modernization Management (CLIP Transformation)

Job Role: DevOps Engineer

Project Description:

Customer Modernization Management focuses on enhancing the CLIP (Customer Line Increase Program) feature of the organization. The CLIP program is classified into Reactive CLIP and Early CLIP, which is being modernized further to better serve the customers. rCLIP: These programs apply a line increase to a customer’s account proactively within the first 12 months after account opening for Mainstreet accounts and 6 months for UpMarket accounts based on a set of eligibility criteria.


•Responsible for gathering the requirements from product owner and understanding it to meet customer needs.

•Participate in sprint planning meetings, retrospective and follow Agile methodology.

•For credit line increase program consumed REST webservices for sending and retrieving the data from user interface.

•Manage iterations of developing code in source version control to Git.

•Use company in-built workflow components to configure in JSON format to input various data and apply business logic, data transformations to make API calls.

•Use company inbuilt Quantum Framework to ingest, process the batch and streaming data.

•Use lambda functions in python and invoke the scripts for data transformations on large data sets in EMR clusters.

•Update S3 bucket policy only to authorized users to access to the IAM role and cross region access rule permission. Implement new bucket policies in production and validate results using cloud watch logs.

•Create and automate ECS clusters with new base image and destroying the existing machine (Rehydration process) using the latest AMI for security purposes.

•Using Latency and Weighted Records in Amazon Route 53 to Route Traffic to Multiple Amazon EC2 Instances in a Region.

•Create and write configuration files to build AWS infrastructure to on-board applications to pipeline.

•Use Postman tool for testing the applications using endpoint URL’s and migration of SSL certificates from one version to other.

•Create Master and Read-replica RDS database in East and West AWS console. Automate the database using lambda, bogie and python scripting.

•Perform security scans and fixed the code using Eratocode.

•Conduct user acceptance testing and the code is deployed to Quality Assurance then to production.

•Install and Configured Chef Server, Workstation, Client servers and nodes, written several recipes, cookbooks in Chef to automate.

• Experience in Designing, Installing and Implementing Ansible configuration management system and in writing playbooks for Ansible and deploying applications.

•Provision the highly available EC2 Instances using Terraform, Cloud Formation and wrote new plugins to support new functionality in Terraform.

•Setup continuous integration with Jenkins (CI/CD) and building the pipeline using Groovy to make sure it releases a jar version in Jfrog artifact.

•Perform TREx activity in which applications are failed over to the AWS west region. In order to discover any issues with an application's disaster recovery configuration prior to needing it in an actual disaster.

•Migrate the SHA1 to SHA256 checksum on the data file using Java application to validate the data migration.

•Developed test plans, test cases and executed test cases associated with the software being developed.

•Used IntelliJ IDE for developing the code. Use ELK stack for searching the data, stash the data and create an index, dashboards for keeping track of logs in Kibana.

•Involve in daily stand ups to track project progress using JIRA and report if there are any issues.

Environment: Core Java, RDS, AWS Lambda, ELK, S3 bucket, Ansible, Chef, EMR, Groovy, Shell, Python, Microservices, Rest, Postgres, Postman, SNS, Route53, Kibana, Terraform, AMI, AWS, Jenkins, Docker, Jira, Kanban, Maven, Apache Tomcat server and Git.

Client: Dignity Health, Phoenix, AZ Mar 2017 – Nov 2018

Project Name: Health Care Systems

Job Role: DevOps Engineer

Project Description:

Health Care System that is committed to patient care, research and service to the community locally and globally. The main aim of the project was to develop an open source patient index system that provides health care organizations to exchange data and it uniquely identifies to which patient the data belongs to despite variations in the attributes that describe the patient.


•Gathering requirements and working with business analysts for resolving design issues.

•Attended daily scrum meeting, sprint planning meeting, backlog refinements.

•Configure Load Balancers and VPC with private subnets and performed troubleshooting for connectivity issues.

•Extensively used Auto Scaling groups and launch configuration templates for launching Amazon EC2 instances while deploying Microservices.

•Integrate Amazon Cloud Watch logs with EC2 instances and ECS clusters for monitoring the log files, store them and track metrics.

•Create Docker containers to deploy and run various web applications. Also, was a part of team for analyzing microservices management using Docker.

•Manage the AWS cost cutting by writing the Ansible playbook for auto start/stop of AWS resources at a time of the day by triggering it from Jenkins (CI/CD) pipeline.

•Work on enabling API gateway by working on both AWS Cloud Formation template and API gateway extension to swagger that handles all API calls providing features like processing, monitoring, authorization, access control and traffic management to various AWS services (Lambda functions).

•Automate the development process for deployment of the Applications, Containers and provisioning Public Cloud Environments with Terraform, Ansible, Docker.

•Involve in developing custom scripts using Shell and groovy scripting to automate jobs.

•Build Automation and pipeline using configured Jenkins (CI/CD) job.

•Develop pom.xml and .yml files for Maven build scripts and automate the batch jobs.

•Create native SQL queries to persist data to database.

•Use Sonar cube for rectifying the groovy code bugs and vulnerabilities in workflows.

•Use Splunk to capture, index and correlate real-time data in the dashboard and monitor the logs for application management.

•To automate applications set up Ansible Infrastructure and uploaded the playbooks, roles for deployment.

•Deploy builds in various environments like Development, QA and Production.

•Involve in all projects that move to production and work for data center exits.

•Create Change Orders for pushing the code changes to production and taking approvals during the enterprise meetings.

•Store the data using S3 storage bucket and restricting bucket policies and permissions to confined people.

•Use Policy Generator in IAM to create custom policies to Users and Groups for privileges to AWS Services Enabling Cross Account Access for users by switching roles in the Dev account permission to assume the role in the Prod account.

•Experience with ELB, routing the traffic to specific data center regions using Route53, updating the certificates to provide secure authentication for applications.

•Configure Maven, resolve life cycle dependencies and generated artifacts for deployment.

•Creating feature branches and checkout and merging branches through Git repository and used Git bash commands from command line tool.

•Used JIRA tool to keep track of issues and update the status of the same accordingly.

Environment: Agile, Groovy, Shell scripting, Python, ELK, AWS, Webservices, EC2, ECS, SNS, AWS Lambda, Terraform, AWS Lambda, Ansible, Docker, Jenkins, UNIX, SQL, Git, Apache Tomcat, JSON, Splunk, Eclipse, Maven and Junit.

Company: CenturyLink, Monroe, LA Jan 2016 – Mar 2017

Project Name: Mobility Application Experience

Job Role: DevOps Engineer

Project Description:

MAX is a Web app that makes the work of the technician simple. They can find their assigned job on daily basis and work on them. The Jobs are related to network issues like service outage, new network connection. Features like status, Pole inspection form, Time sheets were included in the app which makes the job of technicians easy. Addition to web app we have developed mobile app where we used android studios and implemented all the features in it. This makes technician life much easier to access their jobs and update status to upstream.


•Involved in Agile/ SCRUM based environment-Requirement Analysis, Development, System and Integration Testing and detailed Design document for the project.

•Participate in sprint planning meetings and requirement gathering phases of SDLC.

•Develop Chef Cookbooks to manage system configuration and for Tomcat, MySQL, Windows applications and versioned them on GIT repositories and Chef Server.

•Work closely with software developers and DevOps to debug software and system problems.

•Integrated Terraform with Jenkins (CI/CD) to achieve Blue Green Deployments by changing the Route 53 configuration or ELB configuration.

•Secured Terraform state file remotely in S3 buckets by encrypting and versioning. Configured Vault to store sensitive data.

•Defined AWS infrastructure as code by making use of various Terraform AWS modules to create VPC, subnets, EC2 instances, and RDS.

•Create AWS S3 buckets, performed folder management in each bucket, Managed cloud trail logs and objects within each bucket.

•Wrote Ansible code for automating secure VPC creation as well as the deployment process of standing up secure Jenkins (CI/CD) Server and ELK stack.

•Create scripts in Python which integrated with Amazon API to control instance operations.

•Setting up the Ansible control machine (RHEL7) and configured the remote host inventories via SSH.

•Experience in creating Docker containers leveraging existing Linux Containers and AMI's in addition to creating Docker containers from scratch.

•Maintain and coordinated environment configuration, controls, code integrity, and code conflict resolution.

•Involve in writing parent pom files to establish the code quality tools integration.

•Written Chef Cookbook recipes to automate installation of Middleware Infrastructure like Apache Tomcat, JDK and configuration tasks for new environments etc.

•Analyze and resolve conflicts related to merging of source code for GIT.

•Prepare documentation with all necessary steps and configurations to be covered before code is moved to production.

•Develop test plans and execute test cases according to the acceptance criteria.

•Configure and deployed applications on Apache Tomcat Server 8 and tracked defects using Jira.

•Written complex SQL queries using joins to retrieve data and developed logic for calling stored procedures.

Environment: Agile, AWS Cloud, Chef, Ansible, Terraform, JSON, VPC, Subnet, EC2, ECS, UNIX, SNS, Jenkins, Groovy, Shell scripting, Apache Tomcat Server, Junit, RDS, Docker, SQL, ELK, Git and Jira.

Client: Provogue, Mumbai, India March 2014 – Dec 2014

Project Name: Custom Express Project

Job Role: DevOps Engineer

Project Description:

Custom Express project is a web application, which is used by business stakeholders to update inventory and maintain catalog products. Any newly added requirements or out of stock inventory items will be updated/created in database using this custom express application. It manages admin portal and access permissions and to create new events start dates and end dates and generating reports of availability in stock at different store locations.


Implemented & maintained the branching and build/release strategies utilizing Subversion /GIT.

Create environments in AWS cloud using infra provision tool Terraform.

Create a Jenkins (CI/CD) jobs for building ECS clusters to run the application on them.

Use Puppet server for configuration management of hosted instances within AWS.

Participate in configuring, monitoring distributed and multiple servers using Puppet.

Involved in periodic archiving and storage of the source code for disaster recovery.

Worked closely with developers to pinpoint and provide early warnings of common build failures.

Deploy Puppet dashboard for configuration management to existing infrastructure.

Use MAVEN as build tools on Java projects for the development of build artifacts on the source code.

Automate build and release management process including monitoring changes between releases.

Execute and maintain tasks including creating users and groups, reports and queries.

Create Docker images using Docker files to support Containerization of web applications.

Document project's software release management procedures with input decisions.

Develop, maintained, and distributed release notes for each scheduled release.

Provide periodic feedback of status and scheduling issues to the management.

Use Jenkins (CI/CD) AWS Code Deploy plugin to deploy to AWS cloud.

Experience in Cloud automation using AWS Cloud Formation templates.

Created views and appropriate meta-data, performed merges, and executed builds on a pool of dedicated build machines.

Environment: Subversion, GIT, Jenkins, AWS, Maven, Containerization, Puppet, Jira, Linux, XML, Windows server, Web logic, MYSQL, Shell scripts.

Client: Remedy Hospital, Hyderabad, India May 2013 – Feb 2014

Project Name: Insurance Member Benefits

Role: DevOps Engineer

Project Description:

The project was a web-based application developed for streamlining office workflow processes for Member benefits in claims management cycle on Member Portal. The features included Eligibility, Billing Address Verification with Address History and Insurance Eligibility Verification


Create and maintain users, user profiles, security, rights, disk space and process monitoring.

Configure user accounts for Continuous Integration – Jenkins (CI/CD).

Involve in maintaining the user accounts (IAM), RDS, Route 53, Lambda, VPC, RDB, DynamoDB and SNS services in AWS cloud.

Deploy applications using Jenkins (CI/CD) server and Troubleshoot build & release job failures, resolve, work with developers on resolution.

Manage and document all post deployment issues utilizing the Post Deployments Issue Log.

Configure Puppet Master and Agent. Wrote Puppet manifests for deploying, configuring, and monitoring into agents/nodes.

Experience in designing and deploying AWS Solutions using EC2, S3, EBS, Elastic Load balancer (ELB), auto scaling groups.

Experience in creating alarms and notifications for EC2 instances using Cloud Watch.

Follow up with the users and pending tickets, updating pending CRQ’s and incident tickets on daily basis.

Edit and modify POM.xml for a few applications that used Maven.

Experience in writing Shell scripts to automate the deployments. Automate processes with custom built Shell scripts.

Install and administered GIT source code tool and ensured the reliability of the application as well as designed the branching strategies for GIT.

Environment: AWS cloud, Jenkins, Shell Scripting, Docker, Apache Tomcat, Puppet, GIT, MAVEN.

Contact this candidate