Sign in

Devops Enginner

Aldie, VA
January 15, 2020

Contact this candidate


Professional Summary:

More than * years of IT experience as DevOps Engineer/AWS Cloud Engineer and as PL/SQL Developer including designing Build process, Software Product Development, Build and Deployment Automation, Release Management, Packaging Quality Control, Source Code repository & Environment management.

Experience in working on DevOps/Agile operations process and tools area (Code review, unit test automation, Build & Release automation, Service, Incident and Change Management).

Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Developing, Testing and implementing and Post-production analysis of the projects with Agile Scrum, Waterfall and other Project Management methods.

Good experience at automating the release process of apps using tools like Jenkins and Maven.

Worked Extensively on AWS Cloud platform and its features which includes EC2, S3, VPC, EBS, dynamo DB(DDB), Aws Lambda, Elastic Beanstalk, AMI, SNS, RDS, Cloud Watch, Route53, SQS, Cloud Front (CDN), Auto scaling, Security Groups, and Cloud Formation.

Experience in creating custom Bucket policies to grant Bucket access only to the required users and to those accounts which has cross bucket accesses

Hands on Experience in managing artifacts using Nexus and Jfrog repository and Virtual Machines using Virtual/Cloud Environment technologies (VMware, CloudStack, AWS & RDS).

Experienced as Build & Release Engineer for release automation of enterprise applications to achieve continuous integration/deployment (CI/CD) on container infrastructure (Docker)

Experience in building CI/CD pipeline on enterprise level to automatize the deployment and testing of the code in AWS East and West regions.

Experience in creating GitHub-webhooks to trigger the CI process whenever a Pull Request or a Git Push to a repository is performed.

Experience on setting up code repositories using Jenkins, Subversion (SVN), Git, Github, Bitbucket, Eclipse, Perl, RedHat (RHEL), on different Environments of windows/Unix. Managed plugins for Jenkins for scheduled automatic checkout of code.

Deployed Java applications/Web-services using CI/CD tools like Jenkins and Chef in standalone and clustered environments.

Good Knowledge of container systems like Docker and container orchestration like EC2 Container Service, Kubernetes, worked with Terraform.

Experience in IAM roles modification to add the trust relationships for cross bucket functionality to read and post data into other Buckets.

Experience in building complete Infrastructure in both EAST and WEST cloud regions to run the batch process using AWS EMR, SPARK to build Analytics engine for executing the workflows.

Proficiency of creating scripts using Bash, Perl, Power Shell to support infrastructure as code and continuous deployment, Python which integrated with Amazon API to control instance operations.

Extensively worked on building the AWS stack (Lambda Functions, SNS Topics, CloudWatch Events) using Cloud Formation Templates and deployed the stacks in both EAST and WEST regions using groovy scripts.

Experience in Migrating the EC2 Instances to reduce the costs and as well as to reduce the computational time of the applications.

Expertise in using build tools like MAVEN and ANT for the building of deployable artifacts such as war, ear & jar from source code.

Experience in and demonstrated understanding of source control management concepts such as branching, merging, labeling and integration.

Good knowledge on managing and integrating code Quality tools like SonarQube, manage sonar rules, Quality gates.

Experience in building and deploying java applications and troubleshooting the build and deploy failures.

Experience in performing AMI Rehydration through Jenkins which automatically updates the existing stacks created in AWS.

Experience in installing, maintaining, and troubleshooting Application Servers Web Sphere, WebLogic, JBoss, Apache Tomcat, Nagios and security software’s in Linux/Unix.

Hands of experience with SQL & PL/SQL, MySQL and PostgreSQL. Performed tasks of taking backups time to time. Deployed and configured Database servers on Cloud platforms like AWS, Azure to handle Big Data Applications.

Hands on experience in Monitoring tool like Splunk and scheduling tools like Cronjob.

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.

Strong Knowledge in Data warehouse concepts, ETL.

Educational Qualification

Bachelor’s in Computer Science and Engineering, JNTU, Hyderabad, India in 2007.

Technical Skills

Operating Systems

Ubuntu, Red Hat, CentOS, Windows Server


Python, Shell Scripting, SQL, Java, Groovy, HTML

Cloud Computing

SOAP, RESTful, Amazon web services like AWS: EC2, VPC, S3, Route53, Cloud Watch, IAM, SNS, RDS, Cloud Front, EMR, EC2 CLI, Python boto3 module

Configuration Management

Chef, Puppet, Ansible, ANT, MAVEN

Continuous Integration/Delivery

Jenkins, TeamCity, Bamboo

Version Control Tools

GIT, SVN, Bitbucket

Monitoring Tools

Splunk, Cloud Watch


Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2008

Ticketing/Bug Tracking

Jira, Remedy, ServiceNow


VMWare, VCAC, VCenter

Container Technologies

Vagrant, Docker

CI Tools

Jenkins/Hudson, Bamboo, U Deploy, X Deploy


Agile, Waterfall Methodologies

Web Service Tools

JBoss, Apache Tomcat 5.x, IntelliJ IDEA, Oracle WebLogic 10.x/11.x, IBM WebSphere 5.x,6.x,7.0, IIS Server

Professional Experience:

Client: Bank of America, Richmond VA

Role: AWS & DevOps Engineer July 2018- Till date


Developed CI/CD pipelines using Jenkins2, GitHub, Jfrog, maven and AWS.

Migrated all the existing Jenkins 1 jobs to Jenkins 2 by writing the Groovy scripts.

Develop and deploy applications on the cloud using EC2, S3, RDS, EBS, Elastic Load Balancer, and Auto scaling groups, optimized volumes and EC2 instances.

Developed complete Infrastructure in both EAST and WEST cloud regions to run the batch process using AWS EMR, SPARK to build Analytics engine for executing the workflows.

Extensively worked on building the AWS stack (Lambda Functions, SNS Topics, CloudWatch Events) using Cloud Formation Templates and deployed the stacks in both EAST and WEST regions using groovy scripts.

Developed CI pipelines which will be triggered when a PR is raised to successfully validate all the workflows that are modified as per the Business Needs.

Developed CD pipelines which will be triggered when the PR is merged to deploy the code to the Artifactory as well as the WEST and EAST regions Buckets.

Created Jenkins CI/CD pipelines for continuous build & deployment and integrated Junit and SonarQube plugins in Jenkins for automated testing and for Code quality check.

Wrote Groovy scripts in such a way to Create, Update and Delete the AWS Stacks in both EAST and WEST regions and also creates S3 Buckets.

Created GitHub-webhooks to trigger the CI process whenever a Pull Request or a Git Push to a repository is performed by the developers.

Created a Docker image with Spark & Hadoop to run the workflows as a part of testing and validating in CI Process.

Created SNS notifications and assigned ARN to S3 for object create notifications.

Automate provisioning and repetitive tasks using Terraform and Docker container, Service Orchestration.

Modified the IAM roles and added the trust relationships for cross bucket functionality to read and post data into other Buckets.

Created dual AWS Stacks (Lambda Functions, SNS Topics, CloudWatch Events) to test all programs in prod Environment.

Created and Deployed a Cluster Bot (Written in python) lambda which helps in Monitoring the EMR cluster health and in return sends a message to slack about their status using Cloud Formation Template.

Involved in development of test environment on Docker containers and configuring the Docker containers using Kubernetes.

Created custom Bucket policies to grant Bucket access only to the required users and to those accounts which has cross bucket accesses.

Created Bucket Events using SNS Topic ARN’s which in turn triggers the lambda’s based upon a file drop in a particular location of the Bucket.

Added a functionality in the Jenkins file to perform the AMI Rehydration by automatically updating the existing Stacks created in AWS.

Created and Deployed a stats Lambda (by setting S3 Triggers) which notifies the team about the program stats by reading the workflow outputs that are dropped in the S3 Buckets.

Migrated the EC2 Instances from m4 to m5 to reduce the costs and migrated to Spark 2.4.0 to reduce the computational time of the workflows.

Environment: Jenkins, Maven, GitHub, Spark, AWS (S3, EC2, SNS, EMR, Events, Lambda, SQS, IAM, RDS, Cloud Formation Templates, Cloud Formation, Cloud watch, VPC, AMI, Jfrog, Groovy, Python, Node JS, Hadoop, Gangila.

Client: Herbalife, Winston-Salem, NC

Role: AWS & DevOps Engineer Aug 2016- Jun 2018


Managed Linux and Windows virtual servers on AWS EC2 using Open-Source Ansible Server.

Used Ansible for server configuration and Git for source control.

Responsible for specialization areas related to Ansible for Cloud Automation in DevOps Platform.

Collaborated with Development and Support teams to setup a CD (Continuous Delivery) environment with the use of Docker, continuous build and delivery tools.

Developed automation framework for Application Deployments to the cloud environments.

Designed & deployed AWS solutions with E2C, S3, RDS, EBS, Elastic Load Balancer, auto scaling & Ropeworks, deployed cloud stack using AWS OpWorks.

Optimized volumes and EC2 instances and created multi AZ VPC (Virtual Private Cloud) instances.

Configuring and Networking of Virtual Private Cloud (VPC). Written Cloud-Formation templates and deployed AWS resources for dev, test, staging and production.

Configured S3 to host static web content, Elastic Load Balancers with EC2 Auto scaling group.

Created playbooks for OpenStack deployments and bug fixes with Ansible. Configured and monitored distributed and multi-platform servers using Ansible. Wrote playbooks for configuring app servers and applying security patches.

User admin setup, maintaining account, monitor system performance using Nagios & Tivoli.

Worked on Multiple AWS instances, set the security groups, Elastic Load Balancer and AMIs, auto scaling to design cost effective, fault tolerant and highly available systems.

Implemented Ansible playbooks for Deployment on build on internal Data Centre Servers.

Installed, Administered and Maintained several instances of Jenkins. Mastered supporting various development teams and running more builds a day with downstream jobs to perform the deployments onto SIT & QA environments.

Installed and configured Sonatype Nexus, Used Nexus as artifact repository to promote builds through different environments.

Worked on building Jenkins pipeline for automating build, deploy, test and release of applications while using Nexus repository for versioning.

Worked on Ansible playbooks to deploy new software and plugins as well as manage deployments to the production Jenkins server. Integrated Build Process through Jenkins to various SCM tools.

Develop Custom Scripts to monitor repositories, Server storage.

Built and deployed J2EE application in Jboss using python scripts.

Used source control version tools like, SVN and GIT for analyzing the source code and implementing it on automation tool.

Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of Open Stack Nodes and Test Playbooks on AWS instances using Python.

Automated the release pipeline to achieve zero touch deployments using Jenkins, SVN, and Nexus.

Extensively worked on system performance monitoring (tuning as necessary) using Nagios

Worked on Administration, maintenance and support of Red Hat Enterprise Linux (RHEL) servers.

Continuously updating documentation for internal knowledge base of support team and IT team.

Provided production 24x7 supports.

Created analytical matrices reports, dash boards for release services based on JIRA tickets, reach continuous delivery and built clouds with AWS.

Worked with the DB team and third-party services team to rectify the issues related to service calls and database collection

Environment: Red Hat, Ansible, AWS, OpenStack (Icehouse/Havana), Jenkins, Maven, Ant, GIT, Docker, Apache, Nagios, MySQL, Python, JBoss, Nexus.

Client: Siemens, Germany

Role: Junior DevOps Engineer Apr 2015- Jul 2016


Involved in designing and deploying multitude applications utilizing almost all the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS CloudFormation.

Migrated and automated cloud deployments using API’s, chef and AWS Cloud Formation Templates.

Defined a migration strategy for an application by understanding the application architecture in working with the development team.

Implemented AWS IAM for managing the credentials of applications that run on EC2 instances.

Worked on Cloud formation to create cloud watch metric filters and alarms for monitoring and notifying the occurrence of cloud trail events.

Leveraged AWS cloud services such as EC2, auto-scaling and VPC to build secure, highly scalable and flexible systems that handled expected and unexpected load bursts.

Analyzed application portfolios, identifying dependencies & common infrastructure platform components, and assessing migration feasibility.

Created Elastic Load Balancer to distribute incoming application traffic across Amazon EC2 instances.

Involved in network administration in large datacenter environment - DNS/DHCP, Load Balancing (F5 Networks, AWS ELB), Firewalls (Cisco Systems, Juniper Networks), IDS/IPS, IPSEC VPN)

Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.

Designed and implemented Subversion and GIT metadata including elements, labels, attributes, triggers and hyperlinks.

Created and maintained continuous build and continuous integration environments in SCRUM and agile projects.

Automated Linux production server’s setup using Puppet scripts. Used these scripts to replicate production build environments on a local dev boxes using Vagrant and VirtualBox.

Documenting releases, builds and source control processes and plans. Writing MAVEN and ANT build tools for application layer modules.

Created BASH Shell scripts to automate cron jobs and system maintenance. Scheduled cron jobs for job automation.

Used JIRA for issue tracking and code integration to plan and collaboration, used Dev tools to host, review, test, and deploy team's GIT and Mercurial code.

Followed Spring MVC Framework for the development of the project and used Spring Security to provide authentication, authorization, and access-control features for this application.

Used MAVEN build tool for compiling and packaging the application and compiled the project assemblies using MAVEN and deployed it with ANT script.

Utilize AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.

Automated application deployment in the cloud using Docker technology using Elastic Container Service (ECS) schedule.

Environment: Red Hat Linux, AWS, Ruby, Python, Sun Solaris 9/10, GIT, Hudson/Jenkins, MAVEN, Groovy Script, Solaris Volume Manager, Web logic 10g/11g, Oracle 11g, RESTful, ANT, Agile, SVN, Docker, Chef, Nexus, Sonar, Check style, BitBucket, PERL, BASH

Client: DuetscheBank, Germany

Role: PL/SQL Developer Feb 2013- Mar 2015


Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Worked on SQL*Loader to load data from flat files obtained from various facilities every day.

Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.

Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.

Involved in the continuous enhancements and fixing of production problems.

Generated server-side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.

Developed Advance PL/SQL packages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQLNavigator. Generated server-side PL/SQLscriptsfor data manipulation and validation and materialized views for remote instances.

Developed PL/SQL triggers and master tables for automatic creation of primary keys.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.

Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.

Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.

Partitioned the fact tables and materialized views to enhance the performance.

Extensively used bulk collection in PL/SQL objects for improving the performing.

Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.

Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.

Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Environment: Oracle11g, SQL * Plus, TOAD, SQL*Loader, SQL Developer, Shell Scripts, UNIX, Windows XP

Client: T-Mobile, Germany

Role: PL/SQL Developer Oct 2011- Jan 2013


Used Oracle JDeveloper to support JA Built complex queries using SQL and wrote stored procedures using PL/SQL in Various API’s like Java, .Net and Hierarchical databases like Oracle and Access.

Developed and modified several Forms and for various modules. Also responsible for following up bugs reported by various users and suggesting possible patches to be applied.

Wrote Shell Scripts for Data loading and DDL Scripts.

Worked in Production Support Environment as well as QA/TEST environments for projects, work orders, maintenance requests, bug fixes, enhancements, data changes, etc.

Used Oracle JDeveloper to support JAVA, JSP and HTML codes used in modules.

Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.

Performed Database Administration of all database objects including tables, clusters, indexes, views, sequences packages and procedures.

Implemented 11g and upgraded the existing database from Oracle 9i to Oracle 11g.

Involved in Logical & Physical Database Layout Design.

Set-up and Design of Backup and Recovery Strategy for various databases.

Performance tuning of Oracle Databases and User applications.

Used SQL*Loader as an ETL tool to load data into the staging tables.

Used DTS Packages as ETL tool for migrating Data from SQL Server 2000 to Oracle 10g.

Provided user training and production support.

Improved the performance of the application by rewriting the SQL queries.

Wrote packages to fetch complex data from different tables in remote databases using joins, sub queries and database links.

Environment: VB 6, Oracle 9i/10g/11g SQL, PL/SQL, Forms 9i, Reports 9i, SQL*Loader, SQL Navigator, Crystal Reports, TOAD.

Contact this candidate