Post Job Free

Resume

Sign in

DevOps Engineer

Location:
Canmore, AB, Canada
Posted:
January 04, 2021

Contact this candidate

Resume:

Prudhvi Vasireddi

adi52w@r.postjobfree.com

587-***-****

Overview:

•DevOps Engineer with 8+ years of IT experience in Continuous Integration/Continuous Deployment with strong background in System Administrator, Build/Release Management, Change/Incident Management and Cloud Management with AWS and OpenStack.

•Experience in Working with Public Cloud Platform like Amazon Web Services such as EC2, S3, EBS, VPC, ELB, Auto Scaling, Route53, Security Groups, Cloud Watch, CloudFront, Cloud Formation, IAM and Glacier.

•Experience in creating User/Group Accounts and attaching policies to User/Group Accounts using AWS IAM service

•Extensively worked on creating Multiple AWS instances, Creating Elastic Load Balancer and Auto scaling to design cost effective, fault tolerant and highly available systems.

•Defined AWS Security Groups which acted as virtual firewalls that controlled the traffic allowing reaching one or more AWS EC2 instances.

•Setting up databases in AWS using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.

•Experience in configuring Virtual private cloud (VPC), subnets, Internet Gateways, S3 bucket and route 53 under Amazon Cloud Environment.

•Experience in branching, tagging and maintaining the version across the environments working on Software Configuration Management (SCM) tools like Subversion (SVN) and GIT.

•Extensively involved in installing and configuring different monitoring tools Nagios, Splunk.

•Experience using MAVEN and ANT as build tools for the building of deployable artifacts (jar, war & ear) from source code to write Pom.xml and Build.xml respectively.

•Created and wrote scripts in BASH, SHELL, RUBY and PYTHON for Continuous Deployment.

•Strong Hands-on Experience in configuring software provision tools like CHEF, ANSIBLE and PUPPET.

•Worked with DOCKER and Vagrant for different infrastructure setup and testing of code.

•Proficient with container systems like Docker and container orchestration like EC2 Container Service, Kubernetes, worked with Terraform.

•Good knowledge in creating and maintaining Docker images, Containers and Docker Hub.

•Provisioning AWS Infrastructure (EC2, S3, VPC, Security Groups, EBS) using Terraform.

•Worked on web servers like Apache and application servers like Web logic, Tomcat, and JBOSS to deploy code.

•Configuring network services such as DNS, NFS, SMTP, NTP, DHCP, LDAP, postfix, send mail, ftp, remote access, security management and Security troubleshooting skills.

•Monitored and supported hundreds of Zabbix servers running a variety of applications.

•Experience in Agile and waterfall Methodologies.

•Excellent understanding in all phases of the Software Development Life Cycle from Analysis to Implementation.

Technical Skills

Languages:

Javascript, Java, Python, Node.js

Scripting Languages:

Shell, Ruby, Python

Middleware:

Apache Tomcat Server, Weblogic, NGINX, F5

Database:

Oracle, PostgreSQL, MongoDB, Amazon RDS, DynamoDB, Redshift

Operating systems:

Red hat, Ubuntu, Linux, Windows and CentOS.

Build Automation:

Maven, Gradle, Code Deploy

SCM:

SVN, GIT

Build Automation:

Maven, Gradle, Code Deploy

Cloud Platform:

AWS, OpenStack, Lambda

Configuration Management Tools:

Chef, Terraform, Puppet, Ansible, Consul, Vagrant

Containers:

Docker, Kubernetes, Mesos, ECS

Monitoring/Visualization Tools:

Splunk, Zabbix, Nagios, Dynatrace, New Relic, Kinesis

Change Management:

JIRA, Confluence

Professional Experience

ATB Financial, Calgary, AB Dec ‘19 to Present

Cloud DevOps Engineer

Roles & Responsibilities:

•Designing scalable, highly available and Fault tolerant infrastructure solutions with best practices for dedicated data centers, public, private or Hybrid Clouds.

•Launching Amazon EC2 Cloud Instances using Amazon Web Services (Linux/ Ubuntu) and Configuring launched instances with respect to specific applications.

•Configured Build failure analytics dashboard using Elastic and Kibana.

•Production Patching on Linux (Debian/RHEL/Amazon Linux AMI) and Windows Servers.

•Monitor metrics such as CPU utilization, Memory, Disk Space on AWS CloudWatch.

•Worked at optimizing volumes and EC2 instances and created multiple VPC instances.

•Implemented MFA (Multi factor authentication) with Google authenticator on our Primary and Secondary VPN Servers with LDAP credentials.

•Worked frequently with Backend and database Professionals to deploy and run services on Containers (AWS ECS).

•Managed Docker orchestration and Docker containerization using Kubernetes.

•Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.

•Developed microservice on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.

•Automation of infrastructure using Terraform and Ansible.

•Created gold images and employed autoscaling.

•Built AWS Custom AMI by installing and configuring the software using Packer.

•Designed & Implemented Branching Strategy.

•Maintained SVN repositories for Devops environment: automation code and configuration.

•Created Jenkins Jobs for Auto deployment to Dev, QA & Regression environments based on cronjob.

•Created Jenkins Jobs for CD to Perf, Prod Environments.

•Used Chef to manage Web Applications, Config Files, Data Base, Commands, Users, Mount Points, and Packages.

•Auto-merge Jenkins Job from Release to Dev and Master Branches.

•Integrated Ansible Playbooks with Jenkins for Push button deployments of higher environments including Integration, Stating and Production.

•Used Ansible Playbooks to setup Continuous Delivery Pipeline. Deployed micro services, including provisioning AWS environments using Ansible Playbooks.

•Used Ansible to document all infrastructures into version control.

•Configured permission in Ansible Tower to restrict users at different levels.

•Utilized Spluk queries to triage the production issues.

•Created NewRelic Dashboards for alerts creation and application performance and metrics.

•Configured custom Email notifications with Release Notes.

•Changing the AWS infrastructure Elastic Beanstalk to Docker with Kubernetes.

•Filebeat Installation/configuration to ship logs to Logstash.

•Created Bash Script for Automatic Backup of Artifacts, Jenkins Configuration.

•Code Coverage/Code Quality/Test Reports using Sonar, Jacoco tools.

•Installation and Configuration of NodeJs for Transcoder Engine.

•Actively working on SSL (Secure Socket Layer) certificate installation for external applications.

•Did setup log rotation to rotate the logs based by size and purge the logs.

•Designed a security group for maintaining the inbound and outbound traffic.

•Experienced in Branching, labeling and Analyzing and resolving conflicts related to merging of source code in GIT and implemented a continuous Integration and Delivery pipeline using Docker, Jenkins, bitbucket and GIT. And used Jenkins pipeline plugin to analyze the Maven dependencies and the SCM changes.

•Experience working on docker hub, creating docker images and handling multiple images primarily for middleware installations and domain configurations.

BMI, Nashville,TN Jan ‘15 to Dec’19

DevOps/AWS Engineer

Roles & Responsibilities:

•Responsible for completing stories in two-week sprints. Stories include developing and modifying new and existing puppet modules to support newer versions of Infrastructure stack in windows and Linux platforms.

•Worked with the packaging team to make appropriate packages available in the central library for Puppet module development.

•Built multi-zone and multi-region architectures.

•Implemented CI/CD pipeline using GitHub, Jenkins and Docker.

•Experience in developing Cloud Formation Scripts for AWS Orchestration.

•Used Jenkins pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

•Worked extensively on DevOps practices using AWS and Docker with Kubernetes.

•To achieve Continuous Delivery goal on a highly scalable environment, used Docker coupled with load-balancing tool Nginx.

•Assist with migrating services from running in Rackspace to AWS.

•Responsible for Nagios for Synopsys which takes care of VM monitoring and periodic updating of Local Health Check for VM'S.

•Maintained and expanded the Elasticsearch cluster used for log aggregation to a 5 node cluster.

•Understanding the existence tools of datacenter and kick-start code in RHEL.

•Automation of creating host definition in Central Nagios instance.

•Used Puppet/ Hiera for automatic environment provisioning and deployments.

•REST API scripting in Python, Perl to interact with infrastructure tools.

•Enhancement of S3 storage object in AWS in python code.

•Used Amazon EC2 features to provision, monitor and scale and distribute compute infrastructure.

•Monitoring the instance health using AWS cloud watch and message alerts using SNS.

•Monitoring environments, servers and applications for health, performance and security with the help of tools like Prometheus, Nagios, Graphite and Splunk.

•Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

•Developed Python Modules for Ansible Customizations.

•Extended and enhanced an open source cloud interface library called libcloud to support most of Amazon EC2 API.

•Configured their Kubernetes cluster and supported it running on the top of the CoreOS.

•Used Tomcat and WebLogic as standard application servers to deploy web applications.

•Configured Ansible to manage AWS environments and automate the build process for core AMIs used by all application deployments including Autoscaling and Cloudformation scripts.

•Used Jenkins and pipelines to drive all microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

•Rebuilt software deployment pipeline integrating Docker and Mesos.

•Creating the delivery pipeline of the build in Jenkins and Bash Script.

Wells Fargo, Des Moines, IA Aug ‘13 to Dec ‘14

DevOps/Cloud Engineer

Roles & Responsibilities:

•Worked in setting up the AWS environment which includes VPC, Subnets, S3, EC2, Web Servers, IAM, Security Groups, Loan Balancer & Lambda to support data warehousing solutions.

•Worked with EBS, storage service S3 and performed tasks like buckets creation, folder navigation and property changes and data migration activities.

•Migrated Non-prod physical machines to Virtual Machines on AWS Cloud.

•Rotating the AWS access and secret keys of application generic users.

•Configured IAM services creating new IAM users & groups, defining roles and policies and Identity providers.

•Configuring AWS Cloudwatch for monitoring of instances during performance testing and also to monitor application performance on QA environments.

•Setting up databases in AWS Using RDS, storage using S3 bucket and configuring instance backups to S3 bucket.

•Configured Chef Server and Chef Solo including bootstrapping of chef client nodes for provisioning.

•Created roles, cookbooks, recipes, and data bags for server configuration, deployment, and app stack build.

•Used Chef, Knife, Ohai to create cookbooks and recipes to install packages that automate with Linux.

•Setting up Chef Cookbooks to perform builds and deployment management.

•Built a deployment pipeline for deploying tagged versions of applications to AWS beanstalk using Jenkins CI.

•Used AWS Elastic Beanstalk for continuous deployment to reduce development timelines and increase productivity.

•Integration with Splunk, which offers a pre-built knowledge base of critical dashboards and reports.

•Set up the scripts for creation of new snapshots and deletion of old snapshots in S3 bucket.

•Configured trigger points and alarms in cloudwatch based on pre-defined thresholds and log monitoring.

•Creating Launch configurations and creating auto-scaling groups based on launch configurations.

•Designed and implemented a VPC infrastructure inside of AWS that would facilitate a migration with minimal downtime.

•Implemented Chef Server and components installations, including cert imports, increase chef license, creating admins and users.

•Written several cookbooks in Ruby with recipes to perform Automation of middleware Installation and configuration tasks.

AIG, Fort Worth, TX July ‘12 to July '13

Linux System Administrator

Roles & Responsibilities:

•Installation and administration of RHEL 4.0/5.0, CentOS 5 and Solaris 9,10.

•Network configuration for host names, net masks and route details, DNS, NFS, NTP and SNMP etc.

•Network troubleshooting using 'ndd', 'traceroute', 'netstat', 'ifconfig' and 'snoop' and Monitored server and application performance & tuning via various stat commands like vmstat, nfsstat, iostat etc and tuned I/O, memory, etc.

•Configured Users & Security administration, backup, recovery and maintenance of various activities.

•Built new systems for Production and migrated from Solaris 8 to 9.

•Maintained Volumes and File systems using VxVM / VxFS for Oracle, DB2, MySQL databases.

•Experience with running SQL queries on Oracle and MySQL.

•Worked with the development team to troubleshoot and debug issues with our java application and other 3rd party service providers.

•Modified Kernel parameters to improve the server performance in Linux.

•Manage Red Hat Enterprise Virtualization Environment in multiple datacenters.

•Develop SQL queries to extract information for business partners from Oracle and MySQL databases.

Education:

• Bachelor’s in Computer Science from JNTU university, Hyderabad, India. Class of May 2010.

• Master’s in Computer Science from Western Kentucky University, Bowling Green, KY. Class of May 2012.



Contact this candidate