Post Job Free

Resume

Sign in

Engineer Cloud

Location:
Hyderabad, Telangana, India
Posted:
June 07, 2018

Contact this candidate

Resume:

Prasad – AWS DevOps Engineer

ac5sg1@r.postjobfree.com

+1-612-***-****

PROFESSIONAL SUMMARY:

Competitive IT Professional with eight Years of Experience as Cloud AWS/DevOps Engineer

Extensive experience includes Software Configuration Management (SCM), DevOps Build/Release Management, Change/Incident Management and Cloud Management.

Coordination experience with various teams like Dev, QA and production Operations.

Experienced in Design an Architectural Diagram for different applications before migrating into Amazon cloud for flexible, cost-effective, reliable, scalable, high-performance and secured.

Worked with several teams to transition workflows to Continuous Integration and Delivery model and implemented best practices for utilizing AWS.

Designed, built and deployed a multitude applications utilizing almost all of the AWS(Amazon web services), Including EC2, S3, Elastic Beanstalk, Elastic Load Balancing (Classic/Application), Auto Scaling, RDS, VPC, Route53, Cloud Watch, snapshots and IAM, focusing on high-availability and fault tolerance.

Integrating Jenkins with AWS to automate Services.

Well versed in managing source code repositories like SVN, Git, GitHub, bit bucket.

Proficient in writing Puppet modules and chef cookbooks & Recipes to manage systems configuration.

Experience in integrating the infrastructure automation using Puppet. Creating and configuring Jenkins jobs, build and delivery pipelines.

Experience in working with various CI/CD tools like Jenkins/Hudson, Sonar, Subversion, Team foundation server and Nexus, Artifactory.

Experience with monitoring and logging tools like App Dynamic, Splunk and Nagios for monitoring network services and host resources.

Experience in various scripting languages like Shell, Ruby and Python focus on Devops tools, CI/CD and AWS Cloud Architecture.

Extensive experience using MAVEN and ANT as a Build Tool for the building of deployable artifacts (Jar, war) from source code.

Delivered architecture designs and solutions for public, private and hybrid clouds covering the cloud architecture tiers and portfolios of cloud services.

Used AWS Cloud Formation and AWS OpsWorks to deploy the infrastructure necessary to create development, test and production environments for a software development project.

Good expertise in implementing PaaS Environment (Using Elastic Bean stalk) and Infra structure as a Code Service using Cloud Formation.

Have experience with Server less/PaaS technologies (API Gateway, Lambda, Dynamo, S3, etc.).

Hands on experience with hypervisor and compute virtualization technologies (VMware ESXi, VSphere, vCenter, vCloud, Vmware Horizon/View)

Worked on Docker components like Docker Engine, Docker-Hub, Docker-Compose, Docker Registry and Docker Swarm.

Expertise in creating DevOps strategy in a mix environment of Linux (RHEL, Ubuntu and CENTOS) servers and Windows along with Amazon Web Services.

Expert in deploying the code on web application servers like Web Sphere, Web Logic, Apache Tomcat, JBOSS and Built micro services using API gateway.

TECHNICAL SKILLS

Title

Tools used

Configuration Management

Puppet, Chef and Ansible.

Scripting languages

BASH for Linux, Shell, Power shell, Perl, Python, ruby, sql, json, yaml.

Clod Computing

AWS, cloud Front, cloud Watch, AWS EC2, VPC, ELB, RDS, Route 53, ECS, EBS, lambda,

IAM, Data Pipeline, code-build, code-deploy, code-commit,

AWS Configuration and Microsoft Azure.

Compiled languages

C, C++, C# and JAVA.

Databases

SQL Server, Oracle, MySQL

Operating systems

Windows, Linux, Unix, Ubuntu and Cent OS

Build tools

Ant, Maven and Ms. build and Jenkins, gradle, Selenium, AWS Code Pipeline, AWS Code Build,

AWS Code Deploy

Version control tools

SVN, GIT, TFS, CVS and AWS Code Commit

Virtualization

Product

VMware, VSphere, VCenter Server, VMware Server, Xen, Solaris Zones.

Web/Application Servers

Web Logic, Apache Tomcat, Web Sphere, Blade Logic Server and Nginx.

Client: United health Group (Optum) June 2017 –Till date Role: AWS/ DevOps Engineer

Location: Eden prairie, MN

Responsibilities:

Planning, Design, Road maps, POC implementations for AWS Cloud technologies. Cloud first strategy

Work closely with development teams to integrate their projects into the production environment and ensure their ongoing support.

Implementing a Continuous Delivery framework using Jenkins, Chef, Maven & Nexus in Linux environment.

Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS Cloud watch.

Creating cloud formation scripts for hosting software on AWS cloud. Automating the installation of softwares through PowerShell scripts

Build and deployed the java and node js web applications in agile continues integration and continues deployment environment to automate the whole process.

Installed Workstation, Bootstrapped Nodes, Wrote Recipes and Cookbooks and uploaded them to Chef-server, Managed On-site OS/Applications/Services/ Packages using Chef as well as AWS for EC2/S3&ELB with Chef Cookbooks.

Writing json templates for cloud formation and ruby scripts for chef automation and contributing to our repository on GitHub (sub version control).

AWS Import/Export accelerates moving large amounts of data into and out of AWS using portable storage devices for transport.

Add project users to the AWS account with multifactor authentication enabled and least privilege permissions.

Created and automated the Jenkins pipeline using pipeline groovy script for the applications.

Used maven as build tool on java projects for the deployment of build artifacts on the source code.

Created branching and tagging strategies to maintain the source code in the code hub repository and coordinate with developers with establishing and applying appropriate branching, labeling/naming conventions using GIT source control.

Use EC2 Container Service (ECS) to support Docker containers to easily run applications on a managed cluster of Amazon EC2 instances.

Deployed LAMP based applications in AWS environment, including provisioning MYSQL- RDS and establish connectivity between EC2 instance and MySQL-RDS via security groups.

Configured Elastic Load Balancers with EC2 Auto scaling groups.

Implemented migration of Source Code Repository to AWS Code Commit.

Deployed code in to the required environments using AWS Code Deploy.

Use AWS Code Pipeline to design and implement a Continuous Integration and Delivery pipeline on AWS.

Configuring of Virtual Private Cloud (VPC) with networking of subnets containing servers.

Access VPC, Subnet, Security Group and EC2 AWS describe API and Cloud Formation to create spreadsheets, MySQL & Postgres Database entries.

Built S3 buckets and managed policies for S3 buckets and used S3 bucket and glacier for storage and backup on AWS.

Environment: Amazon Web Services, IAM, S3, EBS, AWS, SDK, Cloud Watch, Cloud Formation, SVN, GitHub, Chef, Jenkins, Ansible, Docker, Java, Agile, Apache Tomcat, JSON, Shell, Python.

Client: Chevron Corporation, California, U.S

Role: AWS DevOps Engineer

Location: Atlanta, GA June 2016 – March 2017

Responsibilities:

Managing GIT for legacy products also automated the weekly deployments with Jenkins.

Maintaining Git repos and Jenkins for builds.

Used MAVEN, GRADLE as build tools on Java projects for the development of build artifacts and test J2EE and J2SE applications to a variety of WebSphere platforms - BPM, MQ, ESB, Portal etc. and conducted SVN to GIT migration.

Converted our staging and Production environment from a handful AMI's to a single bare metal host running Docker.

Introducing and implementing Continuous Integration principles and practices for Enterprise wide teams using Jenkins, Git, Nexus, Gradle, Ansible, Docker, AWS.

Managed Ubuntu Linux and Windows virtual servers by using Puppet.

Experienced with working on Puppet Master and Puppet Agents. Defined Puppet Master and Puppet Agents to manage and configure nodes.

Experienced in configuration management tools such as Ansible, Chef, and Maven.

Optimized volumes and EC2 instances & Created multi AZ VPC instances

Used IAM to create new accounts, roles and groups

Configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier

Configured S3 to host static web content

Deployed cloud stack using AWS OpWorks

Configured Elastic Load Balancers with EC2Autoscaling groups

Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AMI's.

Technical leader to align processes and tools, such as branching, source control structure, dependency management, Linux\Windows hybrid build infrastructure, code review, and check-in policies, that are developed and instrumented by DevOps teams across projects globally.

Configured and monitored distributed and multi-platform servers using Nagios

Supported 2000+ AWS Cloud instance's and also familiar with Amazon Cloud command line management.

Experience in developing ETL and Data Models using an ETL tool such as SAP Data Services, Informatica.

Implemented a Continuous Delivery pipeline with Docker, Microservices, Jenkins and GitHub, Nexus, Maven and AWS AMI's.

Implemented build stage- to build the micro service and push the Docker container image to the

Private Docker registry.

Worked on User administration setup, maintaining account, Monitor system performance using Nagios and Tivoli.

Encrypted EBS volumes via KMS service

Worked on end to end setup of the Artifactory pro as a Docker container with a secure private Docker registry and local Docker repositories for storing the built Docker images.

Migrating the current code to CI/CD pipeline via Ant to Gradle, TFS to Git, and Bamboo to Jenkins.

Implemented various testing scenarios like Contract, Functional, Performance and Integration tests as part of Micro service architecture using moco server for mocking the backend.

Deployed and Monitored Micro Services Using Pivotal Cloud Foundry, also Managed Domains and Routes with the Cloud Foundry.

Worked on deployment automation of all the microservices to pull image from the private Docker

Registry and deploy to Docker swarm cluster using Ansible.

Created monitors, alarms and notifications for EC2 hosts using Cloud watch Set up Elasticache using memcached

Migrated applications to the AWS cloud

Experience implementing Cloud based Data Warehousing solutions.

: EC2, Elastic Load Balancer, EBS, IAM, Gradle, AWS OpWorks, VPC instances, Amazon CLI's, Puppet, Ansible, Run Deck, EBS, RDS, S3, VPC instances, Glacier, Nagios, Tivoli, Auto scaling, Cloud Watch.

Client: Bank of America

Role: Sr. AWS/DevOps Engineer

Location: NC Sep 2015 – Mar 2016

Responsibilities:

Administering the Linux System & Middleware for the day to day phenomenon and administration

Of RHEL 5, 6 / CentOS which includes installation, testing, tuning, upgrading and loading patches, troubleshooting server issues.

Interacted with client teams to understand client deployment requests.

Involved in defining, documenting, negotiating and maintaining Product/Application Release Roadmap.

Involved in designing and deploying applications utilizing almost all the AWS stack (Including EC2,

Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS Cloud formation.

Good experience on Networking in AWS, VPC, and Datacenter to Cloud Connectivity, Security

Groups, Route Tables and ACL' s in AWS.

Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure

Successful deployment of Web applications and database templates.

Involved in implementing deployments into AWS EC2 with the help of Terraform.

Involved in working with Amazon EC2 for deploying the applications by grouping all the container

Instances, where Docker is used as a containerization tool for this.

Implemented and maintained the monitoring and alerting of production and corporate

servers/storage using AWS Cloud watch.

Integrated Jenkins with different code quality analysis and Review tools like Sonarqube, Jacopo.

Integrated with SonarQube reporting dashboard to run analysis for every project.

Installed & maintained XEN & VMware servers with multiple VM's, running a Multi VLAN Physical &

VM environment.

Extensive experience in using VCAC enterprise tool for giving users the ability to access the organization cloud., working to maintain the organization to meet the business requirements.

Involved in working on the Kayako tickets which includes general maintenance issues.

Working with an agile development team to deliver an end to end continuous integration &

Continuous delivery product in an open source environment using Jenkins and uDeploy.

Experience in creating deployment tool uDeploy Environment.

Experience with Docker and Vagrant for different infrastructure setup and testing of code.

Implemented a continuous deployment(CD) framework that automates the software delivery

Process from source code checking to deployment onto Application servers.

Developed and Consumed SOAP and RESTFUL Web Services.

Developed Codes on angular Js and node Js.

Deploy, scale web applications and services developed with Java, PHP, Node.js, Python, Ruby

And Docker on familiar servers like Apache with the help of AWS Beanstalk.

Built Custom services and utilized existing services like http service to invoke Rest service calls.

Executed the modules using scripting languages like python, shell, and bash in Ansible.

Executed a Continuous Delivery pipeline with Docker, Jenkins, GitHub and AWS AMI's, results in

Generating of new docker container whenever a new GitHub branch get started.

Configured Ubuntu machines and Red Hat machines using Ansible.

Worked on Ansible playbooks for mapping of hosts to a set of roles.

Automated various infrastructure activities like Continuous Deployment, Application Server setup,

Stack monitoring using Ansible playbooks and has Integrated Ansible with Jenkins.

Used Ansible playbooks to automate in AWS features like EC2, IAM, VPC, EBS, Cloud Watch,

Installed and managed Kubernetes applications using helm.

Cloud Trail, Cloud Formation, Auto Scaling, IAM, S3 and general knowledge in Kubernetes.

Used Kubernetes for creating new Projects, Services for load balancing and adding them to Routes

By accessing from outside, created Pods through new application and controlling, scaling and troubleshooting pods through SSH.

Managed Artifactory repository for the current project created a new repository and provisioned it.

Performed using a log structured storage model with Cassandra, performed small and simple test

That showed the amount of load put onto disk systems during small operations.

Wrote PowerShell scripts to automatically restart the uDeploy agents on LINUX machines

Performed Automation and Scaling of applications using Kubernetes.

Worked on Saas and DNS entries features and configured them using the Terraform.

For monitoring production health used tools like Splunk, AppDynamics and for networking issues

Used Nagios.

Interpret Jenkins logs to pinpoint cause of failure.

Written Shell scripts to automate the deployments to Jboss Application Servers deployed to

Unix/Linux environments.

Environment: AWS, Bit bucket, Shell, Docker, RHEL, Linux, RESTFUL, ANT/Maven, SonarQube,

JUnit, Jenkins, uDeploy, WebSphere Application Server Network Deployment, IBM HTTP Servers,

Jboss, Tomcat, Nagios, Shell Scripting, Xml, Java, J2EE Applications.

Client: Extranet Software Solutions July 2012 – Apr 2015

Role: Devop’s engineer

Location: Hyderabad, IN

Responsibilities:

Responsible for configuring and troubleshooting for Rational Clear Case Administration.

Developing and maintaining quality control documents.

Evaluated and lead the conversion to using TFS for integrated Source control, Builds, testing and deployment and lead upgrade of Team foundation server.

Experience with the planning, controlling and troubleshooting of Software release by .NET Teams.

Troubleshoot the automation of Installing and configuring .NET and C# applications in the test and production environments.

Create and manage associated SharePoint sites.

Create and manage reporting server and reporting solutions.

Experience in functional testing of web application using Selenium.

Used Maven, Selenium web driver, java scripted selenium grid to create nightly automation Scripts.

Train, mentor and coach end users in all functional areas of TFS.

Implemented continuous integration (CI) automated build pipelines using Jenkins.

Maintain automated build system; implement new features or scripts for the build system

Work with Operations to coordinate production and test releases

Used Sonarqube for code recovery and code quality.

Performed Static Code Analysis using Sonarqube.

Run the automated test scripts including build verification test scripts after every build.

Assist in component/production issue diagnosis and resolution.

Setting up Rational Clear Case.

Azure Cluster status reports for high level management.

Installation of Rational Clear Case on clients and servers.

Preparing procedure for administration of Clear Case.

Developing and maintaining standard operating procedures documents.

Expertise in implementing the Configuration Management Tools like Chef, Puppet, Ansible and Docker.

Involved in coordinating with the ops team to set up APP Pools for various .NET Apps.

Troubleshoot the automation of Installing and configuring .NET and C# applications in the test and production environments.

Good knowledge on Selenium suite of Tools (Selenium IDE, Selenium RC, Selenium WebDriver and Selenium Grid).

Idea on Test Automation Framework implementation.

Ability to create and execute Test cases using Selenium IDE and Selenium Web driver.

Develop and Implement HTML, JavaScript, and .NET web page.

Integration with Sonarqube for code coverage, Selenium for automatic tests.

Migrated all projects from various version control tools VSS, GIT, Clear Case, SVN to RTC.

Worked on IIS 7.0&IIS 8.5 in setting up websites for applications.

Experience in automated builds using TFS team build and CruiseControl.net build for .net applications and Sales force.

Created an automated application-testing framework for the CD pipeline leveraging Robot Framework integrated with Jenkins & Selenium that increased testing cycles.

Wrote python scripts to automate deployment tasks and also do pre and post tasks in JENKINS.

Subscribe SQL Server Azure Cloud Services

Configured and Installed GIT with TFS as VSTS

Set up Branching strategy in GIT with User Level Access.

Environment: TFS 2008/2010(Team Foundation Server), .Net, ASP, Web Sphere, Test Director, QTP, MS-Visio, XML, HTML, IBM DB2, CICS, JCL, Oracle.

Novation Solutions, India

Title: - Solaris / Linux System Administrator May 2010 – April 2012

Roles & Responsibilities

Extensive use of Veritas Volume Manager for Disk m

anagement, file system management on Sun Solaris

environment.

Implemented the Jumpstart servers to automate the server builds for multiple profiles.

Installation and configuration Veritas NetBackup on Sun Severs.

Configured Puppet to provide centralized management.

Involved in the patching of several Solaris servers (both UFS and ZFS) with the latest patch clusters.

Customize user environment for users.

Dealing with user id deletion with removing entries from CRONTAB.

Performed process automation, scheduling of processes using CRON jobs.

Managing cron jobs, batch processing and job scheduling.

Configured new devices in online and offline, partitioned the disks, created new file systems, mounted

and maintained them and changed the /etc/vfstab entries.

Set up mount points on Solaris servers for Oracle database.

Monitored System Performance of Virtual memory, Managing Swap Space, Disk utilization and CPU

utilization.

Monitored client disk quotas & disk space usage.

System performance monitoring and tuning.

Monitored system logs.

Worked with other IT teams, and other managers in helping build and implement systems and

standards.

Responsible for documenting the project conducted in the company for the future purpose

Performed data management using native Solaris utilities for archiving and compression.

Environments: Solaris 8/9/10, CentOs 4/5, SUSE Linux 10.1/10.3, SPARC Enterprise M3000, M5000, M8000, T5220, T5240, Sun Fire v880, v490, v440, v240, v210,Dell (T100, T105, T200, R300) Apache 2.2, JBOSS 4.2, Jumpstart, HP C, Blade system, BL460c, BL420c, Tivoli Storage Manager 5.5, MySQL, Veritas Cluster Servers, Solaris Volume, Management, WebSphere 6.1.



Contact this candidate