Post Job Free
Sign in

Devops Engineer Aws

Location:
United States
Posted:
September 27, 2025

Contact this candidate

Resume:

Ravali Aleti

Email ******.*********@*****.***

Contact: 302-***-****

LinkedIn- http://www.linkedin.com/in/ravalialeti

Senior AWS DevOps Engineer

AWS Docker Kubernetes Jenkins Terraform Ansible Ansible Tower Azure DevOps CI/CD Pivotal Cloud Foundry (PCF) UrbanCode Deploy Git Maven ELK Stack Splunk Nagios New Relic ServiceNow Jira Remedy VSTS ClearQuest RHEL PXE/Kickstart Automation Build & Release Management Agile/Scrum

Professional Summary:

●Senior AWS Cloud & DevOps professional with over 9+ years of experience in the IT industry, specializing in Cloud Management, Software Configuration Management, Continuous Integration, Continuous Deployment, Automation, and Build Release Management.

●Strong experience in AWS provisioning and good knowledge of AWS services like EC2, S3, Glacier, ELB, RDS, VPC, Autoscaling, EBS, IAM

●Proficient in managing Docker containers and images for runtime environments, with hands-on experience in containerization tools, Docker transitions, and developing distributed cloud systems using Kubernetes.

●Extensive experience in configuring and integrating servers across environments for automated provisioning and machine creation using configuration management tools like Ansible.

●Expertise in Continuous Integration (CI) and Continuous Deployment (CD) methodologies using Jenkins.

●Experienced in deploying production-grade Kubernetes environments, enabling reliable containerized workloads across private and public clouds. Managed Kubernetes charts with Helm, created reproducible builds, managed manifest files, and handled Helm package releases.

●Skilled in configuring Auto Scaling within customized VPCs, leveraging Elastic Load Balancer (ELB) traffic and health checks to trigger auto-scaling actions. Implemented Auto Scaling policies to adjust EC2 instances based on ELB health checks and set up alarms for auto-scaling decisions. Knowledgeable in AWS ACM, including SSL certificate installation on various Load Balancers.

●Strong background in deploying applications on Pivotal Cloud Foundry (PCF) using CF push and UrbanCode Deploy, including PCF backup for all environments and Jenkins Maven build automation with uploads to PCF.

●Proficient in using Terraform for provisioning and managing cloud infrastructure through reusable modules.

●Experienced in configuring and setting up Ansible Tower, writing Ansible playbooks for software package installations, and managing web applications on Virtual Machines and AWS EC2 instances.

●Proficient in leveraging Azure DevOps, ARM templates, and Terraform to build scalable CI/CD pipelines and automate cloud infrastructure provisioning.

●Expertise in Continuous Integration for major releases using Jenkins, integrating Git and Maven plugins, creating pipeline jobs, and configuring global environment variables.

●Experienced with monitoring systems like Nagios and New Relic, and various bug tracking tools including Remedy, Jira, VSTS, ServiceNow, and ClearQuest. Designed, deployed, and collaborated to enhance the ELK platform and have extensive experience with Splunk, including solution design, architecture, deployment, and configuration of Splunk components.

●Skilled in deploying and managing applications on Azure App Services, AKS, and Azure Functions, ensuring secure, reliable, and high-availability environments.

●Proficient in Red Hat Enterprise Linux (RHEL) using Kickstart and PXE on HP DL380 G3, performing OS installations, upgrades, and server patching through PXE & DHCP server configurations with Kickstart & Jumpstart scripts on Red Hat Linux 5.x, 6.x & 7.x.

●Worked as part of Agile Scrum teams, participating in daily scrum meetings.

Technical Skills:

●Programming/Scripting: SQL, PL/SQL, Python, Bash, Korn Shell, Perl, Ruby, YAML, Groovy.

●IaC: Terraform, CloudFormation.

●Databases: Oracle, MySQL, MongoDB.

●AWS Stack: EC2, RDS, VPC, S3, Route53, SNS, SQS, CloudFront, EBS, ELB, CloudWatch, Elastic Beanstalk,

●OpenShift, OpenStack, CloudTrail.

●Configuration Management: Puppet, Ansible, SaltStack, Chef.

●Version Control: Git, SVN, TFS, ClearCase.

●Build Tools: Maven, Ant, Gradle, Visual Studio.

●CI/CD: Jenkins, Hudson, Bamboo, TeamCity.

●Containers & Orchestration: Docker, Kubernetes, Mesos, Marathon.

●Monitoring & Logging: Nagios, Splunk, Elasticsearch, SonarQube, Selenium, CloudWatch.

●Web/App Servers: WebLogic, WebSphere, JBoss, Apache, Tomcat.

●Operating Systems: RHEL, Linux, Windows, macOS.

Education:

●Master of Science in Computer Science University of Bridgeport Bridgeport, CT 2018

●Bachelor of Science in Information Systems JNTUH Hyderabad, IN 2016

CERTIFICATIONS

●AWS Certified Cloud Practitioner: Learned AWS core services (EC2, S3, RDS, IAM), cloud architecture, security best practices, and cost management fundamentals.

●AWS Certified Developer – Associate: Gained hands-on experience with AWS Lambda, EC2, DynamoDB, SQS, SNS, CloudFormation, and CloudWatch for building and managing scalable applications.

●AWS Certified AI Practitioner (In Progress): Learned core AI and ML concepts and AWS AI services like SageMaker, Rekognition, Comprehend, Polly, Lex, and Transcribe for building intelligent applications.

●Google Cybersecurity Professional Certificate: Completed an intensive, hands-on cybersecurity training program focused on foundational security principles, including network security, risk assessment, threat detection, incident response, and vulnerability management. Gained practical experience using industry-standard tools such as Wireshark for network analysis, Splunk and SIEM platforms for security information and event management, and scripting in Python and Bash to automate security tasks. Developed skills in operating system security (Linux), database security (SQL), and applying best practices to protect IT infrastructures from cyber threats.

●CompTIA Security+ (In Progress)

●AWS Certified Solutions Architect – Associate (In Progress): Acquired expertise in designing secure, scalable, and highly available cloud architectures using AWS. Gained practical experience with services including EC2, S3, RDS, VPC, IAM, ELB, Auto Scaling, CloudFront, Route 53, Lambda, CloudFormation, and CloudWatch. Focused on high availability, fault tolerance, cost optimization, and applying the AWS Well-Architected Framework.

Project Experience:

Client: Travelport, Wilmington, DE July 2024 – till now

Role: Senior AWS DevOps Engineer

Project Description: Working on cloud migration and automation initiatives for Travelport by designing and implementing scalable AWS cloud infrastructure, CI/CD pipelines, and containerized microservices solutions.Focused on optimizing DevOps processes using AWS services, Kubernetes, Ansible, and Jenkins for high Availability and faster deployments.

Responsibilities:

●Involved in working with all the prominent AWS services like Cloud Watch, Trail and Cloud Formation, Kinesis, Cloud Front and worked on AWS DevOps tools like AWS Code-Pipeline to build a continuous integration or continuous delivery workflow using AWS Code-Build, AWS Code-Deploy, and worked with many other AWS tools to Build and deploy a microservices architecture using ECS or AWS Lambda.

●Configured AWS Identity and Access Management (IAM) Groups and Users for improved login authentication. Provided policies to groups using policy generator and set different permissions based on the requirement along with providing Amazon Resource Name (ARN).

●Installing and configuring Apache HTTP server, Tomcat web application servers and Jetty.

●Virtualized Docker Images on Amazon EC2, created Docker files, used Nexus Repository as a Private Docker Registry for Docker Images and Clustered the Containers by using Docker Swarm and Kubernetes.

●Used Jenkins and pipelines which helped us drive all Microservices builds out to the Docker registry and then deployed to Kubernetes.

●Implementing new projects builds framework using maven as build framework tool and resolve conflicts related to merging of source code for Git.

●Established Python Boto framework and CloudFormation to automate AWS environment creation along with the ability to deployment on AWS, using build scripts (AWS CLI) and automate solutions using Shell and Python.

●Experienced in programming Ansible Playbooks with Python for system administration, managing configurations of VMware Nodes and testing Playbooks.

●Created functions and assigned roles in AWS Lambda to run python scripts and Created Lambda jobs and configured Roles using AWS CLI.

●Managed Ansible Playbooks with Ansible modules, implemented CD automation using Ansible, managing existing servers and automation of build/configuration of new servers.

●worked with Chef Enterprise and Chef Open Source, Chef DK, Chef Workstation, Chef Server, Chef-Client. Experience in working with Knife command line utility, Berkshelf dependency manager, and Test kitchen to validate Chef Cookbooks.

●Automate infrastructure creation, deployment and recovery using Ansible, Docker, Jenkins

●Designed an ELK system to monitor and search enterprise alerts. Installed, configured and managed the ELK Stack for Log management within EC2/Elastic Load balancer for Elastic Search. Monitored performance of the applications and analyzed log information using ELK & EFK (Elasticsearch, Logstash, Kibana).

●Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.

●Migrating the present Linux environment to AWS by creating and executing a migration plan, deployed EC2 instances in VPC, configured security groups & NACL’s, attached profiles and roles using AWS Cloud Formation templates and Ansible modules.

●Implemented CI/CD pipeline using Jenkins, Ansible Playbooks and Ansible Tower.

●Managed Docker orchestration and Docker containerization using Kubernetes Implemented multi-tier application provisioning in open stack cloud, integrating it with Ansible, migrating the application using Maven as build tool.

●Create proof of concepts of technologies to evaluate Docker, Kubernetes, Cassandra, Bamboo, Jenkins, Splunk

●Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.

●Worked on testing, evaluation, and troubleshooting of MongoDB and Cassandra, NoSQL database systems and cluster configurations to ensure high availability in various crash scenarios.

●Deployed applications to Oracle WebLogic, JBoss, Apache Tomcat, Nginx and Web Sphere servers and worked on Logical Volume Manager (LVM), Veritas Volume Manager, Kickstart, Bonding, LAMP and LDAP.

●Built and engineered servers on Ubuntu and RHEL Linux. Provisioned virtual servers on VMware and ESX servers in Cloud.

●Orchestrated blue-green deployments with Kubernetes to deploy servers with intended changes. Used Load balancer to route the traffic to the new server. Set Up auto scaling feature together with Horizontal Pod Auto-Scaler to automatically handle Production service load that changes with time using Kubernetes.

●Used IAM to create new accounts, roles and groups, policies and permissions.

●Used Packer to automate the build process for machine images and utilized Vault’s AWS secrets engine to generate dynamic, on-demand AWS access credentials for Packer AMI builds.

Environment: Docker, Chef, Jenkins, CI/CD, Maven, Git, MongoDB, AWS, EC2, S3, Lambda, Auto Scaling, Cloud Watch, Cloud Formation, Security Groups, Dynamo DB, ELK Stack, Ansible, SonarQube, Nexus, Cassandra, NoSQL, Ubuntu, Linux, VM Ware Servers, Tomcat, Kubernetes, Shell, Groovy, Bash, Python, AppDynamics, Dynatrace.

Client: Commonwealth of Massachusetts, Quincy, MA (Remote) Mar 2022 – July 2024

Role: AWS DevOps Engineer

Project Description: Worked with Commonwealth of Massachusetts, on automating infrastructure

provisioning, CI/CD pipelines, and containerized deployments using AWS, Kubernetes, Docker, and Ansible.

Focused on Enabling Cloud native solutions, serverless architectures, and configuration management to

Streamline application deliver and improve scalability.

Responsibilities:

●Developed automation scripting in Python (core) using Puppet to deploy and manage Java applications across Linux servers.

●Developing a deployment management system for Docker Containers in AWS Elastic Container Service.

●Created Python Scripts to Automate AWS services which include web servers, ELB, Cloud Front Distribution, database, EC2 and database security groups, S3 bucket.

●Use of Docker and Kubernetes to manage micro services for development of continuous integration and continuous delivery.

●Developed Ansible playbooks from scratch to automate deployments, software and services configuration, server patching and configuring services on cloud.

●Worked on several Docker components like Docker Engine, Docker-Hub, Docker-Compose, Docker Registry and Docker Swarm.

●Implemented server-less architecture using API Gateway, Lambda, and DynamoDB and deployed AWS Lambda code from Amazon S3 buckets.

●Worked on AWS for deploying EC2 instances containing various platforms such as RHEL, CentOS, Ubuntu in Linux and windows.

●Setup Bit Bucket code repository for mobile development and integrated it with the new active directory.

●Responsible for automated identification of application server and database server using Ansible Scripts.

●Automated various infrastructure activities like Continuous Deployment, Application Server setup, stack monitoring using Ansible playbooks and has Integrated Ansible with Jenkins.

●Involved in provisioning and Automation servers on Public Cloud like AWS and Kubernetes.

●Used MAVEN as build tools on Java projects for the development of build artifacts on the source code.

●Developed Chef Recipes in Ruby to configure, deploy and maintain software components of existing infrastructure to cloud and bootstrapped chef client nodes.

●Implemented Kubernetes to deploy scale, load balance, scale and manage docker containers with multiple name spaced versions.

●Worked on Splunk logging driver to send the container logs to http event collector in Splunk Enterprise.

●Evaluated Chef Recipes with teh concept of Test-Driven Development for Infrastructure as a Code

●Used MAVEN as build tools on Java projects for the development of build artifacts on the source code.

●Developing puppet modules for Automation using a combination of Puppet Master, Git Enterprise, Open stack`(Horizon), Vagrant and SimpleUI(Jenkins).

●Converted numerous existing Java projects to a single deployment method using ECS Docker Containers.

●Developed build and deployment scripts using ANT as build tools in Jenkins to move from one environment to other environments.

●Environment, utilizing Kubernetes and Docker for the runtime environment for the CI/CD system to build and test and deploy

●Developed Microservice on boarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.

●Developed Puppet modules to automate deployment, configuration, and lifecycle management of key clusters.

●Used Kubernetes to deploy scale, load balance, scale and manage docker containers with multiple name spaced versions.

●Used Ansible for configuration management and deployed all the services on to the cloud using Ansible.

●Automate Application Delivery using Chef and Urban Code Deploy tool suite.

●Setting up continuous integration and formal builds using Bamboo with Artifactory repository.

●Analyze AWS based product for defects, and enhance automated testing to prevent regression.

●Used MAVEN as build tools on Java projects for the development of build artifacts on the source code.

●Developed Puppet modules to automate deployment, configuration, and lifecycle management of key clusters.

●Implemented Docker containers and created clients respective Docker images and leveraged Apache Mesos and Aurora to manage Cluster hosts for Applications

Environment: Jenkins, Ansible, Maven, Docker, Kubernetes, PCF, GIT, LINQ, JSON, Java, LAMP, Splunk, SOAP UI Tool, jQuery, Bootstrap, UDeploy, Windows, AWS- EC2, S3, VPC, Cloud Watch, NACL, Route 53, IAM, SQS, SNS, SES, Apache servers, Linux servers.

Client: Union Bank, Los Angeles, CA May 2020 – Feb 2022

Role: DevOps Engineer

Responsibilities:

●Automated AWS components like EC2 instances, Security groups, ELB, RDS, IAM through AWS Cloud information templates.

●Experience in designing and deploying AWS Solutions using EC2, S3, and EBS, Elastic Load balancer (ELB), auto-scaling groups and OpsWorks.

●Used Micro services architecture with Spring Boot based service through REST

●Experience in creating alarms and notifications for EC2 instances using Cloud Watch.

●Creating Lambda function to automate snapshot back up on AWS and set up the scheduled backup.

●Utilize Camel to integrate microservices with other microservices and RabbitMQ messaging exchanges.

●Worked on creating the Docker containers and Docker consoles for managing the application life cycle.

●Experience with Elasticsearch, Logstash & Kibana stacks.

●Used Chef to manage Web Applications, Config Files, Database, Commands, Users, Mount Points, and Packages.

●Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AMI’s

●Have experience of working with Docker- docker hub, pulling images from docker hub, running containers based on an image, creating

●Creating fully automated CI build and deployment infrastructure and processes for multiple projects.

●Developing scripts for build, deployment, maintenance and related tasks using Jenkins, Docker, Maven, Python and Bash

●instances systems with Chef Automation. Wrote recipes, tools, shell scripts and monitoring checks.

●Experience of writing & managing Chef Scripts & using Linux automated deployments using Chef.

●DynamoDB and ordering based on the location of mobile users and ordering trends

●Worked closely with developers and managers to resolve the issues that were risen during the deployments in different environments.

Environment: AWS EC2, S3, Cloud Formation, Dynamo DB, Kinesis, VPC, IAM, Tomcat Apache, Micro-Services, Cloud Watch, New Relic, Git, Linux, ELK Stack, Jenkins, Maven, Ansible, JVM etc.

Client: Comcast, Mt Laurel, NJ Jan 2018 – Apr 2020

Role: DevOps/Build and Release Engineer

Responsibilities:

●Designed and implemented infrastructure as code (IaC) with ARM templates, Bicep, and Terraform for consistent and scalable Azure resource provisioning.

●Created Multi branch pipeline and shared pipeline libraries that can be used by others jobs, created Artifacts and Fingerprints of the build jobs.

●Worked portal for triggering builds and releasing them to stakeholders by understanding the pain points of Developers and QA engineers.

●Integrated Azure Repos, GitHub, and third-party tools into CI/CD workflows to streamline code builds, testing, and deployments.

●Knowledge in Docker swarm orchestration, networking, security, storage and volumes.

●Configured and optimized Azure App Services, AKS, and Azure Functions deployments with zero-downtime release strategies (Blue-Green/Canary).

●Created release notes based on the contents of builds and publish build artifacts to the stakeholders.

●Monitored and improved pipeline performance using Azure Monitor, Application Insights, and Log Analytics, ensuring secure and reliable releases.

●Continuous Integration: implemented and promoted use of Jenkins within the developer community. Validated Jenkins along with Bamboo.

●Managed end-to-end build and release processes using Azure DevOps (ADO) pipelines, ensuring smooth CI/CD automation across multiple environments.

●Implemented Chef to deploy the builds for Dev, QA and production.

●Expert in Docker with strong experience in multi-stage builds in Dev/Test/Prod using Docker-compose and Docker-stack deploy.

●Integrated Jenkins with GitHub for continuous integration and deployment of the code by enabling Git hooks, the build jobs are automatically created once the changes are made to the code by the dev team.

Environment: Chef, Oracle, Jenkins, MS Azure, Java, Eclipse, Tomcat, Apache, Python, JIRA, Maven, Git, Windows

Client: Hyundai Auto Ever America, Fountain Valley CA Aug 2016 – Dec 2017

Role: System Admin

Responsibilities:

●Worked on RHEL 6.x and Sun Solaris 10/9.0/8.0 VM build for installing, upgrading, mirroring and configuring Kick start, Jumpstart installation respectively.

●Participated in the development and implementation of network-related procedures and standards and Configured DHCP and FTP servers in Linux

●Administered and configured volume operation using vxdisk, vxdg, vxassist, vxmake and vxvol in VERITAS Volume Manager/VERITAS File System.

●Worked in setting up LDAP, DNS, DHCP, NFS, NIS Server along with effective group and System Level policies in Red Hat Linux, and Sun Solaris.

●Set up roaming profile features by using Samba and NFS servers.

●Handled network related services like FTP, NFS, Samba, TCP/IP in Red Hat and Sun Solaris environment.

●Involved in documenting Linux and Windows environment and configuration details including documentation of solutions for any issues that have not been discovered previously.

Environment: RHEL 6.X, Sun Solaris 10/9.0/8.0, VERITAS Volume Manager, VxFS file system, VERITAS NetBackup, Samba, Sun SPARC 1000, Perl, Shell Scripting, CRONTAB/AT., Weblogic 8.1, Vim editor, Networking servers.



Contact this candidate