Post Job Free
Sign in

AWS Cloud/DevOps engineer

Location:
Chicago, IL
Posted:
June 11, 2025

Contact this candidate

Resume:

SUSMITHA G

AWS DevOps Engineer

Email: ****************@*****.***

Mobile: 609-***-****

Professional Summary:

Around 8 years of hands-on experience supporting, automating, and optimizing mission critical deployments in AWS, leveraging configuration management, CI/CD, and DevOps processes.

Experienced with principles and best practices of Software Configuration Management (SCM) in Agile, scrum, and Waterfall methodologies.

Automation, Build/Release Engineering and Software development involving cloud computing platforms like Amazon Web Services (AWS).

Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS.

Perform as ScrumMaster for teams with a focus on guiding the teams toward improving the way they work and facilitating overall sprint planning – including daily stand-ups, grooming, reviews/demos, retrospectives.

Experienced in AWS Cloud platform and its features which includes EC2, VPC, EBS, AMI, SNS, RDS, EBS, Cloud Watch, Cloud Trail, Cloud Formation, Kinesis, AWS Glue, AWS Config, Elastic load balancer, Auto scaling, Cloud Front, IAM, S3, Glacier and R53.

Designed AWS Cloud Formation templates to create custom sized VPC, Subnets, and NAT to ensure successful deployment of Web applications, database templates and expertise in architecting secure VPC solutions in AWS with the help of Network ACLs, Security groups, public and private network configurations.

Experience on AWS, focusing on high-availability, fault tolerance, and auto-scaling using Terraform templates along with (CI/CD) with AWS Lambda and AWS Code Pipeline.

Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS services, EBS volumes, to set alarms for notification or automated actions and to monitor logs for a better understanding and operation of the system.

Implemented automation using Configuration Management tools like Ansible, Chef.

Experience writing Ansible playbooks and deploying applications using Ansible.

Knowledge in Akamai CDN, CDN support and deploying, Cloud flare in website caching, Server load balancing and maintenance methods.

Good Knowledge and experience using Elasticsearch, cloud watch, Nagios, Splunk and Grafana for logging and monitoring.

Exposure to test data management tools like Delphix to create virtual production like databases.

Designed and configured the Hashicorp Vault roles and policies to secure the Kubernetes Infrastructure.

Worked on Hashicorp vault secret management tool to provide security for credentials, tokens and API keys.

Worked on onboarding the applications to Gremlin Chaos engineering tests.

Experience in working with container-based deployments using Docker images, Docker file, Docker Hub to

link code repositories and to build and test images, Docker Compose for defining and running multi-container applications.

Experience in Ansible configuration/deployment and writing Ansible playbooks to manage environments configuration files, packages, and users.

Set up a AWS Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified configuration and used AWS cloud CDN (content delivery network) to deliver content from AWS cache locations.

Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances.

Developed build and deployment scripts using MAVEN as build tool and automated the build and deploy processes using Jenkins to move from one environment to other environments.

Handled work from initial stage of development to create branches, make developers follow standards creating build scripts, labelling, automating the build process and deploy process by using Jenkins plugin.

Experience in using Nexus Repository Managers for Maven builds. High exposure to REMEDY and JIRA defect tracking tools for tracking defects and changes for Change management.

Configuration and maintenance of NFS, Samba, Send mail, LDAP, DNS, DHCP and Networking with TCP/IP on Linux.

Imported and managed with various corporate applications into GitHub code administration repo and Managed GIT, GitHub, Bit bucket and SVN as Source Control Systems and Managed SVN repositories for branching, merging, and tagging.

Good commitment, result oriented, hard working with a quest and zeal to learn new technologies and undertake challenging tasks.

Technical Skills:

Cloud Computing Amazon AWS Cloud

DevOps Tools Jenkins, Git, Maven, Nexus Repository, SonarQube, App D

Infrastructure as a Code Terraform

Configuration Management Tools Chef, Ansible

Operating Systems Linux & Windows.

AWS Services IAM, EC2, S3, RDS, SQS, SNS, Cloud Trail, Cloud Watch, EBS, VPC.

Languages/Scripts Python, Java Script, Bash, CSS, HTML, C#

Container and Orchestration Tools Kubernetes, Docker, AWS EKS

Ticketing and Tracking Tools Service Now, Remedy, JIRA

Databases MySQL, MS SQL Server 2008 R2, Oracle

SDLC Agile, Scrum

CERTIFICATIONS

AWS Certified Solutions Architect – Associate

EDUCATION

University of Central Missouri, Warrensburg, MO, USA Dec 2015

Master’s in Computer Science

Jawaharlal Nehru Technological University, Kakinada, India May 2014

Bachelor’s in Information Technology

PROFESSIONAL EXPERIENCE:

Manhattan Strategy Group – Baltimore, MD May 2023– Till Date DevOps/AWS Cloud Engineer

Responsibilities:

Participated in designing the architecture diagrams to multiple projects such as Assistance for Arts and educations U.S Department of education (AAEC), National Charter school resource centre (NCSRC).

Worked on configuring and administration of AWS services such as VPC,VPN, ELK,ELB,Auto scaling,EC2,S3,CloudWatch, CloudTrail, CloudWatch logs, AWS Inspector, RDS, NAT Gateway, Internet Gateway, Bastion Host, AWS WAF, AWS Route 53,AWS IAM, AWS Key management services, AWS certificate manager, Maria DB.

Built S3 buckets and managed policies for S3 bucket and Glacier for storage and backup on AWS.

Created multiple CloudWatch logs such as Apache logs, Access logs, Apache2 access logs, SQL logs, audit logs, sys logs, auth logs by installing the AWS CloudWatch Agent in both Staging and Prod environments for AAEC,NCSRC,NCELA and HMRF projects.

Integrated AWS CloudWatch logs to Splunk by creating multiple dashboards and alerts.

Created email alerts and threshold values using Dynatrace for our environments.

Instrumented Dynatrace with AWS EC2 and Lambda functions.

Sets up monitoring for applications running in cloud using Dynatrace.

Experience with Splunk searching and Reporting modules, knowledge objects, Administration, Add-On’s, Dashboards, Clustering and Forward Management.

Install and configure the Linus with Apache and PHP, Installed OpenSSL version updates, OpenSSH ciphers and configured the Drupal application.

Built a VPC, established the site -to-site VPN connection between Data Centre and AWS.

Configured Direct connect and VPN with AWS VPC.

Configured site to site VPN connection and Direct connect for high rate data transfer.

Worked on creating the CloudWatch alarms in case of audit log failures, CPU utilization, Disk Utilization, Memory utilization and sent the SNS notification to slack channel by triggering the AWS Lambda function.

Configured multi factor authentication in IAM to implement 2 step authentication of user’s access using Microsoft authenticator and AWS Virtual MFA.

Included security groups, network ACL’s, Internet Gateways and Elastic IP’s to ensure safe area of organization in AWS public cloud.

Design AWS Cloud formation templates to launch VPC, EC2, Maria DB, subnets, NATS to ensure successful deployment of web based applications and database templates.

Responsible for monitoring AWS resources using CloudWatch and also application resources using Nagios.

Branching, Merging, Release activities on version control tool GIT.

Created automated pipelines in AWS Code pipeline to deploy Docker containers in AWS ECS using services like CloudFormation, Code Build, Code Deploy, S3 and puppet.

Wrote automated scripts for creating resources in OpenStack cloud using Python and terraform modules.

Worked with Amazon web services (EC2, Elastic search, Route53, Elastic Beanstalk, VPC, IaaS).

Configured Elastic Load Balancers and Auto Scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.

Implemented Kubernetes manifests, helm charts for deployment of microservices into K8s cluster.

Worked on creating the IAM users, roles, policies, groups, Permissions and creating the access key and secret key to provide access to third party users.

Worked on creating the Linux users and as per the scans making the system highly secured by following the AWS security checklist.

Created the Maria DB and Dynamo DB and make sure the connection is established from Bastion Hosts by having the security groups restricted to specific IP addresses.

Environment: AWS Cloud, Jenkins, JIRA, Confluence, EC2, IAM, ECS, ELK, S3, RDS, IAM, Direct Connect, Splunk, Dynatrace, Code build, Nagios, Python, Terraform, ServiceNow, GIT, EKS.

Barclays Bank (Capgemini) – Whippany, NJ Jan 2022 – May 2023 AWS DevOps Engineer

Responsibilities:

Responsible for designing, implementing, and supporting cloud-based infrastructure and its solutions

Participate in software development life cycle (SDLC) in Waterfall and Agile scrum methodology.

Acted as a Scrum Master for product team and managed Sprint planning meetings, Daily scrum, sprint review, product backlog refinement meetings and sprint retrospective meetings.

Daily stand-up meetings with the team to track the backlog in scaled Agile Framework using Scrum.

Configure Delphix engines, manage Datasets on Delphix engines and create and refresh virtual databases.

Implementing Backup and Recovery strategies and supporting Delphix database virtualization platform.

Worked on onboarding the applications to Gremlin Chaos engineering tests by communicating with application team to gather detailed information covering architecture, integrations and underlying platform and infrastructure.

Developed Jenkins plugin to Create, Deploy and Update REST API’s.

Experience in implementing and configuring F5 Big-IP LTM and GTM load balancers.

Analyse the existing test cases, data and environment details required to conduct resiliency (chaos) tests.

Building and maintaining Docker container clusters managed by Openshift.

Worked on Openshift for creating new projects, services for load balancing and adding them to Routes to be accessible from outside, troubleshooting pods through ssh and logs, modification of Build Configs, templates etc.

Managing Openshift cluster for QA04 and QA05 environments that includes scaling up and scaling down the app nodes.

Worked on chef configuration management tool by updating the roles to different applications.

Manage, configure and upgrade various tools used in the CI/CD process such as Jenkins, Vault, Artifactory, Nexus, Bitbucket Server etc.

Configured Elastic Load Balancers and Auto Scaling groups to distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.

Work with Development/QA teams, to troubleshoot issues related to infrastructure and applications.

Configured and administered Jenkins for continuous integration and deployment into Tomcat Application Server and to improve reusability for building pipelines.

Responsible for creation and submitting weekly sprint status reports.

Used JIRA to maintain product backlog and sprint backlog and to create and track user stories, sprint planning, tracking and managing sprints, created scrum and Kanban boards, status reports and burn down charts.

Environment: Bitbucket, Jenkins, OpenShift Container Platform, Hashi Corp Vault, Azure, Gremlin, AWS, Terraform, App View X, Maven, JIRA, Nexus Repository, Confluence, Delphix, Service now.

Comcast Cable Corporation (Cognizant) - Philadelphia, PA March 2017 – Jan 2022

AWS Cloud/DevOps Engineer

Responsibilities:

Responsible for designing, implementing, and supporting of cloud-based infrastructure and its solutions.

Manage, configure and upgrade various tools used in the CI/CD process such as Jenkins, Vault, Artifactory, Nexus, Bitbucket Server.

Worked on AWS services such as VPC, EC2, S3, ELB, Autoscaling Groups (ASG), EBS, RDS, IAM, CloudFormation, Elastic Beanstalk, Route 53, CloudWatch, CloudFront, CloudTrail, API Gateway, Lambda, SNS & SQS.

Used Amazon RDS Multi-AZ for automatic failover and high availability at the database tier for heavy MySQL workloads.

Worked on AWS cloud infrastructure to maintain Web servers on EC2 instances with AMIS behind Elastic load balancer with Auto-scaling to maintain scalability and elasticity to scale up and down the servers as per requirement.

Used AWS CloudFront (content delivery network) to deliver content from AWS edge locations drastically improving user experience and latency.

Used CloudWatch logs to move application logs to S3 and created alarms in conjunction with SNS to notify of resource usage and billing events.

Implemented AWS Code Pipeline and Created Cloud formation JSON templates in Terraform for infrastructure as code.

Created alerts and monitoring dashboards using Grafana for all the microservices deployed in AWS.

Created multiple terraform modules to manage configuration, applications, services and automate installation process for web server and AWS instances.

Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.

Created the snapshots for the production release which are well tested and passed till STAGE environment.

Wrote complex SQL Statements to validate data and ensure system integrity and security in Oracle DB.

Executing issue post-mortem if there are any hard stops in offer flow through xfinity.com, by verifying the products and services related to cable video, voice data and home security services are configured properly and update the details in JIRA.

Configuring the offers using extensive analysis considering multiple permutations and combinations to make sure there is no Financial impact.

Developed Stored Procedures, Functions, PL/SQL Queries, Indexes and Triggers for fetching Transaction details, Customer Details, and Product Configuration data.

Solving complex technical issues or data related by debugging code, non-routine problem analysis, applying advanced analytical methods as needed. Write complex SQL queries using joins and develop custom procedures, functions using PL/SQL to generate various reports to predict business trends

Monitoring the Production Fallouts and triaging errors through testing and troubleshooting to resolve the issues.

Worked on production incidents using service now and closing them within the SLA.

Issue identification, data analysis and secure analysis through Splunk.

Configured S3 versioning and life-cycle policies to backup files and archive files to Glacier.

Sets up monitoring for applications running in cloud using AppDynamics.

Environment: Ansible, Maven, Docker, Kubernetes(EKS), ECS, JIRA, Nexus, JSON, Python, Azure, AWS (EC2, VPC, S3, RDS, kinesis, CloudFormation, Kanban, CloudWatch, CloudTrail, Route53), App D, Datadog, Splunk, Jenkins, Grafana, BASH.

Trizetto Corporation(Cognizant) - Denver, CO Oct 2016 - March 2017

DevOps /AWS Engineer

Responsibilities:

Good knowledge on the Foundations of US Health Care how it works and the government programs like Medicare and Medicaid.

Performed analysis into the different stages of the system development life cycle to support development and testing efforts, identify positive and negative trends, and formulate recommendations for process improvements and developments standards.

Worked with the Facets Data Model Guides that provides a complete set of facets ER win diagrams. These diagrams offer a high level, logical view of the data in the facets database. I can query tables used for processing transactions and run commands to manage batch jobs and services.

Worked on Facets product foundations lab by creating the Subscriber/member, Provider, Network X, Claims processing and utilization management. Processed the claims which are in the pended status and as well as completed status.

Used SSIS (SQL Server Integration Services) for migrating data from different sources like MS Excel, CSV, Flat File, sources, Oracle database to SQL Server database.

Expertise in scripting and programming languages like Shell scripting, Python for automating day to day administration tasks on cloud platforms.

Created AWS Launch configurations based on customized AMIs and used them to configure auto scaling groups.

Configured VPCs and secured them using multi-tier protection - security groups (at instance level) and network access control lists (NACL, at subnet level)

Configured auto scaling policies to scale up/down EC2 instances based on ELB health checks, resource monitoring and linked alarms to auto-scaling events.

Developed automation framework to deploy CloudFormation stacks.

Configured AWS ALB to route the traffic from target group to right targets on AWS ECS.

Build and deployed Docker images on AWS ECS and automated CI/CD pipelines.

Managing Amazon Web Services (AWS) infrastructure with automation and orchestration tools such as Chef and Ansible.

Work with AWS direct connect & VPN Cloud hub to connect cloud to provide secure connection between sites with multiple VPN connections.

Environment: BASH, JSON, EC2, S3, Glacier, RDS, VPC, Direct Connect, Route53, CloudWatch, Ops Works, IAM, WAF, SNS, ELB, AWS CloudFront, AWS AMIs

Techno Rocket Systems, Dallas, TX June 2016 - Oct 2016

Cloud DevOps Engineer

Responsibilities:

Built up CI/CD pipeline using Git, Ant, Maven, Jenkins for JAVA and Middleware applications.

Involved in Software development life cycle (SDLC) of application from design phase to implementation phase, testing, Deployment, and maintenance phase

Migrated Jenkins distributed build system from local machines to AWS.

Creating new jobs in Jenkins and managing the build related issues.

Created containerized build and test environments using Dockers.

Setup Static Code Analysis and Code Coverage to ensure quality of code.

Working with CI/CD Principles According to Organizational Standards.

Configured Elastic Load Balancers with EC2 Auto scaling groups.

Involved in Designing and deploying AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer and Auto scaling groups.

Worked on Docker containers snapshots, attaching it to a running container, removing the images, managing the containers and setting up environment for development and testing for the redirection of ports and volumes.

Created automated Unit test plans and performed Unit testing modules according to the requirements and development standards.

Development and maintenance of structured and well documented code in C# using Visual Studio.

Environment: Jenkins, Shell Scripting Docker, Maven, GIT, JIRA, SVN, Ansible, Unix, Artifactory and AWS Cloud.



Contact this candidate