Post Job Free

Resume

Sign in

Aws Developer Devops Engineer

Location:
Marietta, GA
Posted:
March 15, 2023

Contact this candidate

Resume:

Pavithra Peddanaggari

+1-470-***-****

advxpk@r.postjobfree.com

linkedin.com/in/pavithra-reddy-53b4b0203/

Professional Summary:

Successful AWS developer with around 8 years of professional Experience dedicated to automation and optimization. Understands and manages the space between operations and development to quickly deliver code to customers. Has experience with the Cloud, as well as DevOps automation development. Seeking for a position in Cloud to contribute my technical knowledge.

●Proficient in AWS Cloud platform and its features which includes EC2, VPC, EBS, AMI, RDS, EBS, Lambda, Cloud Watch, Cloud Trail, Cloud Formation, Autoscaling, CloudFront, IAM, S3 & R53.

●Worked in deployments with immutable infrastructure built and tested using CI/CD. Implemented Amazon EC2 setting up instances, virtual private cloud (VPCs) and security groups. .

●Hands on experience with AWS solutions using CloudFormation Templates and launch configurations to automate repeatable provisioning of AWS resources for applications.

●Used Amazon ECS as a container management service to run microservices on a managed cluster of EC2 instances. Implemented Amazon API Gateway to manage, as an entry point for all the API's.

●Knowledge on SaaS, PaaS and IaaS concepts of cloud computing architecture and Implementation using AWS, OpenStack, OpenShift, Pivotal Cloud Foundry (PCF) and Azure.

●Expertise in designing the Google Cloud architecture by following the financial regulations from security point of view. Expertise in several GCP service focusing on Security, Kubernetes and Biq Query.

●Expertise in deep development/enhancement of OpenStack components such as Nova, neutron, designate, Cinder, Swift, Image, Horizon & Identity, and Ceilometer.

●Managed Google BigQuery instance for the data warehouse, using Google Data Prep and Google storage as the staging platform. Deployed the application using Google App Engine.

●Responsible for building scalable distributed data solutions in both batch and streaming mode on Google BigQuery using Kafka, Spark and Core Java.

●Converted existing Terraform modules that had version conflicts to utilize CloudFormation templates during deployments, worked with Terraform to create stacks in AWS, and updated the Terraform scripts based on the requirement on regular basis.

●Expertise in integrating Terraform with Ansible, Packer to create and Version the AWS Infrastructure, designing, automating, implementing and sustainment of Amazon machine images (AMI) across the AWS Cloud environment.

●Experience with setting up Chef Infrastructure, bootstrapping nodes, creating, and uploading recipes, node convergence in Chef SCM.

●Experience in using Chef for server provisioning and infrastructure automation, release automation and deployment automation, Configure files, commands, and packages.

●Experience in working with Ansible Playbooks to automate various deployment tasks and working knowledge on Ansible Roles, Ansible inventory files and Ansible Galaxy.

●Experienced in working on Docker hub and Docker swarm, Docker compose creating Docker images and handling multiple images primarily for middleware installations and domain configuration.

●Experience in working with Jenkins in a Docker container with EC2 slaves in Amazon AWS cloud environment and also familiar with surrounding technologies such as Mesos (Mesosphere) and Kubernetes.

●Experienced in Using Jenkins pipelines to drive all Microservices builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

●Experience in Branching, Merging, Tagging and Maintaining the version control and source code management tools like GIT, SVN and Bitbucket on Linux and windows platforms.

●Expertness in using Build Automation tools and Continuous Integration concepts by using tools like Gradle, ANT, Jenkins, Team city, Quick Build, Build forge and Maven.

●Architected Jenkins build pipelines in various Linux environments like RHEL, Centos and Windows to build and deploy Java applications.

●Having knowledge on amazon lex, Congnito and Polly.

●Proficient with Shell, Python, Ruby, Perl, Power Shell, JSON, YAML, Groovy, PowerShell scripting languages.

●Experienced with all the OpenStack components like keystone, swift, nova, cinder, glance etc.

●Experience in using web servers like Apache HTTP and Tomcat, Nginx, IIS and application servers like IBM WebSphere, Oracle WebLogic and JBOSS for deployment.

●Hands on experience on working with System health and performance Monitoring Tools like Nagios, Splunk, CloudWatch, New Relic, App Dynamics.

●Experience in working with SQL database like MySQL, Oracle, SQL Server and NoSQL databases like MongoDB, DynamoDB and Cassandra.

TECHNICAL SKILLS:

Cloud Platforms

AWS, OpenStack, GCP

Configuration Management Tools

Ansible, Chef, Puppet

CI /CD Tools

Jenkins, Bamboo, Gitlab

Build Tools

Maven, ANT

Containerization Tools

Docker, Docker Swarm, Kubernetes, Mesos, OpenShift, Aws ECS.

Version Control Tools

GIT, GITHUB, Bitbucket, SVN, Gitlab

Logging & Monitoring Tools

Nagios, Splunk, ELK, CloudWatch, Azure Monitor, Prometheus, New Relic

Scripting & Programming Languages

Shell Scripting, RUBY, C, C++, XML, JavaScript, PL/SQL, Java/J2EE, HTML, PERL, POWERSHELL, Python, Java/J2EE, .Net

Databases

My SQL, MS SQL, Oracle, Dynamo DB, Cassandra, Mongo DB 7 SQL Server

Application/Web Servers

Web logic, Web sphere, Apache Tomcat, Nginx, Oracle application server

Operating Systems

UNIX, Linux, Windows, Solaris, CentOS, UBUNTU and RHEL.

Virtualization Platforms

Oracle VirtualBox, VMware Workstation, Vagrant, VMware vSphere ESXi 5.x/4.X, ESX /3.x, Hyper-V

Bug Tracking Tools

JIRA, Bugzilla, Remedy, HP Quality Center, IBM Clear Quest, Mingle

Repositories

Antifactory, Nexus

Web Technologies

HTML, CSS, Java Script, jQuery, Bootstrap, XML, JSON, XSD, XSL, XPATH.

Professional Experience:

Truist DEC’23-Present

Role: AWS Analyst/AWS Developer

Responsibilities:

Designed and stepup Enterprise Data Lake to provide support for various use cases including Analytics, processing, Storing and reporting of Voluminous, rapidly changing Data.

Data ingestion from LXN/IFMX sources using Cloud DataMovement framework enabled fraud analytic teams to access common view of customer profiles, accounts, activities and events to effectively reduce fraud events.

Designed and Developed Security Framework to provide fine grained access to objects in AWS S3 using Lambda, DynamoDB.

Implemented AWS step functions to automate and orchestrate the Amazon Sage maker related tasks such as publishing data to S3.

Responsible for maintaining quality reference data in source by performing operations such as cleaning, transformation and ensuring Integrity in a relational environment by working closely with the stakeholders & solution architect

Designed and developed Security Framework to provide fine grained access to objects in AWS S3 using AWS Lambda, DynamoDB.

Designed and executed on-prem to AWS cloud migration projects for Truist Financial cop.

Setup and worked on Kerberos authentication principals to establish secure network communication on cluster and testing of HDFS, Hive and MapReduce to access cluster for new users various AWS services like Amazon EMR, Redshift, S3.

Used AWS EMR to transform and move large amounts of data into and out of other AWS data stores and databases. such as Amazon Simple Storage Service (Amazon S3) and Amazon DynamoDB

Creating Lambda functions with Boto3 to deregister unused AMIs in all application regions to reduce the cost for EC2 resources

Developed reusable framework to be leveraged for future migrations that automates EIL from RDBMS systems to the Data Lake utilizing Spark Data Sources and Hive data obiects

Conducted Data blending, Data preparation using Snowflake for Tableau consumption and publishing data sources to Tableau server.

Integrated Apache Airflow with AWS to monitor multi-stage ML workflows with the tasks running on Amazon Sage-Maker.

Experience on Setting up Data sync to move data from on-prem to Cloud environment based on schedule period in secure way.

Creating and utilizing tools to monitor our applications and services in the cloud including system health indicators, trend identification, and anomaly detection

Develop AWS centralized logging solutions for security teams to consolidate AWS logs and analyze them to detect incidents using CloudTrail, VPC flow logs, S3 Cloudwatch logs, EC2 server logs.

Wrote Terraform module to develop entire AWS infrastructure .

Expertise in working with Terraform key features such as IAC, Execution plans, change Automation and extensively used Autoscaling launch configuration templates for launching AWS EC2 instance.

Expertise in configuring log monitoring tools like ELK stack, Splunk to monitor logs of applications in production and testing.

Experience with Informatica to extract data from EDL and convert into Cloud compatible file then put in target location with help ESP

Environment: Datasync,Digital edge, AWS connect, Cloudwatch,Cloudtrail,Event bridge,s3, Stepfunction, lambda,Glue, Aws Macie, Glue crawler, Lakeformation, Athena, Snowflake, tableau, EC2, Qualis, TurboT, Splunk, Informatica

Chamberlain Groups, OakBrook, IL Dec’21– Nov’22

Role: AWS developer

Responsibilities:

Performed the automation deployments using AWS by creating the IAMs and used the code pipeline plugin to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers.

Write Lambda functions in python for AWS Lambda and invoked python scripts for data transformations and analytics on large data sets in EMR clusters and AWS Kinesis data streams.

Used Ansible Tower, which provides an easy-to-use dashboard and role based access control, so that its easier to allow individual teams access to use Ansible for their deployments.

Write Python scripts to totally automate AWS services which includes web servers, ELB, CloudFront distribution, database, EC2 and database security groups and application configuration, this script creates stacks, single servers, or joins web servers to stacks.

Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to kubernetes, created Pods and managed using kubernetes.

Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.

Good experience on creating server less chat box by using amazon lex.

Strong understanding of the entire AWS Product and Service suite primarily EC2, S3, VPC, Lambda, Redshift, Spectrum, Athena, EMR(Hadoop) and other monitoring service of products and their applicable use cases, best practices and implementation, and support considerations.

Automated Java Builds with Maven and implemented multiple plugins for Code analysis, Junit, Code coverage, PMD, SonarQube, etc. Installed and administered Artifactory repository to deploy the artifacts generated by Maven.

Monitor and tune performance of MongoDB. Experience with MongoDB scaling across data centers.

Upgraded various CI/CD tools like Jenkins/JIRA using SDLC process and supported upgrading the software patches and worked with the vendor for any issues to completion.

Good knowledge with MongoDB tool like Atlas. Cloud manager, deployment on Docker container services AWS Elastic Container servie deployment of MongoDB, build MongoDB apllication for microservices.

Implementation of client side validations using JQuery and ASP.Net MVC validations implemented at controller level.

Developed User interface using HTML, Bootstrap,CSS, JavaScript and AngularJS.

Used Spring framework features like Spring IOC,Spring AOP and Spring Batch.

Had knowledge on Kibana and Elastic search to identify the Kafka message failure scenarios.

Implemented Kafka producer and consumer applications on Kafka cluster setup with help of Zookeeper.

Developed JAVA API to interact with AWS SQS used in sending bilk emails.

Developed user interface using ASP.NET, Bootstrap, Javascript and CSS3 for responsive website.

Used Spring Kafka API calls to process the messages smoothly on Kafka cluster setup.

Installing Ansible Tower and using it to manage systems.

Athena Health EMR implementation and physician onboarding.

Set up Jenkins server and build jobs to provide continuous automated builds based on polling the GIT SCM during the day and periodically scheduled builds overnight to support development.

Provided training and instruction to co workers and peers on PowerShell scripting techniques and Practices.

Created and maintained the Python deployment scripts for Tomcat web application servers.

Environment: Linux, Bash, PowerShell, Python, AWS, ELK, Terraform, Ansible, Docker, Jenkins, Git, Jira, JAVAScript, Kubernetes, Maven, Nagios, SonarQube, kafka, DynamoDB, MongoDB, Cassandra, WebSphere 8.x, WebLogic 10.x/12.x, Tomcat 8.5.x, Nginx.

Dell Systems, Austin, TX April’17 -Oct’21

AWS developer & Devops Engineer

Responsibilities:

Created confidence and certainty in deployments with immutable infrastructure built and tested using CI/CD.

Produced client side validation through JavaScript for Asynchronous communication.

Created interface to manager user menu using Javascript, Angular JS and JQuery.

Responsible for validation of client interface HTML Pages using Javascript and JQuery

Worked on Terraform for deploying AWS infrastructure like VPCs, ELBs, security groups, AWS Glue, SQS queues, S3 buckets, and continuing to replace the rest of the infrastructure.

Ability to harness all that AWS has to offer - you'll be spinning up new scalable environments quickly and keeping our AWS accounts tidy and efficient.

Used AWS Redshift, S3, Spectrum and Athena services to query large amount data stored on S3 to create a virtual Data Lake without having to go through ETL process.

Have done POC on AWS Athena Service.

Build servers using GCP, improving volumes launching EC2,RDS,Creating security groups, auto scaling, load balancers in the defined Virtual private connection.

Hosted static websites in S3 as secondary site and route customers to a custom error page if the primary webserver is down. Used S3 Firefox plugin to upload information to S3 and used CloudFront as a content delivery network to speed up sites and media delivery.

Design and oversee development IoT platform ecosystem including edge sensors, gateways, aggregators and actuators, mesh networks, IoT cloud services such as data ingestion, device provisioning, analytics and orchestration

Creating S3 buckets and restricting access to buckets and objects to specific IAM users.

Automating backups by shell in Linux and PowerShell scripts in windows to transfer data in S3 bucket.

Involved in writing Java API for amazon Lambda to manage some of the AWS services.

Performed the automation deployments using AWS by creating the IAM and used the code pipeline plugin to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers.

Responsible for the acquisition and management of IoT services at the launch of new solutions

Served as the lead Power shell Developer in hundreds of windows based migration and automation projects.

.

Experience Migrating servers, applications from local datacenter on VMWare to AWS.

Changing the AWS infrastructure Elastic Beanstalk to Docker with kubernetes.

Implemented Spring boot Microservices to process the messages into the Kafka cluster setup.

Created monitors, alarms and notifications for EC2 Hosts using Cloud watch

Write Lambda functions in python for AWS Lambda and invoked python scripts for data transformations and analytics on large data sets in EMR clusters and AWS Kinesis data streams.

Environment: AWS, GCP, JavaScript, Kafka, Terraform, Container Orchestration - ECS, Docker; Configuration Management tools - Ansible; Infrastructure as Code (IaC) - Terraform, CloudFormation; SQL, GitHub, Bash/Shell scripting.

AlgoBrains Technologies pvt ltd, Hyderabad, India Jan’15– apr’17

AWS Developer

Responsibilities:

Configured, deployed highly available (HA) and scalable applications on AWS using EC2, RDS, S3, Elastic Load Balancer (ELB), Auto-scaling and monitored these applications with CloudWatch and SNS services.

Managed system administration tasks of maintaining users, filesystems, networking, package management on local machines and AWS.

Developed custom Power Shell script to tie into the filesystem.

Provisioned and configured virtual machines using VirtualBox and Vagrant.

Maintained Git repositories for DevOps environment: Version control and built automation integrating git into Jenkins.

Worked closely with clients to establish problem specifications and system designs.

Documented progress of project, resources using Microsoft Project management tool.

Implemented RESTful web services using Jersey, Spring Frameworks, JSON and Java 8.

Implemented all the shared services application as Spring Boot Deployable JARs.

Added security to the application using Spring security and SAML authentication mechanism.

Initiated the applications with Spring Boot.

Developed and executed JUnits for the REST services using Spring’s JUnit facilities..

Maintained existing applications and designed and delivered new applications.

Debugged network connectivity issues on workstations in computer labs.

Works with several technologies such as JSP, Servlets and JDBC, HTML, Spring boot, Hibernate

Used SSH techniques and bash scripts to install packages on nodes.

Managed GitHub repositories and permissions, including branching and tagging.

Created VPC Peering between multiple VPC into various AWS accounts.

Took snapshots of existing volumes and create AMI from running servers and restore it in case of emergency.

Creating Docker images from Centos Linux as per requirement of the product/service and storing those Docker images to the ECR or Nexus Repository.

Environment: AWS (EC2, EBS, S3, VPC, Code Deploy, Route 53 (DNS), ECS, CloudFormation, RDS, DynamoDB, Load Balancers, CloudWatch, SNS, SES, SQS, IAM, RedShift), Maven, Puppet, Ansible, Docker, Kubernetes, Jenkins, GIT.

Educational Details & Certifications:

●AWS Certified DEVOPS Engineer Professional.

●Bachelor’s in computer science and Engineering, 2015 – JNTUH.



Contact this candidate