Post Job Free
Sign in

Aws cloud enineer

Location:
Boston, MA
Posted:
August 20, 2020

Contact this candidate

Resume:

Sowjanya (Sr DevOps Engineer)

Contact: 248-***-**** / *************@*****.***

https://www.linkedin.com/in/sowjanya-reddy-906a614b/

Professional Summary:

Around 8 years of Experience IT industry with major focus on Release management, CI CD and Configuration Management, DevOps and Cloud Engineer.

Experienced in AWS Cloud-Specific technologies including EC2, EBS, S3, VPC, RDS, SES, ELB, EMR, ECS, IAM, Cloud Front, Cloud Formation, Cloud Watch, Elastic Cache, Redshift, Lambda, SNS, Dynamo DB.

Hands on Experience in terraform scripts to automate AWS cloud infrastructure.

Experience in Jenkins for CI/CD pipeline for all builds and deployments with Jenkins, Rundeck and Puppet configuration management tool.

Well experienced in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT, SVN

Hands on experience using MAVEN, ANT as build tools for writing build.xml and pom.xml, building of deployable artifacts from source code and configuring, administering Nexus repository manager for Maven builds.

Strong understanding of using Ansible to orchestrate AWS environment, written Ansible Playbooks, Configured Ansible roles and integrated with the pipeline.

Good working experience on scripting languages like Shell, Python, PowerShell, Bash. Ability in development and execution of XML, Shell Scripts and Perl Scripts.

Good experience on several Docker components like Docker-Hub, Engine, Compose, Swarm, Docker registry, creating Images, Containers and pushing them to Docker Hub.

Involved in setting up JIRA as bug tracking system and configured various workflows.

Experienced in Software Development Life Cycles (SDLC) and Agile Methodologies.

Strong working experience on DSS (Decision Support Systems) applications, directly responsible for the Extraction, Transformation and Load (ETL) of data from Legacy Systems using Informatica and Reporting using SAP BI.

Responsible for verifying and validating whether products, services meet specified requirements.

Expertise in Analysis, Design, Development and Implementation.

Working Experience on Database clustering in PostgreSQL.

Ensure success of project by taking ownership of deliverable &assist/perform analysis, testing or other tasks as necessary.

Quickly adapting to the new environment and ability to learn new technologies.

To work with positive attitude, analyzing all kinds of situation with efforts.

Ability to interface and communicate effectively with team members and provide the guidance in development activity.

Good Experience in creating Transformations and mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Involved in creating the technical specifications and high-level Design documents.

Developed Standard Reports, List Reports, Cross-tab reports, Drill through reports and Master Detail reports using Report Studio.

Attending Status meetings and daily scrum calls with the client and Business Users. All Phases Testing like Unit testing, QA Testing, Regression testing and UAT.

Strong ability to troubleshoot any issues generated while building, deploying and in production support.

Familiarity with Kubernetes cluster management and administration, creating pods and managing them by updating resources depending on the requirement.

Used Chef Automate to build and experienced in using Chef and managed nodes, Cookbooks, Chef Recipes, Chef attributes, Chef templates, Run-lists, environments.

Worked with Scheduling, deploying, managing container replicas onto a node cluster using Kubernetes.

Actively involved in deployments for prod/non-prod environments by making sure all services get deployed successfully.

Obtain a responsible and challenging position in a growth-oriented organization, which will allow me to learn new technologies and skills while utilizing my expertise experience as a DevOps, cloud engineer, and to build a strong business relationship with the company and clients.

Involved in installing, configuring, updating and troubleshooting Web Servers (Apache Tomcat, Nginx).

Participated in on-call rotation supporting the platform and production application for production issues.

Technical Skills:

Operating Systems

Windows, Linux/Unix and MAC OS.

DevOps Tools

Ansible, Puppet, Chef, Jenkins, Docker, Ant, Maven, Nexus, Bamboo, Kubernetes.

Languages

Shell, Bash, Perl, Python, Groovy, Ruby, YAML

Databases

MySQL, MongoDB, MS SQL server, Oracle PL/SQL developer.

Web/App Server

Apache, IIS, Tomcat, WebSphere Application Server.

Ticketing Tools

JIRA, RTC

Version Control Tools

GIT, SVN, Bitbucket

ETL / Middleware tools

Web Technologies/Programming Languages

Servlets, JDBC, JSP, XML, HTML, Java Script, Java/J2EE, C, C++, Perl scripting, Python, Shell scripting, Ruby

Monitoring Tools

Nagios, Zabbix, Splunk, ELK, CloudWatch

Cloud Platforms

AWS, Microsoft Azure, OpenStack, GCP

Education: Bachelor’s Degree, India

Work Experience

Monsanto, St Louis, MO. July 2018 – Present

Devops / AWS Cloud Engineer

Responsibilities:

Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups.

Developed Docker based micro services, deployment modules with Jenkins, Kubernetes Experimented with Docker, by using Docker-compose.

Implemented a Continuous Delivery pipeline with Docker, and GitHub and AWS

Migrated on-premise database to AWS Cloud and designed, built, and deployed a multitude of applications utilizing the AWS stack (Including EC2, R53, S3, RDS, SNS, and IAM), by focusing on high-availability, fault tolerance, and auto-scaling with the opscode chef; cookbooks and recipes. Using Jenkins 2.0 AWS Code Deploy plugin to deploy to AWS

Implemented new projects builds framework using Jenkins & maven as build framework.

Implementing a Continuous Delivery framework using Jenkins, Ansible, Maven & Nexus in Linux environment

Wrote Ansible Playbooks for deploying, configuring, and managing collected for metric collection and monitoring.

Built Ansible playbooks and bootstrap scripts to allow us to bootstrap instances to various roles without having to maintain AMIs. Implemented a distributed messaging queue to integrate with Cassandra using Kafka,

Helped teams perform their duties more efficiently by providing ALM trainings targeted for each teams' needs.

Created private cloud using Kubernetes that supports development, test and production environments.

Automated deployments, scaling, and operations of application containers across clusters of hosts, provided container-centric infrastructure by Kubernetes.

Created test branches from the master branch of each repository on GIT to perform testing of Gradle upgrade to LSR and then assisted DEV teams to do the same successfully.

Automated configuration management and deployments using Ansible playbooks and YAML.

Migrated configuration management from Puppet to Ansible

Managed Ansible playbooks to automate system operations and AWS Cloud management

Developed core product feature using NodeJs, Java and Scala.

Worked on loading CSV/TXT/DAT files using Scala/Java language in Spark Framework and process the data by creating Spark Data frame and RDD and save the file in parquet format in HDFS to load into fact table using ORC Reader.

Defining Release Process & Policy for projects early in SDLC and responsible for code build, release and configuration

Perform Deployment of Release to various QA & UAT in Linux environments.

Optimized volumes and EC2 instances and used IAM to create new accounts, roles and groups

Configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier

Configured Elastic Load Balancers (ELB) with EC2 Autos calling groups

Created monitors, alarms and notifications for EC2 hosts using Cloud Watch

Splunk deployment, configuration and maintenance across a variety of UNIX and Windows platforms.

Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of OpenStack Nodes and Test Playbooks on AWS instances using Python.

Worked on applying patches and recommending necessary security fixes for web Application servers

Scripting in multiple languages on UNIX, LINUX and Windows - Batch, Python, Shell script etc.

Troubleshoot the build issue during the Jenkins build process.

Resolved system issues and inconsistencies in coordination with quality assurance and engineering teams.

Environment: AWS, OpenStack, Terraform, Chef, Ansible, Docker, Kubernetes, Jenkins, GIT, Python, Maven, New Relic, java.

Charter communications, St Louis, MO Jan 2017 – June 2018

Devops Cloud Engineer

Roles and responsibilities:

Developed build and deployment processes for Pre-production environments.

Configured Route 53 by using CFT templates assigned the DNS mapping for the AWS servers and trouble-shoot the issues of the load balancer's, auto scaling groups and Route 53.

Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AMI's.

Worked with Ansible and Packer to build Jenkins master AMI. This includes Groovy to configure plugins configuration files and jobs deployed with DSL plug-in, Ruby, and Vagrant file to help with testing of that AMI and a Python script to help rotate old versions of the AMI.

Using Ansible Playbook, automated the Build of Docker Image, Utilized Jenkins to Auto push to Docker HUB, Automated the infrastructure downloaded and managed Ansible roles from Ansible Galaxy.

Wrote Ansible playbooks to launch AWS instances, securing a server with Ansible &used Ansible to manage web apps, configuration files, used mount points, packages & worked on developing Ansible Go Scripts for automating regular tasks.

Developed Ansible Playbooks using YAML scripts for launching different EC2 virtual servers in the cloud using Auto-scaling and Amazon Machine Images (AMI).

Automated Weekly releases with Maven scripting for Compiling Java Code, Debugging and Placing Builds into Maven Repository. Developed automation scripting in Shell using Puppet to deploy and manage Java apps across Linux servers.

Used Puppet to automate Configuration management and to manage Web Applications, Config Files, Data Base, Commands, Users Mount Points and Packages.

Experience writing puppet manifests for apache installation and configuration as well as for various deployments.

Created Docker images using a Docker file, worked on Docker container snapshots, removing images & managing Docker volumes. Docker can be integrated into various tools like AWS, Puppet, Vagrant, Jenkins & VMware containers.

Used Docker coupled with load-balancing tool Nginx to achieve Continuous Delivery goal on high scalable environment

Experience in designing and deploying AWS Solutions using EC2, S3, EBS, Elastic Load balancer (ELB), auto scaling groups.

Containerization of Web application using Docker and Kubernetes and Database maintenance.

Involved in writing parent POM files to establish the code quality tools integration.

Collaborated with development support teams to setup a continuous delivery environment with the use of Docker.

Involved installing & managing different automation and monitoring tools on Red hat Linux like Nagios, Splunk and Ansible.

Used Kubernetes as a open source platform for automating deployment, scaling and operations of applications containers across clusters of hosts, providing container centric infrastructure.

Kubernetes to deploy applications quickly and predictably.

Developed and implemented Software Release Management strategies for various apps in agile process.

Experience migrating SVN repositories to GIT.

Developed automation scripting in Python (core) using Puppet to deploy and manage Java apps across Linux servers.

Configured and installed monitoring tools Grafana, Kibana, Log stash and Elastic Search on the servers.

Automated the cloud deployments using Puppet, python (boto & fabric) and AWS Cloud Formation Templates.

Business data analysis using Big Data tools like Splunk, ELK.

Configured SonarQube code quality tool and integrated it with Jenkins. Implemented SonarQube to analyze code quality metrics, to verify the coding standards and setup quality gates to allow/fail builds as per requirement.

Created and tracked release improvement process to be applied across all IT domains and initiates new projects related to release management. Releasing code to testing regions or staging areas according to the schedule published.

Used Azure Terraform and Azure OpsWorks to deploy the infrastructure necessary to create development, test, and production environments for a software development project

Used Kubernetes for container operation in Azure and used Kubernetes clusters as a network and load balancer, and chosen Kubernetes is also good at running web applications in a cluster way, also used in multiple services by creating images and reduced space.. Automate NGINX/MySQL Setup and Monitor.

Having pleasant experience in Cloud Computing platform like Azure from Microsoft for deploying and managing applications. Automate NGINX/MySQL Setup and Monitor

Environment: RTC, SVN(Subversion), Anthill Pro, ANT, NAnt, Maven, Puppet, Azure, Jenkins, Clear case, Unix, Linux, Perl, Python, Ruby, AWS, Bamboo, Hudson, Git, JIRA, Shell Script, WebLogic.

Client: NBC Universal, Los Angeles, CA May 2015 – Nov 2016

Role: DevOps Engineer

Responsibilities:

Developed and implemented Software Release Management strategies for various apps according to the agile process.

Installed, Configured and Administered Hudson/Jenkins Continuous Integration Tool.

Developed build and deployment scripts using ANT and MAVEN as build tools in Jenkins to move from one environment to other environments. Developed automation framework for Application Deployments to the cloud environments.

Worked on Managing the Private Cloud Environment using Chef.

Performed Branching, Tagging, Release Activities on Version Control Tools: SVN, GIT.

Developed Perl and shell scripts for automation of the build and release process, developed Custom Scripts to monitor repositories, Server storage.

Automated the cloud deployments using chef, python (boto & fabric) and AWS Cloud Formation Templates.

Used Maven as build tool on Java projects for the development of build artifacts on the source code.

Deployed the Java applications into web application servers like JBoss.

Performed and deployed Builds for various Environments like QA, Integration, UAT and Productions Environments

Worked on configuring the Jenkins to use MetaCase Software to build Java code and also to do the whole C.I process on the java code generated by MetaCase.

Troubleshoot and resolved Build failures due to infrastructure issues reduced by 95% stabilizing the build process. Setup and executed process to code review system effectively.

Responsible for defining branching & merging strategy, check-in policies, improving code quality, automated Gated Check-ins, defining backup and archival plans.

Troubleshoot Build and Deployment Issues, with little downtime.

Organized and Coordinated Product Releases work closely with product development, QA, Support across global locations to ensure successful releases.

Documented release metrics, software configuration process. Used Maven scripts to build the source code. Supported and helped to create Dynamic Views and Snapshot views for end users.

Environment: DevOps, Java, Ant, J2EE, Maven, Jenkins, Hudson, Chef, Python, Perl, GIT, Apache Webserver, JBoss, Apache JMETER, MetaCase, GIT, SVN, Windows.

Cybernatics - India Jan 2014 – Mar 2015

Role: ETL Developer

Project: US Health insurance

Responsibilities:

Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.

Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings

Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.

Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.

Created Mapplet and used them in different Mappings

Performance tuning using round robin, hash auto key, Key range partitioning.

Used shell scripts for automating the execution of maps.

Used ETL to extract and load data from Oracle, Flat Files to Oracle DWH.

Involved in Analysis, Documentation and Testing of workflows.

Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.

Created and Monitored Workflows using Workflow Manager and Work Flow Monitor.

Worked extensively on different types of transformations like Update Strategy, Look Up, Filter and Router Transformations.

Developed Mapping to load data in Slowly Changing Dimensions.

Created Workflows / Worklets and scheduled them using workflow manager

Environment: Informatica Power Center 9.6/9.5,Oracle 11g/12c, SQL, UNIX

Client: Cybate Infotech India June 2012 – Dec 2014

Role: ETL Developer

Project: Air India

Responsibilities:

Implement ETL solution and identify resource requirements.

Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.

Regular meetings for project updates, schedules, timelines, results, plans and status with teams and management.

Expertise inputs in design meetings as an functional data SME.

Provide conceptual solutions and enterprise project impact along with possible solutions and error routes for enterprise projects.

Responsible for mentoring Developers and Code Review of Mappings developed by other developers.

Responsible for best practices like naming conventions, Performance tuning, and Error Handling.

Involved in business analysis and technical design sessions with business and technical staff to develop Entity Relationship/data models, requirements document, and ETL specifications.

Guided Joint Application Design (JAD) sessions for the approval of the Requirement Document with Business Owners and Business SME.

Developed full SDLC project plans.

Attending Status meetings and daily scrum calls with the client and Business Users.

All Phases Testing like Unit testing, QA Testing, Regression testing and UAT

Involved in the requirement study and understanding the functionalities.

Ability to interface and communicate effectively with team members and provide the guidance in the development activity.

Developed Informatica Mappings to capture data changes from the operational sources systems into data warehouse and data marts (SCD mappings).

Created Workflows / Worklets and scheduled them using workflow manager.

Developed Mapping to load data in Slowly Changing Dimensions.

Ensuring proper Dependencies and Proper running of loads (Incremental and Complete loads)

Perform Tuning of Sessions and Mappings.

Created Various Complex Transformations such as Update Strategy, Look Up, Filter and Router Transformations.

Created basic and aggregate levels of Fact Table mapping and optimized them for incremental Extraction and Aggregation using Parameters and variables to pass values during workflow execution.

Environment: Informatica Power Center 9.6/9.5,Oracle 11g/12c, SQL, UNIX



Contact this candidate