Post Job Free

Resume

Sign in

Devops Engineer Project Management

Location:
Santa Clara, CA
Salary:
100000
Posted:
February 05, 2024

Contact this candidate

Resume:

Saritha Gannapaneni

Santa Clara, CA ***** 669-***-**** ad3d9z@r.postjobfree.com

Summary

* ***** ** ********** ** the field of DevOps Engineer in application configurations, code compilation, packaging, building, automating, managing and releasing code from one environment to other and deploying to servers.

9+ years of overall experience working in Devops, Big Data and Datawarehouse Technologies.

Experienced in Jenkins by installing, configuring and maintaining for purpose of continuous integration (CI) and for end to end automation for all build and deployments and creating Jenkins CI pipelines.

Hands on experience with EC2, S3, RDS, VPC, ELB, EBS, Auto scaling.

Experienced in branching, merging and maintaining the versions using SCM tools like Git and GitHub on windows and Linux platform.

Experienced in Project Management and issue tracking tool like JIRA.

Experienced in the creation of Docker containers and Docker consoles for managing the application life cycle.

Creating custom Docker Images using Docker file for easier replication of DEV and QA Environments in local machines.

Performed and deployed builds for various Environments like QA, Integration, UAT and Productions Environments.Developed and deployed Chef, puppet based on their cookbooks, recipes and manifest.

Configured and monitored distributed and multi-platform servers using Nagios.

Strong analytical and problem-solving skills and can work either independently with little or no supervision or as a member of a team.

Good written and verbal communication skills, strong organizational skills and a hard-working team player, well-practiced in attending phone calls and answering business team queries.

Experience

Devops Engineer, 12/2022 - Current

FannieMae

Environment: Amazon EC2, S3, RDS, VPC, ELB, EBS, Auto scaling, UNIX/LINUX, Redhat Linux 6, CentOS, Jenkins, Windows, Apache Tomcat, Shell Scripts, Docker, Nagios, puppet.

Used Shell scripts to day to day activities and tasks for automating.

Used Jenkins tool to automate the build process.

Installing and configuring Jenkins master and slave nodes. Built CI/CD pipeline and managing the infrastructure as code using chef and puppet.

Have experience in cloud platform like AWS.

Created and implemented chef cookbooks for deployment and also used Chef Recipes to create a Deployment directly into Amazon EC2 instances.

Worked in GIT to manage source code.

Deployed the applications to Tomcat Application Server and static content to Apache web servers.

Automated the continuous integration and deployments using Jenkins, Docker.

Installed, Configured, and Managed Monitoring Tools such as Nagios for Resource Monitoring/Network Monitoring.

Worked on Docker container to create Docker images for different environments.

Responsible for taking the source code and compiling using Maven and package it in its distributable format, such as a WAR file.

Implemented process for release management, automated code deployment, configuration management, and monitoring.

Used Docker and Kubernetes to manage micro services for development of continuous integration and continuous delivery.

Used Kubernetes to create Pods, Config Maps, and deployments into the cluster.

Hands on experience Kubernetes to automate the deployment, scaling, and operations of application containers across clusters of hosts.

Data Engineer and Analyst, 03/2021 - 11/2022

Subk-Impact Solutions

Environment: Linux, Eclipse, Java, SQL, AWS, Python, Subversion, Bash

Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters with agile methodology.

Monitored multiple Hadoop clusters environments using Control-M, monitored workload, job performance and capacity planning using Cloudera Manager.

Experienced with through hands-on experience in all Hadoop, Java, SQL and Python.

Participated in functional reviews, test specifications and documentation review.

Performed MapReduce programs on log data to transform into structured way to find user location, age group, spending time.

Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.

Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports by Business Intelligence tools.

Documented the systems processes and procedures for future references, responsible to manage data coming from different sources.

Automated the deployments using scripts which execute the automated CI and release management process.

Implementing a Continuous Delivery framework using Jenkins, Maven & Artifactory in Linux environment.

Maintained build related scripts developed in MAVEN and shell. Modified build configuration files including MAVEN's pom.xml.

Automate all the tool and Product and Integrate required for company specific product through Cookbooks.

Responsible for nightly and weekly builds for different modules.

Integration of Maven/Artifactory, Jenkins, Urban Code Deploy with Patterns/Release, Confluence, Jira.

Used JIRA as ticket tracking, change management and Agile/SCRUM tool and managed Artifactory repositories to download the artifacts during the build.

Data Engineer, 06/2019 - 02/2021

APR-Hub Technologies

Environment : Ab Initio 3.03, Teradata 12, UNIX (Sun Solaris Korn Shell).

Supported in Gathering requirement and understanding the business functionality in case of TPR (Technical Project Request) document.

Develop the graphs according to the business requirements.

Analyze source systems file layouts and write DML for extracting the data from various sources like flat files, tables, Mainframe Copy Books, Responder Layouts.

Involve in analyzing the data transformation logics, mapping implementations and data loading into target database through Ab-Initio graphs.

Developing UNIX shell scripts for automation processes.

Involve in fixing the unit and functional test case/data defects.

Analysis of Existing application and identifying improvements.

Maintained build related scripts developed in MAVEN and shell. Modified build configuration files including MAVEN's pom.xml.

Automate all the tool and Product and Integrate required for company specific product through Cookbooks.

Responsible for nightly and weekly builds for different modules.

Integration of Maven/Artifactory, Jenkins, Urban Code Deploy with Patterns/Release, Confluence, Jira.

Used JIRA as ticket tracking, change management and Agile/SCRUM tool and managed Artifactory repositories to download the artifacts during the build.

Systems Engineer, 12/2011 - 04/2014

TCS Pioneer, ITPL

Environment: J2EE, Servlets, EJB, AJAX, HTML, CSS, XML, Ant, Rest API, JavaScript, Oracle 10G, Eclipse AngularJS, jQuery, Oracle, SOAP, Maven.

Agile methodology is used for development of the applications and deployed various stored procedures and triggers to get the data using SQL server.

Debugged the code using Java debugger and Eclipse and used exception, condition and other break points.

System was built using Model-View-Controller (MVC) architecture.

Wrote SQL queries, stored procedures, modifications to existing database structure as required for addition of new features using Oracle database.

Developed JUnit test cases and performed various phases of testing.

Education

Computer And Information Systems Security: Expected in 08/2024

University of Cumberlands - Williams Burg,Kentucky

Skills

Versioning Tools - Git

Source and Version Control -Git, Github

CI - Jenkins, Chef, Puppet

Build Tools - Maven

Ticket Tracking Tool - JIRA

Containerization Tool - Docker

Operating System -

Windows, Unix

AWS - Amazon EC2, S3, RDS, ELB, EBS, Auto scaling.

Languages - SQL, NO SQL

Scripting Language - Shell, Python

Web server - Apache Tomcat

Database - Oracle,SQL Server, MySQL, Teradata, DB2, Netezza

Big Data - Hadoop, HDFS, Map Reduce, Flume, Pig,Sqoop, Hive, Oozie, MongoDB,Spark,Cassandra

ETL - Tableau,PowerBI

Monitoring Tool - Splunk

Databases: Oracle, MongoDB

Certifications

Cloudera Certified Hadoop Developer (CCD - 410).

Certified MongoDB Developer.



Contact this candidate