Resume

Sign in

Engineer Data

Location:
Sunnyvale, CA
Posted:
January 12, 2020

Contact this candidate

Resume:

ARUNA PAMIDI

Phone : *** - *******

E-mail: adbaac@r.postjobfree.com

PROFILE SUMMARY:

•DevOps Engineer with over 11+ years of IT experience and 5 years as a Build and Release Engineer, infrastructure integrator and System administrator and strong expertise in best practices of Software Configuration Management (SCM).

•Worked on source controller tools like version control system GIT.

•Extensively worked in continuous integration tools Jenkins for end-end automation for all build and deployments

•Use of MAVEN as Build Tool for building of deployable artifacts (WAR, JAR & EAR) from source code

•UNIX/LINUX Administration, Shell Scripting with expertise in Red-hat Linux 5,6 and 7,

•Automated builds using continuous integration and continuous deployments by using Jenkins

•Efficient in working closely with teams to ensure high quality and timely delivery of builds and releases.

•Good Experience into deploying tool Tomcat by using different Environments.

•Experience in using Jenkins for Deployments, integration and delivery

•Knowledge in managing Sonatype Nexus/ Artifactory repositories for MAVEN artifacts and dependencies

•Knowledge in code quality tool for Sonarqube.

•Good experience in sonatype Nexus for artifactory for creating repositories.

•Experience in configuration management tools like Ansible.

•Ability in managing all aspects of the software configuration management process including code compilation, packaging/Deployment / release methodology, and application configurations.

•Experience in container platform like Docker.

•Good Experience in Application console tool by using IBM BPM TOOL.

•Experience in load balancing, DNS, SSL, and firewalls.

•Strong knowledge on source controller management concepts like Branches, Merges and Tags.

•Knowledge in monitoring tools like Newrelic and Cloudwatch.

•Experience in using issue tracking systems like JIRA and Remedy.

•Managed environments DEV, QA, UAT & PROD for various releases & designed instance strategies

•Experience in multiple deployments using Ansible with AWS.

•Worked with Engineers, QA and other teams to ensure automated test efforts are tightly integrated with the build system and in fixing the error while doing the Deployment and building (Agile Projects)

•Strong experience to all aspects of software development life cycle (SDLC) involving Agile/Scrum such as Analysis, Planning, Developing, Testing and Implementing and Post-production analysis of the projects.

•Self-motivated and resourceful team contributor, able to quickly grasp new technologies; deft at tracing complex build problems, release & environment issues in multi-component environment

•Excellent hands-on trouble-shooting, problem solving and communication skills; analytical leader with ability to work efficiently in both independent and teamwork environments.

TECHNICAL SKILLS:

Operating Systems

Linux [ Redhat]

Version Control Tools

GIT

Languages/Scripting

Groovy, shell/Bash, YAML

Database

MySQL and DB2.

Application Servers

Tomcat

Databases

Oracle, SQL Server, DB2, MySQL

Build/Release/DevOps

Maven, GIT, Cloudwatch, Code Coverage/Quality/Continuous Integration Jenkins

Build tools

Maven

Continuous Integration Servers

Jenkins

Release tools

Tomcat

Configuration Management

Ansible

Issue Tracking Management

JIRA[Administration] and Remedy

Testing Tools

Quick Test Pro, Load Runner

WORK EXPERIENCE:

Applied Materials – Santa Clara, CA Jan 2015 to Till Date

•Designed, Installed and Implemented Ansible configuration management system.

•Created and updated Ansible playbooks and modules, files, and packages.

•Automated the cloud deployments using Ansible.

•Implemented rapid-provisioning and lifecycle management for Ubuntu Linux using Amazon EC2, Ansible, and custom Bash scripts.

•Worked on Version control setups like GIT and integration tools Jenkins

•Installed, Configured and Administered Jenkins Continuous Integration Tool.

•Developed automation framework for Application Deployments to the cloud environments.

•Implemented AWS solutions using EC2, S3, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances.

•Installed and configured IDM engine, UserApp in Tomcat servers.

•Configured LDAP using directory Server for user authentication.

•Created SSL and Digital Certificates for requesting, generating and implementation for the communication between web server and the Application Server.

Good Experience in Application console tool by using IBM BPM TOOL.

•Worked on the installation and configuration of the monitoring tool Cloudwatch.

•Getting the requests from Developers through PST Portal.

•Implemented Nagios core for monitoring Infrastructure resources.

Environment: Linux, Ansible, AWS, Maven, Sonarqube, Nagios, Shell, VMware, Java, Ant, Maven, Jenkins, GIT, Apache Tomcat, Ansible, JIRA.

State Street – Wipro – Quincy, MA Mar 2010 to Dec 2014

Description: Purpose of the project is to calculate the real time Market Value in eHorizon/MCH by taking the shares/par/notional par a fund holds on a security multiplied by the security’s price. This calculation is currently being performed by NAVigator. In future NAVigator will be retired and the accounting application/eHorizon will perform market value calculations. The accounting application will calculate the market value and appreciation /depreciation at the fund level, generates the holdings extract for all the securities, stores the Prices, and generates the required data for the reports.

Tasks:

•Understanding the functional requirements by participating the business meeting with clients, and from Functional specs.

•Breakdown the functional requirement into logical units (task) and Prepare high level Technical design document.

•Preparing detailed technical design document

•Application Development by writing COBOL, DB2 programs.

•Supporting System testing, User acceptance testing

•Preparing implementation plan and coordinating with different teams for implementation

•Onsite-Offshore coordination

•Technical guidance to the team

•Impact analysis while changing existing module for reuse of modules, coordinating the merges among many versions

•Providing the required documentation for Audits and coordinating the Project audit.

Environment: COBOL, Pl1, JCL, DB2, Ellipse, QMF,IBM Z/OS, MVS/ESA, WINDOWS-NT.

Walmart – American Sol Inc, Bentonville – AR Jan 2007 to Apr 2008

Project # Item file Migration Project

Client: Wal-Mart, Bentonville, AR

Description:

Wal-Mart is migrating it’s ITEM data base from IMS to DB2 database. As part of it programs are modified to place DB2 SQL queries replacing IMS calls and converting JCLs and parameters.

•Closely interacted with business users to get the requirements.

•Involved in writing COBOL, IMS/DB and DB2 programs.

•Analyzing programs and preparing Technical specification document.

•Coding to replacing IMS related code with Db2 code, JCl changes compatible to DB2 data

base

•Creating Parameters and Resource members

•Involved in Unit, Integration, Functional and Performance testing

•Preparing databases for testing. As part of it IMS test sets are loaded with Production

data.

•Test Db2 tables are loaded with production table data. Running the Sync jobs to make

sure both the data bases are having same data.

•Testing programs with IMS data base and Db2 data base and comparing results.

•Coordinating movement of changed codes to Development and Production environments

•Preparing change control list and presenting in change control.

•Monitoring the jobs for the successful execution of Job in production.

Environment: COBOL, JCL, DB2, IMS/DB, Eclipse, QMF,IBM Z/OS, MVS/ESA, WINDOWS-NT.

ADP – Kanbay - San Dimas, CA Jan 2006 to Dec 2006

Project # Tax Engine Restructure - Re Engineering Project

Client: ADP, San Dimas, California, USA

Description: The ADP Tax System calculates the amount of Federal, State and Local Agency(s) payroll taxes for each Client and each Client employee. After Client payrolls are received, funds are collected to pay client payroll tax liabilities and, based on the payroll check date and liability amount, determines the due date for each tax deposit and each payroll tax return. On the due date, ADP deposits funds with the tax agency and files payroll tax returns.

For the above process COBOL programs were developed using Datacom Data base. But those programs are not structured and have unused code. For more maintainability In this TER project Dead data and dead code is cleaned up, Large programs are Modularized, Unstructured programs are Restructured and Developed new programs for correction of rejected files.

Tasks:

Participate in requirements gathering and analysis activities

Participation in estimation along with offshore PM/PL/TL

Obtain client sign-offs on project deliverables

Manage dependencies with client

Perform specific life-cycle activities as per plan

Track post delivery defects and communicate to the project team

Monitor customer satisfaction

Escalate issues within client and organization if issues are unresolved

Develop flowcharts using VISIO and write pseudo code.

Coding, Peer code reviews, Unit Testing as per test plans.

Track defects from code reviews and testing.

Provide technical support during implementation.

Ensure effective usage of project specific tools during coding and testing.

Day-to-day client management – provide status, discuss queries and issues, seek resolution

Apart from project my special contribution to the project

Created tool using REXX for automation of

oDatabase load and Unload

oFile comparison

Environment: Cobol, JCL, Datacom, VSAM, FILE-AID,CA7, CHANGEMAN Ver5.3.3, ASG tools, IBM Z/OS, MVS/ESA, WINDOWS-NT, VISIO

BCBS – Saahi Systems - Hyderabad Aug 2004 to Dec 2005

Project # WGS20 Group Reporting Maintenance and Enhancement

Client: BCBS, USA. WellPoint, USA

Description:

The WGS 2.0-SAS module deals mainly with reporting programs. The requirements deal with the writing of new reporting programs and the maintenance of the existing reporting programs. As a part of reporting programs, the Adhoc files are used as source files. The Adhoc files are sequential files that consist of the history of all claims.

Some of the types of Adhoc files that can be found within the large group and Small group are Medical, Pharmacy or Drug, Active member file and Uni-care files.

The reports are generated as per the User’s requirements using these Adhoc files.

To create reports SAS is used.

Responsibilities:

Analysis, Design, Coding, testing and implementation.

Analyze the work request and prepare the detailed technical design document

Code the COBOL, BD2 programs for extracting data from medical, pharmacy history data.

Coding the sas and COBOL program for generating daily, weekly, monthly, quarterly and annually reports according to the request.

Working on table base requests like Table base updations.

Developed various COBOL programs to interact with web services, and used MQ Series for transferring data between the heterogeneous applications.

Testing the code and ftp the test results to shared drive

Preparing criteria documents

Changing the procs according to new requests.

Creating the packages and productionalisation of jobs using changman

Converting Easytrieve programs to COBOL.

Environment: SAS, COBOL, JCL, DB2, MQ series,VSAM, File-Aid Ver 8.7.0, CHANGEMAN Ver5.3.3, CA-7, IBM Z/OS, MVS/ESA, WINDOWS-NT



Contact this candidate