Resume

Sign in

Mainframe developer

Location:
Troy, Michigan, 48084, United States
Posted:
October 23, 2016

Contact this candidate

Resume:

Senior Mainframe Developer

Having ** years of experience in host Programming using mainframe technologies, involved in all the phases of SDLC life cycle process.

Major Strengths

Having 10 years of IT experience in different stages of Application and Product development - Analysis, Design and Development in Agile and Waterfall methodology

Extensive experience in developing business components based on the customer requirement by using COBOL, VSAM, JCL, DB2, IMS, CICS.

Strong working experience in developing Mainframe database applications using COBOL, DB2 and COBOL IMS databases.

Extensive working knowledge in mainframe tools – Sort, File Manager, File-Aid, Xpeditor, Changeman, Endeavor, IBM Debug tool, Operation Scheduler, CA-7 scheduler.

Good knowledge on Data Warehousing Concepts.

Solid understanding of the Hadoop distributed file system.

Extensive working knowledge in Banking and Property and Casualty applications.

Excellent analytical and communication skills those are required effectively to work in the field of applications development and maintenance.

Comprehensive problem solving abilities, excellent verbal and written communication skills

Technical Proficiency:

Software

Cobol, JCL, VSAM, DB2, IMS-DB, CICS, REXX, HDFS

Tools

IBM Z OS, Informatica, Teradata, Sort, Fileaid, Abend aid, File manager, Expediter, RDz debug tool, QMF, SPUFI, Endevor, Changeman, RMS, DF Sort, PIG, HIVE, RMS, Control-D, Operation Scheduler

SDLC

Waterfall and Agile

Domain Experience

Banking and Property and causality insurance

Operating System

Windows 7, Windows XP, Linux, Z/OS

Educational Qualification:

Bachelors of Technology from JNTU, India.

PROJECTS:

State farm mutual insurance, Bloomington, IL Jun 2015 – till now

Sr. Mainframe Technical Analyst

Billing and Payment center processing

Description

Development of the Billing and Payment Center will be completed in a multi-staged effort, or generations. This effort is the first generation of the Billing and Payment Center. Work consists of consolidating payments from multiple data stores into a single data store and updating the Agent Bank Deposit application to utilize that single data store. This work will provide a foundation for future generations.

Responsibilities:

Participate in technical and functional workshops to understand the prioritized requirements list.

Preparation of high-level sprint plan in synch with the product owner priorities.

Sharing of working practices with other stakeholders.

Provide estimations for the business requirements.

Preparing implementations plan and monitor production deployment of changes.

Updating all relevant system documents, user documents and systems maintenance and development process documents.

Validate the results by running host jobs on Z/OS operating system.

Raise the defects and track them using Defect Trac utility.

Upload the test cases to Test link tool and maintain them.

Create documentation with evidence of testing.

Follow up with business analysts to make sure the review happens with in the project timelines.

Environment:

Z/OS-390 Operating System, DB2, COBOL, JCL, VSAM, TSO/ISPF, SORT, File Manager, HP

Service center, TRAC

Nationwide Mutual Insurance Company Aug 2013 – May 2015

Tech Lead

Smartride Experience (SRE)

Description

This is the Nationwide Insurance’s first project on Hadoop infrastructure where we migrated the data from Mainframes, Data warehouse to Hadoop environment. A new website has been built for Smartride users where policy holders can login and view their driving habits and coaching messages. This website access Hadoop environment to get the analyzed data that comes from Mainframes. The development part of this program includes the following stages: High level and Low level designs, Technical specification, Unit test cases, coding and unit testing.

Responsibilities:

Worked as Tech Lead.

Involved in the Data Migration activities.

Developed migration utilities as part of data pump into new platform.

Worked on the tracking and monitoring module to update as per new platform.

Involved in the implementation activities for new changes.

Involved design changes in tracking and monitoring requirements.

Participate in Technical and functional workshops to understand the prioritized Requirements List

Preparation of High level Sprint plan in synch with the Product owner Priorities.

Preparation of Risk Log

Development of ETL mapping and scripts.

Support Nationwide in establishing the Infrastructure setup and accesses.

Identify the team and arrange Process and Technology training Sessions for the identified personnel’s.

Performed impact analysis for DB2 tables that needs to be compressed and archived to implement the lower cost storage mechanism.

Sharing of working practices with other stakeholders.

Establish Management Review process and communication protocol

Environment :

IBM BigInsights, Data warehouse and Z/OS-390 Operating System, PIG, HIVE, DB2, COBOL, JCL, VSAM, Teradata, TSO/ISPF, Informatica, Changeman, Expeditor.

State farm mutual insurance, Bloomington, IL Aug 2011 – July 2013

Senior mainframe developer

Fire Streaming application

Description

Fire Streaming is legacy batch application, which performs, edits on input data sourced from Agents (GUI applications). It also converts the customer data into legacy values and updates to Interactive Fire IMS database, which is the intermediate database.

This project is a maintenance project where we work on enhancements, finding Root Cause Analysis for production problems. Integrated Customer Platform (ICP) is a major enhancement that is currently going in State Farm. We are involved in requirement analysis for few SOWs in ICP.

Responsibilities:

Gather the requirements from Onsite.

Understand the requirement and communicate with Onsite for more clarification if needed.

Providing Low Level Design and Comp Specifications

Development using COBOL, DB2.

Preparing environment for unit testing.

Preparing Test plan.

Reviewing peer codes and Unit Test Plans.

Production monitoring and Issue/defect resolution, data analysis.

Environment:

Z/OS-390 Operating System, IMS DB, COBOL, DB2, JCL, TSO/ISPF, Panvalet, Struciss.

Hong Kong and Shanghai Banking Services (HSBC bank) May 2010 - July 2011

Module lead

One HSBC Sales and Services (OHSS)

Description:

One HSBC Sales Services is a central set of sales, support and marketing services for the capture and management of customer interactions with the channels and the delivery of common information and relationship decisions across all channels.

Responsibilities:

Worked as module lead.

Performing major enhancement and Development assignments using COBOL, DB2 and CICS technologies.

Handling Client Meetings for project discussions and preparing functional specifications from Business and Technical Requirements.

Provide estimations for the Business Requirements.

Coding, Handling Review activities, preparing Test cases and handle Unit Testing of the Components and documenting Test results.

Preparing Implementation plans and monitor production deployment of changes.

Handling Support Activities for the application.

Provide Long-term fixes for the potential production issues.

Environment:

Z/OS-390 Operating System, DB2, COBOL, JCL, VSAM, TSO/ISPF, Endeavour, FILE-AID, DFSORT, SPUFI, and Xpeditor.

Merrill Lynch Technology Services, Mumbai, India Oct 2009 - Apr 2010

Mainframe Application Developer

Performance Measurement Convergence (PMC)

Description:

Performance Measurement Convergence (PMC) system is a reporting/monitoring system in which financial statistics (Rate of returns) will be generated and reported for all the Financial Advisers (FA’s) present in Merrill Lynch USA. These reports will be visible to both clients and the senior managers who will take all the business decisions.

Responsibilities:

Understanding the functional specifications and converting it to technical specifications.

Preparing test case documents and test plans.

Construction of code based on technical specifications.

Unit testing and System Testing (if required).

Updating all relevant system documents, user documents and task related SMDP (Systems Maintenance and Development Process) documents.

To provide production support for Weekly Batch cycles.

Analysis and feasibility assessment of requirements and delivery deadlines.

Environment:

Z/OS, DB2, COBOL, JCL, VSAM, TSO/ISPF, Endeavour, FILE-AID, DFSORT, SPUFI, and Xpeditor.

MetLife USA Apr 2006 – Sep 2009

Application Developer

MetLife CCM MARS Application

Description

This system broadly acts as Agency Management Systems. It processes the Agents performance across the various lines of business from all the subsidiaries of MetLife US &determines their contributions in sales of policies by crediting them with a percentage of policy count &there onwards feeds the data to a System MARR (Metropolitan Agency Recognition reporting) where the commissions &rewards to the agents for their performance are determined.

Responsibilities:

Analysis of the Change/Service Requests.

Understanding the functional specifications and converting it to technical specifications.

Query resolution with Onsite coordinator via e-mail / conference call.

Preparing test case documents and test plans.

Unit testing and System Testing (if required).

Updating all relevant system documents, user documents and task related SMDP (Systems Maintenance and Development Process) documents.

To provide production support for daily Batch cycles

Environment:

Z/OS-390 Operating System, DB2, COBOL, JCL, VSAM, TSO/ISPF, Endeavour, FILE-AID, DFSORT, SPUFI, and Xpeditor.



Contact this candidate