Post Job Free
Sign in

Data Manager

Location:
New Castle, DE, 19720
Salary:
90000
Posted:
July 17, 2017

Contact this candidate

Resume:

Ramji Orathur Ranganathan

**** ********** **** **** ** - 19701

PHONE +1-980-***-**** E-MAIL – ac1cgj@r.postjobfree.com

Summary:

1.10 years of experience in analysis, design, development, testing and maintenance in the field of Web technology application and client server environments.

2.Strong exposure in COBOL and JCL developing job flows, schedules and batch processing.

3.Expertise in working with DB2 8.0/9.0/10 databases and in writing triggers, COBOL stored procedures and native stored procedures.

4.Working knowledge on Datastage 9.1 ETL tool. Basic knowledge on UNIX shell Scripting to perform Job scheduling.

5.Participated in requirements analysis, reviews and working sessions to understand the systems and system design.

6.Used Datastage Designer to create parallel and sequencer jobs, Datastage Director to debug and monitor the execution.

7.Used different Partitioning techniques while designing the Datastage jobs.

8.Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools – Informatica Server, Repository Server manager.

9.Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

10.Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.

11.Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.

12.Strong experience in working with full life cycle (SDLC) of Development of Mainframe based applications.

13.Experience in designing Low level and High level design documents.

14.Created and presented Estimates, proposals and gap analysis documents to client leader ship.

15.Proficiency in developing “DataStore” applications which will receive from upstream, load, maintain and send data to downstream applications.

16.Designed and developed various mainframe programs using COBOL and Ezetrieves.

17.Design, Development of mainframe job flows using JCL.

18.Extensive experience with Sort Utilities like DFSORT and SYNCSORT and ICETOOL.

19.Strong Experience with usage of IBM DB2 Utilities for LOAD, UNLOAD, FASTLOAD.

20.Experience in creating new tools using REXX language and integrated them with CLIST to invoke them using line commands.

21.Proficiency in scheduling the batch flows using the CA7 scheduler.

22.Working experience in AUTOSYS for scheduling Datastage Sequencer jobs.

23.Strong experience in VSAM file processing and Message Queue (MQ) processing

24.Experience in version control tools like ChangeMan and Endevor.

25.Extensive experience with debug tools like EXPEDITOR, Mainframe Express and IBM’s Debug tool.

26.Designed and coded a job monitoring tool using SARBCH utility for pulling the job details from SAR and then create a report to LoB.

27.Strong experience in DB2 tools like PLAT, BMC to view and edit the properties of DB2 components, evaluate the performance of a query using EXPLAINS.

28.Experience in MS SQL Server 2005/2008 and in creating DTS/SSIS packages for extracting the data from the databases present on various servers.

29.Experience writing technical specifications for IDMS Logical Record Facility.

30.Worked with IDD, ADSA, ADSO, ADSC, MAPC, DMLO, DDDL, OLQ and OLP.

31.Experience in using DB tools like IBM Datastudio V3.1/4.1 and TOAD for developing the native stored procedures.

32.Expertise in using common MF tools like TSO/ISPF, FileAid, FileAid for DB2, SPUFI, PDSMAN, and SDSF.

33.Worked on Multiple operating systems like Windows-2K, XP and 2007.

34.Proficiency in Application Documentation using MS Office tools and version control using VSS and SharePoint.

35.Extensively used Compass, a commercial cash management tool.

Education:

Bachelor of Engineering (Mechanical) from Adhiparasakthi College of Engineering, Melmaruvathur (Affiliated to Anna University, Chennai).

Technical Skills:

Languages

COBOL, Ezetrieve, REXX

Platforms

Windows[9X/ 2000/XP/7/8], zOS, MVS

Database

DB2 V8.1/V10, Microsoft SQL Server2005/2008/2012, RDBMS concepts

Utilities

DF SORT, SYNCSORT, ICETOOL, IBM Utilities for DB2, NDM/FTP, VSAM

Tools

TSO/ISPF 5.9, SPUFI, FileAid 9.1.0, PDS MAN, CA7, Xpeditor, SAR, SDSF,

JCLPREP, Endevor, ChangeMan, Expeditor, MainframeExpress, IBM Debug tool, HP Quality Center,

IBM Datastudio

ETL Tools

IBM – Datastage 8.5/ MS SQL Server SSIS/DTS packages

DB Tools

File-Aid for DB2, PLAT, BMC, QMF

Domain

Consumer and Retail Banking

Professional Experience:

UST Global, Sunnyvale CA Jan’ 2016 – Till Date

Senior Mainframe Developer

Project: Enterprise Business Analytics

Client: Netapp

Description:

Enterprise Business Analytics is a project which collates Netapp’s sales data, order processing data and inventory data from various Netapp applications into a DB2 database. This DB2 database is used as the source for generating various reports to help the business grow.

Responsibilities:

Working in an AGILE environment and has helped the team in planning the sprint by identifying the modules and creating the stories.

Created COBOL-DB2 modules wherever required to apply business rules and format data, insert/update or delete from DB.

Created JCLs to frame the job flow to develop various modules with in the application.

Coded native stored procedures to support our application UI to interact with DB2 database. Translated all the requirements into SP Calls and designed the application.

Used IBM Datastudio extensively to develop, test and deploy the native stored procedures.

Extensively used BMC and IBM Datastudio to do EXPLAINS on stored proc queries to understand the access path and optimized the queries accordingly.

Used File-Aid for DB2, QMF and SPUFI for accessing the DB2 database. Created DCLGENs and used them in COBOL modules.

Used DB entities like Views, Indexes and DB concepts like Triggers to handle and maintain the data.

Used Endevor tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used IBM FaultAnalyzer to debug the COBOL modules and perform the unit testing.

Used NDM and FTP has file transfer protocols between mainframe LPARs and between mainframe and midrange servers respectively.

Used CA-7 as the scheduler tool for all the batch flows inside the application.

Created Ezetrieve codes, Sort utilities like SyncSort and IBM DB utilities using JCL and framed the job flows for batch processing of application data.

Used Join Keys concepts in SyncSort utility to compare data chunks in a more efficient manner.

Created Unit testing scripts and executed the tests and documented the test results.

Responsible for production implementation and install verification, preparation of installation documents.

Environment: COBOL, DB2, JCL, Ezetrieve, SyncSort, MS SQL Stored Procedures, Endevor, Xpeditor, BMC Explains, CA-7, MS Visio, Quality Center, SharePoint

UST Global, Malvern PA Sep’ 2016 – Dec’ 2016

Mainframe Developer

Project: Timely Deposits

Client: Vanguard

Description:

Timely Deposits is a project aimed at creating a monitoring platform for monitoring the files received by Vanguard from the clients with contribution and loan repayment transaction information. The system will generate a workitem on the WMS application, if there is any missing or late contribution/loan repayment transactions based on the payroll date.

Responsibilities:

Working in an AGILE environment and has helped the team in planning the sprint by identifying the modules and creating the stories.

Created COBOL-DB2 modules wherever required to apply business rules and format data, insert/update or delete from DB.

Created JCLs to frame the job flow to develop various modules with in the application.

Used Endevor tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used Mainframe Express to debug the COBOL modules and perform the unit testing.

Used BMC tool EXPLAINS to analyze the queries and optimize them to yield performance.

Used InSync DB2 tool for browsing the DB entities from the TSO sessions.

Used DB entities like Views, Indexes and DB concepts like Triggers to handle and maintain the data.

Used Join Keys concepts in SyncSort utility to compare data chunks in a more efficient manner.

Created Unit testing scripts and executed the tests and documented the test results.

Helped the Testing team with the data setup by creating workitem on the WMS application for the integration testing.

Responsible for production implementation and install verification, preparation of installation documents.

Environment: COBOL, DB2, JCL, DFSort, Endevor, Mainframe Express, BMC Explains, MS Visio, Insync, Agility

Infosys Limited, Newark DE June’ 2015 – Sep’ 2016

IBM Datastage/DB2 Developer

Project: BACARDI (Cards Data Management)

Client: Bank of America

Description:

Bank of America Card Information System (BACARDI) is a card services data mart, hosting carf and transaction information for consumer, small business and large commercial.

Responsibilities:

Responsible for preparing the project estimates and high-level design for the identified requirements.

Prepared the low level design document by logically designing the job flow to satisfy all the requirements.

Served as an application lead programmer, and as an Onsite Co-ordinator working on the ETL Datastage job flows to receive and send cards data to downstream systems and Native Stored Procedures to interact with other systems to get the customer data.

Used IBM Datastudio to develop and test the stored procedures.

Lead the requirement discussion and designed the stored procedure specifications, input parameters.

Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator Etc.

Worked with Join, Look up (Normal and Sparse) and Merge, sequential file, dataset, file set and look up file set stages and created source table definitions in the DataStage Repository.

Used the Datastage Director and its run-time engine to schedule running the solution, testing and debugging its components and monitoring the resulting executable versions on ad-hoc or scheduled basis.

Worked with Metadata Definitions, Import and Export of Datastage jobs using Datastage Manager.

Developed Job Sequencerwith proper job dependencies, job control stages, triggers.

Scheduled Datastage job using AUTOSYS scheduling tool and used UNIX Shell scripts to run Datastage jobs.

Developed several jobs to improve performance by reducing runtime using different Partitioning techniques.

Environment: IBM Information Server V9.1 (DataStage, Qualitystage, Information Analyzer), UNIX Shell Scripting, Windows NT/XP, IBM Datastudio, AUTOSYS, VB scripting.

Infosys Limited, Jersey City, NC Apr’ 2014 - May 2015

Senior Mainframe Developer

Project : Client Notes (CNT)

Client : Bank of America

Description:

Client Notes is an enterprise wide application which will store the NOTES gathered whenever the client interaction happens. The application has a front end which will help the associates to handle customers by displaying the notes gathered on previous interactions with the clients.

Responsibilities:

Served as an onsite co-coordinator for the application mentoring the 8 member offshore team.

Mentored the offshore team and gave them technical guidance and helped them understand the business.

Responsible for Requirements gathering, Impact analysis, Project estimation, scheduling and tracking the deliverables.

Worked as a team to design the database needed for the application. Created logical and Physical datamodel documents and setup a normalized DB.

Coded native stored procedures to support our application UI to interact with DB2 database. Translated all the requirements into SP Calls and designed the application.

Used IBM Datastudio extensively to develop, test and deploy the native stored procedures.

Extensively used BMC and IBM Datastudio to do EXPLAINS on stored proc queries to understand the access path and optimized the queries accordingly.

Used File-Aid for DB2, QMF and SPUFI for accessing the DB2 database. Created DCLGENs and used them in COBOL modules.

Responsible for the data conversion from other systems. Designed, Coded and tested mainframe job flows to convert from SQL server database to DB2 database

Used Endevor tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used NDM and FTP has file transfer protocols between mainframe LPARs and between mainframe and midrange servers respectively.

Used CA-7 as the scheduler tool for all the batch flows inside the application.

Used SharePoint as version control tool for the project documents.

Environment: COBOL, DB2, JCL, Ezetrieve, SyncSort, MS SQL Stored Procedures, Endevor, Xpeditor, BMC Explains, CA-7, MS Visio, Quality Center, SharePoint

Infosys Limited, Charlotte, NC Apr’ 2013 – Mar 2014

Senior Mainframe Developer

Project : Cash Vault Services (CVS)

Client : Bank of America

Description:

Cash Vault service is extended for high net worth customers for Bank of America. The cash vault processes like Deposit processing, Currency verification, Change Order fulfillment and other cash transportation services.

CVS application is the automation of the processes within the cash vault which uses a commercial software product from G&D and widely used in the cash management industry.

The data from each cash vault DB is consolidated and sent to a Mainframe system where it is stored in a DB2 database and then sent to downstream for posting.

Responsibilities:

Served as an onsite co-coordinator for the application mentoring the 5 member offshore team.

Responsible for Requirements gathering, Impact analysis, Project estimation, scheduling and tracking the deliverables.

Managed the Change Request procedures for the identified bug fixes and served as the point of contact for all the enhancement installations.

Served as 24/7 On-call support for both mainframe and mid-range incidents.

Mentored the offshore team and gave them technical guidance and helped them understand the business.

Responsible for the root-cause analysis, incident research and ticket resolution.

Analyzed COBOL-DB2 modules as part of incident research and identified the issues and fixes.

Over saw the code development and responsible for all the deliverables and interactions with the client.

Executed Ezetrieve codes, Sort utilities like SyncSort and IBM DB utilities using JCL and framed the job flows for batch processing of application data.

Created a Job monitoring tool using REXX language using SARBCH utility for pulling the job details from SAR and then create a report to LoB.

Used Endevor tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used SharePoint as version control tool for the project documents.

Extensively used Compass, a commercial cash management tool which is being used in our cash vaults.

Created DTS packages using MS SQL Server 2000 for accessing data from various vault servers.

Created SSIS packages using MS SQL server 2008 for extracting data from various vault databases.

Used SQL Server Management studio from the MS SQL server 2008 to query databases.

Created stored procedures to access MS SQL server DB and extract/insert/update data.

Used File-Aid for DB2, QMF and SPUFI for accessing the DB2 database. Created DCLGENs and used them in COBOL modules.

Used NDM and FTP has file transfer protocols between mainframe LPARs and between mainframe and midrange servers respectively.

Used CA-7 as the scheduler tool for all the batch flows inside the application.

Environment: COBOL, DB2, JCL, REXX, Ezetrieve, SyncSort, DFSort, Native Stored Procedures, ChangeMan, Xpeditor, BMC Explains, CA-7, Compass, MS SQL server 2008, DTS and SSIS Packages, MS Visio, Quality Center, SharePoint

Infosys Limited, Chennai Mar’ 2010 – Mar’ 2013

Mainframe Developer

Project : Offers Federated Repository (OFR)

Client : Bank of America

Description:

OFR is a data repository application which maintains customer Offer data and serves as the real time data source for Offers Management (OM) application. OFR collects customer data from various applications and OM utilizes those data of a particular customer from OFR to generate appropriate offers and stores it again on the OFR database. Generated offers are pulled and used for painting the teller screen in real time, which he can try to market to the customer.

Responsibilities:

Assisted the onshore team for requirements gathering by participating in JAD sessions.

Responsible for preparing the project estimates and high-level design for the identified requirements.

Prepared the low level design document by logically designing the job flow to satisfy all the requirements.

Served as an application lead programmer, maintaining a 6 person development team and responsible for all the deliverables from Offshore team.

Created COBOL-DB2 modules wherever required to apply business rules and format data, insert/update or delete from DB.

Executed Ezetrieve codes, Sort utilities like SyncSort and IBM DB utilities using JCL and framed the job flows for batch processing of application data.

Used ChangeMan tool to maintain the COBOL modules, JCLs, parms and procs and for promoting the same to higher environments.

Used SharePoint as version control tool for the project documents.

Mentored the team and peer reviewed the code modules and had the ownership of the application at offshore.

Created Native Stored Procedures that can be executed by Web service APIs to fetch or insert/update data from the DB.

Used BMC tool EXPLAINS to analyze the queries and optimize them to yield performance.

Used DB entities like Views, Indexes and DB concepts like Triggers to handle and maintain the data.

Used Q-Rep setup to maintain the DB copy on a different MF LPAR as part of disaster recovery process.

Used IceTool, Join Keys concepts in SyncSort utility to compare data chunks in a more efficient manner.

Created a FileCompare tool using REXX to compare up to 5 files on a given key. The tool will take file names and key positions as input.

Created and maintained code review checklists for MF part of the application.

Created Integration testing scripts and executed the tests and documented the test results using Quality Center tool.

Supported performance testing team on the data setup and oversaw the performance testing process and responsible for giving the sign off from the development team.

Did a PoC on the available dataclas parameter for the MF Datasets and has shown huge DASD savings.

Creation of cards for scheduling the jobs on the CA-7 scheduler.

Responsible for production implementation and install verification, preparation of installation documents.

As a DP Anchor, responsible for Defect Prevention activities within the team like tracking the defects, doing the root cause analysis for the defects, identification and execution of action items to prevent the same defect from occurring again.

Environment: COBOL, DB2, JCL, Ezetrieve, SyncSort, MS SQL Stored Procedures, Endevor, Xpeditor, BMC Explains, CA-7, MS Visio, Quality Center, SharePoint

Infosys Limited, Chennai Oct’ 2008 – Feb’ 2010

Mainframe Developer

Project : Federated Relationship DataStore (FRE)

Client : Bank of America

Description:

FRE is a data repository application which collects customer relationship data from various bank applications and maintains it in a single database. This data is used as the input for the decision making process of the bank, while selling its products to a particular customer.

Responsibilities:

Created high level and low level design documents from the requirements documents.

Worked as a team to develop a database for the application, logically grouped the data in the form of tables, defined columns in the tables, defined the constraints, created and executed DDL statements.

Used Mainframe LPARs with a good exposure to TSO and ISPF.

Created JCLs to frame the job flow to develop various modules with in the application.

Created COBOL-DB2 programs to format the data, apply business cases and then insert, Update or delete into database.

Used ChangeMan as the version control tool for COBOL modules and the JCLs code

Prepared the Unit test scripts and unit tested the COBOL-DB2 modules and mainframe job flows extensively.

Extensively used XPEDITOR as the debug tool to test the code modules.

Used the IBM Load and Unload utilities for the bulk load and extraction of data. Handled very high volumes of data for batch processing.

Used File-Aid for DB2, QMF and SPUFI for accessing the DB2 database. Created DCLGENs and used them in COBOL modules.

Used VISUAL EXPLAINS tool to analyze and optimize the performance of an SQL query.

Created DB2 Stored Procedures using COBOL for manipulating and fetching data for the web server calls made from front end application.

Created COBOL codes for accessing, reading and editing the VSAM datasets.

Created Ezetrieve codes for file comparison and data manipulation and formatting activities.

Used SyncSort as well as DFSORT utility for data sorting.

Used NDM and FTP has file transfer protocols between mainframe LPARs and between mainframe and UNIX servers.

Environment: COBOL, DB2, JCL, Ezetrieve, SyncSort, MS SQL Stored Procedures, Endevor, Xpeditor, BMC Explains, CA-7, MS Visio, Quality Center, SharePoint

Infosys Limited, Chennai Jul’ 2007 – Sep’ 2008

Mainframe Developer

Project : Relationship Manager Platform (RMP)

Client : Royal Bank of Scotland

Description:

RMP is an application to help the Relationship Manager to collate, maintain and use the customer data from various sources inside and outside the bank, for Decision making process to sell the right products to right customer.

This is a migration project where the existing PL-SQL application was built using JAVA front end and COBOL-DB2 as the backend.

Responsibilities:

Created High level Design documents by analyzing the PL-SQL code.

Created Low level Design documents from the HLD for the mainframe back-end system.

Used VSS as the version control tool for maintaining all the Project documents.

Translated the design into logical piece of code by coding COBOL-DB2 programs.

Created and executed bind jobs.

Used Endevor to create, stage, and deploy COBOL packages into Test as well as Prod environments.

Created Mainframe jobs using JCLs to execute code modules and framed the job flow to satisfy the given requirement.

Prepared the Unit test scripts and unit tested the COBOL-DB2 modules and mainframe job flows extensively.

Extensively used XPEDITOR as the de-bugging tool while unit testing the COBOL-DB2 modules.

Prepared System test cases and supported the integration testing for data setup requests and result validation.

Extensively used QMF, File-Aid for DB2, IDCAMS Utilities and IBM Load and unload utilities while working on data setup requests.

Used CA-7 for scheduling the job flows in System testing and UAT environment.

Preparing End of schedule report and carrying out minutes of meeting with the team. (Both offshore and onsite) at the end of the schedule.

Created and maintained the defect tracker and tracked each and every defect and review comment to closure.

Environment: COBOL, DB2, JCL, Endevor, Xpeditor, CA-7, IBM Load and Unload, File-Aid, File-Aid for DB2, IDCAMS, QMF.



Contact this candidate