Post Job Free

Resume

Sign in

Data Project

Location:
United States
Posted:
April 27, 2016

Contact this candidate

Resume:

SUMMARY

Having around *.* years of experience of design/development with Business Intelligence solutions & Datawarehousing.

* ***** ** ********** ** all the phases of the Data Warehousing and Data Migration life cycle using Informatica and DataStage.

4.4 years of experience in Informatica, with Proficient in gathering requirements, analysis, design, development, production support and deployment on the Data Warehouse.

Incorporated various data sources like Oracle, DB2, Teradata, XML, Excel, Sequential files into staging area.

Strong Knowledge of Data warehousing concepts, Relational Database Management Systems and Dimensional modeling (Star schema and Snowflake schema).

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.

Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks.

3.8 years of experience in Datastage with Parallel and server editions.

Extensive use of DataStage client components - DataStage Director, DataStage Manager, DataStage Designer, DataStage Administrator.

Experienced in new enhancements in the IBM Websphere DataStage – Multiple Job Compile, Surrogate key generator Stage, Job Report, Message handler options.

8 months of experience in Mainframe Technologies like COBOL, JCL, DB2 & VSAM.

Experience with scheduling of the batches using Autosys, Control-M, CA7 and DataStage scheduling tool.

Expertise in performing Debugging, Troubleshooting and performance tuning.

Pervasive knowledge of IT industry awareness with Finance, Insurance, Retail, Health care.

Ability to solve complex technical problems and assimilate information rapidly.

Excellent Communication and Interpersonal skills, resulting in good rapport with employees and executive management.

Excellent team player with problem-solving and trouble-shooting capabilities.

PROFESSIONAL EXPERIENCE

Working as a Technical Lead in CapGemini (formerly IGATE / Patni Computers Ltd) from Oct 2010 to till date.

Worked as a Software Engineer in Zensar Technologies., from July 2007 to Sep 2010.

Have worked as Senior Developer/ Technical Lead at Onsite for 23 months in USA, holding valid H1B Visa.

Have worked as Onsite Team Member for 8 months in UK.

SKILLS

ETL Tools Informatica v 8x/9x, DataStage PX and Server Edition

Operating Systems Windows XP, Windows NT Server, MVS and UNIX

Databases Oracle 9i/10G, DB2

Languages & Software SQL, COBOL, JCL, CICS and Core Java

Tools and Schedulers Autosys, Tidal, CA7, Control-M, ISPF, FILE-AID

Job Title: Informatica Senior Developer

The Bank of Tokyo-Mitsubishi UFJ, Ltd, USA Oct 2015 to till date

Project: OFSAA ALM & LRM Implementation

The purpose of this project is to design, develop and implement solutions between the existing OFSAA project with the Tahoe and Wishbone projects, The Bank of Tokyo-Mitsubishi UFJ, Ltd. (the “BTMU”) brought these two new products as part of their merger and acquisition process.

The OFSAA is a Financial Planning & Analysis project (FP&A) tool and will be used for fed reporting of 2052a reports. Bank’s liability and asset related data will be sourced from Data Warehouse and will be fed to the OFSAA engine to process the regulatory reporting. Complicated ETL logic is used to transform the existing data so that it can be used as is by the OFSAA tool.

Skills used: Informatica Power Center v9.1, SQL Server, Tidal, Lotus Notes, Microsoft Visio.

Responsibility:

Review the physical translations of business rules entered in metadata manager and have working sessions with the data warehouse team to clarify the entries.

Development and testing of the business rules that are coded in Informatica ETL.

Develop and test report components.

Research and resolve any reconciliation discrepancies based on technology pull.

Responsible for creating Test cases, Test reports, Unit testing and fixing bugs.

Responsible for analysis and preparing the data lineage documents for database objects.

Preparing code walk through documents related to the assignments worked on.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Involved in Unit and Integration testing of Mappings and sessions.

Interacting with Business user & co-ordinate with offshore team.

Involved in UAT testing process and fixing bugs.

Job Title: Informatica Senior Developer

Prudential Retirement, USA Feb 2014 to Sep 2015

Project: GPVAL Rewrite

For nearly 140 years, Prudential Financial, Inc., has helped individual and institutional customers grow and protect their wealth. Today, we are one of the world's largest financial services institutions with operations in the United States, Asia, Europe, and Latin America.

Retirement security is one of the most critical issues Americans face today. You don't need to do it alone.

As part of our commitment to helping individuals plan for a comfortable retirement,

GPVAL Project will extract data from Mainframe systems and provide ETL versioned data to Poly system to get financial calculations.

Skills used: Informatica Power Center v9.1, SQL Server, Autosys, Lotus Notes, Windows2000/XP/7, Microsoft Visio, DB2.

Responsibility:

Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.

Involved in creating logical and physical data models using MS Visio based on business requirements.

Prepared ETL mapping specification document.

Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.

Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more

Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.

Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Involved in Unit and Integration testing of Mappings and sessions.

Interacting with Business user & co-ordinate with offshore team.

Job Title: Informatica Developer

MetLife Insurance, USA July 2013 to Jan 2014

Project: MetLife Insurance

MetLife is a USA based client for Insurance. The iTracker is the webpage for the MetLife where any person who’s directly or indirectly involved in the MetLife business can access their account details. The Data Pertaining to MetLife businesses resides in a DB2 Data warehouse. There are many vendors who provide the information’s on daily and weekly basis, these data stores in oracle database as staging level and after manipulation final data in DB2 database as Data warehouse.

Skills used: Informatica Power Center 9.1/8.6.1, Oracle10g, SQL, SQL Navigator 4.5, UNIX(PuTTY), UltraEdit, Autosys, HP Service Desk, Lotus Notes, Windows2000/XP/7, Microsoft Visio.

Responsibility:

Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.

Involved in creating logical and physical data models using MS Visio based on business requirements.

Developed various Mappings, Mapplets and Transformations for data marts and Data warehouse.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Involved in Unit and Integration testing of Mappings and sessions.

Job Title: DataStage Senior Developer

Assurant Employee Benefits, USA May 2012 to June 2013

Project: Assurant Employee Benefits DWH

AEB is part of Assurant Inc., the US operations of Assurant. AEB is a leading group insurance service provider specializing health, dental, Vision, Disability and Life. To provide services in effective way AEB is having multiple applications spread over different technologies and databases respectively. To name few are Compass, CaTs, Solar, GF, CF etc., OA(Online advantage) application is providing all dental, vision, policy, benefits, member login, claim etc., related information as per respective roles.

Skills used: IBM InfoSphere DataStage v8.7(Parallel and Server), Ascential Datastage v7.5(Server), Oracle10g, SQL, SQL Navigator 4.5, UNIX(PuTTY), UltraEdit, Autosys, HP Service Desk, Lotus Notes, Windows2000/XP/7, Microsoft Visio.

Responsibility:

Used the Datastage Designer to develop processes for extracting, cleansing, transforming and loading data into Oracle Data warehouse database.

Preparation of Technical specification, Detail Design Document(DDD) for the development of DataStage Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defining ETL standards.

Followed Step by Step approach of Upgrade Process for all the Dev, Stage and Production Environments.

Worked with migration of data and jobs from Datastage v7.5 to latest version of Datastage v8.7.

Participated in requirements gathering process to bridge gap between user needs and tool functionalities.

Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.

Used different stages of Datastage Designer like Sequential/Hash file, Transformer, Merge, Oracle Enterprise/Connector, Sort, Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator.

Developed several jobs to improve performance by reducing runtime using different partitioning techniques.

Populated Type I and Type II slowly changing dimension tables from several operational source files.

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used Migrated Autosys to schedule the batch jobs. Used job flow monitoring to monitor the jobs (putting them on ice or on-hold or off–hold and force starting the jobs, change the status to success or inactive from aborted status after fixing the jobs).

Daily setup calls with the client and gives the updates and will take the requirements.

Co-ordinated with offshore team members performs production deployment, monitor production jobs and fix production aborts based on SLA.

Job Title: Informatica Developer

ARVAL, UK Oct 2010 to April 2012

Project: ARVAL Fuel Card DWH

Arval is the leading fleet and fuel management company in the UK with over 35 years of experience in the industry. Arval manages over 120,000 vehicles and have over 1.2 million drivers using their fuel cards. They provide services to both small business and large corporations including internationally managed contracts. They proactively research and educate both companies and drivers on fleet issues, particularly focusing on ‘the complete cost of vehicle ownership’, Health &Safety and the fleet’s impact on the environment.

The proposed Fuel Card Data Warehouse solution will consist of legacy and new data and changes relating to Customers, Accounts, and Merchants. Arval will further use the new Data Warehouse to generate critical reports for Fuel Card Customers, Accounts, Merchants and related entities. The target solution loads data from various different source files of TSYS, TS Prime systems.

Skills used: Informatica Power Center v9.1, SQL Server, Tidal, Lotus Notes, Microsoft Visio.

Responsibility:

System and Requirement Analysis.

Involved in phases of data modeling – identify the dimension tables, fact tables and created star schema model.

Created application and technical design document.

Development and testing of the business rules that are coded in Informatica ETL.

Develop and test report components.

Research and resolve any reconciliation discrepancies based on technology pull.

Responsible for creating Test cases, Test reports, Unit testing and fixing bugs.

Responsible for analysis and preparing the data lineage documents for database objects.

Preparing code walk through documents related to the assignments worked on.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Involved in Unit and Integration testing of Mappings and sessions.

Job Title: DataStage Developer

Assurant Health, USA March 2008 to Sep 2010

Project: Assurant Health DWH

Assurant Health’s mission is to be the premier provider of targeted specialized insurance products and related services in North America and selected other markets.

The project deals with ongoing Production Support, Ticket Resolutions, Maintenance and Development work for any new business changes. The data is extracted from various sources from Mainframes (DB2, VSAM, IMSDB), SQL server and passed on to IBM-AIX as flat files to be used as ETL data source. DataStage 7.5 PX does the ETL (extract, transform & load) and finally stores the data in destination Oracle 9i Database. Autosys is used to handle the scheduling of DataStage Jobs.

Skills used: IBM DataStage v8.1 PX, Oracle10g, Autosys, COBOL, JCL, DB2, CA7

Responsibility:

Developed Mainframe FTP jobs to extract data from the Mainframe and load it into the DataStage staging area.

Meetings with the different data owners to understand the source systems.

Understanding the business requirements and designing applications to meet client needs.

Developed DataStage jobs to extract, transform and load the data from Source to Target.

Developed Parallel jobs using Stages, which includes Join, Transformer, Sort, Merge and Filter.

Conducted Unit testing and created individual test cases.

Test the mappings and check the quality of the deliverables.

Tuned DataStage jobs for better performance.

Resolved the Tickets raised by the client as a part of Change Request.

Monitored production jobs and fixed the production failures based on SLA.

Job Title: Mainframe Developer

Marks & Spencer, UK July 2007 to Feb 2008

Project: Integrated Customer Ordering System (ICOS).

ICOS stands for Integrated Customer ordering systems; it supports the direct selling model, where customers can place their orders in M&S web-site. It consists of two sub-systems, they are ICOS1 & ICOS2.

ICOS1 takes care of customer in-store orders, these orders of two types, GM orders & Furniture orders. ICOS1 processes and sends GM and furniture orders to MUWS and AXL systems respectively to fulfill the customer in-store orders.

ICOS2 takes care of internet orders & phone orders. It supports GM, Wine, Foods (Via ESP), gift vouchers, gift hampers and flower orders. ICOS2 takes care of the corporate ordering systems via Vertex. ICOS2 passes the customer orders to corresponding fulfillment systems. Now, Amazon (Seattle) is the front ending system for all the customer web-site orders except Food and Corporate orders. ICOS2 is receiving all the orders from Amazon via xml parser in the form of XML messages, ICOS2 in turn processes and passes to Corresponding fulfillment systems.

Skills used: Mainframes (COBOL, VSAM and JCL) and Object Star.

Responsibility:

Requirement Analysis.

Interaction with Clients for requirements and data discussions.

Developed Object Star login screens.

Developed Mainframe jobs using COBOL, JCL and DB2.

Conducted Unit testing and created individual test cases.

EDUCATION:

B.Tech in Computer Science and Information Technology – JNTU



Contact this candidate