Employer Mail: - ******@*****.***
Employer Phone: 703-***-****
Summary:-
Around 9 years of extensive experience in the field of Information Technology. Experience in providing Business Intelligence solutions in Data warehousing and Decision Support Systems using IBM Info sphere Data Stage 7X, 8.0.1,8.1,8.7&9.1 & Informatica Power Center 6.x/7.x/8.x/9.1
Work with data analysts and developers, Business Area, and Subject Matter Experts (SME) performing development activities across all phases of project development lifecycle.
Apply data warehousing principles and best practices to development activities.
Developing the post implementation warranty (PIW) support document and transfer knowledge to operation team.
Experience in Software Analysis, Testing, Design, Development and Production Support in Data Warehousing using Oracle 8i, 9i, 10g, and 11i and DB2.
Worked extensively with Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping and Mapplet Designers
Working with Business System Analysts to understand the requirements and preparing the Source to Target Mapping Documents.
Developing Strategies for Extraction, Transformation and Loading (ETL) mechanism using IBM Info sphere Data Stage and Informatica.
Worked extensively with OBIEE 11g Reporting tool.
Experienced with ETL and Business Intelligence development in and Enterprise Data Warehouse environment, including identifying user requirements.
Experienced with Data Migration Projects from Informatica to IBM Info Sphere Data stage and DS Version to another DS Version.
Experience in designing slowly changing dimensions
Experience in complete life cycle Implementation of data warehouse.
Designed and developed Oracle PL/SQL scripts for Data Import/Export.
Experience in Performance Tuning of Source, Targets, jobs, sequences and Transformations.
Sound knowledge of Dimensional Modeling, Star schema, Snowflake schema.
Experienced with Developing the Data Warehouse and dependent Data marts.
Experience in integration of various relational and non relational sources such as Oracle, DB2, SQL Server, Flat Files, XML files(Queues through MQJ Explorer), Cobol files, SAP R/3 and PeopleSoft.
Experience in UNIX shell scripting to support and automate the ETL process.
Extensive Experience in Database Programming and Development using SQL Server, Oracle, Access, Teradata to include Database Architecture, Logical & Physical Design, creation of users, repository backups, Restore, Data Integration.
Expertise in SQL, PL/SQL Programming, Stored Procedures, Functions and Triggers.
Creating the Builds to promote the code from Development to Test, QA and Production.
Excellent problem solving skills with a strong technical background and good communication skills.
Ability to work in teams as well as individually and a quick learner
Ability to meet tight deadlines and work under pressure.
Education
Master of Technology, India
Technical Background
ETL Tools
IBM Information Server(Data Stage) 9.1/8.x/7.x, Informatica Power Center 9.1/8.x/7.x
RDBMS
Oracle 11g/10g/9i/8i, SQL Server 2000, IBM DB2 V9.2.
Programming Languages
SQL, PL/SQL, C, C++, COBOL, JCL, VB, UNIX Shell Scripting.
Operating Systems
Windows98/2000/NT/XP, UNIX AIX, Sun Solaris, Linux
Scheduling Tools
AutoSys 4.0, Maestro, Control-M
Other Tools
TOAD7.5,Sqlworkbench,AdvancedQuery Tool(AQT),SQL Assistant,Techtia,Putty
Certified on IBM Certified Solution Developer (7.5v)
Professional Experience
Capital One Financial(COF),USA Aug 2015 – Oct 2016
Capgemini Pvt Ltd
Sr. ETL/Data Stage Consultant
Project Description :
The CRS Data Warehouse provides business users the ability to do analysis over time. The data marts help them do reporting, both ad-hoc and standard. This helps them to answer various business questions on behavior analysis, profitability of customers over time, usage and profitability of products, retention of profitable customers and marketing promotions.
The raw data comes from the operational systems i.e CMS, CDM, FAS, TRAMS, CTA, STATEMENTS, TRIAD & Strategy ware systems is extracted in the raw format at first level. Then that data is passed through series of cleansing and transformation processes to get the clean, transformed data for loading into the Warehouse. The Idea of keeping all required elements in PMD (Point of Maximum Data) is for future maintenance and mart build. This dataset can be used as one point source for all kind of information. The summary details are then summarized from PMD (Point of Maximum Data) and/or any load file depending upon the summary requirement. The Data Warehouse process includes daily, cycle and month end extracts and loads
Responsibilities:
Involved in understanding the scope of application, present schema, data model and defining relationship within and between groups of data
Prepared mapping documents for the source to stage extract and the stage to EDW ETL jobs.
Participated in data model design and ETL design.
Closely worked with the solution architecture team to resolve issues in the data model.
Prepared ETL design documents and low level design documents.
Developed Data Stage Parallel jobs for extracting data from heterogeneous sources.
Conducted performance tests and resolved issues to ensure that the jobs can be completed within the SLA.
Wrote PL/SQL Stored Procedures, Packages, and Triggers necessary for the system.
Wrote complex PL/SQL queries using joins to optimize the query performance from the Oracle database.
Conducted code reviews for the jobs developed by the onsite and offshore teams.
Closely worked with the testing team for data loads for the system acceptance testing.
Provided assistance to the Business Analyst team regarding questions on the data model and data availability.
Created implementation plans and migration checklists (ETL and DB objects) for production deployment.
Worked with the Production Operations team to address any issues during the deployment process.
Provided production support during the project go-live and the warranty phase.
Analysis of the specifications provided by the client
Involved in Preparation of Low Level Design Documents
Used Data stage Designer to develop process for extracting, cleansing, transforming, integrating and loading the data into the Warehouse
Passed values by loading parameters through Parameter set
Used stages like Join, Lookup, Sort, Column import, dataset, sequential file, Transformer stages, DB2 Enterprise stage
Adhered to Quality procedures related to the project
Written queries to check whether data has satisfied the business rules or not
Involved in Unit Testing
Used Data Stage Designer for Exporting and importing the jobs
Based on Requirement, Creating the Data Stage Jobs, Sequences and Reusable Components.
Performed the testing by analyzing existing jobs and corresponding scripts.
Frequent interaction with client on discussing business
Environment: IBM Info sphere Data stage8.1, DB2,Winscp, UNIX Shell Scripting & IBM AIX
OHBI BIR,USA July 2014 – July 2015
Capgemini Pvt Ltd
ETL Developer
Project Description :
OHBI CPT is a project which is aimed at launching and deploying a Product which will be universal and generic in its approach and will combine sources and can be implemented across 84 countries. The purpose of One HSBC is to implement a single Group Enterprise Warehousing strategy. The project involves many domains like Deposits, Customers, Credit cards, mortgage, core assets and loan and Product etc.
Responsibilities:
Involved in Requirement gathering and preparation of Design Documents
Developed custom coding standards, technical specification, test specification, and designed the data mart schema
Worked on dimensional modeling, Star Schema modeling - fact and dimension table design, physical and logical Data modeling.
Created various Documents such as Functional specifications, Source-To-Target Data mapping Document, Unit Test Cases and Data Migration Documents..
Based on Requirement, Designing the jobs in Web sphere Transformation Extender
Implemented Type I/II slowly changing dimension (SCD) tables.
Imported Source/Target Tables from the respective databases and developed jobs using Joiner, Lookup, Filter, Transformer, Aggregator, Funnel and Oracle Enterprise stages.
Proficiency in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment
Involved in designing the simple to complex Transformation Extender jobs.
Involves in the project Release Deployment tasks.
Wrote complex PL/SQL queries using joins to optimize the query performance from the Oracle database.
Wrote PL/SQL Stored Procedures, Packages, and Triggers necessary for the system.
Performing as individual contribution.
Prepare the Test Plans, Test Objectives and Test Scripts.
Upload the test scripts In Quality Centre (QC) and execute the scripts during the Integrated test cycles
Based on Requirement, Creating the Data Stage jobs, Sequencers and Reusable (Shared Containers) Components.
Involved in designing the simple to complex Data Stage jobs.
Involved in Design Development and Deployment of Data Stage Parallel jobs. Used stages like sort, aggregator, transformer,XML input, XML output, pivot, FTP stage etc
Environment: Data Stage 8.1, DB2 9.5 &Unix, Putty,techtia,Sql workbench, Control –M
AlRajhi Bank, Malaysia Jan2014 – Jun2014
Capgemini Pvt Ltd
Senior Software Programmer
Financial Information Trove (FIT): Banking Data Warehouse product. The existing FIT product is built on the Informatica Enterprise Edition. Utilizing data model and Informatica power center ETL tool for development. This project handles the issues with incorrect data mapping, missing data, incorrect data architecture, ETL performance issues. This team is entrusted with the responsibility of delivering some critical Finance, Supply/Demand reports for the worldwide Executive decision support using the data from operational and legacy source systems. Data was extracted from Flat files Source transformed and loaded to Staging Area and Staging Area to Warehouse again DB2.
Responsibilities:
Worked extensively with Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping and Mapplet Designers.
Work involved mainly monitoring workflows, running on failures, updating the data and fixing the bugs.
Interacted with the onsite team, resolved the technical issues and maintained weekly Build.
Updated mapping, session and workflows as per business requirement changes.
Investigated the data and make sure the build has successfully completed.
Extensively used Informatica transformations like Union, Joiner, Aggregator, Expression, Sorter, Sequence Generator, Filter, Router, Rank, Look up and Update Strategy transformations to model various standardized business processes.
Environment: Informatica PowerCenter, Windows XP, db2, Control M and Quality Centre (QC).
Ally Basel2 Reference Data,USA Oct2012- Dec 2013
Capgemini Pvt Ltd
Sr.ETL Developer
Project Description :
Ally currently uses Excel-based model to support Basel I calculations. Considering the potential need to compare the Basel I capital with Basel II capital for attribution analysis, a shared infrastructure where common data sources may be leveraged for these two calculations is considered appropriate. As part of the Basel II Calculator project, there is an intermediate layer which updates the core Revalues Calculator with data from the inputs files received from the various System of Records (SOR) or System of Access (SOA). This intermediate layer is the Data Acquisition and Standardization layer or the Basel II Integration Layer (BIL).
The BIL is broadly divided into 2 parts namely the Reference Data flows and the Integration Flows. Reference Data flow is primarily concerned with the master data coming from various reference sources and uploading into the Reference Data tables. Integration Flow, data is received from the various SOR files and pushed into the Reveleus Staging Tables after having performed enrichment and pre-processing activities.
Responsibilities:
Business requirement analysis.
Using agile scrum method, partitioned deliverables in to sprints and delivering the sprints on timely manner.
Involved in preparation of Low Level Design Documents.
Used Data stage Designer to develop process for extracting, cleansing, transforming, integrating and loading the data into the Warehouse.
Involved in SIT support, fixed the defects.
Supported the onshore team in implementation of the application.
Used Macro for converting the Excel tabs in to individual CSV files.
Daily call with client and onshore team to discuss on day to day status and issues
Environment: Informatica Power Center 8.x, db2, Control M and Quality Centre (QC).
Ameriquest Mortagage, USA July 2011 – Sep2012
Capgemini Pvt Ltd
Software Engineer
Project Description:
The Enterprise Data Warehouse of Ameriquest Mortgage was built by integrating financial information across various OLTP systems. EDW marts were designed to extract data from EDW staging for end Users and managerial adhoc analysis. Enterprise Data Warehouse is mainly used for Customer Status, Index Rates, and Loan allocation status, involved parties in Loan Origination and Total Loans funded for each Branch.
Responsibilities:
Worked extensively with Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping and Mapplet Designers.
Used Informatica Power Center 7.1.2 for migrating data from various OLTP databases to the DWH
The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Sequence generator.
Translated the business processes into Informatica mappings for building the data marts.
Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
Used Debugger to test the mappings.
Delivering all deliverables on Time.
Involved in unit testing.
Environment: Informatica Power Center 8.x, Oracle, Unix, Toad & Quality Center (QC).
Euroclear Apr 2010 – Jun 2011
Capgemini Pvt Ltd
ETL Developer
Project Description :
Euroclear is a Belgium based financial services company that specializes in the settlement of securities transactions as well as the safekeeping and asset servicing of these securities. It was found in 1968 as part of J.P. Morgan & Co. to settle trades on then developing Eurobond market.Euroclear settles domestic and international securities transactions, covering bonds, equities, derivatives and investment funds. Euroclear provides securities services to financial institutions located in more than 90 countries.
Responsibilities:
Involved in Requirement gathering and preparation of Design Documents
Developed custom coding standards, technical specification, test specification, and designed the data mart schema
Prepare the Source to Target mapping between the Data Warehouse and target data mart databases.
Based on Requirement, Designing the jobs in Web sphere Transformation Extender
Involved in designing the simple to complex Transformation Extender jobs.
Involves in the project Release Deployment tasks.
Prepare the Test Plans, Test Objectives and Test Scripts.
Upload the test scripts In QC and execute the scripts during the Integrated test cycles
Based on Requirement, Creating the Data Stage jobs, Sequencers and Reusable (Shared Containers) Components.
Involved in designing the simple to complex Data Stage jobs.
Involved in Design Development and Deployment of Data Stage Parallel jobs. Used stages like sort, aggregator, transformer,XML input, XML output, pivot, FTP stage,etc
Environment: Web sphere Transformation Extender, DataStage 8.1,db2,Unix
Prime Life Insurance Corporation Ltd. Sep2007 – Aug 2009
Dream Tekis Software Pvt Ltd
Software Engineer
Project Description :
SOHAM Life is a web based application, scalable life suite that offers optimum solution For Life Insurance. It offers Agent Management, New Business Management, Underwriting, Policy Services, Claims Management, and Accounts Management. Agent Management module allows Insurance Companies to manage all types of Agents. New Business Management allows Insurance companies to simplify the process of receiving the proposals, scrutinize the proposal form and validate agent authenticity. The Underwriting Module simplifies the complex processes like Finance and Medical Underwriting, which is cost and time effective. The Policy Services module take care of entire gamut of customer services starting from altering policies to handling of complexities like revival, policy loans,assignments with ease.
Responsibilities:
Involved in Functional Requirement gathering
Involved in Functional Design & Functional Testing
Involved in Creating the SRS Documents
Involved in Creating the Views
Environment: Product Design Analysis and Functional Testing.