Post Job Free
Sign in

Sr. Informatica/ETL/Data Analyst/Developer

Location:
Cranbury, NJ, 08512
Salary:
80k
Posted:
March 31, 2009

Contact this candidate

Resume:

Consultant : Basant Dagar

Available: Immediately

Currently: Working as Data Analyst, Sr. Informatica / ETL /

DWH/ Oracle / Teradata Developer/Architect/

Administrator

Relocation: Anywhere

Rate: Negotiable

US Legal Status: H1B Contact Information :

SUMMARY

• An IT professional with 7+ years of experience (5+ years with Fortune 500 US Clients) in Data Analysis, Informatica PowerCenter 8.x/7.x/6.x, Oracle (SQL & PL/SQL), Informatica OnDemand, IBM DataStage 7x, Business objects 6x, Teradata V2.R6, SalesForce CRM Solution, Data Warehousing and Unix.

• Strong Working Experience with various clients ranging from Telecom, Financial, Life-Sciences (Pharma), Insurance and Web (domain-space) related industries.

• Excellent analytical skills in understanding the client systems and Organizational Structure.

• Design and Architect ETL jobs for the DW team

• Provide high level optimization ideas for the ETL team to improve there code performance. This includes multi processing.

• Possess excellent documentation skills, prepared best practices document’s and have experience in data modeling and data modeling tools like Erwin.

• Knack of converting client requirements into physical and logical data models as part of Data Analysis initiatives.

• Created ERWin Data Models to map various Data Sources to the Data Warehouse.

• Worked with Informatica Server and Client tools, experience in the Data Analysis, Design, Development, Implementation, Testing, Production Support of Database/Data warehousing /Legacy applications for various industries using Data Modeling, Data Extraction, Data Transformation and Data Loading.

• Extensive experience working with clients/end-users, requirement gathering, has strong communications, interpersonal skills and has the ability to work under own initiative and respond to peaks in demand.

• Research oriented, raise issues upfront and like to address them as soon as they have been identified.

• Metadata Analysis to find out gaps, timings of various jobs and to create a central repository during migration phases.

• Strong Working Experience of PowerCenter Administration, Designer, Informatica Repository Administrator console, Repository Manager and Workflow Manager.

• Worked on Administration of Informatica PowerCenter and Ascential DataStage.

• Involved in development of Informatica, DataStage Jobs with required transformations like Aggregator, Filter, Lookup, Sorter, Normalizer, Update strategy etc.

• Installation of Informatica patches and upgrades, user access management, deployment activities, on-call support and capacity planning.

• Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling using Erwin.

• Extensive Experience in writing SQL queries, stored procedures, functions, packages, triggers, exception handlers, Cursors, 3NF, PL/SQL records & tables.

• Involved in Query Level Performance tuning using Explain Plans, SQL Trace and TKPROF utilities to pin point time consuming SQL’s and tuned them by creating indexes and forcing specific execution plans.

• Used Power Exchange to source copybook definition and then to rowtest the data from data files, VSAM files.

• Performance Tuning of Informatica Jobs by finding out the bottlenecks and by using partitioning option, loaders etc.

• Teradata experience using Teradata SQL Assistant, Teradata Administrator and data load/unload utilities like BTEQ, FastLoad, MultiLoad, FastExport, and Tpump.

• Worked on active data Warehouses with database sizes up to 10 Terabytes.

• Used Unix Shell Scripts to automate day-to-day operations.

• Experience in TriZetto’s Facets, Core Java, Security (Sun’s Identity Management) and mainframe applications.

• Working experience of tools like Toad, Putty, SmartFTP and Mercury Quality center (QC- defect tracking), TestDirector.

• Followed complete Software Development Life Cycle (SDLC), agile methodologies in various projects.

• Uncover simpler ways of doing things.

EDUCATIONAL QUALIFICATION

• Bachelors in Technology (B.Tech) in Computer Science, Kurukshetra University, India.

CERTIFICATIONS

• Informatica PowerCenter 8 Mapping Designer (Code-S) Certified.

• Informatica PowerCenter 8 Advanced Mapping Design (Code-U) Certified.

• Oracle Certified (Exam – Introduction to Oracle9iSQL 1Z0-007).

• BrainBench RDBMS, Informatica v6, Data Warehouse concepts Certified.

• 2007 Privacy and Data Security (Gramm-Leach-Bliley Act) Certified.

TECHNICAL SKILLS:

Data warehousing Tools : Informatica Power Center 8.x/7.x/6.x, Power Exchange, Data Profiler, Infomatica Data Quality, Kalido, Repository Server Administrator Console, IBM DataStage 7x.

Databases : SQL Server, Oracle 10g/9i/8i, Teradata v2r6, DB2, Sybase 11.0.

CRM Solutions (SaaS) : SalesForce OnDemand CRM solution, Oracle CRM package.

Languages : SQL, PL/SQL, C, C++, HTML, COBOL, Core Java, JCL.

Database Utilities : BTEQ, FastLoad, MultiLoad, FastExport, Tpump, TOAD 8.0/7.1, SQL*Plus, SQL*Loader, SQL Navigator.

Reporting Tools : BO 6x (Business Objects).

Operating System : UNIX (Sun Solaris 8.0, AIX5.1) and Windows NT/2000.

Scripting Languages : Shell Scripting.

Other Tools : Erwin Data Modeler, SmartFTP, WinScp, Putty, Mercury QA, Marvel ticketing.

Security : Suns’s Identity Management (IDM).

PROFESSIONAL EXPERIENCE:

Client : Network Solutions, Herndon, VA

Position: Sr. ETL/Informatica Developer/Architect, Data Analyst

Duration: Aug2008 – Till Now

Project: MDM (Marketing Data Mart)

Network Solutions is the largest domain provider in USA. Joined there existing Enterprise Data System (EDS) team as Data Analyst cum ETL Architect and successfully designed, developed and enhanced there marketing data mart (MDM). Suggested key enhancements to there existing Marketing data mart which resulted in building more efficient marketing strategies. Analyzed, identified, fixed bad data and Imported data from SalesForce – CRM, Oracle, Teradata and flat files on near real time basis and integrated that data to one single place to provide users the capability to report on near real time/live data warehouse.

Responsibilities:

• Understand the existing subject areas, source systems, target system, operational data, jobs, deployment processes and Production Support activities.

• Design queries for marketing data marts and then figure out how to make these queries yield comparable data.

• Discover the source data that causes the problems downstream.

• Listen intently to a manager, translating what he says into data fields.

• Utilized existing Informatica, Teradata, Sql Server, SalesForce and UNIX knowledge to deliver work and fix production issues on time in fast paced environment.

• Worked closely with Business Object’s reporting team in order to meet users/business Adhoc and other reporting needs.

• Designed the overall ETL solution including analyzing data, preparation of high level, detailed design documents, test plans and deployment strategy.

• Proactively resolved open defects against MDM and other subject areas.

• Forage through rows of data, realizing patterns.

• Carried out the testing strategy/validations against MDM subject area by implementing key test cases.

• Extracted/loaded data from/into diverse source/target systems like Teradata, Oracle, SalesForce, COBOL, XML and Flat Files.

• Supported daily nightly productions jobs on rotation basis.

• Worked closely with the Administrator for creating new users and to manage security.

• Analyzed the data models of the source & target systems to develop comprehensive mapping specifications.

• Raise tickets/issues with respective teams in the event of production failures and get in touch the respective teams to get the issues fixed as earliest as possible.

• Monitor and stand by on nightly jobs to make sure every job runs successfully and gets completed within allotted timeframe.

• Worked on Infomatica Data Quality to resolve customers address related issues.

• Worked on Informatica OnDemand to mainly import data from SalesForce.

• Design and developed ETL process to load and extract data using MLOAD.

• Used Teradata’s OLELoad, BTEQ and Sql Assistant.

• Used Power Exchange to source copybook definition and then to row test the data from data files etc.

Environment: Informatica Power Center 8.6, SalesForce, SQL Server, Teradata v2r5, Test Director, Oracle 10g, DB2, TOAD 8.6, UNIX AIX 5.1, Windows XP.

Client : AT&T, Atlanta, GA (Contractor with Capgemini)

Position: Sr. Informatica/ETL Admin/Analyst/Developer

Duration: Jul2007 – Jul2008

Project: BOBI (Broadband Operation Business Intelligence)

AT&T is the largest Telecom Company in the USA. Joined existing onshore BI team as Sr. ETL Analyst and successfully designed, developed business solutions and contributed under Broadband operation and Business Intelligence (BOBI) project which aims in fulfilling AT&T’s need for reporting to better understand the market trends, behavior, future opportunities and to improve there decision making process. Coordinated with the business and P&A team to understand the system requirements and then analyzing and designing ETL solutions to accomplish the same. Involved in various successful releases to accomplish AT&T’s reporting needs under order-activation (OA) functional area.

Responsibilities:

• Demonstrated strong analytical, problem-solving, organizational, communication, learning and team skills.

• Analyzed the data models of the source systems to develop comprehensive mapping specifications, created POC’s to prepare high level design, mapping specification, Low level design, unit testing plan, monitoring and resolving production issues.

• Forage through rows of data, realizing patterns.

• Designing a Database for Staging Area.

• Extracted data from Sql Server, Oracle, flat files to pull it into the Staging area.

• Informing respective business lines incase the SLA is missed and taking appropriate action afterwards.

• Written PL/SQL procedure and functions to identify certain patterns in data and then to update the data as per business rules.

• Carried out Data Analysis for mapping of all sources of Data involved.

• Provided Technical Consultancy on performance of a production Database.

• Modify ETL mappings/reports to correct longstanding problems

• Analyzed loading of over 40 Files into a Data warehouse to ensure Data Integrity.

• Co-ordination with offshore team.

• Involved in designing the architecture of the system by coordinating with business.

• Single handedly prepared design, utp’s and developed Informatica mapping for dry-loop release when onshore team members had tight deadlines on other release.

• Worked extensively with Type 2 dimensional tables using Informatica.

• Identifying and removing Bottlenecks to improve the performance of Mappings, Sessions and Workflows using partitioning, fixing bottlenecks at source and target level.

• Designed database & UNIX Scripts needed to meet the system requirements.

• Optimizing SQL queries for better performance using Hints, Indexes and Explain Plan.

• Design and develop ETL mappings using Informatica to support the reporting data mart.

• Performed day to day migrations of various Informatica objects using export-import option.

• Performed Informatica Administrator functions like creating repository, Groups Users, Folders, giving permissions on the folders to the users, installing patches and upgrades, capacity planning etc.

• Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager. The objective is to extract data stored in different databases such as Oracle and DB2 and to load finally into a single Data Warehouse i.e. Oracle.

• Developed Mapplets and Transformations.

• Involved in writing shell scripts and added these shell scripts in Cron Job, as scheduled daily, weekly, monthly

• Involved in managing Informatica backup and restores.

• Data Modeling using Erwin.

Environment: Informatica Power Center 8.5.1, SQL Server, Sybase, Oracle 10g, DB2, TOAD 8.6, UNIX AIX 5.1, Windows XP.

Client : BlueCross BlueSheild (Carefirst), MD

Position : Sr. Informatica Developer / Administrator

Duration: Mar2007 – Jun2007

Project: Administration of HSC and EDW projects

BlueCros BlueShield, United States is one of the largest non-profitable Health Insurance Company in the world. The services provided are health care insurance, coverage, prescriptions, education, Research, and Marketing etc. HSC and EDW are primarily responsible for BlueCross BlueSheilds facets, dental, provider etc extracts wherein data is continuously changing for the same purpose Informatica’s Power Exchange CDC option has been used to deliver near real time data to the business.

Responsibilities:

• Performed Informatica Administrator functions like installing and configuring version 8.1.1, enabling-disabling and starting-shutting down Repository and Integration services, creating relational connections, Groups Users, Folders, giving permissions on the folders to the users etc.

• Performed day to day migrations of various Informatica objects using deployment groups and copy-wizard option.

• Extracted data from Facets application to pull it into the Staging area.

• Updated data into the Facets application.

• Analyzed facets data like claims, billing to resolve related subject areas issues.

• Used Facets application to open, add generations, enter and save information.

• Extensively worked with pmcmd and infaservice startup and shutdown commands.

• Involved in setting up Data-Profiler setup and creating POC of data-profiler mappings.

• Analyzed the data models of all the extracts such as facets, dental, provider and FRS.

• Performed ETL development task’s like creating jobs using different stages, debugging them etc.

• Working very closely with UNIX and Data-Base administrators.

• Raising tickets to Informatica.com and to interact with Informatica consultants for best practices and on-going issues.

• Involved in migration of Informatica 6.1.2 to Informatica 8.1.1 workflows and different objects.

• Involved in managing Informatica backup and restores.

• Mentoring EDW team in their ongoing development issues, best-practices, performance tuning etc.

• Extracted and analyzed source data from various databases like DB2, Oracle.

• Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager for resolving day to day issues.

• Extensively worked in the performance tuning of the long running ETL mappings and workflows.

• Involved in running of shell scripts and using other UNIX commands to facilitate smooth migration and development process.

• Created mapping/mapplets, reusable transformations using transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy etc.

Environment: Informatica Power Center 8.1.1 and 6.1.2, Datastage 7x, Data-Profiler, Oracle 10g, DB2, SqlPlus, UNIX AIX , Windows XP professional.

Client : Royal Bank of Scotland, Atlanta

Position : Sr. ETL Developer/Architect/Administrator

Duration: Jan2005 – Feb2007

Project: FSA-IRR (Financial Services Authority-Integrated Regulatory Reporting)

Royal Bank of Scotland, UK is the second largest Bank in UK and seventh largest bank in the world. The services provided are Personal, Private, Business, Annuities, Corporate Banking, Research, and Marketing etc. Financial Services Authority (FSA) is an independent non-governmental body, given statutory powers by the Financial Services and Markets Act 2000.

This FSA-IRR is taking care of the Mortgage Plans, Policies, Annuities, and Risks etc in UK market. In Drop -1 of this project data is picked up from the various source systems and is maneuvered as per the FSA compliance and also CDC were running in background to capture real time changes and to deliver almost current data to the business using the MLAR (Mortgage Lending Administration Reports) and ADR (Annuities Administration Reports) reports. This data is utilized by the FSA to assess the risks in the UK Mortgage and Tax benefit market and to assist with the prudential supervision of firms.

Responsibilities:

• Analyzed the data models of the source systems to develop comprehensive mapping specifications.

• Designing a Database for the loading and Staging Area.

• Extracted data from Sql Server, Oracle, flat files to pull it into the Staging area.

• Produced specifications for mapping from various source Systems to Data Warehouse.

• Performed Informatica Administrator functions like creating repository, Groups Users, Folders, giving permissions on the folders to the users etc.

• Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager. The objective is to extract data stored in different databases such as Oracle and DB2 and to load finally into a single Data Warehouse.

• Used Teradata’s BTEQ, FLOAD, MLOAD and Sql Assistant for data testing, analysis and loading.

• Developed Mapplets and Transformations.

• Involved in migration of Informatica 7.1.3 to Informatica 8.1.1.

• Extensively worked in the performance tuning of the ETL mappings and workflows using partitioning, external loaders, lookup cache etc.

• Involved in writing shell scripts and added these shell scripts in Cron Job, as scheduled daily, weekly, monthly

• Extensively created mapping/mapplets, reusable transformations using transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy etc.

• Created Perl Scripts.

• Written JCL code to schedule various mainframe nightly jobs.

• Worked on Power Exchange for change data capture (CDC).

• Involved in managing Informatica backup and restores.

• Data Modeling using Erwin.

Environment: Informatica Power Center 8.1.1, Power Analyzer 4.1, SQL Server, JCL, Teradata V2.R6, DB2, TOAD, Perl, UNIX AIX 5.1, Windows 2000.

Client : Merck Pharmaceuticals, PA (Done from Offshore)

Position: ETL Developer/ Analyst

Duration: Dec2003 – Dec2004

Project: Master Data Management

Merck Pharmaceuticals is the second largest company in Life Sciences and Pharmacy in the World. The services provided are Research, Prescriptions and Marketing e.t.c. Master Data Management (MDM) is the project for maintaining old data of customers, sales groups within the Merck Network for eliminating the need of Legacy systems and to use the new Technologies for there new data set.

MDM involves updating of Customer data, Party data, Product data, Organization data, Employee data, Bridge-Alignment data and Customer Information management system.

Responsibilities:

• Involved in the administration of Informatica PowerCenter 7.1.2 like installing and configuring version 7.1.2, creating relational connections, Groups Users, Folders, giving permissions on the folders to the users etc.

• Creating Informatica Repository in Oracle 9i database to store the metadata required for the Data Mart.

• Used Oracle Data Integrator for some jobs at the initial stage of project.

• Creating folders, user groups and users within them and assigning selective privileges.

• Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager.

• Written PL/SQL packages, procedures and functions to achieve various business functionalities.

• Developed Mapplets and Transformations.

• Involved in development of Informatica Jobs.

• Extensively worked in the performance tuning of the ETL mappings and workflows.

• Involved in writing shell scripts.

• Extensively created mapping/mapplets, reusable transformations using transformations like Lookup, Filter, Expression, Stored Procedure, Aggregator, Update Strategy etc.

• Extensively used command line tool like pmcmd, pmrep & pmrepagent.

• Supervising Informatica Developer to develop jobs, guiding them to implement logic.

• Involved in Query Level Performance tuning using Explain Plans, SQL Trace and TKPROF utilities to pin point time consuming SQL and tuned them by creating indexes and forcing specific execution plans.

Environment: ETL-Informatica Power Center 7.1.2, Oracle Data Integrator, Power Exchange 5.1, Oracle 9i, Erwin 4.14, SQL, PL/SQL, TOAD , Sun Solaris 8.0, WinNT

Client : Pacific Life Insurance (Done from off-shore)

Position : ETL Consultant / Mainframe Job Scheduler

Duration: Apr2002 –Nov2003

Project : Insurance Data Warehouse

The Pacific Life Insurance is one of the largest insurance companies in the United States. Pacific Life offers a host of products to assist you in meeting your insurance needs. Insurance Data warehouse was to implement a centralized data to provide a single source of integrated and historical data. This project also implements Claims Data Mart (CDM). The Claim Data Mart (CDM) is used to store, manage, and to deliver access to Claims transactions for reporting and analysis. Additionally, there is a web-portal, through which business users can run and view reports as well. Data marts were continuously enhanced to provide additional analytical information and reporting.

Responsibilities:

• Extensively used Informatica client tools-Source Analyzer, Warehouse designer,

Mapping designer, Mapplet Designer, Transformation Developer, Informatica

Repository Manager, Facets application.

• Created various jobs in DataStage to load data into warehouse.

• Performance tuning of various DataStage jobs.

• Extracted data from Facets claims application to pull it into the Staging area.

• Used Facets claim application to open, add generations, enter and save information.

• Created Informatica mappings using Source Qualifier, Expression, Lookup (connected and unconnected), Aggregate, Update Strategy, Joiner, Normalizer and Filter transformations.

• Developed Mapplets and Transformations.

• Identifying and Removing Bottlenecks in order to improve the performance of Mappings and Workflows.

• Schedule various mainframe nightly jobs using JCL.

• Setting the security management that included creating user groups and assigning privileges.

• Taking the backups using Designer and Workflow Export/Import utility.

• Production support for the monthly loads of data.

• Shell scripting in UNIX.

• BO Universe development of various application schemas.

• Used command line tool like pmcmd, pmrep & pmrepagent.

Environment: IBM DataStage 7x , Informatica PowerCentre 6.1, Oracle 9i, DB2, BO 6x, Sybase, Erwin, O/s: Sun Solaris 8.0, WinNT.

PROFESSIONAL TRAINING’S:

• Informatica 6.1.2 Developer / Administrator training.

• Informatica Data Quality, Power Exchange (CDC).

• Oracle9i – PL/Sql.

• Siebel Analytics.

• Business Objects XI, MicroStrategy.

• Data stage 7x.

• Teradata 12.0.

• Tidal – Job Scheduler

• Marvel – Change Management System.

• SalesForce CRM service.

• Core Java

• Security – Sun’s Identity Management.



Contact this candidate