Post Job Free
Sign in

Manager Data

Location:
8852
Posted:
December 16, 2010

Contact this candidate

Resume:

SOUMI

Sr.Informatica Developer

. About * years of IT experience in System design, development, testing and

maintenance.

. 7 years of Technical and Functional experience in Decision Support

Systems - Data Warehousing, ETL development.

. Extensively used Informatica Power Center Mart (8.x/7.x/ 6.x) for ETL

(Extraction, Transformation and Loading) of data from multiple source

database systems to Data Warehouses in UNIX/Windows environment.

. Involved in Data Transformation and Loading, for Star Schemas, Dimension

less Fact and Snowflake Schemas.

. Experienced in performance tuning of sources, mappings, targets and

sessions.

. Solid understanding of Data Warehousing life cycle and strong working

experience on Data Warehousing applications.

. Directly responsible for the Extraction, Transformation and Loading of

data from multiple sources into Data Warehouse.

. Designed complex mappings using Normalizer, Aggregator, Updated Strategy,

Lookup, Joiner and Stored Procedure Transformations.

. Worked in a variety of Business domains which include Investment Banking,

Retail Life Insurance, Retail and Health care, including off-shoring

models.

. Extensive database experience using Oracle 10g/9i/8i/7, QL, PL/SQL,

SQL*Plus, SQL*Loader. Used SQL queries to retrieve the data from the

database.

. Extensively worked on the scheduling tools like Autosys.

. Extensively used the revision control tools like Clear Case and WinCVS.

. Effective communications skills (verbal and written) for both highly

technical and non-technical audiences.

. Expert in writing UNIX Shell Scripting & PERL scripting for various

needs.

. Team Player, Able to Work with a new and fresh perspective on work

assignments, good in debugging, creative and able to learn new things.

. I have experience worked with teradata SQL assistant in order to send

queries to any ODBC database, export data from the database to a file on

a PC, create reports from any RDBMS that provides an ODBC interface,

create historical records of the submitted SQL with timings and status

information.

. Also have experience working with various teradata utilities like

Fastload, Multiload,T-Pump, BTEQ.

. Used quality centre for identifying the bugs and then fixing the bugs.

. Highly motivated, energetic individual and a very good team player

willing to work hard to meet intensive schedules and having excellent

communication, presentation and interpersonal skills.

Skills:

Operating System UNIX (Sun Solaris 2.8), Windows

NT/95/98/2000/ME/DOS/XP

Programming/Scripting FORTRAN, C, C++, Shell Scripts, Java, SQL, PL-SQL

Web Publishing HTML, Macromedia Dreamweaver 4, Flash 5,

Fireworks 4, Microsoft FrontPage

RDBMS Oracle 9.x/8.i/7.x, SQL Server 6.5/7.0, Teradata

V2R4.x, MS Access 7.0

ETL Tools Informatica 5.2/6.5/7.1/8.1/8.6.1 Informatica

5.2/6.5/7.1/8.1/8.6, (Workflow Manager, Workflow

Monitor, Warehouse Designer, Source Analyzer,

Transformation Developer, Mapplet Designer,

Mapping Designer, Repository Manager and

Informatica Server) Complex Data Exchange, ETL,

Data Mining, Data Mart, OLAP, OLTP.

OLAP/DSS Tools BusinessObjects5i/6i, Supervisor, Designer,

Infoview4.0, BusinessMiner4.2,

Data Modelling ERwin 4.x/3.5.2, Microsoft Visio, Relational

Modelling, Normalized Schema, Dimensional

Modelling, Star/Snow Flake Schema, Physical &

Logical Modelling

Software Tools Microsoft Office 2003/2000, Adobe PhotoShop 6

and Adobe Illustrator 9

GUI VB 5.0/6.0, Oracle Forms 6.0, Oracle Reports 6.0,

VC++,

BAJC, New Jersey

Sr.ETL Application Developer - Anti-Money Laundering & BASEL II Compliance

March '2008 - Till date

Role:

. Involved in requirements gathering and analysis to define functional

specifications.

. Created technical specs documents for all the mappings using MS Visio.

. Developed ETL processes to load data from different source systems

such as Oracle, Teradata and Flat Files into target systems i.e.

Oracle, Teradata and Flat files.

. Used Teradata SQL assistant to send queries to the database and to

view and sort the results by column and save them to a file.

. Extensively worked on Informatica Power Centre - Source Analyzer, Data

warehousing designer, Mapping Designer, Mapplet & Transformations to

import source and target definitions into the repository and to build

mappings.

. Various kinds of the transformations were used to implement simple and

complex business logic. Used debugger to test the mapping and fixed

the bugs.

. Created and configured workflows, worklets & Sessions to transport the

data to target systems using Informatica Workflow Manager.

. Used Mapplets and Reusable Transformations prevent redundancy of

transformation usage and maintainability.

. Maintained metadata and naming standards for future application

development.

. Provided Knowledge Transfer to the end users and created extensive

documentation on the design, development, implementation, daily loads

and process flow of the mappings.

. Responsible for different Data mapping activities from Source systems

to Teradata

. Developed PL/SQL procedures for processing business logic in the

database.

. Optimised and tuned the mapping, sessions of the Informatica objects.

. Developed UNIX shell scripts to create parameter files, rename files,

compress files and for scheduling periodic load processes.

. Involved in Refined existing 3 data models to remove de duplicating

data and query inconsistency and data redundancy

. Responsible for BASEL II compliance, and designed data models for

capital investment and risk calculation and analysis.

. Responsible for designing Informatica ETL jobs to calculate complex

operational risk and credit risk factors

. Designed various mappings using transformations like Lookup, Router,

Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and

Expression Transformation.

. Created the control files for the SQLLDR and optimised the data

migration from Sybase to Oracle.

. Used DQC tool for Data profiling as part of data-integration project

analysis and to control the quality of data in transactional and

analytical applications.

Environment: ETL/Informatica PowerCenter 8.6.1, Informatica Power Exchange

8.6, Oracle 11g, PL/SQL, Sybase, Flat files, UNIX, Toad, SQL Developer,

VSAM, CSV, AS400, DB2, Shell Scripting, HP Quality Center 10.0, Erwin

7.3,teradata SQL assistant,HP Quality Centre.

CGJC, New Jersey

Dec '2007 -Mar '2008 ETL Developer - Informatica

Equity Trade Fund - EDWH (Data Reconciliation)

Role:

. Involved in requirements gathering and analysis to define functional

specifications.

. Leading the offshore Team involved in requirement gathering and making

the design process.

. Creation of reconciliation reports.

. Developed ETL processes to load data from different source systems

such as Oracle Flat Files into target systems i.e. Oracle and Flat

files.

. Extensively worked on Informatica Power Centre - Source Analyzer, Data

warehousing designer, Mapping Designer, Mapplet & Transformations to

import source and target definitions into the repository and to build

mappings.

. Various kinds of the transformations were used to implement simple and

complex business logic. Used debugger to test the mapping and fixed

the bugs.

. Created and configured workflows, worklets & Sessions to transport the

data to target systems using Informatica Workflow Manager.

. Developed complex mappings using Lookups connected and unconnected,

Rank, Sorter, Joiner, Aggregator, Filter, Router transformations to

transform the data per the target requirements.

. Extensively used Informatica to load data from Flat Files to Teradata,

Teradata to Flat Files and Teradata to Teradata

. Responsibilities included designing and developing complex Informatica

mappings including Type-II slowly changing dimensions.

. Implemented constraint based load ordering of target data within the

pipeline.

. Implemented Aggregation logic using expression transformation to

implement the business logic.

. Used TOAD Software for Querying ORACLE. And Used Teradata SQL

Assistant for Querying Teradata

. Created the reusable transformations and Mapplet objects to embed the

business logic for improving the ETL session performance.

. Used the Incremental Aggregation process to capture the changes from

various sources and apply to Targets incrementally.

. Developed shell scripts for running batch jobs and scheduling them

. Used Workflow Manger tasks like Email, Command task, decision task,

event raise and event wait tasks.

. Used Mapplets and Reusable Transformations prevent redundancy of

transformation usage and maintainability.

. Maintained metadata and naming standards for future application

development.

. Provided Knowledge Transfer to the end users and created extensive

documentation on the design, development, implementation, daily loads

and process flow of the mappings.

. Developed PL/SQL procedures for processing business logic in the

database.

. Tuned Informatica Mappings and Sessions for optimum performance.

. Developed UNIX shell scripts to create parameter files, rename files,

compress files and for scheduling periodic load processes.

Environment: ETL/Informatica PowerCenter 8.1, Informatica Power Exchange

8.1, Oracle 11g, Teradata, PL/SQL, Flat files, PERL, UNIX, Toad, AS400,

DB2, Shell Scripting, HP Quality Center, IDE/IDQ

WMTLR, Arkansas Sep '2006 -Dec

'2007

Retail Data Mart - Customer Data Integration

Informatica Developer

Role:

. Involved in requirements gathering and analysis to define functional

specifications.

. Created technical specs documents for all the mappings using MS Visio.

. Developed ETL processes to load data from different source systems

such as Oracle Flat Files into target systems i.e. Oracle and Flat

files.

. Extensively worked on Informatica Power Centre - Source Analyzer, Data

warehousing designer, Mapping Designer, Mapplet & Transformations to

import source and target definitions into the repository and to build

mappings.

. Various kinds of the transformations were used to implement simple and

complex business logic. Used debugger to test the mapping and fixed

the bugs.

. Created and configured workflows, worklets & Sessions to transport the

data to target systems using Informatica Workflow Manager.

. Worked with power center tools like Designer, Workflow Manager, Task

Developer, Workflow Monitor, and Repository Manager

. Extensively used Source Qualifier Transformation and used most of its

features like filter, sorter and SQL override

. Extensively used various Active and Passive transformations like

Filter, Transformation, Router Transformation, Expression

Transformation, Source Qualifier Transformation, Joiner

Transformation, Lookup Transformation, Update Strategy Transformation,

Sequence Generator Transformation, Rank Transformation and

Aggregator Transformation

. Extensively worked with Joiner functions like normal join, full outer

join, master outer join and detail outer join in the Joiner

transformation.

. Worked with Index cache and Data cache in transformations like Rank,

Lookup, Joiner and Aggregator Transformations.

. Used Mapplets and Reusable Transformations prevent redundancy of

transformation usage and maintainability.

. Maintained metadata and naming standards for future application

development.

. Provided Knowledge Transfer to the end users and created extensive

documentation on the design, development, implementation, daily loads

and process flow of the mappings.

. Developed PL/SQL procedures for processing business logic in the

database.

. Tuned Informatica Mappings and Sessions for optimum performance.

. Developed UNIX shell scripts to create parameter files, rename files,

compress files and for scheduling periodic load processes.

Environment: ETL/Informatica 7.1, Oracle 11g, Teradata V2R4, PL/SQL, Flat

files, PERL, UNIX, Toad, AS400, DB2, Shell Scripting, Test Director

STB, Inc., Atlanta, Georgia Oct '2005 -Sep

'2006

ETL Developer

Global Consumer Data Mart - GCG

Role:

. Involved in requirements gathering and analysis to define functional

specifications.

. Responsible for managing Informatica Services like Repository Service,

Integration Services, Reference Table Manager Services etc.

. Designed and developed error handling mechanism and rekey mechanism in

Informatica

. Monitored daily loads and providing on call production support (L2)

during data Loads.

. Developed mappings between multiple sources such as flat files, db2,

oracle and multiple targets.

. Strong experience in using Expressions, Joiners, Routers, Lookups,

Update strategy, Source qualifiers and Aggregators.

. Created Informatica mappings which included data explosion to

incorporate critical business functionality to load data.

. Created Mapplets for reusable business rules.

. Extensively worked on Mapping Variables, Mapping Parameters, Workflow

Variables and Session Parameters.

. Used the Workflow manager to create workflows, worklets and tasks.

. Implemented pipeline partioning concepts like hash-key, round robin,

key-range, pass-through techniques in mapping transformations.

. Created Cron jobs scripts for automation of ETL processes.

. Creating Change Control Requests as required by assigned projects and

ensuring end to end extensive support as Data Analyst and address

overall data quality, including the design of ETL strategies to

source, match a Merge from various disparate systems

. Developed ETL processes to load data from different source systems

such as Oracle Flat Files into target systems i.e. Oracle and Flat

files.

. Various kinds of the transformations were used to implement simple and

complex business logic. Used debugger to test the mapping and fixed

the bugs.

. Created and configured workflows, worklets & Sessions to transport the

data to target systems using Informatica Workflow Manager.

. Used Mapplets and Reusable Transformations prevent redundancy of

transformation usage and maintainability.

. Maintained metadata and naming standards for future application

development.

. Tuned Informatica Mappings and Sessions for optimum performance.

M,m

Environment: ETL/Informatica PowerCenter 7.1.2, Oracle 9i, Business

Objects, ERWIN 4.0, PL/SQL, Toad, Flat files, UNIX, DB2, Shell Scripting,

Rational Clear Quest

HP, Bangalore, INDIA Mar '2003

-Sep '2005

ETL Developer

HP Neoview platform is an enterprise data warehouse that enables businesses

to confidently capitalize on their information at a dramatically lower

cost. It provides the dependability, scale, functionality and availability

required by high-end data warehouse systems, plus delivers these

capabilities with appliance-like simplicity and speed. HP's Neoview

enterprise data warehouse platform is an integrated offering supported by

HP. Backed by the world's largest technology vendor with the broadest

spectrum of technologies and partners.

Data Migration into Siebel Application: The project is about migrating data

from Legacy Systems to Siebel 7.8.2, the data used for migration deals with

Sales and Services aspects of HP Products. The objective is to enable a

full 360 view of all HP touch points with Sales and Services. The

application will be used by Sales and Service users.

Role:

. Participated in Business Requirement discussion and transferred into

technical specification for data loading.

. Prepared the documentation as dataflow charts and design documents on

how the data flow from Legacy Systems to Siebel 7.8.

. Involved in gathering requirements from business, identifying gaps,

level of effort and creating technical design.

. Mapped Legacy system to Siebel Data Model. This was done to load data

from Legacy system to Siebel every night.

. Created ETL process for loading data from Siebel OLTP to Siebel OLAP.

. Created Mapplets, Transformations and Workflows using Informatica.

. Created Mappings to load data from the Siebel base tables into

Warehouse Staging tables.

. Created SIL mappings to load data from Warehouse Staging tables into

Warehouse Fact and Dimension tables.

. Created tasks and groups for sessions and Used DAC Client to schedule

the ETL process and to troubleshoot the data loads

. Mapped Legacy system to Siebel Data Model. This was done to load data

from Legacy system to Siebel every night.

. Mapped Account, Contacts, Assets, Opportunities, Agreements,

Entitlements, Service Requests, Orders, Activities and other entities

to Interface and Base tables taking S_PARTY model into consideration.

. Extensively worked with Unicode, batches, ntext and CLOB data types

. Worked with shell scripts for archiving data files and for scheduling.

. Automated load run on Informatica sessions through UNIX cron, PL/SQL

scripts and implemented pre and post-session scripts, also automated

load failures with successful notification through email.

. Used SQL Scripts, Materialized views, Stored Procedures and PL/SQL

Scripts to extract data from Database.

. Used Trillium for data cleansing and also worked on job files to

create integrated data quality solutions.

. Production support for Informatica jobs (24 x 7)

Environment: ETL/Informatica PowerCenter 6.1/7.1.1, SQL Server, Trillium,

PL/SQL, BI, Data warehousing, Flat files, Siebel, Toad, Siebel Analytics,

Business Objects.



Contact this candidate