Post Job Free
Sign in

Data Customer Service

Location:
Hyderabad, AP, India
Posted:
July 24, 2014

Contact this candidate

Resume:

MADHIRE M

Phone: 508-***-****

Email: *****@**************.***

PROFESSIONAL SUMMARY:

• Over 7 years of IT experience in System Analysis, Design Development, Implementation

and testing of Databases, Data Warehouse Applications on client server technologies in

Pharma, Banking and Finance Domains.

• Experience in Extraction, Transformation and Loading (ETL) of Data into Data Warehouse

using Informatica Power Center 8.x, 7.xversions.

• Designed the Workflows, Mappings, creation of Sessions and scheduling them using

Informatica Power Center 8.6, 8.1 and 7.1.

• Experience in Designing and developing the complex mappings from various

Transformations like Source Qualifier, Joiner, Aggregator, Router, Filter, Expression, Lookup,

Sequence Generator, Java Transformation, Update Strategy, XML Transformations and Web

Services.

• Experience with relational and dimensional data modeling, star schema, snowflakes

chema and fact constellations.

• Designing logical and physical databases using Erwin and developed data Mappings

between source systems and target components using Mapping Design Document.

• Experience in Debugging sessions and mappings; Performance Tuning of the Sessions

and mappings, implementing the complex business rules, optimizing the mappings.

• Experience in Data cleansing, Slowly Changing Dimensions

• Experience in extraction of data from various Heterogeneous sources (Relational database,

Flat Files, XML, Excel) to load into Data Warehouse/data mart targets.

• Expertise in databases, schema objects, Performance Tuning of SQL statements.

• Hands on Experience Teradata SQL and associated utilities like BTEQ, Fast Load, Fast

Export and Multi Load.

• Implemented data warehousing techniques for Data cleansing, Slowly Changing Dimension

Phenomenon’s (SCD) and Change Data Capture (CDC).

• Experienced with the Informatica Data Quality (IDQ) tool for Data Cleansing.

• Extensive development, support and maintenance experience working in all phases of the

Software Development Life Cycle (SDLC) especially in Data warehousing.

• Experience with SQL Query Tuning, SQL Server/Oracle RDBMS.

• Good understanding and working experience in Logical and Physical data models that

capture current/future state data elements and data flows using Erwin.

• Prepare/maintain documentation on all aspects of ETL processes, definitions and mappings

to support knowledge transfer to other team members.

• Experience in production Support in Technical and Performance issues.

TECHNICAL SKILLS:

Data Warehousing Tools: Informatica Power Center and Power Exchange 8.x, 7.x, 6.x and

5.x, Business Objects XIR2, 5.1, 6.5, Crystal Reports 7.0, 8.0,

9.0, 10.0

Databases/RDBMS/Others: MS SQL Server 2008/ 2005/2000, Oracle 8i/9i/10g/11g, DB2

UDB, Teradata V2R5.0/V2R6.0, Sybase, XML, Flat Files, CSV

Files

Microsoft Suite: Excel, Word, PowerPoint

Data Modeling Tools: CA Erwin, MS Visio

Scripting Languages: UNIX Shell scripting, Korn Shell Scripting (K-Shell) and Perl

Job Control/Other Tools: Control-M, Autosys, TOAD, Putty and CVS

Programming Languages: C, C++, SQL, PL/SQL, BTEQ, T-SQL, and PHP.

PROFESSIONAL EXPERIENCE:

Client: TIVO, Santa Clara, CA, Oct 2012 – Till Date

Role: Informatica Developer

Description: TiVo Inc. develops and provides software and technology for advanced television services.

Informatica Power Center tool has been used for all ETL operations. Data flux Power Studio tool has

been used for All Data validations (Data Quality) operations.

Responsibilities:

• Developed mappings in Informatica Power Center to load the data from various sources

using transformations like Source Qualifier, Expression, Lookup (connected and

unconnected), Aggregator, Update Strategy, Filter, Router etc.

• Created parameter files in Informatica Power Center and passed them to Power Center Tasks.

• Tuned Informatica Power Center mappings for better Performance.

• Translated business requirements to Informatica Power Center Mappings.

• Responsible for identifying reusable logic to build several Mapplets which would be used in

several mappings.

• Created mappings to extract and de-normalize (flatten) data from XML files using multiple

joiners with Informatica Power Center.

• Created the transformation routines to transform and load the data.

• Extensively used TOAD to create target tables and accessed data.

• Data cleansing (Data flux) scope is limited to the Name and Address standardization only.

Additionally data validation will be done against an appropriate USPS library exact business

rules for cleansing will be determined during the Requirement analysis phase.

• Wrote Oracle PL/SQL procedures for processing business logic in the database. Tuning of

Database queries for better performance.

• Batch job to identify the potential duplicates and mark it as winner and loser based on

business rules and send to data management ( Data flux) for validation. Upon confirmation

from Data Management (Data flux), have a batch job built to merge the duplicates with

minimal manual intervention.

• Worked closely with the end users and Business Analysts to understand the business and

develop the transformation logic to be used in Informatica Power Center.

• Written UNIX Korn shell scripts along with Control M for scheduling the sessions and

workflows.

• Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on

production processes – success / failure rates for causal analysis as maintenance part and

Enhancing exiting production ETL scripts.

• Developed Test Plans and written Test Cases to cover overall quality assurance testing.

Environment: Informatica Power Center 9.1/8.6,Informatica IDQ 8.x,Informatica Power Exchange 8.0,

TOAD for Oracle, Oracle10g/11g, SQL Server2005/2008, Erwin 4.0, DB2,DataFlux 8.x,PL/SQL, Power

Builder, Sun Solaris 8.0 and Shell Scripting.

Client: Credit Suisse, Morrisville, NC Nov 2011 – Sep 2012

Role: ETL/Informatica Developer

Description: The project is related to the Collateral Management which is a core application of

Investment Banking. The Collateral Management Application is provided by Algorithmic called Algo

Collateral. Algorithmic had upgraded its tool from V4 to V5. The Algo Collateral Application is Collateral

Management Tool that performs the Call Calculation Mechanism of OTC and FX data. The new version

Algo 5 is populated using Informatica to perform ETL process and replaced the existing legacy systems

using SQL plus and loader utilities.

Responsibilities:

• Involved in gathering the Business requirements from the client and writing and maintaining

technical/design documentation.

• Designed and developed end-to-end ETL process for the production systems.

• Analyzed the business requirements for ETL process and created schema flow diagrams in

VISIO.

• Performed extensive analysis of metadata to test the integrity, consistency and

appropriateness of the data brought into the warehouse from various sources.

• Created and deployed the ETL code across the environments.

• Analyzed and tuned queries on the database that will be executed from reporting system.

• Leveraged Informatica to extract data from heterogeneous source systems and aggregate the

data and load into the target warehouse.

• Extracted data from sources like MS SQL Server and Flat Files in to target database Oracle.

• Created various Mapplets in designer using Informatica Power Center Mapplet Designer.

• Design and Development of ETL routines, using Informatica Power Center within the

Informatica Mappings, usage of Aggregator, SQL overrides usage in Lookups and

source filter usage in Source qualifiers and data flow management into multiple targets

using Routers.

• Worked on Unix AIX4.3 machine on which Informatica Server was installed to check for the

session logs.

• Wrote stored procedures for dropping and recreating indexes for efficient Data Load.

• Extensively worked on the tuning of mappings and sessions.

• Used Parallel hints for performance tuning.

• Worked on Clear Case Source Control System, this provides a virtual workspace to check

in and checkout various files and directories.

Environment: Informatica Power Center 8.6, Oracle 10g/9i, MS SQL Server 2005, PLSQL Developer 8.0,

Control-M, Algo Collateral V4.6 / 5.1, MS Visio 2003, Windows XP, Business Objects XI, UNIX,AIX 4.3.

Client: Glaxo SmithKline, Philadelphia, PA Apr 2009 – Oct 2011

Role: Informatica Developer

Description: The project built a Pharmaceutical Data Warehouse to present an integrated, consistent,

real-time view of enterprise-wide data. GSL built a decision support system to compare and analyze

product prices, quantities, and patient profiles. IMS Health data is combined with data from other sources

and is made available for ad hoc reporting. This Data Warehouse enhances Sales reporting for a

pharmaceutical research group, delivering reports and information to sales and marketing management.

Responsibilities:

• Source system evaluation, standardizing received data format, understanding business/data

transformation rules, business structure and hierarchy, relationships, data transformation

through mapping development, validation and testing of mappings.

• Involved in requirement gathering, analysis and designing technical specifications for

the data migration according to the business requirement.

• Developed Technical Design Documents

• Involved in monitoring Informatica jobs/processes using workflow monitor

• Tuned performance of Informatica sessions for large data files by increasing block size, data

cache size, sequence buffer length, and target based commit interval.

• Analyzed the session logs, bad files and error tables for troubleshooting mappings and

sessions.

• Implemented slowly changing dimension methodology and slowly growing targets

methodology for modifying and updating account information and accessing them.

• Worked with Type-I, Type-II and Type-III dimensions and Data warehousing Change Data

Capture (CDC).

• Wrote Unix scripts to back up the log files in QA and production

• Created Mapplets in place of mappings which were repeatedly used like formatting date or

data type conversion.

• Extensively worked with SQL queries, created stored procedures, packages, triggers, views

using PL/SQL Programming.

• Involved in optimization and performance tuning of Informatica objects and database

objects to achieve better performance.

• Offered production support for daily jobs

• Performed Unit and Integration testing and wrote test cases

• Worked extensively in defect remediation and supported the QA testing

• Experience in taking repository backup and restoring it, starting and stopping Informatica

services and worked with pmcmd commands.

Environment: Informatica Power Center 8.1, 7.1, Oracle 9i/10g, SQL Server, PL/SQL, Windows XP,

ERWIN, Business Objects and UNIX.

Client: Federal Reserve Bank of Richmond, VA Apr 2007 – Mar 2009

Role: Informatica Developer

Description: The Federal Reserve Bank of Richmond is the headquarters of the Fifth District of the

Federal Reserve that deploys a full range of corporate & banking services including capital raising, market

making and financial advisory services. The aim of the project was to create a Data Warehouse that would

involve source data from different departments like Finance, Sales and provide complete analytical

solutions.

Responsibilities:

• Involved in design, development and maintenance using procedures for getting data from all

systems to Data Warehousing system and data was standardized to store various Business units in tables

• Worked with business users to create/modify existing reports using the Viador reporting tool.

• Parsed high-level design specifications to simple ETL coding and mapping standards.

• Used Power Center for Extraction, Transformation and Loading data from heterogeneous source

systems into the target data base.

• Created DTS Packages using SQL Server 2000.

• Used stored procedure, views and functions for faster processing of bulk volume of source data

• Upgraded exiting complex DTS packages to corresponding SSIS by using integration services

Control Flow and Data Flow Transformation tasks (i.e. File system, For Each Loop, Derived Columns, Union

all, Merge join, Condition Split, Aggregate).

• Develop ETL processes to replicate data from multiple platforms into reporting databases.

• Created and scheduled SSIS packages for transferring of running feeds from various departments

and multiple servers and resources to Dev. Servers.

• Responsible for unit testing and Integration testing.

• Assisted in mentoring internal staff on Informatica best practices and skill.

• Responsible for Performance Tuning of Informatica Mapping and Tuned SQL Queries.

• Responsible for multiple projects with cross functional teams and business processes.

• Responsible for development and support of ETL routines and designed Informatica loads required

for populating the data warehouse and Experience in loading high-volume data, Tuning and

troubleshooting of mappings and Created documentation to support for the Application .

• Developed ETL process for the integrated data repository from external sources.

• Provided production support by monitoring the processes running daily

• Created Functional Specifications for the different Problem Logs to get the approval of the work.

• Created Remedy Tickets for the work approval from different users.

Environment: Erwin 4.0, Informatica Power Center 8.6, Viador 7.1, PL/SQL, SQL Server 2000, SSIS,

Control M, HP-UX, AIX 4.3.3, Shell Scripting.

Client: City Bank, India Mar 2006 – Feb 2007

Role: Data warehouse Developer

Description: The City Bank project helps the customer service representatives to deal and transact with

customers loan, credit, debit, portfolios, investment etc. The operational data of different financial

departments loaded into central Data Warehouse and transformed into different regional data marts. The

various corporate metrics/web reports like Credit Profile, Product Profile, and Funds Transfer etc. are

generated periodically. Informatica Power Center is used to extract the base tables in the data warehouse

and the source databases include Oracle.

Responsibilities:

• Involved in the requirement definition and analysis in support of Data Warehousing efforts.

• Created Repository using Repository Manager.

• Worked Extensively on Informatica tools -Repository Manager, Designer and Server

Manager.

• Involved in Extraction, Transformation and Loading (ETL) Process.

• Created the Source and Target Definitions using Informatica Power Center Designer.

• Imported Flat files to Designer did some modifications, used in the Mappings and exported

into an Oracle tables.

• Developed and tested all the backend programs, Informatica mappings and update

processes.

• Created and Monitored Batches and Sessions using Informatica Power Center Server.

• Tuned the mappings to increase its efficiency and performance. Used Informatica Workflow

Manager to create workflows, Workflow Monitor was used to monitor and run workflows

• Involved in production support which also includes trouble shooting of connectivity issues.

Environment: Informatica Power Center 5.1/6.2, Oracle 8i, UNIX, TOAD, Windows NT.

EDUCATION:

• Bachelors in Electronics & communication Engg. From JNT university, India



Contact this candidate