Post Job Free
Sign in

Data Manager

Location:
Jersey City, NJ
Posted:
August 08, 2016

Contact this candidate

Resume:

Sameer Hussain

201-***-****

************@*****.***

PROFESSIONAL SUMMARY:

Six plus years of experience in Analysis, Design, Development and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Databases Client/Server and Web applications on Windows and Unix platforms in various verticals.

Six Plus years of Data warehousing & ETL experience using Informatica Power Center 9.1/8.6/8.5/7.1/6.2/(Repository Manager, Repository Server Admin Console), OLAP, OLTP.

Six plus years of Dimensional Data modeling using Star & Snow Flake schema.

Four plus years in developing Bteq scripts, Fast load, Multi load, TPUMP and Fast Export utilities in Teradata based on business rules and in writing extract algorithms to handle high volume of data from various data sources and flat files into target data warehouse data base.

Expertise in Query Analyzing and Performance Tuning

Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, documentation, training, implementation and post-implementation review.

Experience in Design and Development of ETL Architecture to load data from different sources like ORACLE, Flat files and XML files,Db2 UDB, Teradata, Sybase and SQL Server into Teradata, XML, ORACLE and SQL Server targets.

Experience in designing and developing Mappings, using various transformations like Unconnected and Connected lookups, Source Qualifier, Rank, Sorter, Router, Filter, Expression, Aggregator, Joiner and Update Strategy.

Extensively used SQL, PL/SQL in writing Stored Procedures, Functions, Packages and Triggers.

Experience in Performance tuning of ETL process. Reduced the execution time for huge volumes of data for a billion dollar client.

Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top-down and bottom-up approach.

Extensive experience with ETL tool Informatica in designing the Workflows, Worklets, Mappings, Configuring the Informatica Server and scheduling the Workflows and sessions using Informatica Power Center 9.1/8.x.

Vast Experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.

Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).

Well Experienced in doing Error Handling andTroubleshooting using various log files.

Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Extensive experience in Tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.

Good exposure to working with various scheduling tools like Autosys, Control-M, Informatica Scheduler and Cronacle.

Experience designing for large scale, highly available, fault tolerant data management systems in a dynamic environment.

Experience on working with Client, Customer and User interfacing projects

Team player and self-starter with good communication and inter-personal skills.

TECHNICAL SKILLS:

Data Warehousing

Informatica PowerCenter 9.1/8.6/8.5/8.1/7.1/(Repository Manager, Designer, Workflow Manager, Repository Server Admin Console),OLAP,OLTP

Reporting tool

Business Objects XI R3-R2-R1/6.5/6.0/5.1/5.0, ORACLE Discoverer, SSRS

Data Modeling

Erwin 4.1,MS visio,UML

Scheduling Tools

Autosys, Informatica Powercenter Scheduler,cronjobs

Programming

SQL,PL/SQL, Transact SQL, Unix Shell Scripting

Databases

ORACLE 10g,9i/, SQL Server 2005/2000,Teradata

Operating Systems

Windows NT/2000/XP,Sun Solaris 5.10/5.5

EDUCATION:

Bachelor of Engineering in Computer Science from University of Kashmir, India.

PROFESSIONAL EXPERIENCE:

Merck & Co, Rahway, NJ Sep’ 11– Till Date

ETL Developer

Aon plc is the leading global provider of risk management, insurance and reinsurance brokerage, and human resources solutions and outsourcing services. For one reinsurance subdivision, an Enterprise Data Warehouse (EDW) was implemented byintegrating data from different feeder systems situated across various locations into a central repository of business information. Teradata was used as the database for this repository. The EDW provides quick access to data to enable a more informative decision making process.

Responsibilities:

Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.

Developed ETL mappings, transformations using Informatica Power Center 9.1

Extracted data from flat files (provided by disparate ERP systems) and loaded the data into ORACLE staging using Informatica Power Center.

Analyzed Source Data to resolve post-production issues. Used MS Access to analyze source data from flat files.

Involved in using Tetadata Fastload, Multiload and Fast Export utilities.

Involved in writing Teradata scripts to extract data from Data warehouse .

Query Analysis using ExplainPlan for unnecessary product joins, confidence factor, join type, order in which the tables are joined.

Involved in Performance tuning of Teradata code.

Used volatile table and derived queries for breaking up complex queries into simpler queries.

Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.

Fine Tuning existing Teradata procedures, macros and queries to increase the performance

Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.

Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.

Used Mapping Parameters and MappingVariables based on business rules provided.

Scheduled workflow daily basis for incremental data loading

Wrote PL/SQL Procedures for data extractions, transformation and loading.

Assisted in Data Modeling and Dimensional Data Modeling.

Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.

Scheduling jobs using TWS to automate the Informatica Sessions.

Used TOAD to FTP file moving processes to and from source systems.

Performed Unit testing for all the interfaces.

Environment:Informatica Power Center 8.6.1, ERWIN 3.5, Shell script, Toad, SQL *Loader, SQL, PL/SQL, MS Visio ORACLE 10g/9i, Teradata 12,Solaris 10 1/13

CSX, Jacksonville, FLSep’ 10 – Aug’11

Informatica Developer

CSX Corporation, based in Jacksonville, owns companies providing rail, intermodal and rail-to-truck transload services that are among the nation's leading transportation companies, connecting more than 70 river, ocean and lake ports, as well as more than 200 short line railroads. The purpose of the project is to create webapplication to track the orders from customers from the time that the orders are placed to the time that the order is closed or cancelled.Etl tool pulls various source sales data into odertrack environment.and push data into secondary environment.

Responsibilities:

Working directly with client to capture requirements and preparation of FSD’s.

Leading Offshore Team in development activities in Informatica 8.6

Designed the mapping technical specifications on the basis of functional requirements

Responsible for the core design, development, testing and production support of the ETL process

Developing the Teradata scripts for history data population

Dealt with Incremental data as well Migration data to load into the Teradata.

Involved in using Teradata Fast load, Multi load, Fast Export and TPUMP utilities and particularly taken care of PIs, NUPIs, USIs, NUSIs, partitioning columns and join indexes

Involved in Performance Tuning of Teradata code.

Writing FSD’s, TSD’s, and Data mapping Sheets, SIA’s and Run Books.

Interacting with other teams and stake holders on daily basis.

Responsible for delivering the objects as per the project plan.

Assisting QA team in creation of test plans.

Responsible for Deployment of Informatica Objects and Database objects.

Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures.

Developed and tested all the backend programs, Informatica mappings and update processes

Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited) and

Relational Tables.

Involved in performance tuning of SQL Queries, Sources, Targets and Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Generated and Executed SQL Queries to be fired on the Repository Tables to analyze the results of the Sessions.

Worked on Hierarchical relationship within table to build mappings per business rule.

Created mappings using various Transformations such as Source qualifier, Aggregator, Expression,

lookup, Router, Filter, Rank, Look up, Sequence Generator, and Update Strategy.

Creating Autosys JIL files for Batch and command jobs, file watchers, table polling for scheduling in Autosys in various versions.

Engaging with Informatica Support team in identifying bugs and raising SR’s with Informatica Corp.

Environment: Informatica 8.6, ORACLE 11g, PL/SQL, SQL Server 2008, T-SQL, Toad, SQL Developer, SQL Loader, UNIX Server, Windows XP client.

Comcast,Philidelphia,PA Oct’ 08 – Aug’ 10

ETL/Informatica Developer

Comcast is one of the world\'s leading media, entertainment and communications companies. Comcast is principally involved in the operation of cable systems through Comcast Cable and in the development, production and distribution of entertainment, news, sports and other content for global audiences through NBCUniversal. Comcast Cable is one of the nation\'s largest video, high-speed Internet and phone providers to residential and business customers.The project delivered large scale multi-terabyte DW and ODS reporting solutions.

Responsibilities:

Involved in the Informatica server installation and setting up the environment.

Worked on developing Unix scripts for data cleansing and data archiving

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.

Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.

Involved in designing and developing various complex ETL mappings

Involved in Creation of SQL, Packages, Functions, Procedures, Views, and Database Triggers.

Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like ORACLE, MS SQL Server, XML, Flat files into the staging area and Design ETL processes that span multiple projects

Involved in writing the SQL procedures, used SQL Server DTS to improve the warehouse loading.

Designed and Developed ODS to Data Mart Mappings/Sessions/Workflows.

Created various ORACLE database objects like Indexes, stored procedures, Materialized views, synonyms and functions for Data Import/Export.

Created reusable worklets and workflows.

Used Transformation Language functions in the mappings to produce the desired results.

Used TOAD to run SQL queries and validate the data in warehouse and mart.

Involved in Debugging and Troubleshooting Informatica mappings.

Populated error tables as part of the ETL process to capture the records that failed the migration.

Involved with Scheduling team in creating and scheduling jobs.

Used CDC for moving data from Source to Target.

Designed the ETL processes using Informatica tool to load data from ORACLE, flat files into Netezza (Staging, Warehouse).

Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations

Writing stored procedures and DTS packages for maintenance tasks in Production environment.

Implemented various Data Transformations using Slowly Changing Dimensions

Developed test cases for Unit, Integration and system testing

Involved in Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.

Partitioned the Sessions for better performance.

Designed of ETL mappings for the CDC change data capture

Trained end users in using full client BO for analysis and reporting.

Wrote SQL Scripts and PL/SQL Scripts to extract data from Databases

Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 8.6.0/8.1.1, DB2, Erwin 4.0, UNIX Shell Scripting, ORACLE 9i/10g/11g, PL/SQL, Business Objects XI R2, Teradata SQL, Teradata Utilities(BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD), Netezza Database, Tivoli Workload Scheduler 8.4, TOAD 9.7.2.

ICICI Bank, Hyderabad, India June’ 06 – Sep’ 08

ETL/Informatica Developer

ICICI is a diversified financial service company that provides a broad range of banking, asset management, wealth management, and corporate and investment banking products and services. The data warehouse was primarily designed to provide Marketing Managers with the complete understanding of customer pattern (market based analysis) and their preference (Trend Analysis). This data warehouse was later extended to cater the needs of Sales, Marketing and Human Resources.

Responsibilities:

Architecture the dataflow in the datamart, extensively customized Bank’s traditional methodology, designed data flow diagrams, designed the best solution for data flow.

Involved in discussions with business analysts for requirement gathering, understanding the requirements and explanation of technical probabilities and possibilities with business users.

Created Time Estimate Proposal document with estimation of hours required for completion of each ETL task.

Conversion of business requirements into technical documents – Business Requirement Document, explained business requirements in terms of technology to the developers.

Worked with data modeler and business users for designing of tables.

Monitored jobs and batches daily and provided upfront solutions to the jobs that got failed and tracking the problem, maintaining the solutions to the frequently occurring problems.

Analyzed the source data with business users, developed critical mappings using Informatica PowerCenter .

Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.

Extensively Used Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator.

Developed mappings to handle exceptions and discarded data.

Used External Loader – SQL * Load – for bulk loading of ORACLE tables.

Created UNIX scripts for using SQL * Loader and defined positions for each field in Flat files.

Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and FACT tables.

The table or column structures were changed without affecting the Conceptual model using Entity relationship.

Worked extensively on Erwin tool on Logical and physical level Design.

In systems analysis logical data models are created as part of the development of new databases.

Assisted data modelers in database design and successfully brought database to 2nd Normal form.

Worked with DBA's and boiled down where to Partition the Tables, adding Indexes to the columns.

Unit testing the data and report generation for review of business users for special accounts.

Developed UNIX shell scripts and used PMCMD to execute the workflows.

Exported the workflows from Repository Manager, checked the workflows into svn (Version Control Management tool).

Prepared documents for QA and PRODUCTION migration.

Worked with business users and QA team during testing phases.

Environment:Informatica Power center Designer 8.1, Workflow Manager, workflow monitor, Repository manager, ORACLE 10g/9i,Sql Server2000,, UNIX, COBOL,ERWIN 3.5, Shell script, Rapid-SQL, Toad, SQL *Loader, SQL, PL/SQL, PVCS, Visio, AutoSys.



Contact this candidate