Post Job Free

Resume

Sign in

Manager Data

Location:
Worthington, OH
Posted:
June 22, 2015

Contact this candidate

Resume:

Navaneethakrishnan N

acqcvm@r.postjobfree.com

*** -***-****

Professional Summary

Enthusiastic, highly skilled consultant for 8+ years in the field of Data Warehousing using the Informatica ETL tool, possessing superior analytical skills and the ability to meet deadlines. Enjoy sharing technical expertise and coordinating projects with team members. Professional skills and Competencies include the following:

Knowledge in all phases of the Data Warehouse Development Life Cycle.

Development strategies for Extraction, Transformation and Loading (ETL) of data from various sources into Data Marts and data Warehouses using Informatica Power Center. ETL experience includes working with Informatica Power center 7.x/8.x/9.x on development projects on the UNIX platform.

Strong experience in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables and legacy system files.

Experience in Informatica mapping specification documentation, tuning mappings to enhance the performance, proficient in using Informatica Server Manager to create and schedule workflows and sessions.

Experience in integration of various data sources like SQL Server, Oracle, Teradata and DB2.

Familiarity with SQL and PL/SQL which includes writing Stored Procedures, Functions, Cursors& Triggers.

Experience in UNIX Shell Scripting.

Have in-depth understanding of dimensional modeling and data mart development including star and snow-flake schemas and slowly changing dimensions.

Good exposure to various Software life cycle models like Waterfall, Agile and Rapid Prototyping.

Experience as a technical lead for Onsite-Offshore team.

Experience in gathering requirements, design and hands on development in all phases of the project.

Data analysis skills - ability to dig in and understand complex models and business processes .Very strong communication and interpersonal skills with people of all levels and roles.

Education:

Master of Computer Application from Madurai Kamaraj University, TN, India.

Technical Skills:

ETL Tools

Informatica Power Center 9.x/8.x/7.x.

Data Modeling

Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

RDBMS

Oracle 10g/9i/8i/7.3, MS SQL Server 2000, DB2, MySql, Teradata.

Middleware Tools

SQL*Plus, SQL Loader, TOAD, WinSCP, WinSQL, Filezilla

Programming Skills

Oracle SQL Developer 1.5.5, SQL, PL/SQL, HTML, DHTML, XML, C, C++, Java.

Operating Systems

Win 95/98/2000, Unix, Sun Solaris 2.6/2.7.

Scheduling Tool

Control-M, Autosys, Informatica Scheduling.

Professional Summary:

JPMORGAN CHASE Apr ’14 – Present

BCBS – Business Banking Data Aggregation

ETL Developer

BCBS (Basel Committee on Banking Supervision) supports central banks and supervisory authorities by formulating supervisory standards and guidelines, recommending best practice and encouraging convergence on common standards and approaches without attempting detailed harmonization. It does not possess formal supranational supervisory authority or legal force.

The program aims to ensure consistent implementation of the Basel framework, which will help strengthen the resilience of the global banking system, maintain market confidence in regulatory ratios and provide a level playing field for banks operating internationally.

As a part of this BCBS Program–Creating the ETL Code using the BRDs, testing the code and implementing in production. Document all report elements, generation code logic and calculation logic used for re-creating the new reporting layer (DWH).

Responsibilities:

Conduct studies, gather and analyze data from various databases and sources

Extensive interaction with clients and business teams to get the report code and generation logic.

Creating ETL Code as per the BRDs using simple to complex transformations

Participated in team meetings and proposed ETL Strategy.

Leading the offshore team by guiding them with the requirements and reviewing all the deliverables.

Preparing the High level Design Specifications for ETL Coding and mapping standards.

Performed Unit-testing by generating sql scripts based on the pre-defined test plans

Worked with Flat files as sources and targets

Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner etc.

Created Mapplets and reusable transformations to be re-used during the development life cycle

Extensively worked in PDO mappings for handling massive data transformation an loading.

Standardized parameter files to define session parameters such as database connection for sources targets, last updated dates for Incremental loads and many default values of fact tables.

Performed tuning of Informatica Mappings for optimum performance.

Implemented Slowly Changing Dimensions to update the dimensional schema.

Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.

Documenting all business requirements within SLAs.

Environment: Informatica Power Center 9.1, Oracle 10g/9i, Teradata, Flat files, TOAD, Control M, Unix

JPMORGAN CHASE Nov ’12 – Mar ’14

Digital Marketing

ETL Developer

Chase is building an Enterprise Digital Marketing Ecosystem that includes but not limited to display advertising, search, email campaign, mobile and social media landscape that include communities like Face book, Twitter, Flicker etc utilizing EDW platform for structured data and Aster data for high volume unstructured and structured data solution. This deliverable was developed based on the Physical Data Integration Design/Model activities.

Responsibilities:

Worked closely with Business Analyst and the end users for understanding existing business model and customer requirements and involved in preparing the functional specifications based on the business requirement needs.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes.

Experience with high volume datasets from various sources like Text Files and RDBMS.

Developed transformation logic and designed various Complex Mappings and Mapplets.

Experience in finding the performance bottlenecks and redesigning the ETL process to improve the performance. Fine tuned ETL processes by considering mapping and session performance issues.

Defined and worked with mapping parameters and variables.

Migration of code across the environments using Folder to Folder and Deployment Group methods. Troubleshooting of problems in QA and UAT phase.

Involved in Performance tuning for complex ETL codes and Pushdown Optimization for huge volume of data.

Used Power Center Workflow manager for session management, database connection management and scheduling of jobs.

Experienced in using the Informatica partitioning feature to load the data into database.

Familiarity on writing multiple stored procedures.

Tested the data and data integrity among various sources and targets and associated with Production support team in various performances related issues.

Extensively worked on creating Control M jobs to schedule the workflow, file executions.

Environment: Informatica Power Center 9.1, Oracle 10g/9i, Teradata, Flat files, TOAD, Control M, Unix, Quality Center.

CNA, IL Sep’11 – Oct ‘12

Medicare Reporting

ETL Developer

In this project, the Source systems ACT, CASE, CC, DOC USER will be combined and will be fed to DOC. The central component of the reporting process shall be the “Data Orchestrator Controller” (DOC) formerly referred to as the Medicare Reporting Controller or Maintenance Application.

Medicare Secondary Payer of the Medicare, Medicaid and SCHIP Extension Act of 2007 requires liability insurers (including self-insurers, no-fault insurers and workers compensation insurers) to:

Determine Medicare status for all claimants; and Report all claims involving a Medicare beneficiary to Centers for Medicare and Medicaid Services (CMS), the federal administrative agency responsible for administering Medicare and Medicaid when financial compensation has been administered for an injury related incident.

Responsibilities:

Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.

Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.

Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files to load in to Stage area.

Worked data mapping, data cleansing, program development for loads, and data verification of converted data to legacy data.

Involved in error handling, performance tuning of mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.

Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis.

Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing, and optimization rules.

Testing and debugging of all ETL objects in order to evaluate the performance and to check whether the code is meeting the business requirement.

Responsible for migrations of the code from Development environment to QA and QA to Production.

Carry out Defect Analysis and fixing of bugs raised by the Users, involved in support and troubleshooting of production systems as required, optimizing performance, resolving production problems, and providing timely follow-up on problem reports

Environment: Informatica Power Center 8.5/8.6, Power Exchange 8.5/8.6, Oracle 10g, PL/SQL, Toad, UNIX, Toad, Flat files, Control M, Quality Center.

PNC Bank, OH Oct ’10- Aug 2011 Data Quality

Informatica Developer

The goal of Project is to provide crucial insights into business team with proactive information delivery and self-service data analytics. The Reporting team will publish the statistics and the error data to the business insight team for data cleansing. Project is made up on several subject areas like Referral, Outstanding Balances, Revenue, Retail Sales Performance, Customer Profiling and others

Responsibilities:

Worked extensively with Business community and database administrators to identify the business requirements and data realities.

Working closely with data modeling team for new requirement and Enhancements in the database.

Parsing high-level design specification to simple ETL coding and mapping standards.

Handling all phases including Requirement Analysis, Design, Coding, Testing and Documentation.

Participated in client meetings and proposed various ETL Strategy.

Leading 30 people of the offshore team and providing the inputs from the clients to the team

Performing as co-coordinator between the business users and the technical team by converting the business requirement to technical design

Worked on Informatica Power Center 8.6 Tool –Designer, Work Flow Manager, Work Flow Monitor and Repository Manager

Implemented end-to-end mappings involving complex logic using Transformations like Expression, Joiner, Filter, Lookup, and Router.

Using parameter files in mappings and running workflows

Guiding in creating test plans, scripts, and data.

Performing code review, formal testing, functional testing, performance testing, and specification review

Implementing the error handling strategy process for all interfaces to avoid errors during the live environment and migrated the objects from the Dev, QA to Production Environment.

Implementing performance tuning methods to optimize developed mappings

Worked on Autosys to schedule the Jobs.

Worked with Informatica Admin team to move the Components of Informatica to the QA and Production Environments.

Responding quickly and effectively to production issues and taking responsibility for seeing those issues through resolution

Environment: Informatica Power Center 8.1, Oracle 10g, DB2, PL/SQL, Toad, UNIX, Flat files, Autosys, Windows XP.

Standard Chartered, India Dec`08 – Sep ‘10

Truncation System (CTS) (Inward & Outward)

Informatica Developer

Cheque Truncation System (CTS) is a data and image-capturing application. The project involves two phases of loading, the ETL phase and Image Capturing. The Image details are captured in flat files for the ETL.

The main objective of system includes:

Extracting, Transforming and Loading data from flat files and inserting it into appropriate target tables.

Performing Data Cleansing on User’s data

Manipulating the dealings of the accounts for cheque transactions

Responsibilities:

Parsing high-level design specification to simple ETL coding and mapping standards.

Performing as co-coordinator between the business users and the technical team by converting the business requirement to technical design

Worked on Informatica Power Center 8.6 Tool –Designer, Work Flow Manager, Work Flow Monitor and Repository Manager

Performing code review, performance testing, and specification review

Implementing performance tuning methods to optimize developed mappings

Responding quickly and effectively to production issues and taking responsibility for seeing those issues through resolution

Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.

Designed ETL architecture to Process large number of files and created High-level design, low-level design documents.

Responsible for migrations of the code from Development environment to QA and QA to Production.

Carry out Defect Analysis and fixing of bugs raised by the Users

Involved in support and troubleshooting of production systems as required, optimizing performance, resolving production problems, and providing timely follow-up on problem reports

Environment: Informatica Power Center 8.5/8.6, IDE, Oracle 10g, Toad, UNIX, ControlM, Toad.

Standard Chartered, India Aug`06 – Nov ‘08

Electronic Clearing Service

Informatica Developer

Electronic Clearing Service (ECS) is the name of the service extended by RBI for electronic debit to an account. This phase includes function to cater to the requirements of centralized processing (at Item Processing Centre IPC) ECS Debit Instructions arising out of National Clearance Cell Interface of various cities across India.

Responsibilities:

Extensively used informatica tools, Designer, Workflow manager, Repository Manager.

Working with IDE - Informatica Data Explorer to profile Customer & Claim information for faster data analysis.

Working with mappings using expressions, filters, lookups, router, update strategy transformations to implement Type 2 dimensions.

Involved in development of mappings for complicated scenarios by implementing the logic of counters and complex conditioning and filtering requirements.

Created sessions, workflows and worklets to run with the logic embedded in the mappings using Power center Designer.

Maintain Development, Test and Production mapping migration Using Repository Manager. Involved in enhancements and maintenance activities of the data warehouse including performance tuning.

Environment: Informatica Power Center 8.1.1, Oracle 9i, Informatica IDE, Flat files, Toad, Control M, Unix.



Contact this candidate