Nithya Hariharan
********@*****.***
Summary:
• 3 years of IT experience in software analysis, development and implementation of business applications for manufacture and health care verticals.
• Proven experience in designing, testing of Data warehouse and Data marts.
• Directly responsible for Extraction, Cleansing, Transformation and Loading of data from multiple feeds and sources into Data Warehouse.
• Performed various data analysis at the source level and determined the key attributes for designing of Fact and Dimension tables using star schema for successful creation of Data Warehouse and Data Marts.
• 3 years of ETL and data integration experience in developing ETL Mappings using Informatica Power Center 7.1/7.1.3 with client tools like Designer (Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer), Repository Manager, and Workflow Manager & Workflow Monitor.
• Extensively worked in creating Mappings using Informatica Designer and processing tasks using Workflow Manager to configure data flows from multiple sources (flat files, XML files, SAP, Oracle, SQL server) to targets persistent data stores.
• Excellent knowledge and experience in configuring connections to various sources and creating source to target mapping (SCD mappings), edit rules and validation, transformations, and business rules.
• Extensively worked on ETL job scheduling, based on client’s adhoc requirements. Worked with lead developers to migrate the code to various environments.
• Performed Performance tuning in RDBMS sources by implementing complex SQL query over rides in Source qualifier transformation, which reduced unwanted data to be processed, thereby increasing the data processing speed.
• Performed Performance tuning in reducing the number of lookup transformation used in the mappings, by joining all the lookup tables and giving a query over rides in lookup transformation.
• Performed other performance tuning techniques like reducing the number of transformations used in the mapping, using unconnected lookup when multiple values has to be returned from the same table, performing joins at the source level (homogenous sources) instead of using joiner transformation etc.
• Extensive experience in using Data Stage Client tools like Designer, Administrator, Director and Manager.
• Experience using Oracle 9i/10g, MS SQL Server 2005, SQL, PL/SQL.
• Hands on experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions and packages.
• Worked in Operational environment, following the agile methodology with 6 weeks iterative approach. Worked in the development, testing and integration of the project at a faster pace.
• Worked closely with the Business Analyst, Database and Data warehouse Architects in determining the exact business rules and functionalities required for the delivery of the work articles.
• Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals along with excellent communication skills.
Skills:
ETL Tool : Informatica Power Center 7.1/7.1.3/8.1
Data Stage 7.5.1
RDBMS : Oracle 9i/10g, SQL Server 2005
Programming Languages : SQL, PL/SQL, C, C++, UNIX Shell Scripting,
Autosys Scripting
Operating System : Windows 2000/XP, UNIX.
Data Modeling Tools : Erwin 4.0, Visio 2003
Front-end Tools : TOAD 7.x, DbVisualizer 5.0.
Certifications:
Oracle8i - SQL and PL/SQL, Cognizant Academy, 2007
Data Warehousing Basics, Cognizant Academy, 2007
1z0-007 level of Oracle Associate, Oracle Corp, 2007
1z0-147 level of Oracle Associate, Oracle Corp, 2007
Informatica 7 Mapping Design, Informatica Corp, 2007
Informatica 7 Advanced Mapping Design, Informatica Corp, 2008
Informatica 7 Architecture and Administration, Informatica Corp, 2008
Experience:
Wells Fargo, West Des Moines, IA ETL Developer
Feb’09 – Present
Wells Fargo Home Mortgage (WFHM) takes care of supporting Wells Fargo representatives in all branches in distributing home mortgages to customers across United States. The CORE ODS (Operational Data Store) is used to acquire, prepare and deliver data which was generated within the CORE on-line applications. The COD database (Consolidated Originations Database) of OLTP system is the primary data source for the ODS. ODS is divided into various schema’s like Raw, staging and static to ensure the proper cleansing and transformation of data to various downstream systems.
The CORE ODS delivers data primarily to the LIS (Loan Information System) from which the data is delivered to the various downstream systems including operational, reporting and analytical. We analyzed and performed ETL (Extraction, Transformation and load) operations on the ODS data of various subject areas like the RESPA Fee, Escrowing, Closing, Property, Landlord, Income, Asset, Employment, and Underwriter etc of the borrower into the LIS system.
Responsibilities
• Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed data stage jobs into the production environment.
• Analyzed the various subject areas in ODS database along with Business Analyst to determine the exact business functionalities required for the delivery to the downstream systems, and also worked with the data Architect in re modeling the existing data model.
• Analyzed the functional specifications provided by the Data Architect and created technical specification documents for all the jobs.
• Coded complex logics using Aggregator, Transformer, Joiner stages by performing pivoting in Escrow and Employment subject area in Data Stage for Release 2.09 and in Pl/Sql for Release 3.09 as per the transformation logic rules. Also thoroughly unit tested the code both in Development and UAT environments.
• Restructured the Data stage jobs to enhance the performance and clarity of the job and also sent the exact data to the downstream systems
• Handled some of the data issues in the jobs, during the release of the project to send the appropriate correct data to the business users.
• Performed unit testing for all the jobs of various subject areas like Escrow, RESPA Fee, Employment, Property, Closing, Landlord etc
• Worked with the testers closely in determining both medium and high severity defects, that would potentially affect the downstream systems before the release of the project, and also fixed the defects before moving the jobs into production.
EDS Corporation, Plano, TX, Informatica Developer
Apr’08 – Oct’08
EDS is a leading global technology services company delivering business solutions to its clients for more than 40 years. The Development project builds their in-house Data Warehouse and Data Mart to store data from various financial systems. The individual Data Marts are built from the Data Warehouse to hold data about different applications, which are fed into the OLAP cubes. Reports are generated from these Cubes as well as from the Data Marts for the end users through the web interface.
We loaded data in one such mart ASLAP. We loaded with data related to full time equivalent of the employees, which are in the open and closed states and also resource request related information of the open, closed and cancelled states of the employees, which are viewed as reports in the web interface along with the related dimensions.
Responsibilities
• Defined various facts and Dimensions in the data mart and reviewed source systems and also proposed data acquisition strategy.
• End-to-end ETL development of the Data Mart.
• Developed over handful of Informatica mappings related to ASLAP data mart and thoroughly tested them from end to end for successful creation of the Data warehouse and Data mart, also created Mapplets and used them in different Mappings
• Created complex mappings using Unconnected Lookup, Sorter, Aggregator, and Router transformations for implementing slowly changing (Type II) and conformed dimensions.
• Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
• Developed stored procedures which were used across systems and various Informatica mappings for building different marts.
• Participate in system requirements and design specifications and also involved in the development of Star Schema.
• Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
EDS Corporation, Plano, TX, Informatica Developer
Jun’07 – Apr’08
The objective of this project is to build the in-house Enterprise Data Warehouse (EDW) of EDS to store data from various financial systems. Data was sourced from various systems like HR, Admin, and Finance etc… The Data sourced from these systems were used to build the Enterprise Data Warehouse (EDW) using the Inmon methodology of building the Warehouse. We built the junction tables and the cross reference tables which acted as intermediate tables and were used to store data from various sources and the junction table for representing the n: n relationship between the various sources. The Enterprise Warehouse acted as source for building the various Data Marts for various departments like Financial, HR etc…
• Designed and developed complex Informatica mappings to extract data from the staging area and build the EDW using Source Qualifier, Expression, Filter, Sequence Generator, Router, Joiner and Lookup transformations.
• Performed Unit testing and documented the results in the testing documents both in development and Integration environments.
• Troubleshooting issues in the production environment and help in resolving issues in the same. Used debugging tools to debug the flow of data and found the exact issues where expected data was not populated correctly.
• Performed impact analysis for various enhancement requests to find out the systems/programs that could be potentially affected by proposed change(s).
• Developed an eminent level of competency, and predicted issues to proactively resolve them in advance.
• Conducted Brown Bag sessions to share domain and technical knowledge with fellow team members.
TORO, Minneapolis, MN Informatica ETL Developer
Aug’06 – May’07
The Toro Company is committed in helping customers beautify and preserve the outdoor environments by making landscapes green, healthy and safe. They produce superior, innovative, environmentally sound products, services and systems. The primary objective of the project is to build one common warehouse where all the confirmed dimensions, fact tables and dimensions related to each fact tables are stored.
There are many subject areas in TORO like sales order, Inventory and we worked in Financial Subject area, which comprises the module of the project FDR. SAP, XML and Flat files are the source systems from which the data has been pulled into Staging and then from staging the required cleansing of data is done and some calculations are performed and stored in respective dimensions and fact tables in the warehouse.
Responsibilities
• Data Quality Analysis to determine cleansing requirements.
• Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
• Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
• Designed and developed complex Informatica mappings using transformations like Normaliser, Router, Lookup (connected and Unconnected) and Update strategy transformation for various data loads.
• Designed and developed various Pl/SQL stored procedures to perform various calculations related to fact measures.
• Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
• Tuning Informatica Mappings and Sessions for better performance.
• Created Sessions, Workflows, Post Session email task and also performed various workflow monitoring and scheduling tasks.
• Performed Unit testing and maintained test logs and test cases for all the mappings.
• Maintained warehouse metadata, naming standards and warehouse standards for future application development.
• Parsing high-level design specification to simple ETL coding along with mapping standards.
Health Net Inc, Woodland Hills, CA, Informatica ETL Developer
Jan’06 – Jul’06
Health Net, Inc. is one among the nation’s largest publicly traded managed health care companies. The Primary objective of MMI Lab data results project is to receive a standard enterprise-wide lab results feed from Quest, design tables in ODW for lab results and load the data into ODW.
Currently lab results data are posted on Quest Web application and the HEDIS team extracts these lab results data from Quest Web application and saves into Corp Share folder, and from there HEDIS and other departments connect to Corp Share drive to access the lab results data. The current lab results data received from Quest is not consistent with respect to data format, transmission method and frequency of delivery.
To overcome the above limitation, Health Net has planned to standardize the layout of Lab Results file, build tables in ODW to house lab results data, design/develop ETL components to load lab results tables in ODW, design/develop ETL components to extract data from lab results ODW tables and send to other Health Net business areas/external vendors.
Responsibilities
• Developed design strategy of the Extract, Transform and Load for various ODW tables.
• Developed mappings with the Design Documents and also performed testing for various sample data.
• Develop mappings transformations like Filter, Joiner, Sequence Generator and Aggregator and perform query overrides in Lookup transformation as and when required to improve the performance of the mappings.
• Co-ordinate with team members for effective management for all project requirements and responsible for deliverables and receivables during all project phases.
Education:
B.E., Electronics and Instrumentation Engineering, Anna University, Chennai.