Post Job Free

Resume

Sign in

Informatica Data Architect

Location:
Springfield Township, NJ
Posted:
March 24, 2017

Contact this candidate

Resume:

PRANAV R UPADHYAYA

Professional Summary

•Over 12+ years of IT experience in the Analysis, Design, Development and Implementation of business application systems for Banking, Retail, Healthcare and Insurance industries.

•Over 10+ years of experience in Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications

•10+ years of extensive experience in Data Warehouse applications using Informatica, Oracle, Teradata, MS SQL server on Windows, IBM DB2, XML and UNIX platforms.

•Over 4+ years of experience in Informatica Data Quality, Master Data Management.

•Expert in Data Warehousing using Informatica Power center 8.5/9.1/9.6 and Oracle 10g/11g.

•Expert with over 5+ years of experience with Informatica and Teradata, using TPT, Mload, FastLoad and BTEq.

•Over 8+ years of Dimensional modeling, Star Schema, Snow flake Schema, FACT and Dimensional Tables experience.

•Extensive experience in writing SQL and PL/SQL programs which includes DDL, DML, DB Links, Views, Materialized Views, Indexes, Analytical Functions, Dynamic SQL, Bulk load, Packages, Procedures and Functions.

•Extensive experience in developing strategies for Extraction, Transformation and Loading data from various sources into Data Warehouse and Data Marts using Informatica, Unix scripts and Oracle PL SQL.

•Expert in interacting with Clients and Subject Matter Experts in understanding the functional requirements and preparing high level and low level technical design documents

•Expert in performance tuning of ETL jobs and database queries.

•Expert in Monitoring and Support of Enterprise Data Warehouse applications.

•Expert in different databases Oracle 11g/10g/9i/8i, MS SQL Server, Teradata, DB2 and MS Access

•Expert in UNIX Shell scripting as part of file validation and manipulation, Scheduling, FTP, SFTP and database interactions

•Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining DataDictionary and Interface requirements documents

•Exposure of end to end SDLC, Waterfall, Cyclic and Agile methodology

•Hands on experience with configuring business metadata’s using OFSAA’s AAI framework including Datasets, Measures, Business processes, Business Rules and Data Transformations.

•Experience in building data loads using OFSAA’s Table to Table (T2T’s) and File to Table (F2T’s) objects.

•Well versed with various delivered OFSAA processes and their configuration, for e.g. Slowly Changing Dimension (SCD), Exchange Rate Load, etc.

•Expert in Partitioning of database tables and various partition strategies

•As the Informatica Administrator responsible for the smooth operation of Informatica in the Data Warehouse environment, including installing, upgrading, maintaining and tuning Informatica Power Center

•Knowledgeable in HP QC, Visio & creating process models/ process diagrams / flow charts

•Well-organized quick learner with excellent communication and commitment to deliver on schedule

•Expert in Data Warehousing concepts, E-R model & Dimensional modeling (3NF) like Star Schema, Snowflake Schema and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes

•Thorough understanding of enterprise data warehouses using various top-down or bottom-up methodologies

•Experience on IBM Cognos, OBIEE, Business Objects, Tableau and SSIS.

•Outstanding knowledge in Microsoft Office suite ( MS Word, MS Excel, MS PowerPoint, MS Project, MS Visio)

Skills

ETL Tools

Informatica Power Center 6.5 thru 9.6

Power Exchange, IDQ -Informatica Developer, Informatica Analyst, Informatica MDM, Informatica Meta Data Manager

SSIS

Business Intelligence Tools

Micro strategy 7.x

Cognos Reportnet 6.x

OBIEE 11.x

Business Objects

Tableau

Data Modeling

Dimensional modeling, Star-Schema Modeling, Snow-flake Modeling, FACT and dimension tables, Logical / Physical designs,Data Mart, ODS, ERWIN,TOAD

Databases

Oracle 11.x/10g/9i/

MS SQL Server 2005,2010,2013

Teradata V2/R13

MS Access 2000

Programming

SQL, PL/SQL, BTEQ, UNIX Shell Scripting, Toad, MULTILOAD, TPUMP, SQL Loader

Scheduling Tools

Autosys, Informatica scheduler, Unix scripts, Crontab

Professional Experience

Nov 2016 – Current

Employer – Gallant Global

Role: Datawarehousing – Informatica Lead / Architect

Location – Dallas, Texas

Client - CBIzSoft

This project was to integrate all of the client’s (CBIZSOFT) transactions into a Data warehouse so that the Management could review the placements and take decisions to improve hiring rates, focus on in-demand technologies, improve staffing turnaround time and increase per employee placement record.

Responsibilities:

Analyzed business requirements and worked closely with various application teams and business analyst to develop ETL procedures that are consistent across all applications and system.

Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.

Migrate Teradata, Informatica code (parameter files, mappings, sessions and workflows) and scheduler information from Dev/QA/UAT and PROD servers using deployment scripts (Teradata) and deployments group (Informatica)

Worked on consolidating various oracle sources to load data into Teradata

Designed and maintained end to end mappings, sessions, workflows and tables in Informatica and Teradata

Worked on Teradata parallel transporter TPT and pushdown optimizer PDO

Used most of the transformations such as JAVA, SQL transformation, Routers, Sequence Generator, Expression and lookups (connected and unconnected) while transforming the Data according to the business logic

Achieved performance improvement by analyzing explain plan, identifying and defining indexes, Collect statistics on columns that are used in Joins and compression on appropriate attributes to save the disk space

Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level

Responsible for the Architecture, technical analysis, design, development, testing, deployment and administration of data warehousing using Informatica Power Center

Involved in Informatica issue resolution, folder migration, Administration and served as the first line of support internally for any Informatica issues raised by the project teams

Proven Accountability including professional documentation and weekly status report

Responsible to lead a team of 4 developers, task allocation, task tracking, reporting risk and issues and review of deliverables

July 2013 – Sep 2016

Employer – Deloitte Consulting

Role: Datawarehousing - Informatica Lead/Architect

Location – Bangalore, India

Client - Wells Fargo

Design, develop and implement ETL solution for Wells Fargo bank (Wholesale division) to report relationship manager’s profitability and enable them to increase their margins by cross selling and up selling. The ETL solutions encompassed of developing mappings that load data from various source systems to the corresponding Data warehouse

Designed and developed an Architectural ETL road map and dimensional model, for a system which enhanced profitability analysis increased cross sell and upsell of products by relationship managers by 20%.

Interact with various business analysts both onsite and offshore and document high level and low level design documents

Designed conceptual data model based on the requirement, Interacted with non-technical end users to understand the business logics.

Executed data modeling using ERwin data modeling tool

Modeled the logical and physical diagram of the future state, delivered BRD and the low level design document.

Discussed the Data model, data flow and data mapping with the application development team

Created PL/SQL procedures, triggers, views, materialized views

Created detail ETL Standards documents for design, development, release management and production support

Analyzed business requirements with the functional and DBA team and created and maintained the dimensional model (star schema)

Architected the ETL data loads coming in from the source system and loading into the data warehouse

Design of Audit, Error and Alert System to monitor various business, operational and technical metrics.

Created Informatica Mappings using Mapping Designer to load the data from various sources using different transformations like SQL, Normalizer, Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router, Sequence Generator, Source Qualifier, and Update Strategy transformations.

Used Autosys as Job Scheduling tool to schedule UNIX shell scripts and Informatica jobs.

Involved in Unit Testing, Integration Testing and Performance Testing of ETL.

Responsible for handling large volume data conversions, data cleansing and following data delivery standards

Apply business rules using complex SQL and validation rules to check data consistency.

Perform Data validation and massaging to ensure accuracy and quality of data

Created detail ETL migration processes for Informatica, Database, Scheduling, O/S and H/W teams

Design and involving complex logic that would not be possible in the tool. Develop PL/SQL procedures to load data into facts and bridge tables

Optimized the performance of the Informatica mappings by analyzing the session logs and understanding various bottlenecks (source/target/transformations)

Involved in PL/SQL query optimization to reduce the overall run time of stored procedures

Created UNIX shell scripts to invoke the Informatica workflows & Oracle stored procedures

Designed Autosys scheduler jobs to invoke the UNIX shell scripts & Informatica Jobs

Involved in unit testing of various objects (Informatica workflows/Sybase stored procedures/UNIX scripts)

Supported various testing cycles during the SIT & UAT phases.

Supported the daily/weekly ETL batches in the Production environment

Design and develop reusable common objects shared across various repositories, for audit purposes

Automated, redesigned and tuned several ETL process for optimal utilization of time and resources, reducing load times by over 40%.

Integrated Informatica Data Quality IDQ with Informatica Power Center and created various data quality mappings in Informatica Data Quality tool and imported them into Informatica power center as mappings mapplets

Created data profiling and score cards using the Informatica data quality and help business analysts to understand the tool components

Configured Analyst tool IDE and helped data stewards or business owners in profiling the source data create score cards applying inbuilt DQ rules and validating the results

Extensively used Informatica Developer tool to build rules and mappings

Design and develop and Implement detail layout of ETL testing plan procedures.

Provide estimations for ETL deliverables and oversee the progress for quality ETL Deliverables.

The data was fed to the OFSAA model from where the analytical engine provided the relationship managers with profitability reports.

Created several Repository Mappings and Queries facilitating rapid impact analysis, trouble shooting, code verification and deployment.

Design and develop Meta data reports for the ETL, Business and Support teams for day to day needs

Did a Tableau POC for the client to showcase the sales/revenue activity by Geography

Dec 2007 – June 2013

Employer – Cognizant Technology Solutions

Datawarehousing, Informatica Developer

Location – San Antonio, Texas (1.5yrs), Bangalore, India (3.5yrs)

Client - HEB

This project was mainly enhancement of the existing data warehouse of the client. The data warehouse had multiple data sources like Point of Sale, Supply Chain and Loss Prevention. All the data had to be fed from the point of sale, supply chain, store feed, employee systems to the ODS in oracle and then moved to the Datawarehouse on Teradata.

Modeling the business rules into a functional star schema database design.

Work in close co-ordination with the SME\'s to reverse engineer database stored old and obsolete procedures to convert the encapsulated business logic into Informatica mappings

Created the data models for enhancing the OLAP and analytical systems.

Engaged in data profiling to integrate data from different sources.

Performed the gap analysis and impact analysis

Involved in all aspects of project life cycle in the lead role - Project planning, requirements gathering, Analysis, build a team, architecting the solution, mentor team members, be the SME, co-ordinate tests, manage support, build and delivery

Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Used FastLoad, MultiLoad, Teradata parallel transporter and FastExport connection to load and export data from Teradata tables.

Worked extensively on SQL, BTEQ Scripts and UNIX shell scripts. Schedule jobs in crontab, informatica scheduler and autosys

Identify and assess business needs, anticipate future data requirements and implement solutions within the data warehouse.

Worked on odbc.ini file to created ODBC connection in informatica.

Wrote FTP and SFTP scripts to get and put files from third party sites

Had hands on experience of writing Teradata Stored procs to move data from Stage to Integration layer. Created complex stored proc's that involved implementation of Business Logic to load data in to integration layer.

Implemented various data quality rules in Informatica Developer and created scorecard as per the business requirements using Informatica Analyst (IDQ).

Reengineered ETL jobs and Processes to bring down the load times by more than 40%.

Installed, Upgraded, Configured and Maintained Informatica Power Center client/server.

Widely used Database objects like Views, Synonyms and Triggers for security and reducing the load on ETL tools.

Tuned the ETL loading Terabytes of data for Optimum Performance, Dependencies and Batch Design via parallel partitioning, indexing, explain plan optimization, push down optimization

Design the logical model for the MDM solution, provided the inputs about the best practices and influenced on how the data model can be better adapted to support existing data integration needs

Using Informatica MDM, configured Landing Tables, Base Objects, Relationships, Staging Tables, Mappings, custom cleanse functions, Match and Merge settings, Trust and Validation Rule

Worked with Business users in understanding Match & Merge setup in MDM and incorporated their requirements and ideas

Collaborated with users in defining Match Rules and conducted workshops to discuss over and under matching

Created and executed test plans for Unit, Integration, and System test phases.

Interacted closely with DBA to improvise the existing long running jobs in production and solving production issues

Generated and published Meta data reports for the data model, ETL and support teams

Responsible in setup of Informatica upgrade approach plan, movement of ETL code in waves, identify to make sure code is properly tested and migrated to new version environment, and retire existing code.

As Administrator was Responsible for creation and automation of server health maintenance scripts like Informatica Server start up, CPU and memory utilization check, Repository and log files, back up, and Informatica coding standards verification and validation, etc.

As Infa Administrator created multiple process related documents, shared to client and trained to team members. Informatica Object creation like Folder, User, User Group, Connection Objects, and assign privileges etc.

Oct 2004 – Nov 2007

Employer - Accenture

ETL Developer

Location – Minneapolis, MN (6months) and Bangalore, India (2.5yrs)

Client - UHG

Estimate efforts required for design, development of ETL components and modules for three different releases and enhancements. Provide estimates based on degree of complexity and number of transformations used in and ETL.

Designed and created ETL mappings using Informatica Mapping Designer.

Produce the data dictionary and database support documentation to the customer so the system could be turned over to them for future support

Built the physical data model for customer review and approval using Oracle 9i

Created DDL, DML scripts. Converted text data into Oracle data.

Created PL/SQL procedures, triggers, generated application data, Created users and privileges.

Followed Iterative Waterfall model for Software Development Life Cycle Process (SDLC).

Worked extensively on performance tuning by making changes to SQL in Source Qualifier.

Used various Informatica transformations like expressions, filters, joiners, aggregators, routers and lookups to load better and consistent data into the targets.

Fine-Tuned Existing Mappings for better performance. Documented Mappings and Transformations.

Responsible for Technical documentation of ETL Process.

Interacted with end users and gathered requirements.

Wrote procedures, functions in PL/SQL, troubleshooting and performance tuning of PL/SQL scripts.

Creation of facts and dimensions according to the business requirements.

Monitoring and improving performance of daily jobs on Oracle database.

Identifying integrity constraints for tables.

Developed procedures and functions using PL/SQL and developed critical reports.

Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation

Employed various Lookup Caches like Static, Dynamic, Persistent, and Shared Caches

Built Business Objects reports for data validation and reconciliation

EDUCATION

Masters in Computer Applications – 2003, NITK Suratkal

Bachelor of Science, Computers and Electronics – 2000, Mangalore University



Contact this candidate