Post Job Free

Resume

Sign in

Data Manager

Location:
Fremont, CA
Posted:
April 06, 2021

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

Over *+ years IT experience in analysis, design, development and implementing software solutions in Data Warehousing using tools - Informatica PowerCenter 10.1.1.

Extensive Database experience using Oracle /11g/10g/9i, DB2, MS SQL Server 2012, SQL, PL/SQL.

Strong Data Warehousing/ETL experience using Informatica PowerCenter Client tools - Designer, Repository Manager, Workflow Manager, and Monitor.

Strong experience working with Enterprise Data Warehouse (EDW)Frameworks and direct experience with BI or DW projects over the full development lifecycle.

Experience in handling Informatica Power Center for ETL extraction, transformation and loading into target Data warehouse.

Strong in Data warehousing concepts, dimensional Star Schema, Snowflakes Schema methodologies, slowly changing Dimensions (SCD Type1/Type2).

Extensive experience in creation of complex ETL mappings, mapplets and workflows using Informatica Power Center to move data from multiple sources into target area.

Information Technology with Expertise in Data modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications.

Proficiency in Salesforce real - time integration using APIs (REST/SOAP/Bulk), Web services.

Adept at Salesforce CRM Configuration, Customization, and Testing of applications.

Participated in varying experience levels in building and supporting ETL processes for systems.

Experienced in estimation, planning, Risk Management, finalization of technical / functional specifications, communication management and quality management of the product and Sound knowledge in tuning Informatica mappings and identifying bottlenecks and resolving the issues to improve the performance of the data loads and extracts.

Proven experience translating business problems into actionable data quality initiatives.

Experience in data profiling & data quality rules development with business and IT end users.

Experience with Informatica Analyst 9.6.1 to analyze the Source data quality issues by creating the Profiles and applying the data quality rules.

Experience in implementing the Change Data Capture Process using Informatica Power Exchange.

Expertise in implementing Customer, Product and Supplier Master Data Management domains.

Experience in working with Informatica DVO tool and creating SQL Views, Lookup Views and Table pairs for the Automation Testing.

Extensive Experience in application design and implementation of Financial and Business Data according to business needs and requirements with Financial, Banking, Insurance Domains.

Installed and configured users and user groups in informatica power center on different operating systems.

Involved in migrating informatica power center services from 9.x to 10.x.

Implemented security mechanism in the servers.

Developed UNIX shell scripts to automate applications, Schedule jobs.

Experience in Oracle BI Apps that include OBIEE, ETL Tools and DAC.

Experience in developing stored procedures, functions, triggers, cursors and joins, SQL Queries with T-SQL, PL/SQL and working with TOAD, SQL Developer, SQL*PLUS.

Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).

Designed, Installed, Configured core Informatica MDM components such as Informatica MDM Console, Hub Store, Hub Server.

Work experience in agile and scrum methodology.

Improved the performance of mappings using various optimization techniques.

SKILLS

ETL Tools

Informatica Power Center 10.2,10.1.1,9.6/9.0.1,

Informatica DVO 9.5, MS SSIS 2012.

Reporting Tools

COGNOS BI- Framework manager, OBIEE

Programming Languages

SQL, PL/SQL, HTML, Python2/3, Java, JDBC, UNIX SHELL SCRIPTING

Database

Oracle 10g/ 11g, DB2 Main Frame, Teradata, MS SQL Server 2012, Vertica.

Operating System

Windows, LINUX / UNIX

Programming Tools

SQL PLUS, SQL Loader, TOAD 10.0, Putty

Scheduling tools

Control M, Crontab, Tidal, DAC

Cloud Technologies

AWS, Azure Cloud

Defect Management

HP Quality Centre, HP ALM, JIRA

PROFESSIONAL EXPIRENCE

Caremore

Location: Cerritos, CA

Project: Informatica Upgrade Project

Role: Sr ETL Informatica Developer Jan 20 – Till date

Responsibilities

Responsible for Performing E2E Testing on ETL Functionality. Involved in installation, configuration, and upgrade of Informatica 10.1 to 10.2 hotfix.

Responsible for Creating Test Scripts based and Test Cases such as Data Validation Testing, Integration Testing, Smoke Testing, Benchmark Testing

Expertise in design, develops, and implements data extraction, staging, transformation and loading ETL procedures to populate an operational data store (ODS) and enterprise data warehouse (EDW) from multiple data sources.

Involved with administration of Informatica repository and user administration.

Well versed in developing and maintaining metadata repositories.

Experience in UNIX Shell Scripting.

Expertise in fine-tuning the mapping and sessions by identifying the bottlenecks to improve the performance.

Proficient in PL/SQL Stored Procedures, Triggers, Packages along with tools like TOAD, Rapid SQL and PL/SQL Developer.

Experience in Systems Analysis, Relational and Dimensional Data Modeling, Design, and Development on Oracle databases.

Experience in Developing Positive and Negative test cases.

Extensive experience in design and development of ETL processes from DB2, Oracle, SQL

Issues and Meta Data Management in the implementation of ETL methodology in Data Extraction, Transformation and Loading using Informatica.

Experience in reviewing python code for running the troubleshooting test-cases and bug issues.

Proven experience in Data Profiling, Analysis and Data cleansing.

Involved in functional and technical Systems Analysis & Design, Systems architectural design, Presentation, process data flow design system impact analysis, star schema, and snowflake, changing dimension, Gap analysis and documentation.

Conducted reviews and Co-ordination experience with offshore teams.

Team player with good interpersonal and Server, and Flat files, Mainframe Files, Excel Spreadsheets.

Expertise in the concepts of building Data warehouse/DataMart’s; Data Access Layer, Data Staging Layer (ETL) and Data Presentation Layer.

Developed a python script to initiate a web service call that will further extract the operational data in XML form and load it into the SQL tables.

Well Experienced with ORACLE, DB2, Salesforce Application database for Data warehouses.

Developed Data marts and integration data with heterogeneous source systems.

Expertise in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and usage of Surrogate Keys. Extensive experience in backup and recovery process of Informatica Repository.

Expertise in Developing Mappings and Mapplets/Transformations between Source and Target using Informatica Designer.

Worked on Microsoft Azure storage to write and read files stored on Azure cloud.

Worked on Azure SQL and SQL Datawarehouse to write data on to Azure SQL platforms.

Experience in scheduling Sessions and Batches in Informatica Server Manager.

Experienced in Data Mining, Data Validation, and documentation.

Experience with data cleansing communication skills and the ability to work in a team environment.

Experience with Co-Ordination between Onsite-Offshore teams.

Hands on experience as Business analysis and requirements gathering.

The Business Requirement Documents and the Functional Specification are reviewed.

Prepared Test Plan from the Business Requirements and Functional Specification.

Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.

Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation.

Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.

Reviewed Informatica mappings and test cases before delivering to Client.

Responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.

Written several UNIX scripts for invoking data reconciliation.

Experienced in writing complex SQL queries for extracting data from multiple tables.

Testing has been done based on Change Requests and Defect Requests.

Preparation of System Test Results after Test case execution.

Performed Functional, Regression Data Integrity, System, Compatibility.

Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in SQL Server Database.

Extensively used Informatica power center for extraction, transformation and loading process.

TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.

Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.

Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center

Prepared Test status reports for each stage and logged any unresolved issues into Issues log.

Used T-SQL for Querying the SQL Server database for data validation.

Environment: Informatica 10.2, Oracle 12c, SQL, T-SQL, PL/SQL, Toad, SQL Loader, Salesforce, Python, OBIEE (Reporting Tool), Vertica, JIRA (Ticketing Tool).

XL Catlin

Project: EFRL

Location: Chennai, India

Role: Sr ETL Informatica Developer Apr 18 – Aug 19

Responsibilities

•Involved in gathering and analyzing the requirements and preparing business rules.

•Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

•Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

•Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.

•Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.

•Involved in the data analysis for source and target systems and good understanding of Enterprise Data Warehousing (EDW), staging tables, Dimensions, Facts and Star Schema, Snowflake Schema

•Experience in Oracle BI Apps that include OBIEE, ETL Tools and DAC

•Develop mappings in Informatica Power Center to extract data from OLTP applications such as Oracle EBS, Oracle BRM, Siebel CRM and other legacy systems and load data into Oracle Datawarehouse for reporting in OBIEE.

•Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.

•Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.

•Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.

•Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 9.6.

•Created Integrated test Environments for the ETL applications developed in GO-Lang using the Dockers and the python API’s.

•Data Analysis and Data Visualizations using Python.

•Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

•Writing complex SQL queries to analyze source data and communicating data quality issues to the business

•Involved in Performance Tuning of mappings in Informatica.

•Good understanding of source to target data mapping and Business rules associated with the ETL processes.

•Generating Reports to the users on weekly basis.

Design:

•Actively participated in Business gathering requirements

•Key player in generating STTM’s (Source to Target mappings)

•Documented Technical Design Documents

Process adherence:

•Maintained SOX evidence document like Rollout document, Backout document, CRL before requesting for migration to higher environment levels

•Designed Test case templates and presented to business before PROD rollouts.

Overall SDLC (Software Development Life Cycle) experience:

•Attained experience in an end to end implementation during build of EIM Organization Model and integrated conceptual data model

•Exposed to various other technologies like Business Objects, MDM.

Environment: Informatica 9.6.1, Oracle 11g, SQL, T-SQL, PL/SQL, Toad, SQL Loader, Python, DB2, DAC(Scheduler), OBIEE (Reporting Tool), JIRA (Ticketing Tool), SVN

Bits India Pvt Ltd

Project: Learning Dashboard

Location: Bangalore/India

Role: ETL Informatica Developer Mar 17 - Apr 18

Responsibilities

•Lead the analysis, design, development & implementation of logical data models, physical database objects, data conversation, integration and loading processes

•Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse (EDW).

•Designed ETL high level workflows and documented technical design documentation (TDD) before the development of ETL components to load DB2 from Flat Files, Oracle, DB2 systems to build Type 2 EDW using Change data capture.

•Created stored procedures, views based on project needs.

•Involved in migration of the maps from IDQ to power center.

•Applied the rules and profiled the source and target table's data using IDQ.

•Develop and coding the ‘real time’ and batch modes loads.

•Developed standard framework to handle restart ability, auditing, notification alerts during the ETL load process.

•Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.

•Data Masking and Data Subset using Informatica TDM.

•Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.

•IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.

•Involved in performance tuning and optimization of mapping to manage very large volume of data

•Prepared technical design/specifications for data Extraction, Transformation and Loading.

•Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

•Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc.

•Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts.

•Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.

•Implemented error handling for invalid and rejected rows by loading them into error tables.

•Implementing the Change Data Capture Process using Informatica Power Exchange.

•Extensively worked on batch framework to run all Informatica job scheduling.

•Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.

•Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.

•Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.

•Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.

•Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

•Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.

•Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

•Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.

•Implemented restart strategy and error handling techniques to recover failed sessions.

•Used Unix Shell Scripts to automate pre-session and post-session processes.

•Did performance tuning to improve Data Extraction, Data process and Load time.

•Mapped Dashboard field requirements to Siebel OLTP/OLAP Physical/Business Model.

•Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model.

•Designed presentations based on the test cases and obtained UAT signoffs

•Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments

•Recorded defects as a part of Defect tracker during SIT and UAT

•Identified performance bottlenecks and suggested improvements.

•Performed Unit testing for jobs developed, to ensure that it meets the requirements

•Prepared/reviewed the technical design documentation (TDD) before the development of ETL and BOB components

•Collaborated with BI and BO teams to observe how reports are affected by a change to the corporate data model.

•Scheduled the jobs using Tidal.

•Used HP QC to track defects

•Handled major Production GO-LIVE and User acceptance test activities.

•Created architecture diagrams for the project based on industry standards

•Defined escalation process metrics on any aborts and met SLA for production support ticket

•Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, DAC (Scheduler), Cognos, JIRA (Ticketing tool), GitHub, HP QC (Testing tool)

Bits India Pvt Ltd

Project: HR Analytics

Location: Bangalore

Role: ETL Technical Developer Jun 15 – Feb 17

Responsibilities

•Developed ETL programs using Informatica to implement the business requirements.

•Communicated with business customers to discuss the issues and requirements.

•Created shell scripts to fine tune the ETL flow of the Informatica workflows.

•Used Informatica file watch events to pole the FTP sites for the external mainframe files.

•Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

•Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.

•Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

•Involved in the Migration of data from PeopleSoft Financials using Informatica PowerCenter Scheduled the data through Data Warehouse Administration Console (DAC) for metadata management and scheduling

•Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

•Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

•Effectively worked on Onsite and Offshore work model.

•Pre and post session assignment variables were used to pass the variable values from one session to other.

•Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

•Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

•Performed unit testing at various levels of the ETL and actively involved in team code reviews.

•Identified problems in existing production data and developed one-time scripts to correct them.

•Involved in performance tuning and optimization of mapping to manage very large volume of data

•Prepared technical design/specifications for data Extraction, Transformation and Loading.

•Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

•Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc.

•Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.

•Implemented error handling for invalid and rejected rows by loading them into error tables.

•Implementing the Change Data Capture Process using Informatica Power Exchange.

•Extensively worked on batch framework to run all Informatica job scheduling.

•Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.

•Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.

•Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.

•Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

•Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

•Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.

•Implemented restart strategy and error handling techniques to recover failed sessions.

•Used Unix Shell Scripts to automate pre-session and post-session processes.

•Did performance tuning to improve Data Extraction, Data process and Load time.

•Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model.

•Designed presentations based on the test cases and obtained UAT signoffs

•Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments.

•Recorded defects as a part of Defect tracker during SIT and UAT

•Identified performance bottlenecks and suggested improvements.

Environment: Informatica 10.1.1, 9.6, Oracle, XML, SQL Server 2012, Web services, DB2 Mainframe, DAC (Scheduler), Cognos, JIRA (Ticketing tool), HP QC (Testing tool)

Bits India Pvt Ltd

Project: Star Health Allied

Location: Bangalore

Role: ETL Developer Aug 12 – Jun 15

•Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.

•Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.

•Develop the mappings using needed Transformations in Informatica tool according to technical specifications

•Created complex mappings that involved implementation of Business Logic to load data in to staging area.

•Used Informatica reusability at various levels of development.

•Developed mappings/sessions using Informatica Power Center 8.6 for data loading.

•Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.

•Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

•Building Reports according to user Requirement.

•Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

•Implemented slowly changing dimension methodology for accessing the full history of accounts.

•Write Shell script running workflows in Unix environment.

•Optimizing performance tuning at source, target, mapping and session level.

•Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

•Design the STTM – Mapping Specification.

•Preparing Technical Specification required for ETL development

•Extensively developed the mappings, sessions, worklets, workflows using Informatica 9.6.

•Consolidate the code and test as part of SIT.

•Involved in performance tuning.

•Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Environment: Informatica 9.6, XML, Oracle 11g, SQL server 2012, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader

EDUCATION

Bachelor of Technology - Computer Science - SKU University, India (2009)

Master’s in Computer Applications - JNTU University, India (2012)



Contact this candidate