Post Job Free

Resume

Sign in

Data Service

Location:
Hollywood, FL
Posted:
June 23, 2016

Contact this candidate

Resume:

EXPERIENCE SUMMARY:

Over ** years of Information Technology experience primarily within the areas of Business Intelligence and Data Warehousing. While specializing in extract, transform and load (ETL) development, I also have experience with all aspects of the development lifecycle for Data Warehousing, Enterprise Architecture, Dimensional Modeling, ODS, Star Schema, Snowflake schema, Data Analysis, Data Integration, Data Migration, Data Profiling and Data Conversion, and reporting.

Technologies

ETL Tools

Informatica Power center 9.*, Power exchange 9.*, Data Quality 9.*, DVO, Data transformation studio, B2B transformation, Salesforce, MDM, SSIS

Database

Netezza, ORACLE 12c,11g/10g/9.x with PL/SQL, Teradata, IBM DB2,

MS SQL Server, OBIEE

GUI & Tools

TOAD, SQL Developer, Aginity for Netezza, IBM Data Studio, Winsql, Teradata SQL Assistant, MS data tools, Visual Studio

Languages

Unix shell scripting, SQL, PL/SQL, Java

Reporting Tools

OBIEE, Micro Strategy, SSRS, Cognos, Business Objects, Tableau

Modelling Tools

Embarcadero ER Studio, Erwin

Schedulers

Tivoli, Tidal, Control M, Red Wood

QC

HP Quality Center

Loaders

Oracle SQL/external Loader, Netezza Bulk loader/Reader/Writer, Teradata Loader utilities and TPT (Teradata Parallel Transporter)

Operating System

UNIX, LINUX, Windows XP/NT/2000/98

Industry Experience:

•Banking/Financial Industry

•Pharmaceutical/Healthcare Industry

•Transportation/Suply Chain/Logistics/Truck rental

•Cruise/Hospitality Industry

•Telecom/Billing Industry

•Entertainment/Film Industry

•University/Education system

•Consulting Services

Current Experience:

Ryder System Inc Miami, FL Sep 2015 – Current

Leader in transportation, logistics and supply chain management solutions. The project involves RDW road map conversion, data quality, cleansing, standardizing, consolidation, developing, creating multiple data marts and maintaining enterprise data warehouse solutions for data integration and reporting needs for business to perform better, efficient and cost effective marketing analytics.

Role: Informatica Netezza Lead

Design, Architect, Develop and implement enterprise data warehouse (EDW to RDW) solutions for fleet management, repair and maintainance, rental agreements and inspections, invoicing, credit memos, warranty, parts and pricing, servicing and performance maintenance indicators.

Design and Develop ETL framework/solution via Informatica mappings from mainframe, database and heterogeneous sources( SQL Server, flat files, mainframe, Oracle, Netezza, DB2, Sales force) performing staging, standardization, transformation, ODS, Confirmed & Junk Dimensions, Facts and Fact Latest/MVs, Aggregating/summarizing and creating views for reporting and down stream data needs.

Create CDC process for daily/incremental loads, historical conversion and ad-hoc loads for UAT testing.

Perform data profiling, data modeling to evaluate the right model, Denormalized and Dimensional modeling (star and snow flake schemas) whereever applicable for EDW.

Gather business requirements, evaulate business need, prepare functional and requirements document with business rules.

Create score cards, data profiling, data mining using using informatica analyst and informatica developer tools.

Create and maintain users, groups, roles, privileges, folders, database conections, nodes, PowerCenter and Data Quality repository & services Administration, which involves PowerCenter repository and Integration Service, and Model Repostory Service, Data Integration Service, Analyst Service and Content management Service.

Design strategies and rules/mapplets for cleansing, Standardization, Matching and Consolidation of reference data using IDQ

Perform GAP analysis in identifying anomalies and form patterns for Data Profiling

Created and maintained Informatica Data Quality PDOs, Profiles, Reference tables and Score cards.

Involved and assisted Data Steward Process.

Architect, design and develop Customer Registry data flow process for Addresses, phone numbers and State Licenses.

Experienced in using Address doctor, address validator in cleansing and standardizing address per US postal service recommendation.

Create mappings and mapplets utilizing mapplets/rules and reference tables required for cleansing, standardization, matching, merging and consolidation using data quality and data integration transformations. Used data quality transformations like Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator, and Address Validate/Accelarator along with power center/Data Integration transformations.

Create workflow and human task for data steward to review data to either approve or reject a match/merge process.

Create IDQ workflows, Human tasks for data steward to analyze the data and either approve or reject the match/merge process.

Create mappings, sessions and workflows to maintain and load staging & ODS layers using PowerCenter transformations involving Type I, Type II and truncate and load tables leveraging CDC checksum number.

Integrated IDQ mapplets with Informatica power center mappings and scheduled as part of enterprise workflow.

Good knowledge of Informatica MDM, match and merge to golden customer, and Data Studio.

Maintain database schemas, users, roles, privileges and table spaces. Create and maintain tables, views, synonyms, stored procedures and packages.

Perform data analysis, work estimation for development, testing and deployment phases.

Assign right distribution key for Netezza tables so that bigger dimensions and facts are distributed on the same key which will yield better performance as they will be on same data slice which will avoid data skews, redistributions, slower data loads, retension, latency and provide faster porformance.

Create data model by reverse engineering using Embarcadero ER Studio.

Conversion of SSIS packages and SQL Server stored procedures into relational tables and Informatica mappings.

Performance tune ETL, database processes and long running jobs for faster throughput via reading and loading via NZ Bulk loader/reader and writer for super fast data loads into Netezza sources and targets.

Create mappings to perform upsert operations using IDLookup, Sales force lookup and External ID fields in salesforce using powerexchange for salesforce with powercenter

Create Informatica DVO mappings/single/multiple pairs to validate data across multiple databases leveraging informatica powercenter objects via Informatica DVO for testing/validation purposes.

Using FTP connection, to read from mainframe directly, application connections for sales force connector, relational connections for ODBC and Netezza bulk reader and writer to process into Netezza targets.

Perform data profiling, data anlysis using informatica developer and analyst

Process xml sources via xml parser and write to/generate single or multi viewed xml targets

Good knowledge of Informatica Data transformation Studio and B2B data transformation to parse unstructured data and repeating multiple groups in xml.

Good knowledge of Big Data concepts -HDFS cluster, BDE edition, Pig, Hive and map reduce, and Cognos & Tableau reporting tools and web services.

Experienced in processing message queues in real time scenaio by making the workflow run infinitely and constantly checking for message queues to be processed.

Develop SCD Type I, II, III and junk dimension mappings with md5 checksum logic and facts with Type II implementation when needed. Also, develop daily and monthly aggregate/summary facts for monthly reporting/aggregation.

Experienced in using Java transformation for looping and de-duping rows with Informatica.

Lead, manage and implement project deliverables and mentor team members in developing efficient, re-usable etl and sql processes

Develop intelligent etl processes using Dynamic lookups for mutiple changes scenarios in same data load.

Create SSIS packages to read from relational and other sources to populate into SQL Server targets using column map, derived column, copy, lookup and other transforamtions, and create SSRS reports based on reporting needs.

Create tables, dmls, views, stored procedures and unix scripts for automation and archiving.

Analyse reporting queries in Microstaregy by executing reports in microstrategy developer, extracting underlying SQLs and analysing queries(sql)/query performance and optimized etl wherever applicable. Also, create new data marts by profiling the results from existing legacy reports and perform conversion for new data marts with star schema.

Create user, groups and roles in Informatica administrator.

Good knowledge of web service consumer transformation and processing message queues.

Adjust/modify DTM buffer size with data blocks w.r.to source and target precisions to improve data loads to attain better performance

Extensive use of Informatica mapping variables, mapping parameters, session parameters, workflows variables, parameter file, inbuilt informatica functions for different requirements, data cleansing, date modifcations, dat standardizations, and perform CDCs

Extensive usage of Informatica transformations for differerent requirements(Expression, Filter, Router, Joiner, Lookup, Union, Aggregator, Java, XML parser and generator, Sorter, Rank, Normalizer and few others) as needed

Work with business partners in undertanding requirements, creating user stories, and work together in performing SIT and UAT and perform testing and deployment activities, implemenation plan and follow change management process.

Create mapping specifications, technical design, process flow and job run book documents

Environment: Informatica Power Center & Power Exchange 9.6.1, Informatica Developer 9.6, Informatica Data Quality, Informatica Power Exchange for Salesforce, Informatica DVO, Informatica Data transformation Studio, Oracle, Netezza, NZLoad, Aginity, DB2, Mainframe, Winsql, Oracle, SQL Developer, SQL Server, SSIS, SSRS, UNIX, ER Studio, Redwood scheduler, Micro strategy Developer, Embarcadero ER Studio

PRIOR EXPERIENCE:

American Express Fort Lauderdale, FL May 2014 – Aug 2015

Role: Informatica Senior Engineer

Design, Architect, Develop and implement Global Credit Bureau Reporting for US, CA and International markets

Develop ETL framework/solution via Informatica mappings from mainframe, database and heterogeneous sources performing staging, standardization, transformation, summarizing and reporting to respective bureaus

Perform data profiling, data modeling to evaluate the right model( Normalized(3NF), Denormalized and Dimensional modeling) applicable for each market.

Apply stringent business rules and regulatory requirements for consumer, corporate and commercial reporting. So that Bureau processing is performed with highest precision.

Performance tune ETL and database processes and long running jobs for faster throughput

Build unix shell scripts for automation and workflow/job execution for Control M

Process xml sources via xml parser and write to/generate single or multi viewed xml targets

Build efficient ETL processes which can be leveraged by multiple markets

Lead and work independently in Agile frame work

Implement informatica partitioning with hash auto sort for better performance and to avoid processing duplicates in multiple partitions.

Develop intelligent etl processes which can handle dynamic bureau formatting needs.

Develop Stored procedures for override process

Create and maintain users, groups, roles, privileges, folders, database conections, nodes, PowerCenter and Data Quality repository & services Administration, which involves PowerCenter repository and Integration Service, and Model Repostory Service, Data Integration Service, Analyst Service and Content management Service.

Design strategies and rules/mapplets for cleansing, Standardization, Matching and Consolidation of reference data using IDQ

Perform GAP analysis in identifying anomalies and form patterns for Data Profiling

Created and maintained Informatica Data Quality PDOs, Profiles, Reference tables and Score cards.

Involved and assisted Data Steward Process.

Design strategies and rules/mapplets for cleansing, Standardization, Matching and Consolidation of reference data using IDQ

Perform GAP analysis in identifying anomalies and form patterns for Data Profiling

Create mappings and mapplets utilizing mapplets/rules and reference tables required for cleansing, standardization, matching, merging and consolidation using data quality and data integration transformations. Used data quality transformations like Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator, and Address Validate/Accelarator along with power center/Data Integration transformations.

Integrated IDQ mapplets with Informatica power center mappings and scheduled as part of enterprise workflow.

Match and merge multiple records and consolidate the best record ( Golden Record) based on mailability score and best filled in information based on different options per client needs.

Create IDQ workflows, Human tasks, analyze the data if it’s correct or wrong, should it be merged or not,

Extensive use of Informatica mapping variables, mapping parameters, session parameters, workflows variables, parameter file, inbuilt informatica functions for different requirements and perform CDCs

Extensive usage of Informatica transformations for different requirements(Expression, Filter, Router, Joiner, Lookup, Union, Aggregator, Java, XML parser and generator, Sorter, Rank, Normalizer and few others) as needed

Create mappings/sources to read comp 3 fields from vsam/mainframe sources via powercenter

Perform Star schema implementation and Denormalized model/pivoting as needed for requirement

Work with business partners in undertanding requirements, creating user stories, and work together in performing SIT and UAT

Create registartion and extraction data maps to read or write to pwx sources or targets via pwx application connections

Create extensive formating ( Metro2, TUDF, Variable byte, XML) files in informatica for bureau requirements.

Good usage of Analytical functions when needed to improve performance (Rank, Row_number, Dense rank, Partition by, Aggregate functions, First_Value, Last_value and windowing

Good understanding of Teradata parallel transporter (TPT)

Good understanding of Business Objects universe creation and reporting.

Develop FLOAD, MLOAD & BTEQ Scripts and Informatica Fastloader utility to Teradata targets

Perform bulk loading, indexing on columns to yield better performance and gather statistics once job completes.

Create Data Partitioned Secondary Indexes (DPSI) on partitioned tables to avoid Non-Partitioning Indexes (NPIs) having contension issues

Perform Data profiling, Data standardization, Parsing and address validator to eliminate duplicates.

Perform push down optimization in Informatica

Perform data standardization, data validation and data profiling

Lead and implement deployment activities, change management process and create mapping specifications, process flow docs and implementation plans

Environment: Informatica Power Center & Power Exchange 9.6.1, Informatica Developer 9.6, DB2 11, Teradata, Teradata SQL Assistant, Mainframe, UNIX

Capgemini US LLC.

Client: Verizon Communications Atlanta, GA

Role: Informatica Data Migration Specialist July 2013 – May 2014

Verizon Communications is a leader in Communications sector. The One Verizon Single Biller - Common Billing Platform is a Data Migration project which involves Migration of CMB, Wholesale, & Federal Enterprise accounts and products from five existing billers (NPD, NY, NE, MDV, West, National Broadband) into a single Biller(Vision) for all services including FiOS Broadband & LEC Voice.

Responsibilities:

Involve in Migration Analysis, Migration Design, Migration Development, Migration Testing and Migration Deployment phases of the Data Migration project.

Manage and Lead design, development, deployment tasks for a team of ten resources and client team members in completing deliverables, project milestones, change requests and support items.

Create level of effort for development tasks, project plan, deployment plan, monitor and track the status on development items, production releases, issues, risks and report consolidated status to Senior Management and create change requests.

Design ETL Architecture and data flow process for Vision single biller system.

Design and develop SCD Type I, II and III per business requirements

Design and implement CDC process for adhoc, daily and weekly jobs/loads.

Design and implement code reusability by utilizing same mapping and session in multiple workflows(for different regions) utilizing parameter files executed via PMCMD

Create and maintain users, groups, roles, privileges, folders, database connection and Repository maintenance & administration, Informatica support tasks and propogate velocity methodologies and best practices.

Utilize user defined and built in Session Parameters for parameterization, code re-usability and metadata information/statistics.

Architect and develop audit processing of data loads for verifcation, reconciliation and meterics operation.

Implement performance tuning in mappings, sessions, transformation, workflows and sql queries for faster throughput and increased performance. By utilizing indexes, partitioned tables, Full/Use_hash hints and related items.

Develop and implement mappings utilizing PowerExchange(PWX) connector to source for IBM Mainframes/DB2 tables. Install PWX connector on Powercenter node and Listner for each partitioned DB2 system.

Design and develop partitioned tables either list or range per values/date data on key attributes for fast quering results, ETL throughput and enhanced performance.

Develop unix shell scripts to execute wrappers, file list, and archiving process.

Develop FLOAD, MLOAD & BTEQ Scripts and Informatica Fastloader utility to load data into Teradata targets for empty tables. Mload for upserts, and Tpump for frequent updates.

Develop and perform performance tuning of long running jobs/stored procedures by SQL tuning, Query optimization, indexes, hints, filtering data wherever applicable and removing unnecessry sorting. Leverage Hash join/ nested loops wherever applicable.

Develop Stored Procedures to gather statistics/reports from metadata tables/views for repository objects/maintenance.

Environment: Informatica PowerCenter & Power Exchange 9.5.1, Solaris platform, Oracle 11G, PL/SQL, Toad 11.6, SQL Developer, IBM Mainframe Z/os, DB2, Teradata, Teradata SQL Assistant, HP Mercury QA 10.0, UNIX, Data Studio, Mocha soft, Navigator

Capgemini US LLC.

Client: Warner Chilcott Rockaway, NJ

Role: Lead Informatica Data Quality Designer Feb 2013 – July 2013

Warner Chilcott is a leading specialty pharmaceutical company currently focused on the women's healthcare, gastroenterology, dermatology and urology segments of the branded pharmaceuticals market, primarily in North America. The business intelligence project is a Siebel MDM implementation of HCP/Prescribers, addresses, phone numbers, state licenses required to cleanse noises, unwanted characters and patterns, standardize, match and merge duplicates and consolidate golden prescribers, addresses, phone numbers and state licenses of reference data utilizing Informatica Data Quality.

Responsibilities:

Involve in requirement gathering, analysis, architecture, design, development, testing, and deployment of entire lifecycle of Data Quality and Data Integration environments.

Create and maintain users, groups, roles, privileges, folders, database conections, nodes, PowerCenter and Data Quality repository & services Administration, which involves PowerCenter repository and Integration Service, and Model Repostory Service, Data Integration Service, Analyst Service and Content management Service.

Design strategies and rules/mapplets for cleansing, Standardization, Matching and Consolidation of reference data using IDQ

Perform GAP analysis in identifying anomalies and form patterns for Data Profiling

Created and maintained Informatica Data Quality PDOs, Profiles, Reference tables and Score cards.

Involved and assisted Data Steward Process.

Architect, design and develop Customer Registry data flow process for HCP, Addresses, phone numbers and State Licenses.

Create mappings and mapplets utilizing mapplets/rules and reference tables required for cleansing, standardization, matching, merging and consolidation using data quality and data integration transformations. Used data quality transformations like Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator, and Address Validate/Accelarator along with power center/Data Integration transformations.

Create mappings, sessions and workflows to maintain and load staging & ODS layers using PowerCenter transformations involving Type I, Type II and truncate and load tables leveraging CDC checksum number.

Integrated IDQ mapplets with Informatica power center mappings and scheduled as part of enterprise workflow.

Good knowledge of Informatica MDM, match and merge to golden customer, and Data Studio.

Maintain database schemas, users, roles, privileges and table spaces. Create and maintain tables, views, synonyms, stored procedures and packages.

Perform performance tuning to tune queries and etl processes for better performance and faster throughput both in database and etl processes.

Create shell scripts to process files, automate and archive files and etl processes

Manage and Lead onsite, offshore and team members in providing solutions, following defect resolution process, daily and weekly calls, providing status reports to senior management and making sure deliverables are completed in time and report senior management of any risks/enhancements ahead of time if needed.

Create level of efforts for project deliverables, enhancements and support items.

Create high level and detailed projects plans involving design, development, testing, UAT and deployment phases of project deliverables.

Involve and assist senior management in creating RFPs.

Create mapping specifications, data lineage, data model and technical design document.

Perform unit testing, Integration testing and UAT. Create test cases, test scripts and test plans documents involving positive and negative scenarios.

Environment: Informatica PowerCenter & Informatica Data Quality 9.5.1, Windows platform, Oracle 11G, PL/SQL, Toad 10.6, SQL Developer, Siebel CRM & Siebel UCM 8.0, Erwin data modeler r7.3, HP Mercury QA 10.0, Windows 7

Capgemini US LLC.

Client: Boehringer Ingelheim Danbury, CT

Role: Informatica and Data Quality Lead Dec 2012 – Feb 2013

Boehringer Ingelheim is a leading drug manufacturer and drug selling company with many distinct branded and generic drugs. The Business Intelligence project involves creating new data marts to read from disparate systems(IMS, Fingertip and other vendors) create formulary list, tiers, product, for distinct plans belonging to disparate payers(Cigna etc) for physician and organizational structures. So that business can perform analytics and improve sales by having physicians/organizations prescribe BI drugs to patients/consumers bypassing other competitor drugs.

Responsibilities:

Involve in requirement gathering, analysis, design, development, testing, and migration (deployment to production) of entire lifecycle of PMDW Enterprise Data Warehouse environments.

Create confirmed dimensions which can be leveraged for EDW.

Understand PL/SQL procedures which currently populate EDW and convert those to ETL mappings for formulary, tier, product, plan and payer data for physician and organizational data

Perform GAP analysis, involve in business requirements gathering, and prepare ETL specifications and technical design documents.

Create score cards, data profiles, and find data anomalies & patterns for reference data

Create IDQ rules/mapplets, reference tables for standardization and cleansing routines

Work with data stewards in integrating standardization, anomalies and patterns of reference data

Create mappings leveraging Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator and data integration transformations.

Work to ensure right group of clusters are associated for grouping, matching and consolidation and fine tune performance of etl process.

Develop Type 1, Type 2 and Type 3 SCD mappings.

Create Informatica fast load utility connection and load into Teradata targets.

Good experience using Teradata utilities Fast Load, Mload, Tpump and Bteq scripts.

Create mapplets and leverage lookup, expressions, aggregator, joiner, SQL, filter, router, external stored procedure and other transformations in mappings wherever needed.

Good understanding of Informatica Data Services.

Manage and lead onshore and offshore teams. Conduct design review, defect check and resolution, status meetings on daily basis and send weekly status reports to PMO team.

Create staging and ODS layers and populate data to external tables (objects) so that information can be processed into Sales Source Veeva Cloud using Informatica Sales Force connectors.

Leverage Informatica scheduler to process daily CDCs.

Configure OBIEE repository, build joins and relationships so that reports and segment trees can run successfully.

Environment: Informatica PowerCenter & Data Quality 9.5.1, Informatica Sales force Connector, Power Exchange, Oracle 11gr2, PL/SQL, SQL Server, Veeva Sales Force, Siebel CRM, Teradata 12, Teradata SQL Assitant, Unix platform (AIX), Toad 10.6, SQL Developer, OBIEE 10.1, Siebel CRM & Siebel UCM 8.0, Erwin data modeler r7.3, HP Mercury QA 10.0, Windows XP

Capgemini US LLC.

Client: Royal Caribbean Cruises Ltd. Miramar, FL

Role: Data Warehouse Team Lead July 2011 – Nov 2012

Royal Caribbean Cruises Ltd is a leading cruise company, providing cruise services across various destinations throughout the world. Business Intelligence projects are Customer Experience Management (CEM), Unified Information Warehouse (UIW) and Guest Revenue - Customer Marketing Analytics (CMA) which involve Real time MDM Integration and historical conversions of legacy systems of transactional type and customer based (Siebel UCM, Marketing, Brochures, Advance Shoreside revenue, Sailings, Customer, Events, Loyalty, Permissions, Preferences and few others) and loading and processing the information at near real time. The data warehouse was built to provide timely decisions; faster data analysis and business critical indicators based on key performance aspects of the various products.

Responsibilities:

Involved in requirement analysis, design, development, implementation, testing, and migration (deployment to production) of entire lifecycle of CEM, CMA, UIW and Guest Revenue Enterprise Data Warehouse environments.

Involved in creating Request for proposals (RFPs) for client requirements.

Experienced in Informatica 6/7/8.6/9.1 Development and Administration.

Experienced in administering, creating, maintaining Siebel Analytics 7.8 repository and run reports, adhoc queries, segment trees and list outputs.

Experienced in synchronizing customer data across multiple databases via MDM.

Experienced in Dimensional Modeling, ETL Architecture and reporting/analytics of Data Warehouse lifecycle.

Manage onshore and offshore teams/resources to develop, test, deploy and continue client work operations and provide ETL solutions.

Created business requirement documents, functional specifications, technical design documents, traceability matrix, test scripts and UAT scripts.

Involved with senior management and practiced project planning, progress tracking, project budgets, client rapport and mentoring team members.

Conducted business and technical workshop sessions proposing design and architecture review involving Business, IT and PMO team.

Created space estimates for new requirements, warehouse maintenance and proposing design solutions. Ran statistics on schema tables for faster performance and regular maintenance, check to maintain indexes are valid and operational for query performance.

Created designs to maintain facts and summaries to capture (insert) and process only changed information and archive and drop partitions when needed.

Partitioned Facts and summaries on key attribute columns, created indexes on important attributes which are used heavily by marketing and data warehouse teams. This will help in faster data retrieval, quick execution of reports, achieve higher through put performing ETL loads.

Performed Informatica migrations using shared objects, shortcuts, mapplets, reusable transformations, mappings, sessions, command tasks, worklets, workflows.

Experienced in real time ETL processing via IBM WESB message queues, and processing the information into Dimensions and Facts parsing via XML parser transformation utilizing sources in CLOB format adhering to XML XSD structures and canonical format generated via stored procedures and invoked by real time IBM Data Mirror integration service through Enterprise Service Bus (ESB).

Performed dimensional modeling using Star schema following Ralph Kimball Methodologies and created Data Models per subject area using Erwin Data Modeler.

Good understanding of Big Data concepts.

Designed and deployed slowly changing dimensions in Type I, Type II, Type III and Type IV Dimensions utilizing checksum numbers for detecting change and maintaining history.

Created Staging areas and Operational data store to maintain transactional data from heterogeneous sources while pushing to warehouse. This process would help in faster loads, high query performance



Contact this candidate