Esha Jha
Role: Datawarehouse Architect/Data Modeler/Data Architect/Data Manager
Email: adrq9a@r.postjobfree.com
LinkedIn: linkedin.com/in/eshajha
Mobile: +1-904-***-****
WhatsApp: +1-971-***-****
Experience Summary
Overall, 17 years of experience in Data warehousing and BI field
Key Responsibilities included Data Warehouse Architecture, Data Modeling, Data Profiling, Data Governance, Scrum Master, ETL Design and data Migration and Reporting, Master Data Management, Meta Data Management, project management activities such as project costing and resourcing, presales, and POCs/due diligence
Experience of Implementing models in Enterprise Data Warehouse environment& Dimensional Modeling for both structures and semi-structured type of data
Experience in Azure cloud Data Migration activity
Experience in Designing and implementing Data lake for structured and semi structured data for Tera bytes of record.
Experience on Agile& Waterfall Methodology.
Some of the Major Projects Highlights: Data Obfuscation, BLU Lake, Customer Data Hub Data Governance project, Retail Data Mart, Footfall Counting System, ERP Financial and HRMS data Modeling for BI reporting, Sales data Mart, CRM data Mart, Banking Data Migration to GL, DBA Role, American Express enterprise world services Datawarehouse
Implemented Big data on Semi structured and structured data using DB2 Blu and Azure SQL Database
Theoretical knowledge of Big data environment non structure (Hadoop, hive)
Cloud Migration experience on Azure.
Certified Scrum Master from Scrum Alliance
Azure Data Fundamentals (DP 900) certified
Azure Fundamentals (AZ 900) certified
Certified Scrum Master, Scrum Alliance
Technical Skills
Line of Technology
Tools Used
On-Prem Databases/MDM
Oracle CDH, Siebel CDH, Oracle, SQL Server, Redbrick, DB2 BLU
Cloud Database
Azure SQL database
Storage System
BLOB, databases, Azure Data Lake, Scality
Data Modeling
ERWIN 9.7, VISIO, ERStudio
ETL
Informatica, SSIS, DataStage, AZURE Data Factory
Data Quality Tool
Informatica IDQ
ERP
Oracle Applications11i (11.5.9)
Reporting Tool
Cognos, SSRS, SSAS, Business Objects 6.0, Tableau, Power BI
Language
SQL, PL/SQL, T-SQL
Solution Methodology
Agile, Waterfall
Agile Tool
TFS, Rally, Jira, VersionOne
Cloud Technology
Azure, Azure ADF
Authentication
Palo Alto
Repository
Devops, GitHub, SharePoint
Education and Certifications
Executive Program in Business Administration, Indian Institute of Management, Lucknow
Master of Computer Application (MCA), Birla Institute of Technology, Mesra
Bachelor of Computer Application (BCA), Birla Institute of Technology, Mesra
Certified Scrum Master. Scrum Alliance
Microsoft Azure DP-900 certified
Microsoft Azure AZ-900 certified
Project Experience-Current Employer
Below are some of my significant project highlights.
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Jan 2022– till Date
Atos
Jacksonville, Florida
AllState Benefits
Data Obfuscation
Data Architect
Erwin, Scality, JSON, Dremio, SSIS
Project Description & Responsibilities:
Responsibility:
Requirement Analysis and logical and physical Data model design
Migrating legacy ETL and data into modern database storage
Conduct reviews with product owner, business, and technical team
Guide ETL team on ETL loading strategy
Create Mapping documents, stored procedure
Managing Data Models and Metadata
Maintaining Data Dictionary
Creating Logical and Physical table on Cloud data lake
Work with DBA
Identifies/resolves technical issues/obstacles
Coordinate with various downstream applications and data obfuscation team to align the process and data
Educate dependent teams
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Nov 2020 – till Date
Atos
Remote, India
Allstate Benefit
FD Risk & Compliance Data Migration
Data Architect
Azure Cloud, ADF v2, Tableau 2020, JSON, SSIS, Erwin
Project Description & Responsibilities:
Responsibility:
Requirement Analysis and Data model design
Data engineering tasks in alignment with FDC data and governance teams using authorized toolkits for data mapping and migration from SQL on-prem to Azure Data Lake
Creating and publishing reports on Azure Cloud using Tableau
Create and maintain data pipelines between on-premises to Cloud data lake and storage
Cost estimation for Azure usage
Creating Mapping Documents (Data Dictionary)
Responsible for managing a growing cloud-based data ecosystem consisting of a metadata driven data lake and databases that support real time analytics, extracts, and reporting
Performance Tuning
Managing Data Models and Metadata
Maintaining Data Dictionary
Creating Logical and Physical table on Cloud data lake
Development, unit testing, integration testing of BI use cases as agreed with FDC team
Assist FDC in conducting UAT and production deployment
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
May 2019 – Oct 2020
Atos-Syntel
Chicago, IL
Northern Trust Bank
MDM Release 13 and Release 14, CSDR
Data Manager
DataStage, Oracle, MDM Infosphere, SWIFT Messaging Format, Kafka, Jira, Erwin
Project Description & Responsibilities:
Responsibility:
Work with Product owner in creating the release requirement documents
Maintain and validating the changes requested in each release is adhere to the standard and design
Impact analysis of new change with sources and consuming systems on team
Creating Data Analysis report for Teams
Assisting sources systems with creating and maintaining parties after each release
Assisting data stewards in doing Data Merge activity
Analyzing release requirements and impact assess with consuming systems
Providing solution to technical in each release based upon the release requirement
Conduct and lead discussions with business leads, subject matter experts, and stakeholders to understand projects scope and Business Requirements.
Translate Business requirements into Data Requirements. Understand the data elements and their domains; understand relationships among those elements and inter-dependencies between models.
Use acquired knowledge and develop Logical Data Models, Physical Data Models, and Data Element Dictionaries
Develop maintenance processes for requirements and modeling artifacts
Oversee data mapping for data conversion from legacy data stores and formats to new data structures. Perform architectural alignment with enterprise formats
Coordinate efforts across teams that leverage multiple development methodologies (e.g., Waterfall, Agile, etc
Determine practicality and technical fit of developed standards and architectures
Support QA and DEV teams in their technical design and strategy and validating test cases. Carries out unit and Integration test support.
Project Costing and Budgeting
Resource Management
Stakeholder Management
Oversee multiple projects across all phases of development
Develop and Maintain project charter and progress report
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Aug 2018 – Mar 2019
Syntel
Portland, Oregon
Daimler Trucks North America
RVC Vendor Universe Management
Scrum Master
DB2 BLU, SQL Server, .NET, SSIS, Informatica, TFS
Project Description & Responsibilities:
Responsibility
Experience of facilitating SCRUM ceremonies & practices
Backlog Grooming, prioritizing, and maintaining the product backlog in collaboration with Product owner
Maintain relevant metrics like Burn down, Velocity, Load etc. that help the team see how they are doing and coordinate release efforts
Remove any impediments or blockers that would prevent the team from achieving its sprint goals
Managing Conflicts.
Plan and Maintain Project Costs and budget allocated.
Resource Management
Communicating changes to the stakeholders
Agile coaching among the various Agile team
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
April 2016 – July 2018
Syntel
Portland, Oregon
Daimler Trucks North America
BLU Lake Implementation
Data Architect
ERWIN, Informatica, SQL Server, SSRS, db2 BLU, flat files, Oracle
Project Description & Responsibilities:
Responsibility:
Created strategy to load data in data lake varied environments
Designed data lake architecture
Working with business to understand their current data need
Created data models for the datalake Ingestion and operational layers
Created views/tables for business specific requirement
Data Profiling and meta data management
Created data model template to be used by enterprise-wide data modelers to ensure they are following the correct naming standards.
Maintaining Data dictionary
Created Data mapping documents
Educating Daimler teams with knowledge of Data lake and how they can use the data for their analysis purpose
Helping ETL team in dealing with high volume data load
Creating data purge strategy
Working with DBA is getting the tables created in the PROD environment
Major user of datalake were supply chain team and ITC team
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Nov 2014 – Feb 2016
Syntel Ltd
Edinburg,
Scotland
Standard Life Investment
Alloy TA Migration
Data Architect
ERWIN, AbInition, SQL Server, SSRS
Project Description & Responsibilities:
Description: Standard Life Investment is UK leading investment company. They are on a verge of migrating their transfer (mutual funds) to IFDS. The SLI in-house team along with Syntel is doing the feasibility analysis of this initiative
Responsibility:
Interact with Business Stakeholders, IFDS Team, application users and technical team to obtain their operational reporting requirements
Creating Gap Analysis document
Preparing project estimates and resource planning
Creating logical model for Data warehousing environment.
Enforcing data governance standard
Working in Designing ETL loading strategy with ETL developer
Assisting ETL developers with business rules while ETL data load
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Aug 2014– Oct 2014
Syntel
Northbrook, IL
AllState
Drivewise
Data Architect
ERWIN
Project Description & Responsibilities:
Description: All State is a leading insurance company in United States. They have initiated a drivewise program for all their customers. In this program, customers will get discount/rewards on their insurance package base on their driving performance. All state tracks their driving skills using a mobile app and device.
Responsibility:
Interact with System Analysts to obtain data and functional requirements
Conduct discussions with users and source system stakeholders to understand the operating behavior of the system
Create and document conceptual & logical data models.
Communicate physical database designs to Data Warehouse Administrators.
Review the data models with business and technical teams.
Assist developers, ETL, BI team and end users to understand the data model.
Created mapping document for ETL team
Duration
Employer Name
Work location
Client
Project
Role
Technologies
/Skills Used
Sep 2013 – July 2014
Syntel
Pune
Amex
World Service Data Warehouse
Data Architect
ETL-Data Warehousing, Informatica, ER Studio
Project Description & Responsibilities:
Description: WSDW stands for World Service Data warehouse group of Amex Integrates information from various operational systems into a centralized data repository with capability to serve the analytical, reporting, and decision-making needs of the World Service business organization. It enforces integrated Flexible standards-based Data Warehouse and Business Intelligence framework that securely delivers accurate and timely high-quality information, enabling the business to make better decisions faster.
WSDW is a system of record for American Express business units, setting the foundation for sharing data across the enterprise in a controlled and managed environment.
Responsibility:
Interact with Business Analysts to obtain data and functional requirements
Interact with business users and find out the reporting needs
Conduct interviews, brain storming discussions with project team to get additional requirements
Gather accurate data by performing data and functional analysis.
Create and document conceptual & logical data models.
Communicate physical database designs to Data Warehouse Administrators.
Review the data models with business and technical teams.
Create SQL code from data model and co-ordinate with DBAs to create database.
Ensure that data models and databases are in synch.
Assist developers, ETL, BI team and end users to understand the data model.
Maintain change log for each data model.
Establish and enforce overall modeling standards, guidelines, best practices, techniques, and approaches.
Assist with the development of project estimates, plans, and schedules.
Communicate effectively with business and IT staff across functional departments
Maintain an in-depth knowledge of current and emerging methodologies, technologies and standards and provide direction to management and users
Participate in data integration, information management programs and projects
Enforce World Service Data warehousing standards while designing the data warehousing models.
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
May 2013 – Aug 2013
Syntel
Memphis, Tennessee (Initial) & then Pune
FedEx
Enterprise Data Warehouse Support
Technical Architect
Erwin, Teradata
Project Description & Responsibilities:
Description: FedEx BI Group captures the data from different sources and stores the data in the Teradata platform. They also deal with the various business units reporting requirements.
Responsibility:
End to end analysis of FedEx existing enterprise Datawarehouse
Identifying data gaps and challenges
Designing new Datawarehouse enterprise model to cater Fedex requirement.
Custom Data Mart design for business reporting
Data Profiling
Creating conceptual, logical, and physical data model
Guiding ETL team in loading data from varied sources
Creating business rules
Project Experience – Previous Employers
Duration
Employer
Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Dec 2012 – Mar 2013
VSERV Business Solutions,
Denver, Colorado
oversite Data Services LLC
Best X 2.0 Data Conversion
ETL Architect
MSBI SSIS
Project Description & Responsibilities:
Description: This firm works in conjugation with Albertelli law firm who deals with portfolio analysis at loan level and facilitate the liquidation of distressed assets. Their major client is Bank of America.
Currently Alaw is using manual process to manage the clients and borrowers’ portfolios such as access and excel. They have recently launched a new application Best X 2.0 for their processing.
Responsibility:
Understanding the functionality and end to end process flow of new application.
Created the migration plan and ETL scripts in SSIS
Task Management and Scheduling
Scope, Risk and Resource Management
Customer engagement and Communication
Reporting requirements gathering
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Nov 2009 – Dec 2010
Emaar Malls Group LLC, Dubai, UAE
Dubai
Emaar In-house Projects
Customer Data Hub
Assistant Manager Datawarehouse and BI Solutions
ERWIN, INFORMATICA PC, INFORMATICA DATA QUALITY, SIEBEL UCM
Project Description & Responsibilities:
Description: The objective of this project was to consolidate all the customers’ information across Emaar and produce the 360-degree view of each customer across Emaar.
The project involved Understanding of Business Data, Data profiling, Data Cleansing, Standardization and Merging, Data Dedup strategy and then utilizing the consolidated data for Upselling and Cross-selling.
Responsibility:
Data profiling which includes performing current state assessments, data assessments, sizing etc. Gap Analysis.
Formation of Data Stewards Team
Understanding all the entities and Technology used across the Emaar
Prepared Infrastructure Setup requirement guidelines for Development and Production environment
CDH logical data Model Design
Data Profiling of each entity source data
Defining Data Quality rules for cleansing and data standardization, Defining Data Merging and Survivorship rule
Liaising between the Business and the Development team
Defined the steps to reverse integrate the corrected data with the source data.
Educating the existing team on the standards and guidelines to follow.
Data Governance Policies and Procedures.
Project monitoring which includes continuous audit of application
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Jan 2010 – Oct 2011
Emaar Malls Group LLC, Dubai, UAE
Dubai
In-house Project
Property Data Mart and HR Data Mart
Assistant Manager Data warehousing& BI Solution
Erwin, SSIS 2008, COGNOS 8
Project Description & Responsibilities:
Description: This project involved restructuring of the existing properties and HR data warehouse to get the meaning full reports.
Responsibility:
Reviewing and redesigning existing Datamart’s to cater to properties requirement.
Redesigning of Sales Datamart, HRMS DataMart, CRM Datamart
Designed Contact Center Quality Control Process and Reporting Structure.
Metadata modeling for reporting and created the reports such as cash flows, service requests information by department, HRMS related report etc...
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Nov 2011 – Feb 2012
Emaar Malls Group LLC, Dubai, UAE
Dubai
In-House Project
Retail Data Mart
Assistant Manager Data warehousing& BI Solution
Erwin, SSIS 2008, COGNOS 8
Project Description & Responsibilities:
Description: This project was initiated to get the consolidated view of all the Emaar retail outlets daily sales and its comparison against the footfall
Responsibility:
Understanding all the retails systems in Emaar
Dimensional Modeling to make the retail Datamart as per requirement
Developed data quality checks and loading strategy (Real Time or Batch Mode)
Create ET package and Reports
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Mar 2012 – May 2012
Emaar Malls Group LLC, Dubai, UAE
Dubai
In-house Project
Footfall Data Mart
Assistant Manager Data warehousing& BI Solution
SSIS 2008, COGNOS 8, Visio
Project Description & Responsibilities:
Description: This project was initiated because our existing footfall capturing mechanism was not covering all the requirements of Emaar and there were loads of PLC failure happening.
Responsibility:
Reviewing the existing footfall system
Highlighted the problem that we are facing
Helping the actual development team in designing the zones
Understanding how PLCs work and how can we extract the data from that device and capture the information, issues, and suggestions to business.
Create Footfall Datamart and ETL jobs to load the footfall from the device and prepare BI reports
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Mar 2009 – Oct 2009
Dicetek LLC
Dubai
Emirates NBD Bank
Unifi
Data Analyst
Informatica, SQL Server 2008, Oracle GL
Project Description & Responsibilities:
Description: Involved in complete end to end oracle GL implementation for Emirates Bank. It includes migration from legacy applicable Equation to Oracle GL.
Responsibility:
Liaise between finance team and respective development team
Understanding the Asset and Liability accounts in GL, Understanding the Asset and Liability accounts in Equation system
Prepared data Mapping from Equation to Oracle GL
Created Migration steps
Reconciling GL Balance Sheet and P & L report with the Equation report
Strategy to track the unsettled accounts so that the respective teams can close
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
May 2008 – Jan 2009
SafeNet LLC,
Noida, India
In-House
Sentinel Rights Management System
Document Manager/Technical Writer
Madcap Flare, Rob help, MKS 2008
Project Description & Responsibilities:
Description: Sentinel RMS is a robust license enforcement and enablement solution providing software and technology vendors with control and visibility into how their applications are deployed and used. Focused on scalable and flexible license management, RMS is ideal for applications deployed in medium to large scale enterprise environments.
Responsibility:
Reviewing SDK and Release documents
Product QA before release.
Versioning the release documents
Managing client issues and Requirements
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Mar 2007 – April 2008
Birla soft Ltd Noida, India
Noida, India
GE Comfin
BP Development & Support
ETL Developer
Datastage, SQL Server, Oracle, BO 6.0
Project Description & Responsibilities:
Description: Involved in Migration of legacy system data InfoLease to Oracle and SQL Server platform
Responsibility:
Involved in requirements gathering and source data analysis
Identifying business rules for data migration.
Creation of New Data warehouse in Oracle and SQL Server Platform
Created source to target mapping documents from staging area to Data Warehouse
Developed server mode ETL jobs
Duration
Employer Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Feb 2006 Feb – 2007
Birla soft Ltd Noida, India
Noida, India
GE Money
World Mortgage Company
Redbrick DBA
IBM Redbrick, SQL Server
Project Description & Responsibilities:
Description: WMC used to handle all the mortgages. The mortgage data were stored in the Redbrick database.
Responsibility:
Worked as a DBA for redbrick.
Involved in Users, Views, procedure, and tables creation in Redbrick database.
Taking database backups and restoring database backups
Involved in Data transformation activities using TMU.
Duration
Employer
Name
Work Location
Client
Project
Role
Technologies
/Skills Used
Feb 2005– Jan 2006
Birlasoft Ltd
Noida, India
Client:
GE NBCU
Kintana, Datawarehouse
ETL Developer
Kintana, SQL Server, Datastage
Project Description & Responsibilities:
Description: Kintana is a project management tool and tracks all the projects currently running in GE NBCU and the costing of those projects and other project tracking details. The purpose of this project was to migrate the data from Kintana into some Datawarehouse platform and provide the meaning reports as intended by the business.
Responsibility:
Understand business requirements and assist in developing project plans
Evaluate and design data collection, data staging, data movement, analytics delivery, data quality and archiving strategies
Lead and guide the ETL developers in data profiling, source-target mappings, ETL
development, SQL tunings and optimization, testing and implementation
Provides reviews of deliverables and identifies gaps/areas for improvement
Provides/reviews detailed technical specifications and ETL documentations required in each phase of the SDLC, and complies to internal standards and guidelines