Kanhaiya Lal
Khatik
BI Data Architect / Data Modeler
Email: **************@*****.*** Phone Number:
PROFESSIONAL SUMMARY
. An Information technology professional with more than 9 years of
extensive experience in all phases of Software development life
cycle including System Analysis, Design, Data Analyst, Data
Architect, Data Modeling Implementation and Support of various web
based applications in OLTP, Data Warehousing, and OLAP
applications.
. Data Modeler with strong Conceptual and Logical Data Modeling
skills, has experience with JAD sessions for requirements
gathering, creating data mapping documents, writing functional
specifications, queries.
. Extensive experience in Relational and Dimensional Data modeling
for creating Data Analysis, Business requirements, Logical and
Physical Design of Database and ER Diagrams with all related
entities and relationship with each entity based on the rules
provided by the business manager using ER Studio
. Worked extensively in forward and reverse engineering processes.
Created DDL scripts for implementing Data Modeling changes.
Published Data model to acrobat PDF files, extensively created ER
Studio reports in HTML, RTF format depending upon the requirement,
Published Data model in model mart, created naming convention
files, co-ordinated with DBAs to apply the data model changes
. Extensive Experience in writing system specifications, translating
user requirements to technical specifications,
created/maintained/modified data base design document with detailed
description of logical entities and physical table.
. Experienced in developing Database Design Document including Data
Model: Conceptual, Logical and Physical Models using ER Studio.
General understanding of Database Management System and Data
Warehouse including their functional and technical architecture and
the design of Data Flow Diagram.
. Good Knowledge of ETL Processes such source, mapping,
transformation, staging areas and created various ETL documents
such as ETL Mapping documents, Data Mapping documents, ETL test
scripts etc.
. Good Knowledge of Informatica Data Virtualization tool.
. Good knowledge and experience in Data quality, Data Governance and
Master Data Management.
. Excellent knowledge of Software Development Life Cycle (SDLC),
Adhered to standards/guidelines through out the development life
cycle and maintenance of the models.
. Possess strong Documentation skill and knowledge sharing among
Team, conducted data modeling sessions for different user groups,
facilitated common data models between different applications,
participated in requirement sessions to identify logical entities
. Extensive Experience working with business users/SMEs as well as
senior management.
. Strong understanding of the principles of Datawarehousing using
Fact Tables, Dimension Tables, star and snowflake schema modeling.
. Strong experience with Database performance tuning and
optimization, query optimization, index tuning, caching and buffer
tuning.
. Extensive experience in backend programming including Database
table design, Stored procedures/BTEQs/Macros, Triggers, Indexes,
Performance tuning in Oracle 10g/9i/8i and Teradata 12.
. Good knowledge in analysis and reporting using various Business
Intelligence tools such Business Objects XI 3, OBIEE, Qlikview,
Spotfire
. Basic Knowledge of MISMO (Mortgage Industry Standards Maintenance
Organization)
. Basic Experience with No-SQL databases.
. Basic knowledge of Master data Management and data Quality.
. Good expierence in Troubleshooting techniques for drilling down the
complex issues raised by users.
. Brain Bench certification in Datawarehousing and data modeling.
. Good knowledge of extracting data from Sales CRM like Salesforce
and Siebel.
. Excellent verbal, presentation and written communication skills.
Able to communicate with technical, non-technical and senior
management audiences. Have played a significant role in
requirements planning, client interaction and documenting.
. Excellent problem solving skills with a strong technical background
and result oriented team player with excellent communication and
interpersonal skills.
. Knowledge of Oracle Apps Financial Services.
TECHNICAL SKILLS
Data modeling Embarcadero ER Studio, Erwin
Databases/Tools Teradata,12.0 Oracle 8i/9i,10g/11g MS SQL Server 2000,
MySQL, Sybase 11.5/11.2, MS Access, Mainframe
Files, DB2 UDB, XML, SQL Navigator, Teradata SQL,
PL/SQL
ETL Tools: Informatica Data Virtualization 9x, Informatica PowerCenter 9x,
8x, 7x/6.1
BI Tools Business Objects,OBIEE,Qlikview, Spotfire
Architectures OLAP, OLTP
Operating Systems Windows 95/98/2000/XP, UNIX
Programming Languages SQL, Shell Scripting, C, C++, XML, HTML
PROFESSIONAL EXPERIENCE
PBC (Pepsico Beverages Company), Somers,NY Jan-2014 to
Current
Sr. Data Modeler
Project - Enterprise Data Foundation V2.0:
This project is to build the enterprise Data warehouse layer for Pepsico
reporting. First phase of the project includes data modeling for entities
required for Canada, Mexico and Saudi countries and later phases would have
more entities required for other countries. This enterprise Data warehouse
layer will enable Pepsico reporting users to get holistic view of key
metrics like Net Sales across various regions by Product, Customer etc.
. Combine business and technical knowledge to determine the most
appropriate business needs.
. Analyzing the business requirements for building data marts and data
warehouse
. Design, developed, implement and maintained
conceptual/logical/physical data models for complex data warehouses
. Developing Snowflake schemas by normalizing the dimension tables
. Develop logical specifications and user requirements to design
solutions for the application environment
. Perform program analysis, design, code, test, implement, troubleshoot,
and write documentation
. Create functional specifications and document findings
. Identify data elements needed to support new requirements
. Responsible for creating and executing test cases to validate ETL and
report code changes made for break/fixes and new enhancements
. Responsible for implementing security for Data warehouse and reporting
applications.
. Responsible for reporting development with HTML along with JQuery,
ASP.Net and C#
. Other similar professional responsibilities as needed.
Environment: Teradata 14, Oracle 10g,Teradata SQL, PL/SQL for Oracle, ER
Studio, Informatica Powercenter 9.1, Business Objects
GE Capital Americas, Norwalk,CT
June 2013-Dec-2013
Data modeler/Data Analyst
HRSIC Project:
GE Capital is an financial services unit of US conglomerate General
Electric. GE Capital provides commercial lending and leasing, as well as a
range of financial services for health care, media, communications,
entertainment, consumers, real estate, and aviation. GE Capital focuses
primarily on loans and leases that it underwrites to hold on its own
balance sheet rather than on generating fees by originating loans and
leases, then selling them to third parties.
This project is built to provide data for GECA Sales Compensation process.I
was involved in maintaining Liaise with business subject matter experts in
analyzing business requirements and translating them into detailed logical
models, physical models and schema development in the database.
Role & Responsibilities:
. Taking Business requirement from users and performing data analysis.
. Analyzing and modeling of CRM systems like Salesforce and Siebel in
DW.
. Analyzing and modeling of Oracle Apps Financial Services tables.
. Data Architecture, Conceptual, logical and physical Modeling for OLAP
systems.
. ER Modeling - Developing Entity Relationship Diagrams (ERD).
. Normalizing the tables/relationships to arrive at effective Relational
Schemas.
. Identifying the facts & dimensions; grain of fact, aggregate tables
for Dimensional Models.
. Developing Snowflake Schemas by normalizing the dimension tables as
appropriate.
. Implementation of Business Rules in the Database using Constraints &
Triggers.
. Dimensional Data Modeling to deliver Multi-Dimensional STAR schemas.
. Requirements & Business Process Analysis; Rapid Audit of the
requirements and existing systems.
. Developed, Implemented & Maintained the Conceptual, Logical & Physical
Data models.
. Design & Implementation of Data Mart; DBA coordination; DDL & DML
Generation & usage.
. Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
. Normalization techniques to arrive at effective Relational Schemas.
. Applied Data Governance rules (primary qualifier, Class words and
valid abbreviation in Table name and Column names).
. Involved in capturing data lineage, table and column data definitions,
valid values and others necessary information in the data models.
. Documented all the information of application and saving them for
future reference.
. Identifying the facts & dimensions; grain of the fact; aggregate
tables, etc for Dimensional Models.
. Developing Snowflake Schemas by normalizing the dimension tables as
appropriate.
. ETL Design & Implementation - Data Extraction, Transformation &
Loading using Informatica mappings and BTEQs.
. Performance Tuning (Database Tuning, Teradata SQL Tuning,
Application/ETL Tuning)
. Create and alter SQL statements before sending database change request
to DBA team.
. Maintained and documented all create and alter SQL statements for all
release.
. Designing Data Flows & System Interfaces.
. Architecting Work Flows, Activity Hierarchy & Process Flows;
Documenting using Interface Diagrams, Flow Charts & Specification
Documents.
. Coordinating with DBA team to implement physical models & to setup
development, test, staging & production environments for DDL & DML
Generation & usage.
. Extensively involved in various data modeling tasks including forward
engineering, reverse engineering, complete compare, creating DDL
scripts, creating subject areas.
. Defined dependencies of access layer components on all the source
databases or files.
Environment: Teradata 12, Oracle 10g,Teradata SQL, PL/SQL for Oracle, ER
Studio, Erwin, Informatica Powercenter 9.1
GE Capital EF, Norwalk & Danbury CT Aug 2009 -
June 2013
Data Modeler/Data Analyst
BI Touchless Origination:
Touchless application is new Origination Application for GE capital and GE
capital has planned to on-board all of small ticket business on this
application to get fast turn-around of the deal.
As a part of this project, we have worked on pulling the data in staging
layer, Core (3rd Normal form) and acess layer on top of 3rd Normal Core.
We provided single-stop reporting solution to Origination users of GECA.I
was involved in maintaining Liaise with business subject matter experts in
analyzing business requirements and translating them into detailed logical
models, physical models and schema development in the database.
Role & Responsibilities:
. Taking Business requirement from users and performing data analysis.
. Performing analytics on marketing data.
. Designed a Conceptual Data Model, Logical Data Model, and Physical
Data Model.
. Worked extensively in understanding business reporting requirements
for various releases Touchless for various GECA businesses.
. Designed the Stage environment as per the various sources involved for
this project.
. Extensively used Normalization Techniques to design Logical/Physical
Data Models, relational database design in EDW Core (3rd Normal form)
on basis of FSLDM model.
. Designed the access layer dimensional model for Touchless reporting.
. Conducted data modeling sessions for different user groups,
facilitated common data models between different applications,
participated in requirement sessions to identify logical entities.
. Extensively involved in various data modeling tasks including forward
engineering, reverse engineering, complete compare, creating DDL
scripts, creating subject areas, creating DDL scripts.
. Extensively worked with Business Objects developer in designing the
universe.
. Extensively worked with Developers and ETL team in enhancing the
models and co-ordinate the same with DBA in implementing those changes
into Applications orDatabases.
. Extensively involved in creating, maintaining, updating documents like
requirement document, database design document, System design
document, naming standard procedures, SOPs etc.
. Analyzed and optimized the existing business processes using
Conceptual Models and Data Flow Diagram. The process included
performance enhancement requirements assessment and mapping of
existing work flow and data flows.
. Participated in Requirement Gathering, Business Analysis, User
meetings, discussing requirements for new table design in existing
Data base and discussing about new columns for existing tables in
Database.
. Knowledge of breaking the xmltype Oracle object into flat view and
pull the same in Teradata.
. Worked on the Data Warehouse team analyzing data files that had to be
compiled from disparate non-production sources and readied them for
production. Tasks included: comparing data to requirements
documentation, creation of data layouts, data dictionary, and the pipe
delimited and fixed width files. In addition, was team lead for
managing the completion and loading of these files for the Data
Management group
. Adding Attributes related to MISMO in datawarehouse
. Extensively worked on creating,Altering and Deleting the Tables in
different Development Environments and also Production.
. Coordinating with DBA in implementing the Database changes and also
updating Data Models with changes implemented.
Environment: ER Studio DA 9.1, Erwin, Teradata 12, Informatica,
Teradata SQL, Microsoft Project Plan, Business Objects, Oracle 10g
GECA (GE Capital Americas), Mumbai India Oct 2006 - Aug
2009
Data Modeler/Analyst
BI Enterprise DW Reporting Project:
This project had 1000 users across the world for various GECA Sub-
Businesses.. This role involves direct interaction with users and it is
very challenging as it deals with multiple source applications and
knowledge of multiple tools and then doing enhancement to existing
environment to cater to user's needs. This project involves in all the
enhancement and support that can be identified by users and applied to
access layer,Siebel Analytics and business objects. Enhancements are
decided for every month and subsequently they are released to production
environment.
Responsibilities:
. Designed a Conceptual Data Model, Logical Data Model, and Physical
Data Model.
. Worked extensively in understanding business reporting requirements.
. Designed the Stage environment as per the various sources involved for
this project.
. Extensively used various Techniques to design Logical/Physical Data
Models, relational database design in Oracle DW.
. Designed the access layer dimensional model reporting.
. Conducted data modeling sessions for different user groups,
facilitated common data models between different applications,
participated in requirement sessions to identify logical entities.
. Extensively involved in various data modeling tasks including forward
engineering, reverse engineering, complete compare, creating DDL
scripts, creating subject areas, creating DDL scripts.
. Extensively worked with Business Objects developer in designing the
universe.
. Extensively worked with Developers and ETL team in enhancing the
models and co-ordinate the same with DBA in implementing those changes
into Applications orDatabases.
. Extensively involved in creating, maintaining, updating documents like
requirement document, database design document, System design
document, naming standard procedures, SOPs etc.
. Analyzed and optimized the existing business processes using
Conceptual Models and Data Flow Diagram. The process included
performance enhancement requirements assessment and mapping of
existing work flow and data flows.
. Participated in Requirement Gathering, Business Analysis, User
meetings, discussing requirements for new table design in existing
Data base and discussing about new columns for existing tables in
Database.
. Knowledge of breaking the xmltype Oracle object into flat view and
pull the same in Teradata.
. Worked on the Data Warehouse team analyzing data files that had to be
compiled from disparate non-production sources and readied them for
production. Tasks included: comparing data to requirements
documentation, creation of data layouts, data dictionary, and the pipe
delimited and fixed width files. In addition, was team lead for
managing the completion and loading of these files for the Data
Management group
. Extensively worked on creating,Altering and Deleting the Tables in
different Development Environments and also Production.
. Coordinating with DBA in implementing the Database changes and also
updating Data Models with changes implemented
Environment: Oracle 9i, Informatica 7, Oracle PL/SQL, Siebel Analytics/
OBIEE, Business Objects, Unix
GE Capital, Corporate Financial Services
Aug 2006 - Sep 2006
Data Modeler/Analyst & ETL Developer
Marketing Data Mart:
This project was to enhance the existing Marketing Data Mart by adding more
entities to it. Data Model changes were being done to add all the new
entities and then new mappings were being built to populate all the
entities.
Roles and Responsibilities:
. Designed a Conceptual Data Model, Logical Data Model, and Physical
Data Model.
. Participated in Requirement Gathering, Business Analysis, User
meetings, discussing the issues to be resolved and translating user
inputs into ETL design documents.
. Assess data quality requirements in terms of data completeness,
consistency, conformity, accuracy, referential integrity, duplication
and evaluate vendors (Informatica Data Explorer/Data Quality, identify
and measure data quality, design and implement data profiling and data
quality improvement solution to analyze, match, cleanse, and
consolidate data before loading into data warehouse.
. Developed mapping to load the relational data model for operational
data store and staging areas, Designed Dimension & Fact tables for
data marts.
. Developed Informatica ETL Mappings documents, created ETL staging area
framework, created data mapping documents, data flow diagram, ETL test
scripts etc.
. Analyzed and optimized the existing business processes using
Conceptual Models and Data Flow Diagram. The process included
performance enhancement requirements assessment and mapping of
existing work flow and data flows.
. Extensively worked with Business Objects developer in designing the
universe.
Environment: Oracle 9i, Informatica 7, Oracle PL/SQL, Siebel Analytics/
OBIEE, Business Objects, Unix
GE Commercial Finance UK (GLM), Mumbai, India Jan 2006-
Jul 2006
Data Analyst, ETL and Business Objects Developer
This project is to create Operational reports (Point in time reports) in
Business Objects from Oracle Applications database. A Business Objects
Universe is designed with these tables to cater the reporting requirements.
Reports are created from this Universe and published to the Corporate
documents. I worked as a BO developer to build the universe and reports in
this project.
Roles and Responsibilities:
. Analyzed/Profiled the source data to help DA to define the grain of
the data
. Developed Analytical applications that can analyze large amounts of
online and offline data.
. Extensively used Data Stage ETL tool in designing & implementing
Extract Transformation & Load processes.
. Performed tuning on database queries, ETL mappings and end-user
queries.
. Developed Reporting Design doc and reports.
. Developed BO reports and Universe
. Used Appworx to schedule and running the solution and used Informatica
components for testing and debugging and monitoring the ETL jobs.
. Wrote shell scripts to automate file manipulation and data loading
procedures.
. Unit tested and deployed the jobs to Production.
. Used Type2 mapping to update a slowly changing dimension table to keep
full history.
Environment: Informatica PowerCenter 6.1 (Work Flow Manager, Work Flow
Monitor, Designer), Oracle,PL/SQL, Unix, Business Objects
GE Commercial Finance VFS, Mumbai, India
Aug 2005- Dec 2005
ETL & Business Objects Developer
The project involved migration of the existing hierarchical Database based
on FOCUS to Oracle
I have built extraction process, BO universe and 105 BO reports on top of
it.
Roles and Responsibilities:
. Involved in analysis, requirements gathering, functional/technical
specification, development, deploying and testing.
. Used Informatica as an ETL tool to extract data from sources like DB2
and flat files and loaded to target tables..
. Developed DataStage Parallel jobs for various components and strings
involved in the calculation of quotes and rates.
. Developed parameter driven ETL process to map source systems to target
data systems with DataStage complete source system profiling
. Developed BO reports and Universe
. Wrote Release notes, Deployment documents and scheduled the jobs.
. Project Life Cycle - from analysis to production implementation, with
emphasis on identifying the source and source data validation,
developing particular logic and transformation as per the requirement
and creating jobs and loading the data into different targets.
Environment: Informatica PowerCenter/PowerMart 6.1 (Work Flow Manager, Work
Flow Monitor, Designer), Oracle, Unix, power designer, Business Objects
EDUCATIONAL QUALIFICATION
. Bachelor's Of Technology from NIT Jamshedpur, India
[pic]