SUMESH NAIR Data Modeler
SUMMARY
• An Information technology professional with more than 3 years of extensive Information Technology experience in all phases of Software development life cycle including System Analysis, Design, Data Modeling, Implementation and Support of various applications in OLTP, Data Warehousing, and OLAP applications.
• Data Modeler with strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
• Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN, ER Studio.
• Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, published Data model in model mart, created naming convention files, co-coordinated with DBAs to apply the data model changes.
• Extensive experience in writing functional specifications, translating business requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical tables.
• Excellent knowledge of waterfall and spiral methodologies of Software Development Life Cycle (SDLC).
• Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
• Extensive Experience working with business users/SMEs as well as senior management.
• Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling.
• Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes.
• Excellent analytical, inter-personal and communication skills with a strong technical background.
TECHNICAL EXPOSURE
Data Modeling
Dimensional Data Modeling, Relational Data Modeling, Star and Snowflake Schema, Fact and Dimension Tables, Conceptual, Logical and Physical Data Modeling, ER Studio 7.1.1, ERwin 4.0/3.5.2/3.x, Power Designer.
Data Warehousing
Informatica PowerCenter 8.1/8.0/7.1/7.0/6.2/6.1/5.2, Informatica PowerMart 4.7, PowerConnect, Power Exchange, Data Profiling, Data cleansing, OLAP, OLTP, SQL*Plus.
Databases
Oracle 10g/9i/8i/8.0/7.0, DB2 8.0/7.0, SQL Server 12.0/11.x, MS SQL 7.0, SQL Server 2000/2005, MS Access 7.0/97/2000, Quest Central DB2.
Environment
UNIX, LINUX, Windows95/98/2000/XP.
Programming Languages
Matlab, C, C++
PROFESSIONAL EXPERIENCE
Mannatech, Dallas, TX May 2009 – Present
Data Modeler/ On-site Project Coordinator
Mannatech is a fast-growing company focused on delivering better quality of life through scientifically tested wellness technologies. It is a multinational firm engaged in multi-level marketing, research, development, and distribution of glyconutrients, the company's name for blends of sugars. The goal of the project was to create and implement a Data Warehouse which will be used for implementing Business Intelligence through 12 detailed reports and 20 dashboards.
Responsibilities:
• Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
• Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
• Analyzed the source system (JD Edwards) to understand the source data and JDE table structure along with deeper understanding of business rules and data integration checks.
• Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
• Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
• Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
• Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
• Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide information about the data model and business requirements.
• Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from JDE to the warehouse.
• Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
• Worked on the model based volumetric analysis and data based volumetric analysis to provide accurate space requirements to the production support team.
• Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
• Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
• Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
• Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System. Worked with the test team to provide insights into the data scenarios and test cases.
• Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
Environment – ER Studio 8.0.3, Microsoft SQL 2008 Server, Microsoft SQL Management Studio, Microsoft SQL 2008 Integration Services, Microsoft SQL 2008 Reporting Services, Microsoft SQL 2008 Analysis Services, Mercury Quality Center 9.
Thrivent Financial for Lutherans, Minneapolis, MN Nov 2007 – April 2009
Data Modeler/Data Analyst
Thrivent Financial for Lutherans is a Fortune 500 not-for-profit financial services membership organization. The Information Management Project was to develop and implement conceptual, logical and physical Enterprise wide data model for the insurance, mutual funds and annuity products.
Responsibilities:
• Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company.
• Created the conceptual model for the data warehouse with emphasis on insurance (life and health), mutual funds and annuity using EMBARCADERO ER Studio data modeling tool.
• Reviewed the conceptual EDW (Enterprise Data Warehouse) data model with business users, App Dev and Information architects to make sure all the requirements are fully covered.
• Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacy data.
• Worked on rationalizing the requirements across multiple product lines
• Reviewed and implemented the naming standards for the entities, attributes, alternate keys, and primary keys for the logical model.
• Created the logical model for the EDW with approximately 75 entities and 1000 attributes using ER Studio. The logical model was fully attributed till 3rd normalization and contains both current and history tables. Data model is divided in number of sub models for the ease of understanding and comprehension.
• Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
• Worked with ETL to create source to target mappings (S2T).
• Worked with DBA to create the physical model and tables.
• Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
• Had brain storming sessions with application developers and DBAs to discuss about various denormalization, partitioning and indexing schemes for physical model.
• Worked on Requirements Traceability Matrix to trace the business requirements back to logical model.
Environment – ER Studio 7.1.1, Quest Central for DB2 v 4.8, COBOL copybooks, Mainframe DB2, Mercury Quality Center 9, Informatica PowerCenter 8.1
Techforce, Shawnee, KS June 2007 – Nov 2007
Data Modeler/Data Analyst
This was an in house project to study an existing data model and change it according to the changes in the business.
Responsibilities:
• Analyzed existing logical data model (LDM) and made appropriate changes to make it compatible with business requirements.
• Expanded Physical Data Model (PDM) for the OLTP application using ERwin.
• Experienced in IDEF1X and IE data modeling methodologies.
• Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
• Created rationalized domains to bring consistency in the tables.
• Identified source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
• Worked with cross-functional teams and prepared detailed design documents for production phase of current customer database application.
Environment: Erwin 4.0, Erwin 3.5.2, Toad, PL/SQL, Oracle 9i, SQL Server 2000, SQL*Loader, UNIX, Windows 2005
Integrated Business Solutions, Kerala, India (Training) June 2004 – Nov 2004
Data warehouse design
• Received training on SQL Server 2000. Created number of tables, functions, triggers, stored procedures, views etc. in SQL server.
• Participated in business requirement gathering sessions.
• Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
• Trained in Inmon and Kimball approaches for data warehouse design.
• Worked on Normalization and Denormalization techniques.
• Trained on building Conceptual, Logical and Physical data model.
• Defined relationships and cardinalities among entities.
• Developed queries using PL/SQL and many stored procedures to do the validations.
• Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers etc).
EDUCATION
M.S. (Electrical and Computer Engineering) May 2007
University of Missouri-Rolla (UMR), USA GPA: 3.5/4.0
Research Experience
• Worked in a team to design the data base for the image analysis.
• Worked on SQL - Matlab interface so as to bring the image data directly in the Matlab application to perform analysis thereafter.
• Performed data analysis and feature classification techniques for real time airborne imagery.
• Developed a front-end application in Matlab to facilitate analysis of airborne imagery.
• Developed an application to register the aerial images.
• Analyzed the images for mine field detection using RX anomaly algorithms.
• Completed error analysis (Mean Error, Variance of the error) on the results from various flight simulation exercises.
Environment: Mat Lab, C++, MS SQL, DB2 Database.
Projects
• Developed an application in Matlab for evaluating various Adaptive filter algorithms for different communication channels for both sparse and dispersive echo paths.
• Developed a feed forward back propagation neural network for zip code recognition.
• Implemented Goertzel algorithm for DTMF detection on DSP kit 6713.
• Developed Matlab application for the registration of the images with both rotation and translation using Fast Fourier Transform.
• Developed a Matlab application for acoustic communication involving transmission of binary data in presence of noise.
HONORS & ACTIVITIES
• Member of Institute of Electrical and Electronics Engineers (IEEE).
• Member of winning team for Soccer tournament organized by Indian Association.
• Member of winning team for Volleyball tournament organized by Indian Association.
B.E. (Electronic and Communication Engineering) May 2004
Bharathiar University, India GPA: 3.9/4.0
Projects
• Developed an Intelligent building using microcontroller 8051 based on Smart Card concept.
• Developed a Digital frequency meter.
HONORS & ACTIVITIES
• Chief organizer of technical symposium ‘Rhetoric 03’.
• Member of Training and Placement Cell.