Preethi Bommaraboyina
************@*****.***
SUMMARY
Senior Data Modeler/Analyst with 71/2 years of experience in Data Analysis and developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.
Evaluating Data Sources and strong understanding of Data warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN, ER Studio.
Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like Erwin, ER Studio.
Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL. Performance tuning and query optimization techniques in transactional and data warehouse environments.
Implemented automated maintenance management system with ability to track.
Created Tableau Dashboards with interactive views, trends and drill downs along with user level security.
Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
Developed and Extending SSAS Cubes, Dimensions and data source views SSAS-Data Mining Models and Deploying and Processing SSAS objects.
Worked in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change
Delivered a series of metrics that enabled an increase in supplier performance management.
Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
Proficient in identifying Facts and Dimensions, Star Schema and Snow Flake Schema and ODS architecture for modeling a Data warehouse in Relational, Dimensional and multidimensional modeling.
Experienced in understanding the HIG proprietary ETL framework metadata to understand the current state ETL implementation.
Performed research regarding python programming uses and its efficiency.
Enthusiastically working on forward and reverse engineering processes.
Good experience in writing functional specifications, translating business requirements to technical specifications, created/maintained/modified database design document with detailed description of logical entities and physical tables.
Experienced in Strong in Source to Target data mapping, Standardization Document Slowly Changing Mapping Creation, Star/Snowflake Schema Mapping Creation, RDMS, Building Data Marts and Meta Data Management
Developed Tableau visualizations and dashboards using Tableau Desktop. Tableau workbooks from multiple data sources using Data Blending.
TECHNICAL SKILLS
Data Modeling
Erwin r9.5/8x/7x, ER/Studio 9.7/9.0/8.0/7.x Data Modeling Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables
Databases
Oracle12c/11g/10g/9i,Teradata R12 R13, R14, MS SQL Server 2005/2008/2012, MS Access
ETL/Data warehouse Tools
Informatica, SAP Business Objects XIR3.1/XIR2, Web Intelligence, and Tableau
Testing and defect tracking Tools
HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, Visual Source Safe
Operating System
Windows, Unix, Sun Solaris
Tools & Software
Teradata SQL Assistant, TOAD, MS Office, BTEQ.
Programming Languages
SQL, PL/SQL, UNIX shell Scripting
EDUCATION
Bachelor of Technology, Electronics & Communication, JNTU, India
PROFESSIONAL EXPERIENCE
Senior Data Modeler/Analyst
Transamerica, Plano, TX Jan 15 - Present
The Transamerica Corporation is an American private holding company for various life insurance companies and investment firms. Transamerica fulfilled its promises to customers, paying over $6.5 billion in benefit claims, including return of premiums paid. The goal of this project was to maintain thousands of global enterprises account information and manage them efficiently. Helped conduct meetings with the end users and performed interviews to gather the requirements from them and to convert those requirements into Requirements document for the development and testing teams.
Responsibilities:
Analyzed the business requirements of the project by studying the Business Requirement Specification document.
Created a logical design and physical design in Erwin
Created ftp connections, database connections for the sources and targets.
Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Maintained security and data integrity of the database.
Developed, deployed, and monitored SSIS Packages.
Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
Designed the Logical Model into Dimensional Model using Star Schema and Snowflake Schema.
Designed Physical database in the form of Star Schema for Data Mart Design.
Developed Tableau visualizations and dashboards using Tableau Desktop.
Generated DDL Scripts from Physical Data Model using technique of Forward Engineering in Erwin.
Identified objects and its relationship from existing database, which can be used as a reference for data mart, then transformed those objects into physical model using Reverse Engineering in Erwin.
Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
Expertise in creating Perspectives, Partitions, DesignAggregations, Cell Level Security in cubesand Dashboard Reports to support end users using SSAS.
Worked extensively with XML schema generation.
Created DFD (Data Functional Design) artifacts that incorporated the process flow Visio, S-T Mapping document and all the specifications for proper ETL implementation.
Used Python programming to develop a working and efficient network with the company.
Participated in Performance Tuning using Explain Plan and Tkprof.
Extensively used Erwin for developing data model using star schema methodologies.
Created Unix Shell Scripts for automating the execution process.
Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
Environment: PL/SQL, SQL, ERWIN r7.3, INFORMATICA, VISIO, Oracle, MS-Word. Tableau, Oracle, SQL Server, Python, Teradata SQL Assistant 12.0, ERWIN data modeler, ODS, TOAD, CA Erwin 7.0
Senior Data Modeler/Analyst
Bravo Health, Baltimore, MD Mar 14 – Dec 14
Bravo is a health care company focused on improving health conditions, the company provides occupational medicine, urgent care, physical therapy and wellness center. The project deals with analyzing the data and creating the dashboards, charts using calculated fields, parameters, calculations, and groups.
Responsibilities:
Provided clear and concise documentation regarding requirements management plans, functional requirements, and supplemental requirements.
Worked with the stakeholders to elicit the business and system requirements and importing them into rational requisite pro.
Analyzed the scope of the project and identified the major actors involved.
Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Power Designer.
Organized and conducted extensive JAD sessions with users to identify, understand and document requirements.
Provided key initiatives in working with users in defining project and system requirements.
Tableau workbooks from multiple data sources using Data Blending.
Performed preliminary analysis on multiple origination and servicing systems and assisted the sourcing team with the creation of Data Requirement Documents.
Created SSIS packages for File Transfer from one location to the other using FTP task.
Worked in Building Tabular Models using SSAS.
Extensively used Power Designer to create Data Models for relational and dimensional modeling.
Developed and co-authored the vision document that defined the primary goals and objectives of the project.
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
Used SQL assistant tool extensively to profile data and check mapping accuracy.
Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
Experienced in Testing the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
Conducted internal and final DFD reviews with the ETL team (EIS) and also FSD reviews.
Participated in ETL Functional Specifications Document (FSD) reviews and responded to questions related to the mappings, modeling and processing logic flows. Supported ETL with the development process.
Delivered a series of metrics that enabled an increase in supplier performance management
Worked with the Business and the ETL developers in the analysis and resolution of data related problem tickets and other defects.
Environment: Power Designer, MS Office(MS Word, MS, Access, MS Excel, MS PowerPoint, MS Visio),.NET, Teradata, Business Objects, Sybase, RUP, UML, SQL, SAS, Tableau, ETL Abinitio, SWOT analysis, GAP Analysis, Load Runner.
Data Modeler/Analyst
Dean Health, Madison, WI Jan 13 – Feb 14
Dean Health Plan is the insurance services subsidiary of Dean Health System and SSM Health Care. Today, Dean Health Plan is one of the largest and most diversified HMOs in the Midwest, and maintains a position of leadership in insurance services through a physician-led integrated health system that improves the health status of its members and delivers a superior level of service and care.
Responsibilities:
Monitored and improved Identified data quality through the use of internal data tools.
Created and ran reports, enters data as required, and conducts research inquiries as directed.
Worked with Clients to gather and document business requirements, conduct extensive data exploration, author technical specifications and coordinate the release and acceptance testing of new development.
Worked productively with clinical areas and call center; collaborated with cross-functional teams to identify the best effective methods to gather the required data for client reports and queries.
Collaborated with product development and operational engineers to coordinate the implementation and QA of technical specifications
Ability to build reporting tools, analyze large sets of data, and do analytical regression models
Used complex SQL Queries for validating and verification of data structure. Generated reports using complex joins, sub-queries and aggregate functions
Conducted GAP analysis so as to analyze the variance between the system capabilities and business requirements.
Developed multi-dimensional cubes, dimensions using SQL server analysis services (SSAS).
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Created SSIS packages for File Transfer from one location to the other using FTP task.
Tableau workbooks from multiple data sources using Data Blending.
Involved in defining the source to target data mappings, business rules, and business and data definitions.
Created Report-Models for ad-hoc reporting and analysis.
Extensively designed data mapping and filtering, consolidation, cleansing, integration, ETL, and customization of data mart.
Spearheaded the design and development of SQL queries to collect and analyze information toward completion of data reporting requests
Performed small enhancements (data cleansing/data quality).
Implemented automated maintenance management system with ability to track.
Performed data analysis on the existing data warehouse.
Environment: PL/SQL, SQL, ERWIN r7.3, Informatica, VISIO, Oracle, MS-Word. Oracle, SQL Server, Tableau Teradata SQL Assistant 12.0, ERWIN data modeler, ODS, TOAD, CA Erwin 7.0
Data Modeler/Analyst
Flagstar Bank, Troy, MI Oct 11 – Dec 12
Flagstar Bank, headquartered in Troy, Michigan, is a full-service bank with 100 branches in communities across Michigan. Chartered in 1987 as a federal savings bank, today Flagstar has assets of $11.6 billion. We are the largest banking company headquartered in Michigan, a top-tier mortgage originator in the country, and one of the nation's top 10 largest savings banks.
Responsibilities:
Interacted with End user community to understand the business requirements and in identifying data sources.
Prepared High-level logical data models and BRDs (business required documents) supporting the documents containing the essential business elements and the description between the entities.
Played an active role as a member of Project team to provide business data requirements analysis services, producing logical and Physical data models.
Designed the Data Marts using Ralf Kimball's Dimension data mart modeling methodologies using ERWIN.
Designed different type of STAR schemas using ERWIN with various Dimensions like time, services, customers and FACT tables.
Worked with end users to collect business data quality rules and worked with the development team to establish technical data quality rules.
Identified the Entities and relationships between the Entities to develop a logical model and later translated the model into physical model.
Coordinated with DBA's and generated SQL code from the data models using.
Worked closely with the ETL SQL Server Integration Services (SSIS) Developers to explain the Data Transformation.
Supported UAT (User Acceptance Testing) by writing SQL queries.
Created SQL tables with referential integrity and developed SQL queries using SQL Server and Toad.
Managed day to day operational and tactical aspects of maintenance programs and capital improvement projects to ensure profitable and successful manufacturing operations.
Performed extensive data analysis and data validation on Tera data.
Worked with the Reporting Analyst and Reporting Development Team to understand Reporting requirements. Involved in Generating ad-hoc reports using crystal reports 9.
Designed and Developed Oracle database Tables, Views, Indexes with proper privileges and Maintained and updated the database by deleting and removing old data.
Created and maintained Logical and Physical models for the data mart, which supports the Credit, Fraud and Risk Retail reporting for credit card portfolio.
Created the conceptual model for the data warehouse with emphasis on insurance (life and health), mutual funds and annuity using EMBARCADERO ER Studio data modeling tool.
Environment: Microsoft Windows Vista/Unix, ER Studio, ERWIN 7.x, oracle 8i, SQL, MS Excel, SSIS, DB2, TOAD from Quest software.
Data Modeler/Analyst
HDFC Bank, India May 10 – Sep 11
Responsibilities:
Identified goals and objectives of projects.
Created project plan by gathering the requirements from the management.
Used Erwin Data Modeler tool for relational database and dimensional data warehouse designs.
Requirement gathering from the users by conducting a series of meeting with the business system users to gather the requirements for reporting.
Worked on creating new tables and columns to Basel data mart.
Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Erwin Forward engineering.
Created SSIS packages for File Transfer from one location to the other using FTP task.
Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
Creating Data Definitions and the Meta Data Repository.
Explored data to identify its model role and measurement level. Verified data for missing values and incorrect data. And then cleaned and structured the data based on the format required.
Validated the model and ran alternate scenarios/sensitivity analysis.
Implemented the model and communicated it to end-users and stakeholders.
Created stored procedures using PL/SQL and tuned the databases and backend process.
Performance tuning of the database, which includes indexes, and optimizing SQL statements, monitoring the server.
Experienced in Using CA Erwin Data Modeler (Erwin) for Data Modeling (data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and data marts.
Worked on Information Systems (FI Systems) and Maintained security and data integrity of the database.
Environment:Erwin, MS Excel, Oracle 9i, PL/SQL, MS VISIO, SQL*Loader and UNIX, Erwin, Win7, Oracle 10g, SQL Server, UML, OBIEE, Informatica 8.6.
Data Analyst
Amnipro Inc., India Jun 09 – Apr 10
Responsibilities:
Analyzed the business requirements of the project by studying the Business Requirement Specification document.
Created a logical design and physical design in Erwin
Created ftp connections, database connections for the sources and targets.
Maintained security and data integrity of the database
Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
Worked extensively with XML schema generation.
Participated in Performance Tuning using Explain Plan and Tkprof.
Extensively used Erwin for developing data model using star schema methodologies.
Created Unix Shell Scripts for automating the execution process.
Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
Environment: PL/SQL, SQL, ERWIN r7.3, INFORMATICA, VISIO, Oracle, MS-Word. Oracle, SQL Server, Teradata SQL Assistant 12.0, ERWIN data modeler, ODS, TOAD, CA Erwin 7.0