Post Job Free

Resume

Sign in

Data Sql Server

Location:
United States
Posted:
November 20, 2017

Contact this candidate

Resume:

Mounika

Data Modeler

ac3ew4@r.postjobfree.com

Contact: +1-732-***-****

PROFESSIONAL SUMMARY

8+ years of total IT experience in Data Analysis, System Application Analysis, Data Architecture, and Development, Testing and Deployment of Business Intelligence using Data Warehousing and Database Business Systems in the areas of health, insurance, finance Industries.

Strong Experience in Data Analysis, Data Cleansing, Data Validation, Data Verification, and Identification of Data Mismatch.

Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources, and scheduling.

Experience in designing Data marts, Star Schema, Snowflake Schema for Data Warehouse concepts like ODS, MDM architecture.

Good Knowledge on developing ETL programs for supporting Data Extraction, Transformation and loading data using Informatica.

Proficient in writing complex SQL queries, stored procedures, normalization, database design, creating indexes, functions, triggers, and sub-queries.

Experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.

Strong Data modeling experience using ER diagram, Dimensional data modeling, Conceptual/Logical/ Physical Modeling using 3NormalForm (3NF), Star Schema modeling, Snowflake modeling using tools like Erwin, ER-Studio.

Actively participated in gathering the requirement documents, analyzing, designing and development extensively using Informatica MDM 9.7, Informatica 9.6.1

Extensive knowledge in data modeling in relational and non-relational databases using industry best practice data modeling techniques and tools such as Erwin.

Created reports for the users using Tableau by connecting to multiple data sources like Flat Files, MS Excel, CSV files, SQL Server, and Oracle.

Extensively used V-LOOKUP’s, Pivot tables and Excel functions, also created Pivot graphs, reports and dashboards.

Extensively used ERWIN for REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document.

Configured Power BI dashboards and reports.

Experience in conducting Joint Application Development (JAD) sessions with end-users, Subject Matter Experts (SME's), Development and QA teams

Experience working in teams, Agile/Scrum development environment.

Good command over Business Intelligence (BI) tool Cognos.

Excellent knowledge in creating Databases, Tables, Stored Procedures, DDL/DML Triggers, Views, functions and Indexes using T-SQL.

Proficient with creating Modules and Macros with Excel VBA in different business environments.

Experience with Data Extraction, Transforming and Loading(ETL) using various tools such as Data Transformation Service(DTS), SSIS.

Good Knowledge on Business Intelligence, OLAP, Dimensional modeling, Star and Snowflake schema extraction, transformation and loading(ETL) process.

Strong working knowledge in Oracle, SQL, PL/SQL, Teradata, DB2, Flat Files, MS Access and Sybase.

Good in analysis, requirements gathering, design, development, implementation, and management of full life cycle data warehouse projects.

Experience in writing and executing unit, system, integration and UAT scripts in data warehouse projects.

Ability to handle and interact with Business users, collect, and analyze business/functional requirements.

TECHNICAL SKILLS

Data Modelling

Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Erwin

Databases

Oracle 11g/10g/9i, Teradata R12 R13, MS SQL Server 2005/2008/2012, Netezza, MS Access, Sybase

Reporting

Crystal Reports and SQL Server 200/2005 Reporting Services

Database Development

T-SQL, PL/SQL

Operating System

Windows Vista/XP/2000/98/95, Dos, Unix

ETL/data warehouse Tools

Informatica9.5/9.1/8.6.1/8.1(Repository Manager, Designer, Workflow Manager, and Workflow Monitor),

Office Applications

MS Office (Word, Excel, Visio, PowerPoint, Project)

Web Development

HTML, XML, Visual Basic 6.0

PROFESSIONAL EXPERIENCE

TD Ameritrade, Jersey City, NJ Sep 2015 - Present

Data Modeler

The goal of this project (FOCUS) is to build a Customer-Centric data warehouse called Commercial Finance Data Warehouse (CFDW) to analyze transactions across finance, marketing, risk, and consumer relations to develop set of standard DW reports to give better analysis to the financial group. With clean customer data successfully implemented, the data warehouse is growing in analytical richness, including the following functionality.

The aim of this project was to give single version of the truth to the user. Data in a single version. Sources were Mainframes, Flat files. This Application would be replacing the existing Legacy system called NFS (National financial System)

Responsibilities:

Analyzed business requirements, transformed data, and mapped source data using the Teradata15 Financial Services Erwin9.6, Logical Data Model tool, from the DB2 source system to the Teradata Physical Data Model.

Created data schema and architecture of data warehouse for standardized data storage and access

Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and convert them to physical data model including capacity planning, object creation and aggregation strategies, partition strategies, Purging strategies as per business requirements.

Worked in Netezza to Oracle12cShell script for loading tables which are required by DB2 Tools from Netezza in to the Oracle.

Designed and developed SSIS Packages to import and export data from MS Excel, SQL Server 2014 and Flat files.

Worked on Data modeling using Dimensional Data Modeling, Star Schema/Snow Flake schema, and Fact & Dimensional, Physical & Logical data modeling.

Modeled new tables and added them to the existing data model using Erwin as part of data modeling.

Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).

Used Normalization and De-Normalization of existing tables for faster query retrieval.

Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to Migrate data from different sources in to target Oracle Data Warehouse.

Worked on Teradata SQL queries, Teradata Indexes, Utilities such as M load, T pump, Fast Load and Fast Export.

Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.

Involved in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the UNIX

use of multiple ETL tools such as abilities and Informatica Power Center and testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages

Involved in Integration and configuration of IBM Information Server components (Fast Track, Business Glossary, Data Stage, Cognos) with Netezza and Oracle.

Worked with Informatica MDM tool for master data management and writing MDM match and merge rules and reviewed and designed the MDM/Activators setup and performed development and Unit testing and deployment.

Environment: Erwin 9.6, DB2, Oracle 12c, SQL, Star Schema, Snow Flake Schema, PL/SQL, Teradata15, SQL Server 2014, SAS, Logical Data Model & Physical Data Model Information Power Center9.7, MDM, UNIX, Netezza, Teradata, MDM.

BCBS, Chattanooga, TN Jan 2015 – Aug 2015

Data Modeler

The subject area includes Claims, & Membership. The scope of the project is to load the Claims data from Source to Conformed Staging Area (CSA). Claims data from Jan 2006 onwards were getting loaded into the target CSA tables for history load and pending claims loaded for the last 18 months. Data from the source files were getting loaded as-is into Teradata landing zone (LZ) tables using Informatica ETL mappings. Data from landing zone tables will be loaded into the Conformed Staging Area (CSA) after applying the business transformations. CSA will be a transient staging area. Data from CSA were getting loaded into EDW with minimum transformations.

Responsibilities:

Gathered and documented business requirements for existing and future business systems.

Worked as a Data Modeler to generate Data Models using Erwin and developed relational database system.

Part of JAD sessions for defining business requirements and translating them into technical specifications.

Developed logical data models and physical database design and generated database schemas using Erwin.

Implemented Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Worked with a design team to map data model to an xml file format used by Acord framework.

Experienced in developing reports in Visual Studio and deploying them to Report Manager.

Used Agile Methodology by Scrum method. Attended day/weekly meetings.

Performed data modeling to differentiate b/w OLTP and Data Warehouse data models.

Reverse Engineered the existing ODS into Erwin.

Developed logical and physical data models that capture current state/future state data elements and data flow using Erwin.

Defined functional test cases, documented test scripts.

Designed data stage ETL jobs for extracting data from heterogeneous source system, transform, and finally load into the Data Marts.

Used the Data Stage Designer to develop processes for extracting, cleansing, transforms, integrating and loading data into data warehouse database.

Created UML diagrams including context, Business Rules Flow, and Class Diagrams.

Environment: ERWIN 8.6, ER/Studio, SQL Server 2012, Oracle 10g, SQL, MS Office, MS Access, MS Visio, MS Excel, ETL

Wells Fargo, Charlotte, NC Nov 2012 – Dec 2014

Data Analyst

Wells Fargo is a diversified bank that offers a broad array financial products and services to consumers, small businesses and commercial clients. A fortune 500company, Wells Fargo has one of the most widely recognized brands in America.

Worked with CMC (Customer Management Center) in production Team which manages the customer recovery for CMC generated, or impacted Risk Events. Events reported to card operation event team are taken care by containment, customer and business process recovery. The main goal of EMS team is to recover the customers or the impacted line of business and restored to a pre-event status. Risk events are managed through the Governance, Risk management, and control system work flow application.

Responsibilities:

Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs using Informatica Cloud

Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.

Using SQL helped design and build new database for company’s financial data.

Conducted JAD Sessions with the SME, stakeholders and other management teams in finalization of the User Requirement Documentation.

Designed ETL architecture to Process large no of files and created High-level design, low-level design documents.

Extensively worked on flat files, mainframe files and involved in creation of UNIX shell scripts using different shell scripts for FTP, generating list files for loading multiple files and in archiving the files after the completion of loads.

Designed star schema with dimensional modeling of the data warehouse/OLAP applications by applying required facts and dimensions.

Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files to load in to Stage area.

Created a lay out for Fixed width flat file and .csv files in Informatica cloud as sources are fixed width/.csv files.

Responsible for testing Business Reports developed by Business Objects XIR2

Was responsible for ensuring OLTP feeds to Enterprise wide data warehouse and other subscribers and maintaining its quality.

Developed test data in preparing the mapping document for source to target (Salesforce.com fields).

Worked data mapping, data cleansing, program development for loads, and data verification of converted data to legacy data.

Designed and Administered Teradata Scripts, Tables, Indices and Database Objects.

Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.

Created reports for the users using Tableau by connecting to multiple data sources like Flat Files, MS Excel, CSV files, SQL Server and Oracle.

Defined best practices for Tableau Report Development

Used Python scripts to update content in the database and manipulate files.

Experience in using Python for creating graphics, data exchange and business logic implementation

Experience on Extraction Transformation and Loading (ETL) process using SSIS.

Worked with DBA in making enhancements to physical DB schema. And coordinated with DBA in creating and managing table, indexes, table spaces, triggers and privileges.

Define and represent Entities, Attributes, and Joins between the entities.

Studied the PL/SQL code developed to relate the source and target mappings for my testing needs.

Developed PL/SQL programs, stored procedures for data loading and data validations.

Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.

Extensively developed PL/SQL Procedures, Functions, Triggers, and Packages

Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis. Optimized the PL/SQL queries by using different tuning techniques like using hints, parallel processing, and optimization rules.

Responsible for SQL tuning and optimization using Analyze, Explain Plan and optimizer hints.

Responsible for migrations of the code from Development environment to QA and QA to Production.

Documented the existing mappings as per the design standards followed in the project.

Carry out Defect Analysis and fixing of bugs raised by the Users

Environment: SQL, ETL, Teradata, Python, Oracle, Tableau, PL/SQL, SSIS, SQL Server 2008, Informatica Cloud

Sonata Software Limited, Hyderabad, India May 2010 – Oct 2012

Data Analyst

Created the data warehouse to align the HR strategy to the overall business strategy. It can present an integrated view of the workforce and help in designing the retention scheme, improve productivity and curtail costs. This included extraction of data from various platforms, metadata management, and integration of mapping and loading of target tables.

Responsibilities:

Involved in defining system requirements, designing & prototyping, developing, testing, training and Implementation of the applications.

Wrote T-SQL queries, created dynamic Stored Procedures by using input values.

Wrote standard and complex T-SQL Queries to perform data validation and graph validation to make sure test results matched back to expected results on business requirements.

Worked on generating reports using Crystal Reports, SQL Server Reporting Services and Excel spreadsheet.

Worked with the developing team in the Writing functions in Visual Basic for Upload download functionality, Data transfer and migration

Wrote Packages, Stored procedures, and functions for accessing, inserting, modifying and deleting data in the database from GUI.

Developed Various Charts and Graphs like Bar Chart (Side by Side, Stacked), Line graphs, Pie-charts etc. using Microsoft Excel.

Responsible for query optimization using MS SQL tools such as Query Analyzer, SQL Profiler, index tuning wizard.

Worked with handling errors and validating user data using VBA.

Used SQL for Querying the database in UNIX environment.

Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package

Used SQL Server 2005 tools like Management Studio, Query Editor, Business Intelligence Development Studio including SSIS and SSRS.

Wrote Data Conversion programs using SQL*Loader and PL/SQL

Developed and maintained lot of batch scripts in UNIX shell scripts like scheduled data loading, Create, databases, Tuning and backup and Recovery.

Environment: T-SQL, Oracle10g, SQL*Loader, XSD, MS Excel, SSIS, SSRS, SQL Server 2005, UNIX, Teradata

Apollo Hospitals, Hyderabad, India July 2008 to April 2010

Data Analyst

This project involved developing online tools and Web Applications for the financial advisors of the Company. The online tools consist of mainly Generating Online Reports to give better understanding of the financial position of the investors to the Advisors. Web Applications were designed to facilitate Advisors for accessing and modifying the data of the investors, accounts, assets and transactions from GUI and the database was designed to handle these requests from the back end.

Responsibilities:

Interacting with the end-user (client) to gather business requirements.

Converted and loaded data from flat files to temporary tables in Oracle database using SQL*Loader.

Extensively used PL/SQL in writing database packages, stored procedures, functions and triggers in Oracle 9i.

Developed SQL scripts involving complex joins for reporting purposes.

Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for the maximum efficiency and fast response using Oracle Hints, Explain plans.

Used Teradata as a Source and a Target for few mappings.

Migration of MS Access to SQL SERVER 2005.

Developed Functional and Regression Testing scenarios based on XML and XSD schema validations

Load data from MS Access database to SQL Server 2005 using SSIS (creating staging tables and then loading the data).

Develop various SQL scripts and anonymous blocks to load data SQL Server 2005

Create procedures, functions and views in SQL Server 2005.

Develop ad hoc reports using Crystal reports XI for performance analysis by business users.

Exported reports into various formats like XML, PDF, HTML, and EXCEL using Crystal Reports XI.

Involved extensively in Unit testing, integration testing, system testing and UAT.

Responsible for the reports deployment through SSRS on the Reports Server. Returned the stored procedure and the UDF for the reports.

Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements, etc.,

Environment: Agile, Oracle 9i, PL/SQL, TOAD 9.5, DB2, Crystal reports 11, Teradata, SQL Server 2005, SSIS, SSRS



Contact this candidate