Post Job Free

Resume

Sign in

Data Project

Location:
Anaheim, CA
Posted:
August 07, 2018

Contact this candidate

Resume:

Professional Summary

General:-

* **** ***** ** ********** as Sr.Data Analytics Lead/Technical (Data/Big Data) Lead/ Sr. BI Business System Analyst for Business Intelligence & Data Science projects with specialties in Requirements Gathering, Requirements Documentation, Scope & Vision Documentations, Testing and Data Analysis/Mapping/Modeling

Extensively used SDLC methodologies. Specific methodologies include Water Fall, Agile Scrum technique and Rational Unified Process (RUP) for managing iterative style of software development

6 plus years of experience in data analysis/project management/business analysis experience in Enterprise Data Warehousing (EDW/BI) Projects

Extensive experience and understanding of Web Technology, ERP systems Finance, Investment, Property& Casualty Insurance, E-commerce, Service-Oriented Architecture (SOA)

Data Analysis:-

Proficient in analyzing Data Quality Issues - Data Redundancy, Data Consistency, Data Completeness, Metadata Analysis, and Data Schema Study

Prepared Data Mapping Document comprising of Source Input Definition, Target/Output Details, Business Transformation Rules, and Data Quality Requirements

Proficient in executing SQL queries for data analysis and preparing Data Profile Report

Performed Data Mining in order to determine Data Mappings between various systems

Ability to interpret/build Data Models - Logical Data Model, Relational (Normalization) Model and Dimensional (Star and Snow-Flakes) Model, Meta Data Schema, Data Validation

Strong understanding of OLTP, OLAP, ETL, Data Integration, Data Migration, Data Conversion and Distribution, Data Flux Kimball & Inmon DW Models, Data Acquisition

Proficient in analyzing data by Data Mining, Statistical Analysis of data, Forecasting, Quality Control in Manufacturing and Production by applying Statistical Analysis (Control Charts), SPC, SIX SIGMA, ITIL, Lean Manufacturing, Logistics, Transportation, Simulations, DMAIC, DOE, ISO standards, Business Optimization, Process Flow, FEMA, TQM, DFM, and 5s

Testing:-

Broad Knowledge in reviewing Test Procedures, creating Test plans, Test scenarios, and defining Test cases for both System & Integration Testing

User Acceptance Testing (UAT) with Test Plan, Assigning Team, Execution, Documentation and Defect Report

Worked closely with QA team in planning Unit Test, Regression Test, Performance Test and System Integration Test

Expert in using Load Runner for load testing, Quality Center/Test Directory for test management and QTP for regression testing and functional testing

Requirements Documentation:-

Business Requirements Document (BRD), Functional Requirements Document (FRD/FSD)

Data Movement Specification (DMS/DMD), Data Profile Report (DPR)

Gap Analysis Document (GAD), Root Cause Analysis Document (RCA)

Run Book for daily Tivoli/ Informatica Job Run

Test Plan Document (TPD)

Technical Skills :

Methodologies

RUP, Waterfall, Agile (Scrum, XP)

Testing Tools

Mercury Test Director, HP Quality Center, JIRA

Data Modeling Tools

MS Visio, Erwin, ER Studio

Business Modeling Tools

Power Designer, MS-Visio

Project/Contents Management

MS Project, Requisite Pro, MS Excel, MS Share Point,

Project Portfolio Management, Service Now

Defect Management Tool

Rational Clear Quest, QTP, Mercury Test Director

Databases

Netezza, Oracle, MS Access, MS SQL Server, DB2,Teradata, PL/SQL

Statistical Analysis Tools

(Statistical Analysis)

Minitab, SPSS, SAS, R

Data Mining (Predictive Analysis)

WEKA, Rapid Miner, Tracies, iDA, SAS

Risk Management Tools

(Mathematical Modeling Tools)

Monte Carlo Simulations, Crystal Ball, What-if Analysis, and

Sensitivity Analysis, OPTQEST

File type

.xls, .xlsx, Flat File, XML, X12 (EDI format), JSON, JAR

Operating Systems

Windows 2000/XP/Vista, MS Dos, Mainframe

General

MS Office, MS Project, Lotus Notes, Excel Macro, VBA,HTML, PDF,

Adobe Photoshop, Excel - Pivot Table, Pivot Chart, Macro, V-look

Up, H-look Up

Reporting Tools

Oracle Business Intelligence, Tableau, MicroStrategy, SSRS

Referential Data/ Revenue Allocation

Referential Data Interface/ART Rules

Servers

FTP, SFTP, EFT, windows, Linux

ETL Tools

Informatica Power Center, Tivoli, SSIS, Map2Code, ExtractMapper

BIG Data Stacks, AWS

Apache Hadoop 2:

Flume, Sqoop, HDFS, MapReduce (Java), PIG Latin, HiveQL, Hbase,

Spark, Scala, Oozie, YARN, Apache Pheonix, CentOS, Shell, Linux,

Java API, GITHub, Redshift, AWS Glue, EC2, DynamoDB,,S3, Elastic cache

Professional Experience

First American Insurance P&C thru MindTree, Santa Ana, CA

Data Analytics Lead\Product Owner\Asst.TDM Feb’2017 - Present

The project focused on legacy data conversion (renewal conversion and legacy data migration) from AS/400 mainframe to Duck Creek Data Insights Data warehouse to Data Mart, Tabular model and utilized by Power BI & SSRS reporting models (BI Models). The project was focused for LOBs such as Home Owner policy, Dwelling fire Policy, Resident Liability Policy, Condominium Policy, Personal Umbrella Liability Policy, Commercial Fire Policy, Commercial Blanket Policy, and Dwelling Property 1 Policy, Personal Auto.

Responsibilities:

Performed source systems data analysis such as SIS ERD, Data profiling, and business data requirements

Performed target systems data analysis such as Duck Creek Data Insights schema analysis, referential integrity analysis, reference data analysis, key analysis

Interacted with business user to get data requirements such as list of tables, attributes, reference tables

Documented ERD, source to target mapping document (technical specifications), business requirements document

Conducted sessions for Duck Creek Data Insight out of the box (OOB) for requirements gathering

Gathered SSRS/Power BI reporting requirements which are utilized by Duck Creek Data Insights platforms, Data Warehouse, DataMart, and Tabular Model

Gathered requirements for Duck Creek UI implementation for Policy & Claim Modules

Worked closely with executives to discuss about Duck Creek Insights implementation strategies

Worked on system integrations such as policy to Lexis Nexis, Verisk, Clue through SOAP web services API.

Closely worked with on site and offshore development, QA team and business user for UAT

Performed ETL activity by using Map2Code utility in Duck Creek systems

Extensively used Duck Creek UI, Data Insights, SQL server, Visio, SharePoint, TFS, ERwrin, ERstudio, Map2Code, Extract Mapper Excel, MS-Access, ViewPoint, SSRS, Power BI, DAX and SSIS, Hadoop, Python, R, XML, XSLT etc.

Extensively used DQ in Map2Code console for data quality

Managed offshore and onsite resources

Involved as delivery manager to deliver all the deliverables on timeline

Worked on budgets and infrastructures of the projects

Worked as product owner, and conducted daily scrum meeting to get the status update from entire team

Mercury Insurance Group, Brea, CA

Sr. BI Systems Analyst/Asst.TDM May’15 – Dec’2016

The projects in Mercury Insurance Group are related to Guidewire Property& Casualty insurance (for LOB: PPA, BA, HO, CEQ) and the electronic submissions, internal data integrations, external data feeds and various BI models such as Micro-strategy &Tableau reporting for policy center and claim center. The project was majorly focused on e-submits data integrations to DMV for 13 different states for daily/monthly basis reporting. And, the current carrier lexis nexis/elienholder data integration was another big project where MIG transmit daily incremental data for all type of policy transactions.

In addition, the projects for other data feeds with agero (America Road Assistance), EARS (Explores) for driving violations data transmit and reinsurance project with Hartford Steam Boiler.

Responsibilities:

Worked on incident tickets on daily basis to support the Mercury Daily Business

Interacted with business Director, SMEs, underwriting team, claim center team and CR on daily basis to resolve the production issues

Daily Activities:

-Scrum Meeting (Agile Methodology),Product Ownership, status update from the team members, Incident tickets Analysis, Trouble shooting the issue as per SLA, Defect Analysis/Enhancement Analysis, Open a Defect in Quality Center

-Requirement Documentations (BRD, FRD, SMT),Production Support, Data Analytics

-Manage development cycles with remote and onsite team members, Defining scope of the issues and managing/tracking them, Documentations of Change requests

-Monitoring : Outbound Process and Inbound Process for all the integrations and reporting on daily basis and coordinating with cross functional departments for process improvements

Ownership of Data Integration - Guidewire Policy Center, and Claim Center with External Vendors

Policy Center Data feeds (Transaction: Property& Casualty- LOBs personal, commercial, homeowner)

Involved in Enhancement & Maintenance

-E-submits Data feed (Inforce Policies as well as other transactions) in 13 different states on daily/Monthly basis

-Hartford Steam Boiler (Reinsurance) data feed,all states and LOB ‘Home Owner’

-Current Carrier Lexis Nexis for Personal Auto data feed, all states and LOBs

-Choice Point Elienholder for BA and HO data feed, all states and LOBs

-Agero ( Roadside Assistance) data feed, all states and LOBs, EARS ( Explore) data feed, all states and LOBs

Claim Center Data feeds (Transaction: Property& Casualty)

Involved in Enhancement & Maintenance

-CLUE/CLUP (Lexis Nexis) data feed all the states – LOBs Personal Auto, Property and Commerical Auto.

-CLUE/CLUP (APLUS, ISO, and INNOVIS) data feed all the states – LOBs Personal Auto, Property and Commerical Auto

-Medicare (ISO) Monthly and Quarterly Reports, CONFIRMIT data feed for Business Auto, Homeowners

Ownership of MSTR Reporting:

Policy Center: Involved in Enhancement & Maintenance

-PC01 Activity Tracking Report, PC02 Driver Filing Report, PC03 Inforce Counts Report, PC04 withdrawn, not taken policies, NY renewal conversion reports and all other Policy Center Reports

Claim Center

Claim Center: Involved in Enhancement & Maintenance

-Catastrophe Report, No Financial Activity Report, PIP Report, Medicare Report, BME 992 Report to productivity counts by group and branch, Vendor Payment Summary Reports and Material Damage Reports, Cars Reports (Vendor Body shop),Legal Tracking Reports and many other claim center reports

-Extensively worked on KPI (performance indicator) and Dashboard Design for Claim transactions by productivity tracking

Policy Uptiering during renewal conversions from Duck Creek as well as Guidewire systems

Involved in Informatica data mapping, work flow monitoring and Tivoli job monitoring, updating the run books and /XML/XSD/HTML/FLAT files mapping

Involved in Guidewire Policy Center to LexisNexis, Claim Center to Clue integrations by using Restful Web services API.

Involved in defining Objects/Attributes, mapping for Micro-Strategy Reporting

Extensively used SQL queries for Data Validation, Data Quality), Data Manipulation, Data Integration, Data Governance Data Dictionary, Data Segmentation and Data Conversion from legacy system to guidewire, data profiling for EDW, ER-Studio, Erwin, Data Cleaning and Micro Strategy Reporting

Involved in creating data transformation logics to load the data into Staging & Data Warehouse by using integration tools like Informatica

Worked on DQ through Informatica Developer (IDQ) and data profiling through Informatica Analyst

Involved with Testing team in reviewing Test plan, Test Cases, Unit and System Integration test plans, UAT

Managed and track the requirements by developing Requirement Traceability Matrix (RTM) and performed Sanity Testing

Extensively used Guidewire Policy/Claim/Billing Centers, MS-SQL Server, Oracle, PL-SQL, Netezza, Teradata, Informatica, ERwing, ERstudio, Micro-Strategy Reporting, XML, Webservices, Sharepoint, WINSCP (FTP, and SFTP), Linux, Unix, SQL Assistant, Quality Center, JIRA, Clarity PPM (Project Portfolio Management), MS-Project, Service Now, Kanban Rally, MS-Excel, MS Access, Tableau, Liquid XML, MS-Visio and MS-Word

Extensively used BIG DATA stacks such as PIG Latin, HiveQL, HBase, MapReduce, Apache Phoenix, My SQL, No SQL, for analyzing the data

Good understanding about Spark, Scala, PostgreSQL, MongoDB

Extensively used Flume, Sqoop, HDFS, YARN, Hadoop Clusters, Oozie for BIG DATA streaming and storing and performed Big Data CRUD operations in Apache Phoenix on HBase

Performed Big Data Analytics by using ‘R’, ‘R-Studio’ for data visualization

Unified School of Los Angeles

Sr.Technical Data Analyst Nov’14 – May’15

The second largest school in the nation, the Los Angeles Unified School District (LAUSD) enrolls more than 40,000 students in kindergarten through 12th grade, at over 900 schools, and 187 public charter schools.

The project focused on the enhancement of the current MiSiS (My Student Integrated Information System) module (UI) and SSRS (BI) reporting. The enhancement is mainly concentrated on Master Scheduling module. Basically, the projects were focused on creating new UI screens (MiSiS Web Application - Screen), Web Content Management, adding fields on the screens and sourcing the data to the SSRS reporting module.

Responsibilities:

Traced data elements required to meet Business Requirements(Business Rule) for user interface

(UI screen) and reports

Gathered and defined business requirements and translated requirements for Logical MiSiS

(UI-Screen) UI module design and reports design

Extensively involved in the following MiSiS UI & Reports design for Master Scheduling module.

Screens:

-Course Master, Individual Request/Mass Request Editor, Change Course Assignment, Section Types/Request Generator, Manage Group/School Staff, School Period and Instructional Space etc.

SSRS Reports:

-School Courses/District Courses, Student Request Scheduled/ Student Request not Scheduled

-Course Request Not Scheduled/Teacher Section Assignment, Student Course Request Summary/Class Enrollment Report etc.

Worked with executive team for project planning, resource allocations, and scheduling of the project

Extensively used T-SQL queries for Data Modeling, Data Validation, Data Manipulation, Data

Integration, and Data Profiling, Data Analysis, and Data Quality

Responsible for creating UML models, Use cases,, Process Flows by using MS Visio

Analyzed and modeled the system using Data flow Diagrams, and Data Models (Logical and Physical) and created Process Flow Diagrams, Integrate process flow diagrams to show one end-to-end business model

Involved with Testing team in writing Test plan, Test Cases, Unit and System Integration,Test plans in Test Director ( HP Quality Center)

Managed requirements by developing Requirement Traceability Matrix (RTM) and performed Sanity Test to validate change after development.

Extensively used MS-SQL server, .net, HTML, SSRS, MiSiS modules, Sharepoint, Team Foundation Server, Excel, Excel Macros, MS Access, MS-Project, MS-Visio, MS-Word, and Quality Center

Mercury Insurance Group, Brea, CA

DW-BI Analyst Aug’14 – Oct’14

The project focused on enhancement of EDW/BI MicroStrategy reporting for Line of Business PPA (Personal Auto and Commercial Auto) and Non PPA (Home Owners, Business Auto) for various source systems such as NextGen, Guidewire, and PHXE/PHXW. Also, the project focused on creation of new dimension (for Flat Cancel) to capture latest view of Inforce Policies, Cancelled Policies, Expired Policies based on the effective date of cancellation, reinstatement date and renewal and report through MicroStrategy.

Responsibilities:

Interacted with business users to capture the reporting requirements

Design dimension (Policy Module) for KPI and Dashboard for MicroStrategy reporting model

Created report in MicroStrategy as per the business requirements and Created defect for existing reports in quality center

Executed T-SQL queries to perform data validation, data reconciliation, and data quality

Involved in Data Integration from legacy systems application SQL server to Netezza

Involved in Data Model to modify existing Data Model (Schema)

Involved in creating source target mapping (Data Mapping) document as per the design needs

Assisted testing team in writing test plan, test cases for Unit and System Integration test in Test Director (Quality Center)

Performed data analysis, data profiling, data mapping for Data Warehouse – BI reporting model, data quality

Managed requirements by developing Requirement Traceability Matrix (RTM) and performed Sanity Test

Performed process and problem management by using incident ticketing system

Performed User Acceptance Testing to verify the end results

Extensively used Informatica, Informatica Developer (IDQ) and Informatica Analyst Erwin, ODI, Netezza, Oracle, DB2, Kanban Rally, DBVisualizer, MicroStrategy, Quality Center, GuideWire, and NextGen applications, Service Now

Extensively worked on Policy, Claims and Billing center for source systems for NextGen, GuideWire, and PhxEast/PhxWest in external application system and internal Database

Fidelity Investments, Boston, MA

Sr. Data Designer June’13- June ‘14

Fidelity Investments is a global financial leader. It is committed to giving back to their communities by supporting organizations whose shared goal is to improve data set feeding from Pyramis Fidelity Investments to FDH (Financial Data Hub) and to FCAP. This project was making Data Warehouse/Data Integration/BI which were utilized through OBI (Oracle Business Intelligence) Integrated Model and Tableau for reporting by several business units such as Asset Management (Financial Markets; Capital and Money Markets), Fixed Income, OTC Derivatives, Trades, Securities, Treasury Services Wealth Management, Risk Management (Actimize).

Responsibilities:

Interacted with the project team to define system objectives and scope, feasibility study report and identify constraints and the measures of success for the system being developed

Conducted work sessions with Business Users and Data Standards Lead to get clear understanding about business requirements

Followed Agile(scrum)methodology under SDLC to develop systems

Responsible for designing Data Warehouse, and creating Data Movement Specification

as per Business Requirement Document (BRD), and developed Use Cases, and UML diagrams

Involved in data querying using MS SQL to do the Data analysis Prepared Data Analysis, Data Profile Report for documenting Data Quality issues

Performed extensive analysis for Fraud, Risk Management and Transaction Loss

Involved in Data Mapping for Data Integration from AXIS, FBSI, FPRS, INVESTONE (IOTA) and FA systems

Involved in creating Data Model e.g. High Level Conceptual Model, Logical Model, and Physical Model and relate Data from external feed to Assets and Flows (FACT connection)

Extensively used, ERP, IBM Netezza (Aginity Workbench), SQL Developer Oracle, Oracle Data Integrator(ODI), ERwing, ERstudio, Oracle Business Intelligence, HMTL, XML, Informatica Power Center, Tableau, Micro-Strategy Reporting, Crystal Reports, TOAD, and Rally Kanban Board

Extensively used RDI (Referential Data Interface) and ART rules for manual data load and for revenue allocations

T-Mobile Newport-2 Bellevue, WA

Data Systems Analyst Aug’12- March’13

The Project was focused on E-commerce Retail Industry Web Application. This project was model products from WPC (Web Product Catalog) for all channels in EPC (Enterprise Product Catalog): Postpaid Products, Prepaid Products, devices, accessories, rate plans and services; Support product model from Samson/SAP in EPC; Model product compatibility and customer eligibility in EPC; investigate automated EPC to WPC interim interface for product updates through WPC retirement (Postpaid and PrePaid wireless Services, Devices, Accessories and WPC extract support).

Responsibilities:

Worked as a Data Analyst/ Business System Analyst and assisted the project manager in defining the Scope, Schedule, Business Change Management, Develop Feasibility Study Report for Preliminary Study of the Project, and Budget of the Project

Participated with individual and JAD/RAD sessions (Work Sessions) with the stakeholders (Offer Management) and the technical units (Amdocs) for business requirement gathering followed by (Agile Scrum) approach

Assisted the team members with documentations like Business Requirement Document and Use Case Specification Document by using DOORs

Assisted in Data Analysis by Data Mapping among Enterprise Product Catalog, Web Product Catalog, SANSON (Billing System) and SAP (Inventory) then worked with the design team in the designing of the Databases with extensive use of ERDs for Data Validation, Reference Data Analysis, and Data Conversion

Created a Decision Support System via Dimensional Modeling by ensuring appropriate FACTS tables, DIMENSIONS tables, attributes, and entities in order to generate MEASURES as required in Data Warehousing (EDW) and Data Migration by business users

Assisted in creating Data Sample that included specific size of data from the database reflecting different data field elements’ values. Data Sample would then be compared with Source Application Systems to test the accuracy of data field elements values

Extensively used, MS-SQL, era Data, Clarity PPM, .net, HTML, XML, Java, J2EE, .net, Web services, MS-Project, MS-Visio, MS-Office, TOAD, SQL Assistant

Saleways Retail, Kathmadu Nepal

Business Data Analyst Dec’07- June’10

Responsibilities

Conducted user interviews and Work Sessions (JAD) by using Agile (scrum) methodology to get understanding of client business process

Gathered data requirements for the system by interacting with the users and data administrators

Responsible for creating the Business Requirement Document (BRD)and the Functional Requirement Document (FRD), and developed Use Cases, BPM, UML diagrams

Involved in data querying using SQL to do the data analysis and validate the data. Prepared Data Profile Report for documenting Data Quality issues

Carried out Data Quality Assessment – subjective and objective in order to categorize quality issues across various data quality dimensions using various quality metrics and devised rules to resolve

data quality issues

Involved in Data modeling (Logical Modeling and Physical Modeling), Data Mapping, Data Conversion, Data Validation, Data Profiling, Data Migration, and Schema Design for Enterprise Data Warehousing

Performed in Data Quality Assurance by using SQL, Excel VLook up, and Excel Macros

Set up definitions and process for test phases including Sanity test, Integration test, System test and User Acceptance Test (UAT) by using Clear Quest, QTP and HP Quality Center

Educational Background

Master’s in Engineering Management/ Operations Research/Information Systems -2010 - 2012

Arkansas State University, Jonesboro, AR, USA

Bachelor’s in Computer/Civil Engineering, Tribhuwan University, Kathmandu Nepal 2003 -2007

Certification s

Big Data and Hadoop Developer - Certified by Edureka! License# C812PDSD

SMAC – Scrum Master Accredited Certification,

Certified by International Scrum Institute, Certification ID# 957***********

APRM – Accredited Project Manager Certification,

Certified by International Organization for Project Management ID# 432***********

AWS Data Consultant – 2 Hours Big Data in AWS Training Classes – Certification# UC-DVFGNTAC

PL\SQL – Beginner to Advanced Course – Certification# UC-S20VMKPT

AWS Certified Developer Associate Course – Certification# UC-O5FEOGEK

Honors

Program of the Year – 2012 Department of Engineering Arkansas State University, GPA 4.0

Research Assistantship, Graduate Assistantship 2010-2012

Membership

American Society of Quality Control-2010

Leadership

Former President – American Society of Quality Control Student Chapter, 2011 - 2012



Contact this candidate