Post Job Free
Sign in

Data Insurance

Location:
Cumming, GA
Salary:
140000
Posted:
July 11, 2017

Contact this candidate

Resume:

Shekhar Mokadam

**.*******@*****.***

*********@*****.***

404-***-****

Profile for Data Architect /Manager, Solutions Architect, Integration Expert

•17 Years in IT industry, proven track record of successful, production implemented data project.

•Strong Experience of Project Management, Program Management, Engagement Management, Stakeholder Management, RFE, RFI, Pre-Sales, Onsite-Offshore Model, Project Estimation and Budgeting, Capacity Planning, Product Management,

•Business Consulting, Process Consulting, Data Management Consulting, Presentations, Define reference solutions, frameworks, Product Selection, Long/Short term technology roadmap

•Exposure to IBM Reference Architecture, Insurance IDM, TOGAF, Zachman, Kimball, Inmon Architectural Framework

•Good experience in Big Data Technologies,AI, Data Management, Machine data, sensor data, real time data streams

•Extensive domain knowledge in Insurance Industry including P&C, Life, Specialty, Aerospace

•Strong hands on experience in different data technologies

•Strong experience as Data Architect, Big Data, Integration Architect, Real time Data Integration, Data Architecture, SOA Integration Architect

•Active member of DAMA, Accord, Various Data warehousing and Analytics Architecture Groups

•Multiple cycles of EDW, DM MDM, Metadata Management, and Reference Data Management Models

•Experience in different data exchange standards including XML, XSD, Accord, ISO, Clinical Data

• Specialized in Integrate analytics and business intelligence to provide solutions according to clients’ organizational policies and IT environment. Provide leadership to employees

•Strong communicator, Requirement gathering, Business Presentations, Stakeholder Interactions, Managed Development Programs

Summary:

•Data Modeling: Over 10 years of Dimensional Data Modeling experience. Sound knowledge on Dimensional Modeling, OLTP Models, Canonical Integration Models, EAV, Associative Modelling, Ralph Kimball and Bill Inmon Methodologies, Star/Snowflake schemas, Data Marts, Facts & Dimensions, Logical & Physical data modeling, MDM Data Modeling, Unstructured Data Modeling, Metadata, Reference Data Process models. ETL and Reporting Framework Modeling, Database Management in SQL Server, Oracle,DB2

•Data Management: Over 12 years of Data Warehousing experience using IBM InfoSphere Information server 11.5 ( DataStage, QualityStage, Multiple Connectors, CDC), Server and Parallel edition of DataStage 7.0 to 8.5, Informatica Power Center 6.0 till 9.5 (Designer, Workflow Manager, Workflow Monitor, Repository Manger), PowerConnect, Power Exchange. 10+ years of Microsoft BI Platform (SSIS, SSRS, SSAS). 7+ years of experience in AbInitio, Express IT, DM Express, Pentaho, Talend

•Big Data: Tibco ActiveSpaces, Data Flux, HortonWorks, Hadoop, Hive, Spark, Scala, Cassandra, Sqoop, NoSQL, Kafka, Hive, Pig, Impala, Hbase, Nifi, Kibana, Apache Big Data frameworks and standards

•Realtime Data: Tibco BE, CEP, BW,IBM MQ, IBM Infosphere ISD ( Infosphere Service director) for real time data integration

•Architect: Over 12 years of Architectural experience as Data Integration Architect, ETL Architect, and Data Architect. Integration of complex Legacy systems in Warehouse, Data Migration Architectural design, Analytics over data. Defined various Data Integration patterns and Framework with innovative approach. Expert knowledge of Integration of New generation PAS systems with legacy systems. Expert in Accenture Duck Creek Integration, Shredding.

•Business Intelligence: 8 years of Business Intelligence experience using SSRS, SSAS,Cognos, BO,Spotfire

•Analytics: 5 years of Experience in Multidimensional Data Analytics including SSAS, TM1, R

•Databases: 15+ years of experience using Oracle, DB2, MS SQL Server, Teradata V2R6/V2R5, Teradata

•Hadoop: MongoDB, Hive, Hbase, Scoop, Ambari. Hortonworks Studio, Cloudera Studio.

•Programming Languages: SQL, T-SQL, PL/SQL, Microsoft .Net, Java, VB, Perl, Python, R

•Cloud Integration: SalesForce, Accord Integration, DuckCreek Integration, Informatica Cloud.

•Data Tools: Trillium, Informatica IDQ, Informatica MDM

Certifications:

Informatica level -1 Certified

Domain Certification Insurance (INS 21, 22, 23)

JCP, SQL and .Net Certification

Trainings :

IBM MDM Architect

IBM IGC,IBM Frameworks ( IFW,IIW,Insurance)

IBM Big data platform

Education:

Master of Computer Application – Amravati University, Oct 1998

Bachelor of Science – Nagpur University, May 1995

Visa Status:

Green Card (Permanent Resident) – No Sponsorship required, Work for any employer

Contact

Email: **********@*****.***, **.*******@*****.***

Cell: 404-***-****

Professional Summary:

Major Railroad (Norfolk Southern) July ‘16 – Till Date

Data Modeler / Data Architect

Project Description:

TEP : Railroad Transportation Event Processing, Real time Data Integration and Manage Train, Equipment and Location Transportation state. ESB and SOA based architecture, provides all data needs for railroad.

TEP DataLake : Data Hub for realtime sensory data, radio data, Waybill data, data streams, equipment state and event data. Different data patterns implemented using Big Data platform.

MDM : Master Data Management for Customer Master, Location Master. Defined conceptual, Logical, Physical architecture. Defined data sync patterns, access patterns, governance framework. Implemented using IBM MDM 11.5

Responsibilities:

Evaluate architecture patterns, Define best patterns for data usage, data security, data compliance

Define concept models, logical & physical data model

Define data integration needs, rules, data governance framework, data quality framework for MDM program

Define strategy for C Level Business users, Discussion with business, define short term and long term roadmap and blueprint

Architecture pattern management, core member of Architecture review board

Process management, Project management, Resource management

Technologies:

Enterprise Architect(EA), Tibco BE, BW, Spotfire, ActiveSpaces, IBM DB2, IBM Data Architect, Erwin

IBM InfoSphere 11.5 (Data Stage, Quality Stage, IGC, DataClick, Data Analyzer, ISD)

IBM Party Model

Cassandra, Spark, Scala, Data Flux(Big Data Platform),Python, Java, Hive, HDFS, Tibco AciveSpaces (IMDG)

Insurance Product – Syntel Feb ‘16 – July ‘16

Data Modeler / Data Architect

Project Description:

The objective of this project is to build Insurance Data Warehouse as Enterprise Insurance Data View with New Generation Technology Platform, which Is integrated view of Structured and Unstructured Data of Insurance Industry. This view is build Based on Standard Insurance data warehouse Accord Model with extension of Dimensional model data marts, SuperNova dimension which holds structured and unstructured data. This view was practically Integrated with various COT products in insurance industry including PAS like Duck Creek, Marine PAS. Decision Products.

Responsibilities:

Product Business Process Model Design

Extension of Accord and Define Enterprise Insurance Data View Model

Discussion with Various Project Partners, Stakeholders and build conceptual view of Product model

Manage Efficiency and Integrity while Integrating with Unstructured Model design.

Understand newer technologies like Big Data, Cassandra, MongoDB NoSQL, NewSQL

Technologies:

Erwin Enterprise

Microsoft SQL Server, SSAS, SSIS, Cassandra, Python, Java,

Hive, HDFS, HBase, Scoop, Data Flux, CDC (IBM CDC), IBM IFW/Reference Model

AIG – Atlanta GA May ‘13 – Jan ‘16

Data Modeler / Data Architect

Project Description:

The objective of this project is to build Insurance Data Warehouse system. This System suffice Policy management, claims management, Billing, Agency, and compliance data management requirement for Aerospace and Marine Insurance policies of AIG Insurance Business. This EDW design involves Integration with existing DB2 based Legacy data application, Enterprise Data warehouse, Agency System, Business reporting, compliance management and document generation applications. Integrated data of warehouse helps to generates Decision dashboards, reports, and various data feeds and compliance reports.

Responsibilities:

Gathered Business Requirements, Consolidated and defined data Integration patterns for warehouse

Defined, Develop and Maintain data model for EDW, Multiple Data Marts, Compliance Data Management, Metadata Management, MDM

SQL Server Database Administration.

Defined Architectural Framework and Pattern for Successful Integration with Legacy Systems

Defined end to end data Integration strategy, methodology and Model driven ETL Framework

Defined Data Publishing strategy from data warehouse data

Issue resolution with Higher Management

Account and Project Management and Offshore Co-ordination.

Technologies:

Erwin Enterprise Data Model, Accord Insurance Data Model

Microsoft SSAS, SSIS, SSRS, Tableau, TeraData, R, DB2, SQL Server 2012, Power BI

Accenture Duck Creek, GuideWire

XML, Java, JavaScript, .Net C# Components, R Scripting

Tower Insurance – Chicago, IL Apr ’11 – Apr’13

Data Modeler / Data Architect

Project Description:

The objective of this project is to build datawarehouse for integrated data published from various Policy admin system for Commercial and Personal Lines of business. Datawarehouse also integrates data from third party PAS acquired by Insurance Carrier. Datawarehouse is centralized EDW for upstream reporting and analytical datamart. Datamart includes Claim, Policy, Customer, Agent, Quotes. Marts also include operational datamarts like Policy Inquiry, Claims Inquiry, Billing Inquiry and Coverage Clearance mart. The EDW integrates more than 10 Legacy systems and operational systems include Guidewire and DuckCreek. ISO reports for different states published from data warehouse. Reinsurance and Treaty data feeds from Data Warehouse

Responsibilities:

•Interacted with End-users for gathering Business Requirements and Strategizing the Data Warehouse processes

•Defined Data Model for data warehouse and Marts. Legacy Integrated Model

•Compliance Data Model for ISO reporting extending Accord Insurance Data Model

•Maintain and Mange OLTP Models for DuckCreek PAS Shredding environment

•GuideWire Integration with Warehouse, Suggest source modifications.

•Defined ETL strategies and Best Practices for EDW environment

•Defined different data integration and validation frameworks including data validation, Error checking process, lineage, recovery and reconciliation.

•Defined mechanism to extract and integrate data from COBOL data sets

•Defined data usage using OLAP and Ad-hoc Data Analysis.

•Defined SOA integration pattern with Warehouse.

•Ensured Development as per Architecture defined

Technologies:

Erwin Enterprise Data Model, Accord Insurance Data Model

Microsoft SSIS, SSRS, Business Object, Cobol Copy Books, R, DB2, SQL Server 2012, TFS

Accenture Duck Creek, GuideWire

XML, .Net, Cobol, T-SQL, Stored Procedures, DB2

Marsh MMC - Hoboken, NJ Sep’ 06 – Dec’10

Modeler/ETL Architect/Data Architect/Lead

Project Description:

Market Model: The objective of this project is to build the analytical model for commercial, personal and claims administration data provided by participating carriers and TPA. This integrates data from carriers PAS, Invoicing system, Investment system, claims and provides end to end view for stakeholders to ensure efficient outcome and safeguard investors investment. This also integrates data feed from third party research agencies like Moody’s, S&P.

SalesForce Integration:

The objective of this project is to establish real time integration environment with Salesforce (the leading cloud CRM) with the client legacy environment.

Integration provides the real business opportunity sync with downstream application, which result faster respond to opportunity. Integration points were Client Prospect Legacy application, PeopleMaster and customer datamarts.

Client Checking Process:

The objective of this project is to develop enterprise-wide client checking processes and a supporting enterprise-wide Client Checking System (CCS) in order to identify and resolve potential business and legal conflicts between and within its Operating Companies (OpCos).

The system will also facilitate organization compliance with requirements imposed by trade sanctions and will include monitoring against the Office of Foreign Assets Control (OFAC) sanctions list.

Claims Data Mart and Market Master Research data marts

Responsibilities:

•Defined Data Models, Data Integration Meta model for Framework.

•Designed TPA Data Integration methodology, TPA Data Integration Models

•Defined SOA Integration methodologies and strategies through SOA Canonical Data Model

•Defined OLTP and SOA Data Modeling for the SFDC applications

•Interacted with business to check for data points and quality of data

•Designed ETL specification and Development

•Involved in Design and Development of ETL

•Developed Integration with web services

Technologies:

Erwin Enterprise Data Model, EDW Standard Data Model

IBM Inforsphere DataStage, Informatica PowerCenter, PowerExchange, Web Service Hub, Oracle 9i, Teradata,SQL Server 2005

Business Object, Universe Designer, Cognos, Cognos TM1

XML, Java, Hybernet, Spring, Cobol, PL/SQL, Perl

SalesForce, Apex Script, Salesforce Informatica Integration

Harleysville Insurance - Harleysville, PA Jan’ 05 – Aug’06

Architect/ Lead

Project Description:

The objective of this project was to develop policy holder data mart for Corporate and Stat reporting. Corporate Stat application is responsible for generation of Policy Holder Report that generates a total Term Experience of a Policy. These reports are used by Underwriters, Agents and Loss Control department.

These reports are split by Branch and further by Policy Status viz. In-force and Expired policies. The existing system holds reports for past 5 years from current date for expired policies. The source from where the report is generated is Premium Stat File, Premium Pending File, Loss Stat File and Claims Trans file.

Data model for Group Life application, Agency Portal, Duck Creek Integration with Warehouse, AQS Integration. Guidewire Integration with Warehouse.

Responsibilities:

•Business Process Understanding and Requirement Gathering, Driving Business Meetings

•Design and Develop End to End Data Model for EDW, Multiple Data Marts, Compliance Reporting Structure, Canonical Modeling, Integration and Process Models, Metadata Models,

•Metadata and Master Data Management (MDM), MDM Modeling

•Defined and Designed ETL architecture of the project

•Defined Reverse engineering of Stat Application logic from Cobol programs

•Defined ETL and Reporting Process Framework Model

•Development of mapping Methodology, Data Lineage, Reference Data

Technologies:

Erwin Enterprise Data Model, EDW Standard Data Model

IBM DataStage Parallel Edition, Informatica PowerCenter, PowerExchange, Teradata,SQL Server 2005

Business Object, Universe Designer

XML, Java, Spring, Cobol, PL/SQL, Perl

Web Development, Data Quality Product, Knowledge Portals Dec’ 98 – Dec’04

Developer/Lead/DBA

Project Descripting:

This involve development of Lost and Found Portal, Knowledge Sharing Portal, Product Development and configuration for data quality product in Java and .Net. Involved in core development activities including business sessions, development, unit testing, Production support and other day to day activities for development life cycle.

Responsibilities:

•ETL Developer, DBA, Database Programmer

•Java Developer

•Business Analysis

•Content Developer

Technologies:

Oracle Hyperion (BRIO Reporting)

XML, Java, J2EE, Hibernate, Spring, JSP, Cobol, PL/SQL, Perl, PHP, Perl

VB 6.0, .net, C#, IIS, ASP, SQL Server, COM, DCOM, DLL, C, C++

Informatica 6.0



Contact this candidate