Post Job Free

Resume

Sign in

Data Developer

Location:
Canonsburg, PA
Posted:
March 27, 2021

Contact this candidate

Resume:

Comact: +1-803-***-****

e-mail: adk77n@r.postjobfree.com

SATISH KUMAR

SUMMARY

Has around 16+ Years of experience in IT and played multiple roles: – Enterprise Data Architect, Data Architect/Modeler, Data Warehouse/ ETL Lead/Developer, BI Lead/Developer, Data Analyst.

Enterprise level knowledge across architecture domains: Information, Integration, Security & Application.

Expertise in cloud-based database solution design, Product Modernization using Microservices (Domain Driven Design) model for Specialty Pharmacy (ScriptMed® Cloud) and Property & Casualty Products.

Implementation of Enterprise data warehouse(s) and data marts using IBM BDW and Firserv InformEnt®.

Implementation of Data Pipeline framework design using Informatica Intelligent Cloud Services (IICS).

Expertise in variety of Data Modeling: - Domain Driven Design (DDD), CDM/ LDM/ PDM, Canonical Data Modeling, Reference Hierarchies, relationships, Metadata, data lineage, Reference data.

Extensively used Performance Optimization best practices for making database healthy and provide optimal performance including table partitioning, Buffer Sizing, Threading and Indexing strategy.

Experienced in assisting the design process & data linkage management between source systems by using known master data elements and/or create unique identifiers when data elements do not match and/or exist.

Exposure in Project Management activities -Resource optimization, task estimation, costing for fixed Time & Material, experienced SDLC & Agile-Scrum Model that includes short term goals, iterative development.

Exposure on INCA (GIS Application), Oracle Spatial DB and MapInfo (GIS Software) application to generate Geographical maps in Nordic Region.

Expertise to Support the development of data availability for the organization such as Insurance (P&C, Life), Healthcare, Life Science, Pharmacy, Telecom domains & BFS.

Data Modeling: EDM, OLTP OLAP Canonical Data Modeling Star Schema Snowflake schemas Data Mart -Dimensional Modeling Logical Data Model Physical Data Model MDM Document Data Model Reference Data Model Domain Driven Design Data Model Data Vault Modeling

Data Management: EDM, SSIS SSAS SAP BODS Informatica Power Center 9.5 Informatica Cloud Services (ICS) – Data Pipeline ICS- CDC Data Pipeline.

Business Intelligence Reporting: SAP Business Objects COGNOS Microstrategy9.2 SSRS

Visualization: SAP Lumira Tableau QlikView\Qlik Sense Power BI SiSense.

Databases: Oracle 12c DB2 SQL Server 2016 XML MongoDB Oracle Spatial Mark Logic.

Programming Languages: SQL T-SQL PL/SQL JSON Unix Shell Scripting Python R TCL

Data Tools Informatica IDQ Informatica MDM Information Steward MapInfo

Version Control Tools: Microsoft SharePoint GitHub Tortoise SVN repository Tomcat Server IIS

Project Management Tool: JIRA Microsoft Project HPSM Agile Manager VSTS

ERP: Oracle- E-Business Suite R12 (PA)

Integration: SOA architecture Data Power API Microservices XML MS- Visual Studio

Industry Model: IBM - ACCORD IAA IIDM IBM BDW

TRAINING & CERTIFICATION

PMP Training CSM - Scrum Alliance OCA – Oracle Corporation Big Data Internship Program– Foundation Data Science A-Z™: Real-Life Data Science Exercises Data Science 101 DevOps Trained SQL*LIMS” v4.0.16 & 5.0.1 training by Merck Inc

EDUCATION

Master of Computer Application (MCA) in 2004 from Kumoun University, Nainital, Uttarakhand, India

Master’s degree in mathematics (M.Sc.) in 2000 from CCS University, Meerut, India

Bachelor of Science (B.Sc.) in 1997 from CCS University, Meerut, India

EXPERIENCE

Client: Inovalon Inc, Canonsburg, PA

Role: Data Architect 09/19 - Now

Inovalon is a leading provider of cloud-based platforms empowering data-driven healthcare. It provides 80% of the nation’s clinical and quality outcomes measurement analytics.

Design the Data Architecture of systems in a way that ensures compliance with certification, regulatory standards, and a technology-based information security program.

Develop and implement procedures covering data management, model development, testing, technical implementation, model execution and monitoring.

Create and Manage a data dictionary for all data sources / datasets by aligning with the IT organization and the business owner of the data to identify the single source of truth for each data silo.

Lead/perform data analysis including data mapping, data modeling and data validation.

Assess the quality of the data for model development as well as inputs to the model, providing recommendations to improve the data quality at the source.

Ensure that scenario requirements, solutions and related documentation are in line with the functional needs of Monitoring Risk management and technical standards of the program.

Develop strategies to solve complex technical challenges.

Apply consultative approach to assist with functional planning/design and operational procedural planning/design when implementing or integrating systems and/or product enhancement.

Collaborate with other technologists on creating Cross-domain solutions.

Identify approaches to improve the accuracy and effectiveness of analytics models.

Define the approach in a clear and concise manner to support the model optimization efforts.

Monitor ongoing model performance and work with stakeholders to resolve any identified issues.

Ensure teams follow best practices regarding coding standards, code reviews, and testing.

Collaborates product/business owners to defines and establish Data Quality Rules and Standards definitions consistent with department and organizations strategies.

Work on Design and develop data models using Toad Data Modeler and apply best practices for optimum performance of the models.

Working on to develop Domain Driven Design Model that support Micro Services Architecture.

Working on the solution to Deploy the micro services on Azure Cloud Architecture

Develops data quality standards to identify gaps and ensure compliance across the enterprise.

Environment: Azure Cloud Architecture based Micro-Service Data Model, Toad Data Modeler, Oracle12c, JSON Schema, Informatica Cloud Services, C#, KAFKA, ELK,

Client: Tower Hill Insurance, Gainesville, FL

Role: Data Modeler/Architect 04/19 – 08/19

Working on a P&C- Commercial Package Policy solution to develop NextGen Policy Management system

Collaborates business owners to defines and establish Data Quality Rules and Standards definitions consistent with department and organizations strategies.

Work on Design and develop data models using Erwin Data Modeler and apply best practices for optimum performance of the models.

Develop the Conceptual Data Model (CDM), Logical Data Models (LDM) and JSON Schema.

Coordinates and performs the data analysis required to identify required data elements and create source-to-target mappings (including business logic) for data transformations.

Working on to develop Domain Driven Design Model that support Micro Services Architecture.

Design LDM and Canonical Data Model to support the Integration, API and for messaging interface

Working on the solution to Deploy the micro services on AWS Cloud Architecture

Technological stack: Amazon RDS for PostgreSQL, Amazon Elasticache, AWS MongoDB, Amazon Kafka.

Develops data quality standards to identify gaps and ensure compliance across the enterprise.

Environment: ERwin 9.7, Microservices Data Model, Oracle12c, NoSQL- MongoDB, XML XSD, JSON Schema.

Client: Erie Insurance Group, Erie, PA, USA

Role: Data Modeler, Data Architect 07/18 – 04/19

Collaborates business owners to defines and establish data quality rules and definitions consistent with department and organizations strategies.

Develops Enterprise Data Management Functions like data quality management and standards to identify gaps and ensure compliance across the enterprise.

Work on Design and develop data models using Erwin Data Modeler, enforced the Naming Standards, create Meta data and apply best practices for optimum performance of the models.

Coordinates and performs the data analysis required to identify required data elements and create source-to-target mappings (including business logic) for data transformations.

Develop the Conceptual Data Model (CDM), Logical Data Models (LDM)

Work with the application teams to identify Target model mapping design and ensure that it aligns with Integration standards and guidelines.

Design Canonical Data Model to support the Integration, API and for messaging interface and maintain the Reference and Master Data Management.

Support Integration Development Team to develop message flows using IBM WebSphere Message Broker for all interfaces/services.

Design JSON and xml canonical model using schema definition tool Like Altova.

Designed Canonical Data Model to support in transformation of various types of messages formats: CSV, TDS, XML and fixed length using compute node and Message sets in IBM WebSphere Message Broker.

Designed and develop XML Queries, DDL scripts for Data Accessibility to support the Reporting need.

Identify and referred IAA & BOM Mapping for integration standards, guidelines, best practices styles and patterns to the teams.

Identify relevant checklists to validate the adherence to the standards and guidelines.

Environment: OneShield, Oracle, SQL Server, NoSQL DB- MarkLogic9, OSPAS, XML XSD, Erwin9.7.

Client: First Tennessee Bank, Memphis, USA

Role: Data Modeler \ Enterprise Data Management 09/17 -06/18

Working as Data Modular\ Data Engineer at FTB- First Tennessee Bank.

Create and Maintained the Logical data models, Physical data Models in support of Enterprise Data Models, OLTP, Operational Data Structures and Analytical systems

Extensively worked Reverse Engineering, Forward Engineering, Compete Compare, Created and Maintained Logical and Physical layer of Data Models.

Generates Meta Data report from Logical and physical layers to define the business and technical information mapping and constraints.

Responsible for leading the technical design of the tables and views that will be consumed by users of the data warehouse. Creating and maintaining processes for gathering high quality metadata.

Experiences in Data Quality Management, Meta Data Management, DWH-BI Management and Database operation management, Enterprise Data Dictionary for data projects

Gather, define and business requirements and develop solution design best practice guidance.

Prepare impact and risk assessments for proposed changes and/or recommended business solutions

Environment: Erwin 9.7, IBM DB2 10.5, Oracle 11g, SQL Server 2016 or lower.

Client: SGL, USA

Role: BI Developer\ Data Analyst 06/17 -9/17

Worked as Sr. BI Developer cum data modeler on retail chain-based project.

Defined the scope of the project and documented business requirements, constraints, assumptions, business impacts, project risks and scope exclusions.

Report development, problem identification and its fix through major or minor enhancements.

Prepared ETL design strategy document which consists of the database structure, change data capture, Error handling, restart and refresh strategies.

Environment: SQL Server 2016, SSIS, Tableau Desktop 9.3,

Company: DXC Technology (Computer Science Corporation)

Client: KenyaRe & TOARe Insurance

Role: BI-ETL Developer\Data Modeler 06/16– 5/17

Worked in multiple roles like- Sr. DWH-BI, Data Modeler in SICS (P&C, Life, Ceded Insurance Product) Business Analytics projects used by 150+ Reinsurance customers globally.

Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.

Created and Maintained the Logical data models (LDM), Physical data Models (PDM) in support of Enterprise Data Models (Star Schema & Snowflake Schema ), OLTP, Operational Data Structures and Analytical systems

Worked on data profiling the source systems for validating data, defining the data standards, size and datatypes in the data warehouse or mart.

Created ETL mapping documents from source to target systems for ETL developers.

Used advanced data techniques in data modeling, integration, visualization, database design and implementation.

Expertise in Business Objects Installation with IIS/Tomcat Server configuration, Upgrade, Migration and Implementation for Insurance Customers

Worked on MS-SSIS (ETL) Solution to extract, transform and load data into DWH target system, expert in writing complex SQL query, function and stored procedure to perform specific task.

Monitor and improved performance of ETL, Reporting and Database optimization, Expertise in defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.

Worked in Agile-Scrum that includes short term goals, iterative development and daily stand-up

Environment: DB2, SQL Server 2016, SSIS, Oracle 11g, SAP BO XI 3.1/4.2, SiSense 6.2\6.5, Erwin 9.5, JIRA

Company: DXC Technology - Computer Science Corporation, Blythewood, SC, USA

Client: P&C Clients - SwissRe, Farm Bureau, Florida Peninsula & OMAG USA

Role: BI-ETL Developer\Architect\ Enterprise Data Modeler (EDM) 07/13 – 06/16

Worked as various roles like Sr. BI Developer, Data Modeler in POINT IN/J (P&C Insurance Product) Business Analytics projects used by 170+ USA Insurance customers

Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.

Created both logical and physical design of a database and ran the reports, used advanced data techniques in data modeling, Access, integration, visualization, database design and implementation.

Extensively worked on Dimensional Modeling (OLAP data-marts – Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouse

Worked to design data marts for account level and account status databases that help various teams to analyze the user experience, user statistics etc.

Incorporated industry standard modeling practices to support a detailed and subject oriented data mart.

Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.

Maintained the data model and synchronized it with the changes to the database.

Expertise in Business Objects Installation with Tomcat Server configuration, Upgrade, Migration and Implementation for Insurance Customers

Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps.

Worked on motion chart, Bubble chart, Drill down analysis using tableau desktop. And Data Source created and modified.

Environment: DB2, SQL Server 2016, SSIS, SSAS, T-SQL, .NET, JSON, CA Erwin, SAP BO XI 3.1/4.2, SAP Dashboard, Tableau Desktop 9.3 Tableau Server 9.3, JIRA

Company: DXC Technology - Computer Science Corporation, Bloomington, IL, USA

Client: State Farm Insurance, Bloomington, IL, USA

Role: Technical Architect-BI \ Data Modeler 04/12 – 06/13

Worked as Project Lead cum Technical Architect role in State Farms- Asset Management project

Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.

Created both logical and physical design of a database, worked on Dimensional Modeling (OLAP data-marts – Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouse

Incorporated industry standard modeling practices to support a detailed and subject oriented data mart.

Created documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.

Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.

Maintained the data model and synchronized it with the changes to the database.

Developed the team of data analytics and visualization development engineers and convert data into actionable insights using descriptive and predictive modeling techniques

Worked on SAP Business Objects & COGNOS reports design and development

Worked on INFORMATICA (ETL) Solution to extract data from various source, transform and load into target system (Data warehouse and data Mart).

Worked on T- SQL packages Solution to extract data, transform and load into DWH/DM system

Expertise BO Universes- Design and develop new BO universes, modification and enhancement of existing universe by UDT, Resolve loops, traps, and cardinality on universe.

SAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.

Interacting with the clients and the business partners for issues and queries in the project

Environment: SQL Server2016, TSQL, Oracle PL/SQL, INFORMATICA9, IDQ, T-SQL, BO XI 3.1 SP5, Dashboard, QlikView, COGNOS10.2

Company: DXC Technology- Computer Science Corporation, Tokyo, Japan

Client: SAFIC – SAISON Auto & fire Insurance Company, Japan

Role: Data Modeler\ DWH- BI Architect 07/10 – 03/12

Played multiple roles like Team Leader (Technology), Data Modeler, BI Architect, Technical Architect SAFIC- POLISY/J (P&C Insurance)-Business Analytics projects

Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.

Created both logical and physical design of a database, worked on Dimensional Modeling (OLAP data-marts – Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouse

Worked on data profiling the source systems for validating data, defining the data standards, size and datatypes in the data warehouse or mart.

Expertise in Design, Implement and support development end to end data warehouses, data marts, ETLs development that provide structured and timely access to large datasets

Worked on SAP Business Objects & SAP BODI Installation at SAFIC environments

Worked on SAP BODI (ETL) tool to extract data from POLISY/J (Japan) system, transform the data and load into target system (Data warehouse and data Mart).

Expertise in BO Universes- Design and develop new BO universes, modification and enhancement, BO Reports -Design and create new BO reports, troubleshoot the BO reports issues.

Experienced in SAP BO administrator activities, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.

Interacted with the clients and the business partners for issues and queries in the project

Environment: SQL Server 2008R2, Oracle10g, SAP BODI, SAP BO XI 3.1, Dashboard, ERwin, QlikView

Company: DXC Technology - Computer Science Corporation, Noida, India

Client: TDC- Tele-Denmark Communications, Denmark

Role: BI- ETL Developer 10/08 – 06/10

Played as Sr. BI Developer role in TDC (Tele Denmark Communications) project

Worked on ARTEMIS Reports, major/minor bug fix, improved performance and Production Support

Experienced on Data Acquisition from Oracle e-Business Suites R12 (PA Module) using ETL Job scripts.

Expertise to writing complex SQL query, function and stored procedure to perform specific task.

Worked on INCA Application (GIS Application), Oracle Spatial DB and Map Info application to generate Geographical maps

Exposure on Project Registration, Project Tracking, development & Maintenance HLD and DLD Document

Environment: TCL, INCA (GIS Application), Oracle 10g, PL/SQL scripts, Oracle Spatial DB Oracle- EBS R12 (PA Module), ARTEMIS, Oracle Spatial, MapInfo.

Company: DXC Technology - Computer Science Corporation, Chennai, India

Client: Thomson Healthcare (Truven Healthcare)

Role: Senior Software Engineer (G30) 11/07 – 10/08

Worked as ETL – BI Developer role in Thomson Reuters (Truven Healthcare) Business Analytics projects

Experience in creating of Schema objects (Attributes, Facts and Hierarchies) and building application objects (Filters, Prompts, Metrics, custom Groups, Consolidations, Drill Maps, Templates).

Experienced in design and setting up of new users, roles, privileges, data and application security

Extensively worked on Data Modeling concepts - Type2 DIM, Fact-Less-Fact and Confirmed DIM and Fact and ..etc.

Experienced in Quarterly release development process and production live support.

Worked on INFORMATICA 8.6 as an ETL tool to make Data warehouse and data Mart

Exposure of MapInfo to deploy additional ZIP CODE into the system to have them into MSTR Reports

Experienced on reports issues, enhancements, ETL operational environment, DB Optimization

Expertise to writing complex SQL query, function and stored procedure to perform specific task.

Interacted with the clients and the business partners for issues and queries in the project

Environment: SQL Server 2008 R2, Oracle 9i, UNIX Scripting, Oracle SQL Developer Data Modeler, INFORMATICA PowerCenter 8.6, IDQ, ICS, Microstrategy9, Onyx, MapInfo

Company: Cognizant Technology System, Pune India

Client: Merck Inc, NJ, USA

Role: ETL & Business Intelligence Developer 05/06 – 11/07

Performed role of Programmer Analyst -BI Developer in Cognizant Technologies Solutions in Pune

Had Exposure of Pharma Client used SQL* LIMS System for NG-LIMS and Stability-LIMS products and captured sample life cycle phase data to design the DSS system.

Worked for 24 LIMS sites across the Globe to capture NG-LIMS and Stability-LIMS Sample data and loaded successfully into data warehouse and DSS system by setting up the Data Stages at UNIX server

Worked on UNIX Shell Scripting, Crontab Job Scheduling, MSTR & COGNOS Admin and Framework Manager, User Access

Expertise in creating of Schema objects (Attributes, Facts, Hierarchies) and building Application objects (Filters, Prompts, Metrics, Custom Groups, Consolidations, Drill Maps, Templates).

Expertise in MSTR & COGNOS Report Development and Production Support

Monitoring processes running on UNIX server of Stability DSS and maintaining their continuation

Worked on troubleshooting COGNOS and MSTR reports issues, Modify or enhance existing MSTR reports

Environment: UNIX Shell Script, Oracle 9i, SQL*LIMS, PRO*C, Oracle PL/SQL, COGNOS 8.2, COGNOS Framework Manager, Microstrategy9,

Worked at CMC (Active Computers Payroll) at NDPL Data Center New Delhi from 12/05 to 05/06.

Worked at System and Software Solutions, New Delhi, India as Software Engineer from 07/04 to 11/05.



Contact this candidate