Post Job Free

Resume

Sign in

Data Architect Sql Server

Location:
Bloomsburg, PA
Posted:
November 02, 2023

Contact this candidate

Resume:

PROFILE SUMMARY

Result-oriented professional with 17 years of experience in multiple DWH technical platforms with Multiple technologies

Proven success in working on Database including ETL Tool Green plum, Teradata, Oracle, Vector Wise

Hands-on experiences in working on Informatica PowerCenter 10.5.2/9.5, IICS, Informatica Power Exchange 10.4/9.5, IICS, HVR, Data Stage, SAP-BODS, Talend, Hadoop Scala and Spark SQL

Exposure of creating designs & and Project plans documents like PGP, SGP and Understanding of gathering business requirements, developing, & deploying custom data systems / applications

Good experience in designing and developing audit, error identification and reconcile process to ensure the Data Quality of Data warehouse.

Skilled in creation & maintenance of database/application architecture and standards following best practices

Mined &analyzed data from multiple sources to drive optimization & improvement and delivered data-driven solutions to business challenges.

Experience working with data lake implementation. Involved in development using Informatica to load data into Hive and impala systems.

Extensively worked on Informatica B2B Data Exchange Setup from Endpoint creation, Scheduler, Partner setup, Profile setup, Event attributes creation, Event status creation, etc.

Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.

Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.

Have done Requirements gathering, Design, Development, Implementation, Migration, testing for various ETL implementations using Informatica. Worked extensively and designed ETL solutions using sources like Oracle, SQL, JD Edwards, Siebel ERP, SQL, and AWS S3.

Profound knowledge about the architecture of the Teradata database.

Experience in writing PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.

Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Sybase and MS SQL Server into Oracle, Teradata, XML, SQL Server, Hive, Impala and Azure targets.

Significant Multi-dimensional and Relational data modeling experience, Data Flow Diagrams, Process Models, ER diagrams with modeling tools like ERWIN & VISIO.

Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.

Extensive experience is maintaining Informatica connections in all the environments like Relational, Application and FTP connections.

Excellent at managing design & development, testing, debugging, troubleshooting and facilitating smooth implementation of the application

TECHNICAL SKILLS

Informatica PowerCenter 10.5.2/9.5, IICS, Informatica Power Exchange 10.4/9.5, IICS, Data Stage SAP BODS, Talend, HVR, TALEND, HDFS, HIVE, SCALA, SPARK-CORE and Data Frames SPARK SQL

Performance Tuning (Partition, Bucketing, Map side join, Broad cast join in Spark-SQL

RDBMS Green plum, Teradata, Oracle, SQL SERVER, DB2 Vector wise

Scheduler Tool Control – M 6.3, Tidal

UNIX

Dimensional modeling from conceptual to Logical and physical data modeling

EDUCATION

2013: MBA Sikkim Manipal University

2002: B. Com from IGNOU, Dhanbad, Jharkhand

2001: Diploma in Computer Application from NIIT Dhanbad

CORE COMPETENCIES

SOFT SKILL

Lead ETL Developer and Data Architect

Geisinger Health Plan

Since Jul 2022

Projects Undertaken:

Project:

Role:

Customer:

ETL Manage Services

Lead ETL Developer and Data Architect

Geisinger

Period:

Environment:

Domain:

Jul’22 –Till Date

Informatica, SQL Server, Tidal

Health Care

Description:

Geisinger is one of the leading hospitals in health care and plan in Pennsylvania.

Its primary care facility is the Geisinger Medical Center . It was started in 1915.

Geisinger Health System serves over half a million patients across multiple states in the

Northeastern United States

Responsibilities:

Worked on GHP-Project with multiple PBI mostly handing Facets/Medicaid/CHIP Data to generate Member/Medical/Dental and RX Claim and Eligibility file etc.

Build ETL code for Cohere/CMS Interop/Navitus/Facets sources.

Working on the for cloud (IICS) mappings for 350 + Mappings. This engagement will be over at the end of next year.

Automated/Scheduled the cloud (IICS) jobs to run daily with email notifications for any failures.

Working on CMS Interop, Big data Migration to SQL Server.

CMS Interop Navitus and Cohere implement in Progress.

Helping FHIR Accelerator implementation based on the current CMS Interop data set.

24/7 production support of the delivered projects.

Extensively worked with optimizing technique and various Active transformations like Filter, Sorter, Aggregator, Router, SQL, Union and Joiner transformations. For best performance to handle big data set.

Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplets Output transformations.

Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.

Ensured data consistencies is being maintain in each layer of Data.

Connect to do multiple teams of all the inclusive environment to make sure no roadblocks will impact the project timelines

Designed data models using with BAs; designed programs for data extraction and loading into SQL Server database.

Leading a team of 7 members and serving as a primary point of contact for ongoing.

Techno Manger/Architect

Cox Communication

Jan’19 –Jun-22

Project:

Role:

Customer:

Enterprise data platform services

Techno Manager/Architect

COX Communication

Period:

Environment:

Domain:

Jan’19 –Jun-22

Hadoop Scala and Spark, HVR

Telecom

Description:

COX Communications is the largest private broadband company in America, providing advanced digital video, Internet, telephone and home security and automation services over its own nationwide IP network. Cox serves more than 6.5 million residences and businesses across 18 states

Responsibilities:

Managing end to end of projects from the stage of initiation till monitoring & control and closure including planning, estimation & scheduling, updating information to all stakeholders, integrating change control, controlling base lines, planning risk responses and contingency planning

Implementing project plans within preset budgets and deadlines, monitoring and reporting on project progress.

Proven success in leading a team of 37 members and serving as a primary point of contact for ongoing support in areas of responsibility including setting the analytics roadmap and budget for information delivery functions.

Ensuring that the project delivery team structure is adequate and enforces compliance, best practices, approach & direction for the technical aspects of the organization's practice, providing technical leadership to fellow team members.

Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata

Come up with health reports for the environment.

Experience working with Member, pharmacy, patient, provider, encounters and claim data

Experience working with parsing for structured and unstructured files using Informatica PowerCenter.

Implemented Slowly changing dimension Type 1 and Type 2 for change data capture. .

Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.

Worked on loading data into Hive and Impala system for data lake implementation.

Designed solution to process file handling/movement, file archival, file validation, file processing and file error notification steps using Informatica.

Worked extensively with update strategy transformation for implementing inserts and updates.

As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.

Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.

Techno Manager

Baker Huges General Electric

May’17 –Jan’19

Project:

Role:

Customer:

BHGE Application Migration

Techno Manager

BHGE

Period:

Environment:

Domain:

May’17 –Jan’19

Informatica, Green Plum, Talend,

Oil and Gas

Description:

BHGE has a large strategic synergy initiative to move Analytics applications from legacy on premise platforms to the new AWS Data Lake, hosted in green plum. The strategic visualization technologies remain OBIEE and Tableau. The strategic Data integration technology is Talend. The purpose of the project is move to one data lake from many today and drive a common technology stack to improve operations, reduce costs, and improve uptime for users

Responsibilities:

Steered efforts in developing Informatica, Talend, GPDB, HVR, Reporting.

Created a new label for Meta data entry in Talend and Checking the file format and Handled Code testing and deployment in higher environment

Managed the entire gamut of operations starting from gathering requirements, designing interfaces, examining design and providing assistance to the team in coding, testing, performance tuning.

Participated in the development and review of business and system requirements to obtain a thorough understanding of business needs in order to deliver accurate solutions

Communicated with internal/external clients to determine specific requirements and expectations; managing client expectations as an indicator of quality.

Using Informatica Power Center created mappings and Mapplets to transform the data according to the business rules.

Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.

Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.

Documented Informatica mappings in Excel spread sheet.

Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer in Informatica Power Center.

Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Used Informatica Power Center to migrate the data from different source systems.

Extensively used Autosys for Scheduling and monitoring.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.1 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.

Extensively used various Performance tuning Techniques to improve the session performance.

Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle. .

Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.

Project Lead/Manager

Baker Huges General Electric

Feb’14 – Mar’16

Project:

Role:

Customer:

OFS Billing

Project Lead/Manager

Baker Hughes

Period:

Environment:

Domain:

Feb’14 – Mar’16

Informatica, SAP BODS, Oracle, Vector wise and Tableau

Power and Water

Description:

GE was not able to drive operational decision due to multiple operational definitions with no Cross-functionality reporting & lack of drill down facility to its finance users. This implementation of BI platform NexGen would provide a near real time consistent global view & true analysis of operational & financial data to support business drivers & save manual efforts. NexGen Provides more accurate reporting based on one single table as opposed in earlier Guide that provides near real time Trial Balance Sheet.

Responsibilities:

Gained understanding of:

oRequirement from existing ARGO model and create a linage document

oExisting Business Model and planned the challenges for new Data bases

Communicated Code Review/Migration team to find the issues and solutions.

Ensured data consistencies is being maintain in each layer of Data.

Assisted Project Managers in establishing plans, risk assessments and milestone deliverables

Designed data models using Oracle Designer; designed programs for data extraction and loading into Oracle database

Developed complex reports using multiple data providers, user defined objects, aggregate aware objects, charts, and synchronized queries

PREVIOUS EXPERIENCE

Track Lead

Target Corporation, Bangalore

Jun’06 – Feb’14

Projects Undertaken:

Project:

EGP

Technology Used:

Data Stage, DB2, Teradata and Oracle

Project:

MBI-Inventory

Technology Used:

Data Stage, DB2, Teradata and Oracle

Project:

PARS

Technology Used:

Data Stage, DB2, Teradata and Oracle

Project:

Merchandising Item

Technology Used:

Data Stage 7.5, DB2, Oracle and UNIX

Project:

DQ SCR Target Corporation

Technology Used:

Data Stage 7.5, DB2, Oracle and UNIX

Project:

Vendor Compliance Integration

Technology Used:

Data Stage 7.5, DB2, Oracle and UNIX

Project:

Vendor Report Card

Technology Used:

Data Stage 7.5, DB2, Oracle and UNIX

Project:

Gift registry

Technology Used:

Data Stage 7.5, DB2, Oracle and UNIX

Junior Engineer

Jazzcon, New Delhi

Mar’03 - Mar’06

PERSONAL DETAILS

Date of Birth: 07/09/1979

Language Known: English &Hindi

Address: 408 Iron Street Bloomsburg Pennsylvania

* Refer to annexure for projects undertaken

ANNEXURE (PROJECTS UNDERTAKEN)

Project:

Environment:

EGP

Data Stage, DB2, Teradata and Oracle

Role:

Period:

Track Lead

Jan 2013 – Feb 2014

Description:

This is IT initiative project major focus that all the systems are in uniform platform and Database. Maintaining the inventory of Target Stores in EGP program we are transferring AWD Capabilities like Guest scoring, Item affinity, Mark Basket analysis. In Current ADW is retained, driving the need for the data, application and report integration into EDW.

Project:

Environment:

MBI-Inventory

Data Stage, DB2, Teradata and Oracle

Role:

Period:

Track Lead

Dec 2011 – Dec 2012

Description:

Inventory data in the legacy UDB environment does not meet all the analytical requirements of both the food and general merchandise businesses. By ensuring inventory data meets these analytical needs we will be enabling more timely, accurate and actionable reporting to be delivered back to the business via our parallel partner projects: 2012 MBI reporting release, Presentation & Space Reporting and Strategic Planning Funnel reporting. By implementing a more retail industry specific model and leveraging our learning's while building FMA\Unsalable marts, Target will be better positioned to anticipate future market trends and leverage its own inventory data more effectively. Inventory was initially implemented in the legacy UDB environment as a part of the Food Merchant Analytics (FMA) project but had a decidedly “food” slant to it. During the course of us

Re-plat forming analysis the project team identified a number of Logical and Physical recommendations to improve the architecture and usability of this data for General Merchandise.

Project:

Environment:

PARS (Platform architecture, Replacement and Support)

Data Stage, DB2, Teradata and Oracle

Role:

Period:

Track Lead

Feb 2010 – Dec 2011

Description:

The Objective of this project is to migrate the code and ware house from DB2 to Teradata, basically migration project as warehouse is growing in DB2. Migrating the code in Data stage 8.5 and Conversion of DB2 SQL into Teradata using pearl programming.

Project:

Environment:

Merchandising Item

Data Stage, Informatica and DB2

Role:

Period:

Track Lead

Mar 2009 – Jan 2010

Description:

The Foundation and dimension layer of Item is created to get the granular information of all the item of target related to particular item as per hierarchy level. SO was built with a vision of it to be a source of information for all reference information. Shared Objects built reference confirmed dimensions which include Item Hierarchy, Location hierarchy, Calendar information, senior buyer, Vendor, Team Member. The SO is a never-ending project and will increase the number of entities in it, as new reference objects are identified in EBI.

Project:

Environment:

DQ SCR

Data Stage, Informatica and DB2

Role:

Period:

Track Lead

Dec 2008 – March 2009

Description:

Handled data quality issue of the implemented Project. As per the severity and ticket raised by business data quality team.

Project:

Environment:

VCI—Vendor Compliance Integration

Data Stage 7.5, DB2, Oracle and UNIX

Role:

Period:

Software Engineer

May 2008 – Dec 2008

Description:

The objective of this project is to integrate the source systems Vendor Compliance System (VCS) and Import Claim System (ICL) to VRC Data Mart enabling the ability to see performance calculations and compliance charges in a single view and to do dispute research in one report. The objective is to view the relationships between Chargeback, Reversal and Denials also a reason code for reversals & denials. The new system needs the ability to view chargeback that are in progress (not fully paid).

Project:

Environment:

VRC – Vendor Report Card

Data Stage, Informatica, DB2, Teradata, Oracle and Tandem

Role:

Period:

Software Engineer

Nov 2006 - Apr 2008

Description:

The objective of this project is to improve data collection, expand measures to align with the needs of business, improve vendor management and improve vendor performance. By fulfilling these objectives Target Corporation wants to increase the overall vendor performance by 1 to 2 percentage points in Fill Rate Original, Fill Rate Revised, On Time Shipment, EDI 856 %Match, Lead-Time Reliability and Automated Receiving Technology.

Project:

Environment:

VRC – Vendor Report Card

Data Stage 7.5, DB2 and UNIX

Role:

Period:

Software Engineer

June 2006- Nov 2006

Description:

The Gift registry is an application through which a register user can select his/her gifts on the occasion in the target.com Website the popular registries are baby product weeding gifts and holiday gifts. The user adds the gifts what he wants to get from his registries are searchable based on the username.

Decisive, strategic and performance-driven professional targeting senior level assignments in Data Warehousing with an organization of high repute in multiple technologies and roles

ad0s78@r.postjobfree.com+1-469-947-8108

Navin Kumar Vishwakarma

ETL Architect

Data Analysis and design

Project Execution & Management

Automation

Client/ Stakeholder Management

Performance Tuning

Cross-functional Coordination

Code & Unit Test Cases Review



Contact this candidate