Post Job Free

Resume

Sign in

Data Analyst with ETL (Informatica) and Mainframe Experience

Location:
Bayonne, NJ
Posted:
May 09, 2021

Contact this candidate

Resume:

*** ************ ***, *** ***,

Bayonne, NJ - ****2

Email id: adl90l@r.postjobfree.com

Cell: +1-952-***-****

Sandeep Balar

Profile

12+ years of Diverse Experience in IT Industry with emphasis on ETL (Informatica) Development, Enhancements, Data Analysis, Mainframe (COBOL) Development and Production Support.

Results-Driven Data Integration specialist with extensive experience in evangelizing and developing data driven applications based on Data Integration (ETL and ELT), Data Warehousing solutions.

Solid Experience in managing team resources, partnering with Business groups, establishing customer relationships and mentoring Junior resources.

Worked for Diversified clients throughout career span - North America based Insurance (P&C), Telecom, Healthcare, Energy and Utilities, Logistics and Retail.

Proficient in analyzing and translating business requirements to technical requirements and architecture. Well versed in doing analysis, coding and unit testing.

Strong database and Programming skills, Strong Experience in Data Analysis, Data Cleansing, Data Validation and Data Reporting.

Experience in Gap Analysis, Data Mapping, Writing Functional Specifications, Design, Development, Implementation, Testing, Quality Adherence, Documentation, Troubleshooting and Customer Support.

Proficient in writing complex SQL Queries and stored procedures.

Highly skilled and experienced in Agile Development process for diverse requirements.

Developed and cultivated lucrative relationships with both new and existing clients through effective communication and exemplary interpersonal skills.

Skill Set

Cloud Services: Microsoft Azure

Data Warehousing: Informatica Power center 9.x/10.x, Informatica Intelligent Cloud Services (IICS)

Tools: Microsoft SSMS (SQL Server Management Studio), SQL Developer (for Oracle), IBM Data Studio(DB2), WinSCP, Putty, WinSQL (DB2), File-AID, SPUFI, QMF, NDM, File-Manager.

Language: UNIX, COBOL, JCL, CICS, Eazytrieve

Operating Systems: Linux, Windows, MVS, Z/OS

Databases: Azure SQLDW(Synapse), SQL server, Oracle, DB2, IMS

Change Management: GITHUB, CI/CD, CHANGEMAN, PANVALET, LIBRARIAN

Scheduler: Autosys, CA-7, CONTROL-M

Debugger: XPEDITOR, INTERTEST

Ticketing tools: JIRA, BMC iServices, BMC Remedy

Request Management: ServiceNow

Methodology: Agile, Waterfall

Certifications

Microsoft Certified: Azure Fundamentals

AWS Certified Cloud Practitioner

IBM Design Thinking Practitioner

Python for Data Science – IBM

Hadoop Foundations

Education

Bachelor’s in Electrical Engineering (Honors) - 2008, Jai Narain Vyas University, Jodhpur, Rajasthan, India

Professional Experience

Career History 09/2008 - To date

IBM India Private Limited, United States of America

Current Role/Designation - Data Engineer

• Currently working as Data Engineer in Data & Analytics domain to manage and carry out ETL activities on CHUBB Claims data for Property and Casualty Insurance Client using Informatica Intelligent Cloud Services (IICS) on Azure Cloud and Informatica Power Center Tools - Designer, Repository Manager, Workflow Manager, Workflow Monitor.

• Worked as an ETL Developer/Analyst within Agile Scrum team comprised of Product Owner, Business Analyst, Tester and enterprise architect to deliver the integration pipeline for merging Legacy CHUBB and Legacy ACE Claims in CDW (Claims Data Warehouse) Application for CHUBB.

• Responsible to create and deliver North America Claim Connect Financials flow to extract data from Claim Connect Application (Duck Creek Source), ingest into stage layer, transform to CBD (Common Business Definition) layout and load in CBD Conform layer.

• Managed the successful delivery of development projects across the full application lifecycle.

• Responsible for troubleshooting database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.

• Gathered business and technical requirements, wrote BRD and prepared FSD (functional specification document). Prepared technical design documents for each enhancement and maintenance releases.

• Responsible to deliver innovative and high-quality solutions to clients in response to varying business requirements and to improve on the performance, cost, resource utilization of applications.

• Provided immediate fix to day-to-day production issues and assessed the existing online and batch process in different lines of business to identify the gaps in existing process & provided recommendations for online and batch performance improvements.

• Create the Project Plan aligning it with the client’s expectation and preparing high level solution document and review detailed design document, test plan and test cases. Review results from Unit testing, System testing and Regression testing.

• Responsible for leading technical team to resolve critical incidents, emergency production defects and user service requests in order to make application available and running smoothly 24X7 and guiding team in their day to day task in prospect of technical problems encountered in Project development cycle.

Assignment History

Client: CHUBB (aka ACE Insurance) Insurance

Duration: 23 months (06/2019 - To date)

Project 1: CBD (Common Business Definition)

Project Description: CBD(Common Business Definition) is a strategic analytic and reporting solution for CHUBB North America Claims data, and is critical part of the CHUBB larger corporate data strategy. This application acts as Single Source of the Truth, which means all the Business-critical applications sources their data required for reporting purposes from CBD layer. CBD project scope is to build data on unified cloud platform from various data sources using common definition mode. The Project involves critical migration, design, build, transform & maintenance of high volume of application data from multiple claims source systems in on-premise to Cloud application for CHUBB.

Position: Data Engineer

Contribution:

• Responsible to create IICS (Informatica Intelligent Cloud Services) Data scrubs using Data Synchronizations tasks to improve CBD Data Quality to help Modelers run their Python models for effective ML Algorithm results.

• Responsible to write complex queries in Microsoft Azure SQLDW (Synapse Analytics) to retrieve the data from stage layer and populate conform layer.

• Responsible to analyze Data model of Claim Sources and perform Data Analysis to efficiently resolve production Data Quality and Data Reconciliation issues raised by Business Users.

•Create Power Center Workflows, Sessions and Mappings for extraction part of Data flow architecture.

• Responsible to create IICS Task Flows, Mapping Tasks, Mappings to create and deliver North America Claim Connect Financials flow to ingest extracted data into stage layer, transform to CBD (Common Business Definition) layout and load in CBD Conform layer.

• Create project plan aligning with client’s expectation and manage team’s work assignments.

• Provide technical guidance to team members on day to day tasks assigned.

• Experience in all aspects of data warehousing ranging from building architecture, designing, implementing & managing end to end release delivery, application development, testing maintenance with the project team and the client.

• I showcased my technical expertise and used Data Engineering skills (Informatica PC, IICS, Azure SQLDW) to a great use and delivered the much required milestone to achieve high NPS score from client by successfully completing the Performer Flow Development and also by creating a highly reusable solution to get detokenize value using PowerCenter.

• Responsible to work on performance tuning of existing CBD Data flows to eliminate large number of performance bottlenecks and reduce the overall execution time of nightly batch.

Client: CHUBB (aka ACE Insurance) Insurance

Duration - 18 months (01/2018 - 06/2019)

Project 2: CVAC Integration Project

Project Description: CVAC is the name given to bridge which facilitated in connecting Legacy Chubb and Legacy Ace Claims data. After ACE and CHUBB became one, they decided to integrate common applications between Legacy ACE and CHUBB and make one application. Due to this vision they requested IBM to help them to integrate CVAC data into CDW (Claims Data warehouse). As CDW is supported by IBM due to various IBM tools used in this migration activities IBM was the obvious choice to help CHUBB for this integration.

Position: ETL Lead + Analyst

Contribution:

• Implemented CVAC Integration pipeline to begin the flow of CVAC Claims in CDW (Claims Data Warehouse) in full conformance with CHUBB architecture and standards.

• Created sessions, configured workflows to extract data from various sources, transformed data, and loading into Data Warehouse.

• Developed ETL extract jobs using Informatica PC 10.X to extract the data from DB2 tables in Claim Vision and load SQL server tables.

• Using Various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Union and SQL Transformation to develop robust mapping in the Informatica Designer.

• Responsible to perform Data Analysis for critical Integration Issues to efficiently manage and resolve the clients queries and system support.

• Designed and developed interfaces to legacy systems.

• Involved in Performance tuning at source, target, mappings, sessions and system levels.

• Developed COBOL-DB2 modules to load the extracted data from ETL system into CDW database.

• Designed and set up Batch jobs across different LOBs in Claims area.

• Created High Level and Detailed level design pertaining to business requirement. This included high level technical solution document for programming the business requirements.

Client: AT&T

Duration: 24 months (01/2016 - 12/2017)

Project: ACMS (Access Capacity Management System)

Project Description: IBM was engaged to build and support AT&T’s critical ordering and inventory application for Access circuits and facilities (ACMS). ACMS - Specials handles private lines or dedicated lines for customers. ACMS Special Services sends out the Access Service Requests (ASR’s) to the Access Providers. ASR Implementation happened in every 6 months and ACMS has Flex releases carried out across the year.

Position: ETL + Sr. Application Developer

Contribution:

• Responsible for Analysis, coding and unit testing of any issues occurring in ACMS (Access Capacity Management System) application. Lead application developer for developing Mainframe applications for AT&T and later switched to ETL Development for a target system to ACMS.

• Lead technical team for development of end to end flow and preparation of test case implications which requires thorough knowledge of system functionality.

• Oversee implementation activities, provide value additions, proposals, proactively identify potential issues and provide permanent fixes.

• Responsible for production data fixes, projects, Enhancements, and production support.

• Responsible for effective communication between the project team and the customer. Provided day to day direction to the project team and regular project status to the customer.

• Analyzed production defects and change requirements and provide effective solution

• Used Type 1, Type 2 SCD Mappings to update slowly Changing Dimension Tables.

Client: Express Scripts, USA

Duration: 38 months (11/2012 - 12/2015)

Project: Eligibility and DCRS/FRS

Project Description: IBM was engaged to support many missions critical application for healthcare giant, amongst them were Eligibility and Drug Coverage Rule Station (DCRS) application. Eligibility application determines the eligibility of patients needed for claim adjudication and DCRS provides the rules coverages for formulary and non-formulary drugs.

Position: Senior Application Developer

Contribution:

• Responsible for analyzing and preparing low- and high-level design and program specification for the enhancement of Eligibility and DCRS application in Express Scripts.

• Delivered high quality solutions to clients using in-depth functional and technical knowledge of IBM Mainframes in response to varying business needs.

• Designed the structural framework of the application as per the business requirements for enhancements.

• Responsible for debugging, analyzing and solving the issues rose during the system testing and Quality Analysis testing of the enhancements.

• Prepared Test plan specifications and conduct unit testing.

• Responsible for effort estimation of self and team and communicate the same to customer.

• Gave permanent fixes to recurring issues in production for smooth run of application.

• Conducted KT (Knowledge Transfer) sessions to other team members.

Client: Oncor Electric, USA

Duration: 40 months (08/2009 - 11/2012)

Project: LCIS (Legacy Customer Information System)

Project Description: Legacy Customer Information System (LCIS) is the heart and the most critical backend application for Oncor business. This application serves various customer requirements from the beginning. It handles business functions that primarily deal with Service orders, Meter reading, Billing, Cash, Collections, Revenue and Analysis Master.

LCIS interfaces with other applications to meet the business demand of Oncor.

Position: Senior Application Developer

Contribution:

• Responsible for Analysis, coding, unit testing and guiding team in their day to day task in prospect of technical problems encountered in our CIS (Customer Information System) application of ONCOR Electric.

• Responsible for effective communication between the project team and the customer. Provide day to day direction to the project team and regular project status to the customer.

• Provided 24*7 PRODUCTION Support and Maintaining LCIS (Legacy Customer Information System) Application which is the heart and the most critical and challenging application of ONCOR business.

• Resolved emergency and normal defects which were tagged under stipulated releases and creating Positive and Negative test cases so as to check the validation of changes.

• Established Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.

• Resolved service requests raised by the users of ONCOR.

• Conducted KT (Knowledge Transfer) sessions to other team members.

Client: United Parcel Service

Duration: 3 months (05/2009 - 07/2009)

Project: CAMS (Customer Automation Management System)

Project Description: United Parcel Service, commonly referred to as UPS, is the world’s largest package delivery company, delivering more than 15 million packages a day to 6.1 million customers in over 200 countries and territories around the world. UPS is a $49.7 billion global company with one of the most recognized and admired brands in the world. It’s the world’s largest package delivery company and leading provider of transportation and logistics services. It is headquartered in Sandy Springs, Georgia, USA.

Position: Developer

Contribution:

• Responsible for Development work so involved highly in coding and unit testing phase.

• Gathered requirement from the client and preparing efficient solution to help customer in their problem.

• Single point of contact (SPOC) for CAMS (Customer Automation Management System) application that stores information of distributed client systems.

• Prepared unit test cases and documentation of every positive and negative test case.

• Ensured that development is performed as per requirements

• Provided project status updates to project managers, developers, business analysts and clients

• Developed implementation and test plans, coordinate and work with clients.

Client: A.P. MOLLER MAERSK Group

Duration: 6 months (12/2008 - 05/2009)

Project: DSG Project

Project Description: IBM was engaged to handle responsibilities such as Defect & Support, AMS and migration of data from legacy to SAP. The project was mainly to migrate the data from legacy application to SAP platform.

Position: Developer

Contribution:

• Worked as an Application developer doing mostly the development activities (mainly the Phase 4 of SDLC cycle i.e. Coding and Unit Testing) to migrate data from legacy to SAP.

• Responsible for resolving defects which were tagged under stipulated releases and creating Positive and Negative test cases in order to validate the changes.

• SPOC for EDBADM application that handles the payment system of DSG.

• Responsible for Production job abends and their resolution within SLA (Service Level Agreement).



Contact this candidate