Post Job Free

Resume

Sign in

Data Life Insurance

Location:
Seattle, WA
Posted:
September 12, 2017

Contact this candidate

Resume:

Email: ac19yr@r.postjobfree.com Mobile: +*206-***-****

PRAKASS BALAJI KATHIRVEL

HADOOP and MAINFRAME - SOFTWARE ENGINEER - 7.6 YRS experience

To take up innovative and challenging tasks in the field of Designing and Programming that will provide better company growth. A Software Engineer having a total of 7.6 Years of experience in Information Technology wherein worked for 5.5 years on Mainframe technology and 2.1 years on Big data Hadoop. Have worked during various phases of SDLC such as Design, Construction, Testing and System Testing. And my core areas of skill are design and development.

TECHNICAL PROFICIENCY – BIGDATA STACK

OPERATING SYSTEMS

LINUX, UNIX, WINDOWS

HADOOP ECO SYSTEM

Hadoop (HDFS and Map-Reduce), Hive, Oozie

PROCESSING FRAMEWORK

Map Reduce and APACHE SPARK

DATA INGESTION

SQOOP, FLUME

Others

Apache NIFI

LANGUAGES/SCRIPT

Shell Script, Core Java

TECHNICAL PROFICIENCY – MAINFRAME

OPERATING SYSTEMS

Z/OS, MVS

LANGUAGES

PL/I, COBOL, JCL

DATABASE

IMS/DB,DB2, VSAM

OLTP

IMS/DC

TOOLS

TSO / ISPF, ROSCOE, CHANGEMAN, FILE MANAGER

OTHERS

RPF(ROSCOE PROGRAMMING FACILITY)

PROFESSIONAL HISTORY

Capgemini India Private Ltd, Chennai, India Apr 2016 – May 2017

Client: Standard Chartered Bank.

Standard Chartered offers banking services in some of the world's most dynamic markets including Asia, Africa and the Middle East.

PROJECT : Scudee – Data Ingestion Framework

About the Project:

Enterprise Data Management Platform (EDMp) is the enterprise wide information management platform built to perform sourcing, staging and reporting functions of the data from the various TP systems.

It is the strategic data management platform designed to deliver high quality, integrated and timely data to meet regulatory requirements, Risk and Finance management reporting and analytics.

The objective for EDM Data Sourcing is to source comprehensive data across all levels in the TP System Data Model – all tables and attributes available in the TPs are to be fed into Hadoop.

Role & Activities:

Worked on Requirement Analysis and Design phase.

Analyze current system and provide performance improvement solutions.

Creating avro, orc, hql hive DDL from schema xml and primary key files provided from CDC system with proper datatype mapping.

Creating Nifi workflow and writing custom processer to perform file level validation and data level validation and store data into hive avro staging layer.

Apply Deduplication process and in staging layer.

Written Java/Spark program to perform data processing, filtering record type such as insert, before update, after update, delete records and to store data from staging avro tables to Snapshot orc tables.

Written Reconciliation Report generating program based on Row count & checksum for all tables of a TP system and reports are sent on a daily basis.

Used necessary Collections and oops concepts of core Java to process the data

Returned from Hdfs/Hive.

Tools and Technologies:

Hadoop Distribution : Hortonworks Data Platform – HDP 2.3.4

Bigdata Stack : Spark, MapReduce, Hive, Oozie, HDF(NIFI)

Languages : Java, Scala

IDE : Eclipse

SCM, DevOps : Git, Stash, Maven, Jenkins

Cognizant Technology Solutions, Chennai, India Nov 2014 – Mar 2016

Client: ANTHEM INC.

Anthem Inc. is an American health insurance company prior to 2014 known as WellPoint, Inc. It is the largest for-profit managed health care company in the Blue Cross and Blue Shield Association. Anthem offers Health care plans, Specialty plans such as Dental, Vision, Medicare & Medicaid plans and so on.

PROJECT : Risk Analytics

About the Project:

Risk Analytics for Solvency 2

Predicting Insurer's Personality so that it can be used as a key term for Actuarial Calculations using Hadoop – HDFS, Hive.

To batch Actuarial Data using Hadoop so that processing time is reduced from one day to 2 hrs.

POC:

POC comprises of data migration which includes modules like Data Design, Data Ingestion and Data Processing.

Data Design:

This module involves extracting necessary fields from Source DB MYSQL followed by data cleaning which detects and corrects corrupted or inaccurate records from source database.

Cleaned data are transformed as a structured dataset.

Data Ingestion:

Loading the structured Healthcare Insurers data set and Healthcare plan dataset into new MYSQL tables.

Loading the newly created tables from MYSQL into HBASE using SQOOP.

Created both External & Managed Hive tables (version 0.11) for the data stored in HBASE.

Data Processing:

Hive Queries are performed over the data loaded in Hbase. Performance tuning is done using Bucketing/Partitioning. Query processing time is greatly reduced and random data access is efficiently fast when compared to MYSQL query.

Semi-structured data such as log dataset in HDFS are processed using Pig scripts.

Data ingestion and data processing operations are automated and scheduled using Oozie scheduler.

Role & Activities:

Worked on a live Multinode node Hadoop cluster running Apache Hadoop 1.2.1

Worked with highly structured and semi structured data of 5 to 6 TB in size (15 to 18 TB with replication factor of 3)

Extracted the data from MYSQL into HDFS using Sqoop.

Created and worked Sqoop (version 1.4.3) jobs with incremental load to populate Hive External tables.

Extensive experience in writing Pig (version 0.13) scripts to transform raw data from several data sources into forming baseline data.

Developed Hive (version 0.11) scripts for end user / analyst requirements to perform ad hoc analysis.

Very good understanding of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.

Solved performance issues in Hive with understanding of Joins, Group and aggregation

and how does it translate to MapReduce jobs.

Experience in using Sequence files and RCFile file formats.

Developed Oozie workflow for scheduling and orchestrating the ETL process.

Good working knowledge of Amazon Web Service components like EC2.

Very good experience with both MapReduce 1 (Job Tracker) and MapReduce 2 (YARN) setups

Very good experience in monitoring and managing the Hadoop cluster.

Used necessary Collections and oops concepts of Java to process the data returned from HDFS.

Tools and Technologies:

Hadoop Distribution : Hortonworks Data Platform – HDP 2.3.4

Bigdata Stack : MapReduce, Hive, HBase, Sqoop, Pig, Oozie.

Languages : Java

IBM INDIA PVT LIMITED, Chennai, India Feb 2013 – Oct 2014

Client: Allianz ADM Multisourcing, Life Insurance Development, Germany

Allianz Germany provides both Life Insurance and General Insurance. Servicing and Maintenance takes place which includes in applications DI, HD, POL, VAR, BIPER and VTAE.

About the Project:

My Area of work is in applications POL, VAR, BIPER and VTAE where POL and VAR is mainly for Letter creation to customers and BIPER involves Pension policies and VTAE is condition check routine modules which displays Error message to Business dept. if any of the condition check is failed.

Role & Activities:

Developing new modules or do amendments.

Preparation of test data which means nothing but Test cases.

ATS testing is like first level Unit testing.

BTS online testing whenever code change impacts online screen.

ALF which moves the code to Next stage for system testing and User testing.

Any bugs identified in system testing and user testing will be fixed and staged again.

Production failures if any will be considered as high priority and supported immediate.

ONSHORE – WORKING IN CLIENT’S WORK PLACE: (Sep 2013 to Nov 2013)

I have worked in client’s place Stuttgart (Germany) from Sept 2013 to Nov 2013

And I have gone for knowledge transfer from my client on couple of applications

VTAE and DI and bring it to offshore.

STERIA INDIA LIMITED, Chennai, India Sep 2009 – Jan 2013

Client: Zurich Financial Services, Life Insurance Development, UK

About the Project:

A full cycle Insurance product development, Servicing and maintenance take place in Nova & Phase3 (Life Insurance) System and Pulsar (Pensions) System.

Role & Activities:

Amended online screen changes to include the Benefits.

Developing New Modules to integrate changes in online screen with the DB.

Developing New Modules to update, validate new changes entered by users in online screen.

Test Data Setup by loading new policies into online database and did unit testing

and System Testing

Code and Data fix during User Acceptance Testing.

ONSHORE – WORKING IN CLIENT’S WORK PLACE: (Jan 2012 to Sept 2012)

I had experience of working in Client’s place in United Kingdom (Swindon) supporting the ongoing projects through a new team called Technical co-ordination which supports the system test and User Acceptance Testing & Integration Testing and finally the Implementation support.

The work involves co-ordination of various ongoing projects of various systems involves New business, Life insurance and Pensions and placing the respective loads changed into a system test library and supporting the system test run.

The critical part of the testing involves supporting the UAT where it covers the end to end testing. I have supported from loading a new policy into an IMS database and transferring into a life systems till producing the reports/letters for the customers.

Working in middle office provides me enough opportunity to interact with clients and gives me more exposure about the insurance business and understanding the criticality of situations and acting diplomatic and pro-active.

As a developer, I have experience in online screen changes (both front and back-end) and designed the complex batch PL1 and COBOL programs fetching IMS/DB2 databases and creation of Jcls in testing those programs. As a coordinator, I have the experience in supporting the all forms of Technical / Business testing from System test, UAT, Integration and Implementation support.

EDUCATION

Pursued B.E (Electronics and Communications Engineering) from Mahalingam college of Engineering and Technology, Pollachi affiliated to Anna University with aggregate of 78% (2009).

PERSONAL DETAILS

Resident Address : 400 Boren Ave N, #831, Seattle, WA - 98109

Father’s Name : Mr. I.Kathirvel

Mother’s Name : Mrs. K. Kasthuri

Date of Birth : 29th October 1987

Languages Known : Tamil, English

Gender : Male

Marital Status : Married

Nationality : Indian

I declare that all the details given above are genuine to the best of my knowledge and belief.

Date: Yours Sincerely,

Place:



Contact this candidate