Post Job Free

Resume

Sign in

Developer Data

Location:
Richmond, VA
Posted:
July 23, 2018

Contact this candidate

Resume:

Vinoth Kanthan

Hadoop /Spark Developer

Phone: 732-***-****

Email: ac6dsw@r.postjobfree.com

LinkedIn: https://www.linkedin.com/in/vinoth-kanthan-74365814/

Over 11 years of experience, including 3 years on Big Data Hadoop Eco Systems with hands-on project experience in health care domain.

Cloudera certified Spark and Hadoop developer

Extensive experience on SQL&PL/SQL, Hadoop and Linux Shell Scripting

Expertise in developing Spark applications using Scala

Exposure to data quality analysis, data migration and data validation.

Migrating the coding from Hive to Apache Spark and Scala using SparkSQL, RDD

Working experience on Spark, Spark SQL, RDD, Data Frames and Datasets

Hands on experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Good Experience in file handling operations on Unix.

Professionally adaptable to 'get on board' quickly with new technologies and new assignments, able to work independently or in a group/team setting to meet deadlines while handling multiple assignments.

Extensively involved in Requirement analysis, Code development and Bug fixing.

Expertise in Hive Query Language and debugging Hive issues.

Expertise in Sqoop and Flume

Extremely motivated with good inter-personal Skills have ability to work in strict deadlines

Accomplished facilitator in understanding

Proficiency in SDLC Life Cycle, Agile and Waterfall models

Healthcare domain experience of close to 10 years with detailed knowledge on Membership Enrollment, Claims Adjudication process and provider systems

Coordinating with offshore team on technical requirements, design, test plan, test results, pre-implementation activities and deployment

Bachelor of Technology, Industrial Engineering, National Institute of Technology,

Jalandhar, Punjab, India June 2002 to June 2006

Big data ecosystem: Hadoop, MapReduce, HDFS, Hive, Flume, Kafka, Sqoop, Hbase, Zookeeper, Oozie, Apache Spark, SparkSQL, Spark Streaming, RDD, Data Frames, JSON, AVRO

Programming/scripting languages: Scala, Linux shell scripts, Cobol, JCL, XML,

Databases: IBM-DB2, MySQL, IDMS, IMS-DB

Operating systems: Unix, centos 6.7, Z/OS, MVS, Windows […]

Development tools: SUBLIME 2, Eclipse, AVRO-TOOLS

Collaboration Tools: JIRA, Rational Team Concert, Rational ClearQuest, SharePoint

Hadoop/Spark Developer - IBM (Client Anthem Inc)

Richmond, VA October 2015 to Present

Project: Enterprise provider data system

Aim of this project is to migrate provider date from all the legacy provider platform into Hadoop cluster.

Responsibilities:

Involved in developing roadmap for migration from legacy system to Hadoop cluster.

Create, validate and maintain scripts to load data using Sqoop manually.

Load and transform large sets of structured, semi structured and unstructured data coming from various downstream systems.

Migrated data between RDBMS and HDFS/Hive with Sqoop.

Create, validate and maintain scripts to extract and transform data from MySQL to flat files and JSON format.

Create Oozie workflows and coordinators to automate Sqoop jobs weekly and monthly.

Worked on reading multiple data formats on HDFS using Apache Spark.

Wrote Scala scripts for spark to perform operations like data inspection, cleaning, load and transforms the large sets of structured and semi-structured imported data

Involving in Migrating the coding from Hive to Apache Spark and Scala using SparkSQL, RDD

Developed Spark with Scala and Spark sql for testing and processing of data

Developed, validate and maintain HiveQL queries.

Designed Hive tables to load data to and from external HDFS datasets

Hands on using Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive for optimized performance.

Managing and scheduling jobs on a Hadoop cluster.

Environment: Hadoop, HDFS, Apache Spark 1.6,SparkSQL, Unix, Hive, Sqoop, Flume, Scala, Oozie, DB2 SQL, kafka

Technical Team lead and System Analyst - IBM (Client Anthem Inc)

Albany, NY and Bangalore, India June 2010 to October 2015

Project: CHIPS VA provider system and CS90 NY Provider System Development and enhancement.

Responsibilities:

As Delivery Lead, managed and coordinated multiple systems/teams involved to ensure successful delivery of the efforts as per the Business Requirements

Ensure on-time delivery for the SDLC phases for projects and SSCRs (Small System Change Requests)

Perform weekly work forecast, LOE reviews and related negotiations

Represent the CHIPS VA provider and CS90 NY provider teams in the weekly Client Portfolio and Stake Management calls

Identify and raise Risks, Issues and dependencies to the stakeholders and come up with mitigations plans in a timely manner

Review the deployment plan and document implementation strategies with the Anthem management and SME's. Discuss risks, issues and dependencies in the implementation procedures in a deployment across applications

Prepare and review Level of Estimate in hours for each of the efforts executed at Anthem. Based on industry standards the LOE is prepared with clear breakup for each phase of the project and sometimes at activity level

Arrange internal JAD sessions and come up with the best possible design approach to the many projects and SSCRs

Interacting with SME's, Business, Application owners/end users to determine application requirements, user problems, participating in architecture and design activities

SPOC for the CHIPS VA provider and CS90 NY provider team for the release management activities for the Anthem CHIPS Virginia System and CS90 Newyork systems.

Owner of the Procedures and Process documentation for the Anthem account and its maintenance

Environment: Z/OS, MVS, COBOL, JCL/PROC, VSAM, Cobol-XML, DB2 SQL, SPUFI, IMS, REXX, CLIST, VB scripting, ClearQuest, Share-point

Batch Process Lead & Application developer - Dell Services (Client BCBS - Rhode Island)

Noida, India August 2006 to June 2010

Project: Application development and Maintenance of Healthcare Insurance Claims Processing System.

Responsibilities:

Studying the different systems that exist in Health-care insurance and their functional impact.

Understanding the client requirements and preparation of technical documents.

Working and implementing the code in the mainframe environment.

Understanding the interfaces for claims software and PLASM language.

Maintenance, enhancement and developments of programs coded in COBOL as per the requirements.

Working on the Incidents primarily concerning claims (maintenance support system).

Supporting the high-priority live batch flow system.

Environment: Z/OS, MVS, COBOL, IMS-DB/DC, IDMS, JCL/PROC, REXX, CLIST.

Cloudera Certified Spark and Hadoop Developer (CCA175)

May 2018 – May 2020

SAFe® 4 Practitioner (Scaled agile framework)

April 2015 to Present



Contact this candidate