Post Job Free

Resume

Sign in

Data Project

Location:
Bengaluru, KA, 560001, India
Posted:
September 06, 2016

Contact this candidate

Resume:

Saravanan P

acwhqc@r.postjobfree.com

Phone: +91-988*******

Professional Summary

11+ years of expertise in the Media /entertainment, Retail and Insurance IT domain.

* **** ** ********** ** leading Hadoop team in designing architecture for business use cases In Insurance customer segmentation, surveys and lead generation activities

Business Intelligence professional with progressive industry experience and extensive experience in designing and implementing critical ETL and Data warehouse Migration BI projects.

Micro, Small and Medium Enterprises (MSME – Govt. Of India) Certified Hadoop Developer and BigData Consultant.

Good Hands on development experience of parallel processing and Distributed Computing

in Hadoop Ecosystem: HDFS, Pig, Hive, HBase, Sqoop, Flume, Oozie, Zookeeper etc.

Load and transform large sets of structured, semi-structured and unstructured data using Hadoop ecosystem components.

Multitudinous experience across all stages of BI/DWH Development/Migration Life Cycle including business requirement analysis, design, data mapping, development, testing, Implementation and post production support.

Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.

Extensive work experience in ETL analysis, designing, developing, testing and implementing ETL process in Oracle Data Integrator 11g and Netezza, Oracle Exadata including performance tuning and ETL process streamlining.

Distinguished proficiency at programming in SQL, PL/SQL, UNIX including performance tuning, query/code optimization and streamlining.

Experience in Software Installation of ODI and Configuration Management tools PVCS/ CVS.

Experienced in developing custom tools in java (Fitnesse Fixtures) for automating the load process and validating target tables for regression tests.

Reputation for strong organizational skills, excellent communication skills, dedicated teamwork, attention to details, ability to work under pressure to balance competing priorities and meet deadlines.

Extensive experience in coordinating with off-shore Team and onsite projects in a global delivery model.

Skills & Certifications

Programming languages

Java, JavaScript, Hadoop, Hive, SPARK, Pig, HBase, Flume

Platforms

UNIX(Solaris, Linux),Windows 9X/XP/NT

Database

Oracle 11g, Netezza NPA, Oracle Exadata, HDFS, HBase

Tools

Oracle Data Integrator11g, OBIEE 11g, ERwin Data modeler, Informatica Power Centre8.0/7.1, DB Visualizer, Control–M, Eclipse, JBoss 3.2X, Apache Tomcat, Mercury Quality Center11.0,

Certifications

BigData Analytics and Hadoop Developer- MSME- Govt. Of India

Educational Qualification

Course/Degree

Year of Completion

College/Institution

University

Percentage

B.E. (Telecommunication)

2004

M. S. Ramaiah Institute of Technology, Bangalore, India

73 %

Project

DESS Hadoop Engagement Services

Client

Confidential (TCS Clients)

Period

July 2015 to Present

Role

EDW & Hadoop Consultant

Responsibilities

Involved in various customer PoC and DESS deliveries

Lead a 12+ hadoop team on different hadoop Solutions/PoCs for multiple customers and verticals

Lead designer to develop API to load dynamic XML, CSV & XLS files to Hive & HDFS which will create data structure on the fly based on data input

Trained on MapR, Cloudera, Spark Streaming & Flume

Designed and developed hive schemas for different DESS clients

Developed the archival process to keep the raw files in hive tables

Conduct training for new joiners on HDFS, YARN, Map Reduce

Engaged with customers, business users and analyst for requirements gathering and proposed solutions in hadoop platform.

Tools

HADOOP-CDH, HIVE, PIG,SQOOP,Java, Spark Streaming, Map Reduce

.

Project

EDW -Technology Enabled Business Transformation Program (TEBT)

Client

HDFC Life Insurance (Bangalore - India)

Period

Feb2013- Jun-2015

Role

ETL-Architect

Responsibilities

Created source to target mapping document and worked on ETL mapping, session and workflow coding, ETL architecture, and provided data quality analysis to determine cleansing requirements.

Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.

Offload customer demographics data from data warehouse (exadata) to hadoop ecosystem (HDFS) using Sqoop for Data mining and predictive analysis.

Manage one time and incremental data load to HDFS and Hive tables

Developed UDF to implement data validation in Hive to impose business rules & constraint

Designed, developed ODI Interfaces and Packages to extract and load data from DB2, SAP, Oracle and Flat files.

Provided integration and post production support for the ETL and reporting.

Designed,configured and supported Insurance claims data migration to Hive for predictive analytics and fraud detections

Performance tuned the interfaces, queries through use of indexes, ranks, Oracle hints, exchange partitions etc.

Comprehended and converted critical business requirements into technical functionalities for the ETL and Reporting teams.

Managed some aspects of the project as a technical -- functional coordinator, assigning tasks, setting deadlines, tracing issues, raising tickets etc, between the HDFC, TCS and other vendor, development and testing, teams.

Completed Installation and upgradation of ODI 11g tool, SAP HCM connectivity with ODI and Hbase, Oracle Exadata with IBM DB2 Integration in project initial setup.

Designed and implemented Data migration jobs to load past 15 years of Insurance data from various Data sources like IBM DB2, Flat files and SAP HCM.

Tools

Oracle PL/SQL Developer, Oracle 11g, Exadata, Hadoop, Hbase

Project

Media Content Hub

Client

The Nielsen Company (Oldsmar - Florida)

Period

Feb 2007 to Jan 2013

Role

Onshore ETL lead

Responsibilities

Work with business users and business analyst for requirements gathering and mapping requirements.

Involved in Data migration activities for loading past 20 years of Television Ratings data collected across USA from various Data sources like Sybase IQ and Redbrick.

Designed, developed, tested and attuned ODI jobs and Packages. Analyzed and modified existing ETL objects in order to incorporate new changes in them according to the project requirements.

Designed and developed packages in the ODI Framework.

Developed ODI packages to extract data from different flat files, MS Excel, Netezza, Oracle and Redbrick databases and transformed the data based on user requirement using ODI and loaded data into target.

Coded SQL/PL-SQL, SQL* Loader and Unix Shell scripts for high volume data warehouse instances.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, Interfaces, sessions. This led to better session performance.

Conducted code reviews and planned the knowledge transfer to client.

Demonstrated situational leadership in managing transition and allocation issues by working closely with ODMs, Demand Supply, Production Control, Vendor Management and Engineering teams.

Tools

Eclipse, core Java, Oracle Data Integrator10g,Netezza

Project

TCS Reuse platform solution

Client

TATA Consultancy Services

Period

Oct 2004 – Jan 2007

Role

Developer

Responsibilities

Lead a team of four members in improving software quality and re-designing modules of a retail application for a US based Fortune 500 company

Established a strong client relationship and developed good understanding of retail domain and the client business and management concepts

Gathered business requirements and translated into solid documentation such as UML use case and sequence diagrams. Guided team members in development of functional and technical specifications

Detailed Design and Coding using Core Java and other J2EE technologies. Maintained, developed and fixed bugs for applications.

Tested, debugged and troubleshot different applications and components developed by the team and ensured effective resolution

Developed test cases and tested application using JUnit and also used JMeter/OpenSTA for load testing/performance

Set high benchmarks in software quality of the solution provided, by re-defining coding & testing standards for this project

Tools

Eclipse, HP Loadrunner, QTP

Personal Details

DOB – 24/07/1981

Languages – English, Hindi, Tamil

Marital Status - Married



Contact this candidate