Post Job Free
Sign in

Hadoop Adminstrator

Location:
Atlanta, GA
Posted:
October 03, 2017

Contact this candidate

Resume:

*

ARAVINDAN THANGAVELU

Ernst & Young Alpharetta, GA 404-***-**** *******.****@*****.***

Professional Experience Summary

Accomplished senior Information professional with strong strategic planning, design and implementation skills having 14+ years of extensive IT experience in various technologies.

A highly experienced 'Senior Technical Architect’ with expertise in Big Data cluster solutions, Administration, Securities (AD, Kerberos, Knox, Ranger) Migrations focusing on Hortonworks, Cloudera, Apache distribution Hadoop & Cassandra and SPLUNK.

Headed Chennai BigData Infrastructure / Administration competency team with responsibilities recruitment, training, deploying associates in projects and delivering the customer support for 7 different customers on shared service model.

Deft in managing & leading cross-functional teams that collaborate as a focused cohesive unit to achieve aggressive business goals.

Highly proficient in client / stakeholder management with a penchant for ‘Expectations Management’

& ‘Perceptions Management’.

Well-travelled and have got great experience in delivering projects across geographies in countries like the USA & UK.

Strong problem solving & technical skills coupled with confident decision-making for enabling effective solutions leading to high customer satisfaction and low operational costs.

Significant experience working with / for a variety of domains / clients – Financial services, Telecommunications, Banking to name few.

Proven track record of - increasing responsibilities, turning around low performing teams & enhancing operational processes. Strong Analytical & problem solving skills Core Competencies

Big Data: Hadoop/Cassandra (Hortonworks, Cloudera & Apache, Apache Cassandra, Securities, Splunk, Python)

Strong exposure to Big Data environment with production experience in integration of Hadoop ecosystems like Hive, Zookeeper, TEZ, Impala, YARN architecture, HDFS,NOSQL (Cassandra), Spark, Python, Spotfire, Talend.

Handled migration of Hadoop & Cassandra from customized Acunu distribution to the open source Apache.

Experienced in Solution proposals, Architectural Reviews, Consulting Services, POC’s, Estimations, capacity planning and sizing etc.

Hadoop & Cassandra solution designing in highly demanding environments, Cluster Architecture Design, Installation & Configuration, Application design, deployments and Management along with nodes commissioning.

Hadoop Log analytics using Splunk and integration to AppDynamics for monitoring & optimizing the application performance.

Cluster Performance tuning and optimizing the cluster resources effectively. 2

Business Intelligence: ( Ab-Initio, Teradata, Linux )

Experience in the technics of Data profiling, Data Cleansing, and Change Data Capture.

Identify the process improvements from the existing process, propose customers and implement the same for efficient ETL processing.

Identify and Deployment of new process improvements solutions in projects which results in savings of cost, performance uplift, stream lining the existing business. Legacy Systems: ( Mainframes – CICS,COBLOL,JCL,PL1,REXX,VSAM,DB2)

Proposed, Managed, Implemented a new test environment model to stream line the process of test regions which resulted in bringing down the batch processing time significantly from 4 days to 12 hours.

Created a new real time application to capture status of each jobs processed in batch with a GUI, from which dash boards are generated on need basis to investigate the delay cause and address them accordingly.

Proposal Engagements

1. Helix – Hadoop Architecture revamp (Big Data)

Re-architecture of existing process with the new solution from using Talend – (ETL tool, used for scheduling the hive process flow) to Oozie and decommission of NAS by using Gluster storage.

Introducing Hbase for Hive which enables the Spotfire to eliminate the cache of reports and improved the performance by 80%.

Proposed & implemented AppDynamics & Splunk for Hadoop logs & Application monitoring to improve the accuracy and incident resolution time.

2. User Data System – UDS cluster design and implementation. (Big data)

Solution Design and implementation of Big Data environment to process and store high volume of mobile user data and enhance the system with an analytical platform. 3. User Data System – Subscriber Location & BTS Load Forecast.(Big Data)

The location (mobile tower/BTS) of a mobile subscriber & load on BTS must be forecast in real-time. The subscriber's usage history can be used as a reference for the forecast. The real-time feed of the same information should be used to identify the subscriber's current location (BTS used) & use his/her historical travel pattern as a reference and forecast the next BTS they might use. Forecasting the BTS for thousands of users will also help in forecasting the load on a BTS. The historical and forecast data needs to be shown as visual report. The forecast report must reflect the more recent forecast data. 4. Migration of Hadoop & Cassandra: (Big Data)

Strategic transformation of Acunu- based platform to open-source Centos, Apache Hadoop & Cassandra environment covering process, end to end solution design and implementation with effecting migration and deployment strategy addressing the transformation challenges. 5. Factory Model Testing:

Strategic process plan, vision, and roadmap for integration of all LOB BI testing into a new defined model called “Test Factory” which will improve the testing efficiency and cost effective due to utilization of testers across projects.

3

Employment Chronicle:

Senior Technical Architect @ Hexaware Technologies Inc: Ernst & Young. Feb 2016 – Present. I work as a Senior Hadoop Cluster Architect/Administrator involved in administrating the cluster, building and securing the similar environments globally, measuring gaps in the Architecture for simple and effective solution.

Hadoop Solution Architect @ Tata Consultancy Services: Telefonica. January 2015 – Feb 2016.

Head of BigData Infrastructure / Administration competency @ Tata Consultancy Services: ABIM I worked as a Hadoop Solution Architect involved in design and implementation of new secured Big Data environment and build analytics platform for business cases such as Prediction of Network usage, monitoring the BTS usage, weblog trend analysis etc. using Cloudera Hadoop distribution. In Parallel, I was heading the Bigdata Infrastructure / Administration competency.

Hadoop/Cassandra Technical Consultant @ Tata Consultancy Services: Telefonica. Nov 2013 – Jan 2015.

Head of BigData Infrastructure / Administration competency @ Tata Consultancy Services: ABIM I worked as a Big Data Technical Consultant providing & deploying the strategy plan to migrate the customized Acunu distributions of Hadoop & Cassandra to Apache Hadoop & Cassandra platforms with minimal outage of live cluster services and not impacting the business SLA’s. In Parallel, I was heading the Bigdata Infrastructure / Administration competency.

Hadoop System Administrator @ Tata Consultancy Services: The Hartford Insurance. Sep 2013 – Nov 2013.

Head of BigData Infrastructure / Administration competency @ Tata Consultancy Services: ABIM Technical project coordinator and Hadoop System Administrator for the Proof of Value – Location based risk evaluation and managed the team along with delivery responsibilities. In Parallel, I was heading the Bigdata Infrastructure / Administration competency.

Hadoop System Administrator @ Tata Consultancy Services: ABIM May 2013 – Sep 2013. Analytics Big Data Information Management (ABIM) is a horizontal division that provides big data solutions across all line of business. The group has several sub-groups that works on different proposals and build solutions for different use cases. And, I was leading one of the sub-groups by doing use cases and building infrastructure for the same.

Technical Project Manager @ Tata Consultancy Services: Citi Bank. Dec 2010 – Apr 2013. Manage a team of 24 onshore/offshore responsible for the Data warehouse ETL design, development, testing & implementation of all business analytical requirements for all projects at Symantec. Review and approve statement of work (SOW). Resource forecasts management, hiring of contractors & mentoring team leads.

Project Lead @ Tata Consultancy Services: Citi Bank May 2007 – Dec 2010. Responsible to perform detail analysis to bring down the batch processing time from 4 days to 8 hours window in the test environment for Cards LOB. Automate the manual process of Test bed creation for each cycle.Have involved both technically and manage the team deliverables, provide support 24 * 7, test region setup before each release and automation etc.

Developer/Module Lead @ Tata Consultancy Services: Citi Bank Dec 2004 – May 2007. Developed various critical applications for the NAPS system and responsible for the requirements gathering, development, testing and implement support. Academic Profile:

Bachelor of Engineering (Mechanical), Annamalai University, May 2002.

Diploma in Mechanical Engineering, Muthiyah Polytechnic, April 1998.



Contact this candidate