Post Job Free

Resume

Sign in

Hadoop Data

Location:
Buffalo Grove, IL
Salary:
125000
Posted:
December 12, 2018

Contact this candidate

Resume:

M o h i t B a n s a l

https://www.linkedin.com/in/mohit-bansal-98793174/ ac7ya7@r.postjobfree.com

1-585-***-****

Objective

To be part of a progressive organization that helps me in deploying my technical and professional skills to the fullest, that widens my technical skills and dedication to a motivated team to significantly contribute to the company's growth and profitability.

Profile

Current Role: Hadoop Administrator

Hadoop Experience

A Total of 7 years of experience in server and Hadoop administration.

Having 3+ years of experience in Hadoop Administration on Hortonworks distribution.

Maintained Hadoop clusters for Development/Quality/Production.

Experience in monitoring and managing 80+ node Hadoop cluster.

Experience in deploying Hadoop cluster on Public and Private Cloud Environment like Amazon AWS, Microsoft Azure.

Created a complete processing engine based on Hortonworks distribution

Provided level 2&3 technical supports.

Hands-on experience with Production Hadoop applications such as administration, configuration management, debugging and performance tuning.

Fine-tuned Hadoop clusters to achieve the industry standard benchmarking results. This exercise included the performance tuning of Yarn, Map- reduce, Hive, HBase, Zeppelin, Grafana, Atlas, Solr, Kafka and spark services.

Implemented Kerberos in Hadoop Ecosystem.

Implemented KNOX security in cluster enabling secure access to data.

Implemented data encryption on Data at Rest using Ranger KMS to secure sensitive data.

Supported Map Reduce programs running on the cluster.

Monitor Hadoop cluster using tools like Nagios and Apache Ambari/Grafana.

Worked with system engineering team to plan and deploy Hadoop hardware and software environments.

Worked on disaster management with Hadoop cluster.

Built ingestion framework using flume for streaming logs and aggregating the data into HDFS.

Built data transform framework using Map Reduce and Pig.

Worked with business users to extract clear requirements to create business value

Good knowledge on Hadoop cluster architecture and monitoring the cluster

Experience in using Pig, Hive, Sqoop, Zookeeper

Exposure to traditional and non-traditional databases likes RDBMS Server Administration Skills

Server support – Support and admin activities in managing Linux, Solaris (9 and 10) and Windows

(2003 and 2008) server.

SSL certificate renewal – Performed certificate installation and renewal for various web servers (apache, Tomcat, IIS 7.0, iplanet)

Shell scripting - Automating the various system administrative tasks using Unix Shell and Perl scripting.

Network – Performed support task on network administration

Ticketing tool – Worked on various ticketing tool like BMC remedy and MKS integrity.

Monitoring tools – Install, upgrade and implement various monitoring tool like Nagios, Big Brother and JON to monitor servers (Linux, Unix and windows)

Application Server – Support and admin activities in managing application server like Red Hat JBOSS and Web sphere Application Server (WAS 7)

Integration Technology – Support and Admin activities in managing WMQ.

SQL – Good knowledge of PL/SQL programming.

Technical Skills

Operating System Red Hat Enterprise Linux (RHEL)/LINUX, UNIX, Windows Database MySQL, PostgreSQL

Tools Hadoop, Map/Reduce, HDFS, Hive, HBase, Sqoop, Kafka, Tez, Zeppelin, Grafana, MySQL, Microsoft Azure, Nagios Host Monitoring tool, BMC Remedy,SQL developer Cluster Monitoring

tools

Ambari, Ambari Metrics, Nagios, Ganglia

Scripting Shell and Perl Scripting

Programming

Language

C++, Java

Domain Experience Production support for live servers, Performed SIT and UAT Testing. Cognizant Technologies and Kogentix Experience Summary Project Analytic Discovery Platform

Customer CVS Health

Description CVS Health (previously CVS Corporation and CVS Caremark Corporation) (stylized as Heart corazón.svg CVS Health) is an American retail pharmacy and health care company headquartered in Woonsocket, Rhode Island. The company began in 1964 with three partners who grew the venture from a parent company, Mark Steven, Inc., that helped retailers manage their health and beauty aid product lines. The business began as a chain of health and beauty aid stores, but within several years, pharmacies were added. To facilitate growth and expansion, the company joined The Melville Corporation, which managed a string of retail businesses. Following a period of growth in the 1980s and 1990s, CVS Corporation spun off from Melville in 1996, becoming a standalone company trading on the New York Stock Exchange as NYSE: CVS Role Hadoop Administrator / Operations Support Lead Responsibility Installed and configured Hadoop cluster across various environments.

Performed both major and minor upgrades to the existing Hortonworks Hadoop cluster.

Integrated Hadoop with Active Directory and enabled Kerberos for Authentication.

Installed 50+ node Hadoop clusters using Hortonworks Hadoop.

Applied patches and bug fixes on Hadoop Clusters.

Performance tuned and optimized Hadoop clusters to achieve high performance.

Implemented schedulers on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.

Monitoring Hadoop Clusters using Apache Ambari, Ganglia and Nagios. 24x7 on call support.

Expertise in implementation and designing of disaster recovery plan for Hadoop Cluster.

Extensive hands on experience in Hadoop file system commands for file handling operations.

Implemented Yarn ACLs on queues using Ranger.

Worked on Providing User support and application support on Hadoop Infrastructure.

Reviewing ETL application use cases before on boarding to Hadoop.

Created Hive external tables and managed tables, designed data models in hive.

Involved in business requirements gathering and analysis of business use cases.

Prepared System Design document with all functional implementations.

Involved in Data model sessions to develop models for HIVE tables.

Understanding the existing Enterprise data warehouse set up and provided design and architecture suggestion converting to Hadoop using MR, HIVE, SQOOP and Pig Latin.

Worked on Sequence files, Map side joins, bucketing, partitioning for hive performance enhancement and storage improvement

Implemented hive access through Knox integrated with AD.

Implemented data encryption using Ranger KMS and Infoworks for sensitive HR data.

Isilon Administration.

Solution

Environment

Hive, HBASE, Sqoop, Flume, Hue, Java API, PIG, Tez, Knox, Kafka TCS Experience Summary

Project Enterprise Data Platform

Customer Xerox Corporation

Description This project was for build a Hadoop platform which brings the enterprise data to CDW

(Common Data Warehouse). The data was collected from various sources such as databases, social networking site and was in format RDBMS, XML and CSV. This data was further fed to various data analysis tools for exploratory data analysis. Role Hadoop Administrator

Responsibility Installed and configured Hadoop cluster across various environments.

Performed both major and minor upgrades to the existing Hortonworks Hadoop cluster.

Integrated Hadoop with Active Directory and enabled Kerberos for Authentication.

Installed 20+ node Hadoop clusters using Hortonworks Hadoop.

Applied patches and bug fixes on Hadoop Clusters.

Performance tuned and optimized Hadoop clusters to achieve high performance.

Implemented schedulers on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.

Monitoring Hadoop Clusters using Apache ambari, Ganglia and Nagios. 24x7 on call support.

Expertise in implementation and designing of disaster recovery plan for Hadoop Cluster.

Extensive hands on experience in Hadoop file system commands for file handling operations.

Solution

Environment

Yarn, Mapreduce, Hive, HBASE, Sqoop, Hue, Java API, PIG, Tez Project Enterprise Nagios Monitoring Platform

Customer Xerox Corporation

Description Xerox Corporation Ltd. is an American multinational document management corporation that produces and sells a range of color and black-and-white printers, multifunction systems, photo copiers, digital production printing presses, and related consulting services and supplies. Xerox is headquartered in Norwalk, Connecticut

(moved from Stamford, Connecticut in October 2007). Working as a System administrator for supporting and maintaining the availability of applications and servers. Maintaining 400+ servers (windows, Linux, Solaris). The client has its users across the globe.

Role System Administrator

Responsibility • Perform daily system monitoring, verifying the integrity and availability of all hardware, server resources, systems and key processes, reviewing system and application logs, and verifying completion of scheduled jobs such as backups.

• Identifying problems and then suggesting technical computer-based solutions.

• Ensuring robust network backup and recovery

• Create, change, and delete user accounts per request.

• SSL certificates installation and renewal for various webservers (apache, tomcat, IIS 6.0, IIS 7.0)

• Automate administrative task using shell scripting and task scheduler

• Coordinating with various application specific teams to resolve the issue.

• Applying quarterly server and software packages on Unix boxes.

• Creating and Maintaining the instances on Amazon Web services Cloud platform.

• Procuring Virtual machines and setting it up for application use.

• Collecting & Analyzing system metrics and system tuning for maximum throughput.

Nagios and BB Monitoring :-

• Installing, upgrading and maintaining monitoring servers

• Installing Nagios and BB agents on client servers

• Co–coordinating with application to resolve network conflict between client and server

• Managing user and group access on Nagios server.

• Doing a routine health checkout.

• System cleanup and failover to DR.

• Automation using shell and Perl script

• Creating manual plugin to meet customer requirement

• Streamline the process of customer involvement and grievance resolution

• Knowledge sharing, documenting and maintaining a project knowledge base. Solution Environment Red hat Linux (5,6), Solaris (9,10), Windows (2K3, 2K8, 2K12), Nagios Project Enterprise Integration Platform Centre

Customer Xerox Corporation

Role Middleware Administrator

Technical Skills Red hat JBOSS, WebSphere MQ, WebSphere Application Server, JBOSS Operation Network, Red hat Linux 6.0, Shell and Perl Scripting, PL/SQL Responsibility • This project involves developing MQ interfaces, JBOSS deployments, apache webserver configuration and administration of MQ and web services.

• Sole responsible in MQ configuration and MQ installation on Linux and Solaris servers

• Responsible for JBOSS installation, configuration and support more than 80 customer applications in QA, Stage and Prod environment.

• Responsible for MQ, JBOSS, Apache monitoring using various monitoring tool

(Jon, Big Brother, Nagios)

• Worked on platform migration from Message broker and WAS to JBOSS 5.1 and Apache.

• Actively involved in JBOSS migration from 5.1 to 5.3

• Developed tool to create queue manager on various Unix platform.

• Developed various automation tools to standardize MQ build, reduce the manual effort & enhance in quality.

• Perform bi yearly DR drill for Stage and Prod environment. Responsible for failover to DR site, validate all application and network, run test cases and failback to primary site.

Experience Summary

Current Experience 1 YEAR(S),2 MONTH(S)

Previous Experience 5 YEAR(S),8 MONTH(S)

Total Experience 6 YEAR(S), 11 MONTH(S)

Education

Qualification Subject Percentage/Grade

Bachelor Of Technology Computer Science & Technology 64% Standard Xii / H.S.C. Others 82.5%

SSC Others 78.4%

Interests/ Hobbies/ Volunteering

Interests – Cricket, Photography, Sketching.

Koshish – A volunteering effort to collect people together to make small changes in society which can make a big impact to someone’s life. https://www.facebook.com/koshishvolunteers/

Koshish Food Drive – https://yougivegoods.com/koshishfooddrive

• Developed web based MQ Explorer to administrate Queue Manager, QM objects, QM logs and FDC.

• Write down various shell scripts to support MQ and JBOSS administration.

• Actively work with Application teams on different incidents and provide solutions.

• Involved in raising PMR and working with IBM support team on MQ issues.



Contact this candidate