Priyanka Sharma
Email: acz4kx@r.postjobfree.com
Charlotte, NC
https://www.linkedin.com/in/sharmapriyanka2689
Professional Synopsis
5.1 years of IT experience in developing and delivering BigData solutions in Financial Services and Banking domain.
Cloudera Certified Apache Hadoop Administrator (CCAH)
Proficient in installation, configuration and administration of Hadoop ecosystem components.
Maintenance of clusters including upgrades and patches.
Have a deep understanding of architecture model.
Skilled in writing Hive queries for loading data.
Basic knowledge of implementing MapReduce programs for extracting useful information from terabytes of data.
Have a functional knowledge of HBase.
Skilled in workflow scheduler system Oozie to manage Hadoop jobs.
Intermediate skills in Data ingestion to/from Hadoop.
Experienced in client interaction and management.
Experienced in creation of technical, functional and requirement documentation.
Quick learner with strong communication and people skills who can work in a team or independently.
Have a valid H1B work permit valid till 2019.
Technical Skills
Hadoop Tools: Hive, PIG, Sqoop, MapReduce, Yarn, HDFS, Oozie, HBase, Flume
Utilities – Toad for Oracle, Putty, Nagios, Ganglia, BMC Remedy, Filezilla/WinSCP
Database - Oracle 11g, SQL Server 2008
Languages – Java basic, UNIX shell, Python
IDE - Eclipse
Operating System – Windows, Linux/Unix, Mac OS
Education & Certification
Bachelor of Technology in Computer Science, 2011 from University of Rajasthan, Jaipur
(India)
ITIL v3 Foundation certified specialist
CCA-500: Cloudera Certified Administrator for Apache Hadoop Organizational Experience
Employer - Tata Consultancy Services, India
Aug’12 – July’16
Designation - Hadoop Administrator
BigData solution for processing and extracting useful information from Terabytes of data. Development included several technologies including Java, HDFS, PIG, MapReduce, Shell scripting, Hive, Sqoop, MySql etc.
Key Responsibilities:
Gathering business requirements from the Business Partners and Subject Matter Experts.
Involvement in installing Hadoop Ecosystem components.
Managing and reviewing Hadoop log files.
Responsible to manage data coming from different sources.
Supporting Map Reduce Programs those are running on the cluster.
Performing Cluster Maintenance tasks like adding and removing nodes from cluster.
Involvement in HDFS maintenance and loading of structured and unstructured data.
Working with data delivery teams to setup new Hadoop users, which includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
Developing Scripts and Batch Job to schedule various Hadoop Programs.
Creating Hive tables and working on them using Hive QL.
Responsible for Cluster Monitoring using Ganglia and Nagios.
Cluster Security implementation with Kerberos.
Employer - Tata Consultancy Services
Feb’12 – Jul’12
Designation - Network Operations Engineer
Express Scripts is the largest pharmacy benefit management company of USA, serving DoD Tricare program.
Key Responsibilities:
Monitoring the Network, Windows & Unix Servers Infrastructure for a large USA based retail client.
Participation in On-Call roaster to address emergency / critical issues.
Disk Space Management, File system management, User Access, Password Management for Unix Servers.
Coordinating and managing critical production changes and release
Tools used : Sitescope, HP-OM8, HP-OVOU, BMC Remedy, BMC Event Manager, Wily Introscope, Solarwinds, Intermapper, AS400 Console, CMC(Citrix Management Console), BMC Application Manager