Post Job Free

Resume

Sign in

Data Test Cases

Location:
Thiruvananthapuram, KL, India
Posted:
January 23, 2017

Contact this candidate

Resume:

Vignan

213-***-****

Summary of skills:

About * years of experience in IT industry this includes 2+ years in Hadoop 4+ years in Linux.

Hands on experience in installing, configuring and using eco-System components like Hadoop Map Reduce, HDFS, Hbase, Pig, Flume, Linux and Sqoop.

Hands-on experience with productionalizing Hadoop applications (such as administration, configuration management, debugging, and performance tuning).

Worked on Multi Clustered environment and setting up Cloudera Hadoop eco-System.

Implemented in setting up standards and processes for Hadoop based application design and implementation.

Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with DW reference tables and historical metrics.

Familiar with data architecture including data ingestion pipeline design, Hadoop information GraphSupport and utilize multiple Oracle based applications including: SQL and PL/SQL, TOAD, Oracle views, stored procedures, triggers, and the Microsoft Office suite of tools.

Design, create/modify, and implement documented solutions as agreed to by all business partners responsible to design and integrate a computing system, from start to finish.

Excellent communication skills, interpersonal skills, problem solving skills a very good team player along with a can do attitude and ability to effectively communicate with all levels of the organization such as technical management and customers.

Interpersonal, communication, Planning, Organizational and Computer Skills.

Ability to prioritize work and manage daily workload.

Work on multiple projects with efficiency.

Extensive experience in Software Development Life Cycle (SDLC) which includes Requirement Analysis, System Design, Development, Testing, and Implementation.

Created numerous simple to complex queries involving self joins, correlated sub queries for diverse business requirements. Tuned and optimized queries by altering database design, analyzing different query options, and indexing strategies.

Hands on experience in Unit Testing & System Testing.

Professional Expertise in MS-Office i.e.MS-Excel, MS-Word, MS-Power point, MS Outlook.

Technical skills:

Hadoop Ecosystem

HDFS, Map-Reduce, Hive, Pig, Sqoop, Oozie, Zookeeper

NoSql Databases

Hbase

Programming Language

C, C++, Java,HTML

Database

DB2, ORACLE, MySQL, SQL Server

Scripting Languages

Shell Scripting, Puppet

IDE

Net Beans, Eclipse, Visual Studio, Microsoft SQL Server, MS Office

Operating Systems

Linux(Redhat, CentOS,UBUNTU), Windows, Mac

WEB Servers

Apache Tomcat,JBOSS and Apache Http web server

Cluster Management Tools

Cloudera Manager and HDP Ambari

Virtualization technologies

VMware vSphere, Citrix XenServer

Education and Certifications:

Bachelors in Computer Science and Engineering .

Cloudera Certified Hadoop Administrator

Work Experience:

T-Mobile, Atlanta Sep 2015-present

Role: Hadoop Administrator:

Installing, Upgrading and Managing Hadoop Cluster on Cloudera distribution.

Expertise in recommending hardware configuration for Hadoop cluster

Trouble shooting many cloud related issues such as Data Node down, Network failure and data block missing.

Managing and reviewing Hadoop and HBase log files.

Experience with UNIX and Linux OS.

Built and configured log data loading into HDFS using Flume.

Performed Importing and exporting data into HDFS and Hive using Sqoop.

Managed cluster coordination services through Zoo Keeper.

Mystifying and demystifying nodes from the Cluster environment.

Provisioning, installing, configuring, monitoring, and maintaining HDFS, Yarn, HBase, Flume, Sqoop, Oozie, Pig, Hive.

Patching and upgrading Cloudera and Hortonworks clusters.

Recovering from node failures and troubleshooting common Hadoop cluster issues.

Hadoop package installation and configuration to support fully-automated deployments.

Supporting Hadoop developers and assisting in optimization of map reduce jobs, Pig Latin scripts, Hive Scripts, and HBase ingest required.

Environment: HADOOP HDFS, MAPREDUCE, HIVE, PIG, FLUME, OOZIE, SQOOP, ECLIPSE, CLOUDERA,APACHE HADOOP.

Hitachi, Santa Clara Jan 2015-Sep 2015

Role: Hadoop Administrator:

Responsible for Development and monitoring Application.

Expertise with Running Pig and Hive Queries.

Expertise in analyzing data with Hive, Pig and HBase.

Expertise in loading data into HDFS.

Expertise in Cluster Capacity Planning.

Developing MapReduce Program to format the data.

Expertise in handling with Large Data Warehouses for pulling reports.

Expertise in preparing the HLDs and LDS and preparing Unit Test Cases based on functionality.

Expertise in Manage and review data backups.

Expertise in developing query language with Hive.

Good Knowledge in Hadoop Installation.

Expertise in installing and managing distributions of Hadoop (CDH3, CDH4, Cloudera manager, MapR, Hortonworks and Rackspace etc

Expertise in disaster recovery processes as required.

Good Experience on HIVE writing SQLs to pull reports.

Expertise in Design, configure and manage the backup and disaster recovery for Hadoop data.

Expertise in applying Hadoop updates, patches, version upgrades as required

Expertise in Optimize and tune the Hadoop environment to meet the performance requirements.

Environment: HADOOP HDFS, MAPREDUCE, HIVE, PIG, FLUME, OOZIE, SQOOP, ECLIPSE, HORTONWORKS AMBARI

Premier Inc. NY Nov 2013 - Dec 2014

Role: Hadoop Administrator:

Works with the Hadoop production support team to install operating system and Hadoop updates, patches, version upgrades as required.

Installed and configured an automated tool Puppet that included the installation and configuration of the Puppet master, agent nodes and an admin control workstation.

Using Puppet, automated the installation and configuration of the Hadoop Cluster, Hive, HBase, Flume, Pig and Sqoop.

Involved in moving all log files generated from various sources to HDFS for further processing.

Involved in installation of Hive, Pig, MapReduce and Sqoop.

Performed hive queries and wrote UDF’s in PIG Latin.

Wrote Map Reduce jobs to analyze data.

Extracted output files using Sqoop and loaded the extracted log data using Flume.

Installed Apache web server for Ganglia to publish charts and graphs in the web console.

Set up automated monitoring for Hadoop cluster using Ganglia, which helped figure out the load distribution, memory usage and provided an indication for more space.

Created user accounts and given users the access to the Hadoop cluster.

Performed HDFS cluster support and maintenance tasks like Adding and Removing Nodes without any effect to running jobs and data.

Environment: Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Linux, Java, Oozie, Hbase.

Knoah solutions India Jun 2010- Oct 2013

Role: Linux System Administrator:

Administration, package installation, configuration of Oracle Enterprise Linux 5.x.

Administration of RHEL, which includes installation, testing, tuning, upgrading and loading patches, troubleshooting both physical and virtual server issues.

Creating, cloning Linux Virtual Machines.

Installing RedHat Linux using kick start and applying security polices for hardening the server based on the company policies.

RPM and YUM package installations, patch and other server management.

Managing systems routine backup, scheduling jobs like disabling and enabling cron jobs, enabling system logging, network logging of servers for maintenance, performance tuning, testing.

Tech and non-tech refresh of Linux servers, which includes new hardware, OS, upgrade, application installation, testing.

Set up user and group login ID's, printing parameters, network configuration, password, resolving permissions issues, and user and group quota.

Creating physical volumes, volume groups, and logical volumes.

Gathering requirements from customers and business partners and design, implement and provide solutions in building the environment.

Installing and configuring Apache and supporting them on Linux production servers.

QA Engineer, Gameloft: May2008-Jun 2010

Two years’ of experience as a QA Engineer.

Involved in Agile Methodology.

Testing the Compliance module by running the manual test cases from the benchmark given by DISA and CIS, writing the positive and negative test cases as per the benchmark.

Verifying the configuration auditing across the system, servers and devices periodically.

Experience in Mobile Application Testing in Black Box Testing using Manual Testing on Handheld Mobile devices.

Worked on Facebook application for Stress testing.

Good understanding of Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Test Methodologies.

Updating pre-designed Test Cases, Test Data and preparation of specific checklist manually for different application for the different compatible devices.

Expertise in Functional Testing, Re-Testing, Performance Testing, Compatibility Testing, Regression Testing, Usability Testing and User Acceptance Testing.

Expertise in Localization Testing using specific localization checklists.

Proficient in posting and tracking the bugs & involved in Bug Tracking by using Pre-owned tool.

Education:

Bachelors in Computer Science from TRR Engineering College.



Contact this candidate