Resume

Sign in

Hadoop Data

Location:
Vasant Nagar, Karnataka, India
Posted:
May 20, 2020

Contact this candidate

Resume:

Bangalore

+91-888*******

addb3v@r.postjobfree.com

DURGA PRASAD B

EXPERIENCE SUMMARY

Currently working with Tech Mahindra from July 2019 till date.

Worked with Oracle India Pvt Ltd from Sep 2012

PERSONAL PROFILE

Having 6+ years of professional experience in the field of Information Technology and having 3+ years of experience in Hadoop Administration.

Hands on experience in Hadoop cluster installation, configuration and deployment to integrate with systems hardware.

Hands on experience in installation, configuration and managing Hadoop Clusters using Apache 2.X, Cloudera & Hortonworks Distribution.

Strong knowledge on Hadoop concepts and its Ecosystem components such as HDFS, MapReduce, YARN, PIG, Sqoop, Hive, HBase, Oozie, Zookeeper.

Experience in installing and configuring Apache Vanilla and its sub projects.

Troubleshooting, diagnosing, performance tuning and solving the Hadoop issues.

Exposure in Hadoop upgrade and rollback, Backup and Restore.

Commissioning and Decommissioning the Nodes to existing and running hadoop cluster.

Experience in Job scheduling and monitoring the status of the jobs

Configuring and maintaining Space quotas for File System.

General Administration of Linux installation, Users Creation, Group Creation, Permissions and package management using YUM and RPM.

Having knowledge on Cloudera Distribution.

Possess strong problem solving skills, positive attitude, self-starter and team player with good interactive skills.

Having Good experience on SSL Certificate configurations and LDAP authentication configurations.

Expertise to install Hadoop and its related components in a multi-node cluster environment.

Expertise in Implement new Hadoop hardware infrastructure

Proficient in Installing, Configuring and Administering Oracle Databases 10g and 11g.

Configured Kafka DR Cluster for High Availability.

EDUCATION

B. Tech. in Computer Science Engineering, from JNTU

PROFESSIONAL EXPERIENCE

Tech Mahindra

Project: Three UK

Implemented Cloudera from Scratch to replace HortonWorks distribution.

Worked on Azure Cloud for deploying Altus Director to deploy Cloudera Clusters.

End to End implementation of HDInsights Cluster (HortonWorks Distribution) using Azure Cloud and On-Premise.

Configured Kafka DR Cluster

Did Performance tuning for Eco System Components involved.

End to End Admin tasks performed. (Creation of Cluster till Performance Tuning)

Responsible for maintaining the (DEV, SIT, UAT & Prod) Clusters with High Availability.

Resolved DAST security scan issues.

Applied OS patching on Ubuntu and required OS packages installation and updates.

Worked with dev team on day to day issues raised and resolved them.

Equifax Analytics Pvt. Limited

Project: SAMBA (Saudi American Bank)

Responsible for Cluster maintenance, Commissioning and decommissioning Data nodes, Cluster Monitoring, Troubleshooting, Manage and review data backups, Manage & review Hadoop log files.

Monitoring Cluster Health and Troubleshooting

Performing HDFS Snapshots, OS Patching, Hadoop Patching

Setting up Ranger Policies and testing HDFS & Hive for the users.

Re-balancing of the HDFS

Knowledge of importing and exporting the data from RDBMS to Hadoop using Sqoop.

Responsible for data backup from Prod to Backup Cluster

Quota administration, notification in case of space problems

Moving/shifting nodes/roles/services to other nodes

Processing the events of Hadoop logs, taking measures, correcting errors, and involving the relevant teams if necessary

Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.

Performed installation of Hadoop.

Maintaining cluster health and HDFS space for better performance

Moving data efficiently between clusters using Distributed Copy

Oracle India Pvt. Limited

PROJECTS

Project:

AT&T – In house system to manage the activities of various AT&T Companies located around the world. It is mainly used for Order Processing, Order Transmission, Vessel Booking, and Production planning, shipping instruction, Inventory Management. The result is a single, consistent and easily accessible data source for sales, ordering, billing, service delivery & customer and inventory data.

ROLES AND RESPONSIBILITIES:

Responsible for implementation and administration of Hadoop infrastructure.

Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users.

Cluster maintenance as well as creation and removal of nodes

Jobs monitoring and performance enhancements.

Knowledge of importing and exporting the data from RDBMS to Hadoop using Sqoop.

Manage and review Hadoop log files.

Involving in Analyzing system failures, identifying root causes, and recommended course of actions.

Taking backup of hadoop metadata using snapshots.

Performed Hadoop version upgrades when required.

Decommissioning and commissioning the Node on running hadoop cluster.

Involved in R&D Team for migrating data from the RDBMS data source to Hadoop distributed File System Environment and vice versa for analytics and reporting purpose.

Involved in Configuring Hadoop, Hive and mysql and sqoop in the cluster

Created database objects like stored procedures

PROJECT: DISCOVERY AND REMOTE MONITORING PLATFORM

Role and Responsibilities:

Design the Technical architecture as per the products involved in the project.

Installed Oracle Business Intelligence EE and SOA

Installed Oracle Internet Directory and Oracle Access Manager

Configured SSO with E-Business Suite, OBIEE

Troubleshoot and resolved various issues during above setup.

Installed OBIEE 11.1.7.5 and migrated rpd and web catalog.

Installed SOA 11.1.1.7, migrated custom components to SOA.

Configured SSO with E-Business Suite, OBIEE for seamless login across products.

Installation of E-Business Suite R12 on multimode environment.

Patch application on dev and prod applications for bug fixes, upgrade etc.

Schedule and created script for Purging of log files on both application and database side.

Creating, update and delete menu, responsibilities, users, concurrent program etc.

Troubleshoot and rectified various issues in concurrent managers.

Configuring workflow mailer and resolved various issues related to workflow mails.

Created custom concurrent manager for frequently running requests.

Change in hostname/IP address, and application port number of EBS as per business requirement.

Compiling forms, menus and packages through manually as well as using adadmin utility.

Changing apps and all other products password using FNDCPASS

Resolved 500 internal server errors for application home page.

Resolved Apps issues by compiling invalid objects, relinking apps executables and applying patches.

Documentation at client site - System study, site documentation, incident report, monthly service report, daily monitoring and health checkup report

PROJECT: DIGITAL SPACES MONETIZATION

Role and Responsibilities:

Developing Packages, Procedures and Functions using cursors, Records and Collections (Bulk collect) as per the requirement.

Developing SQL queries using Joins, Sub queries, scalar and Inline queries for the manipulation of data and transferring data.

Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.

PL/SQL coding, Debugging.

Creating database objects through development and testing environments.

Creating triggers for the application business logic.

Migrate the previous deals data stored in excel sheets on to the application.

Date:

Place: Signature



Contact this candidate