Post Job Free

Resume

Sign in

Manager Management

Location:
Longmont, CO
Posted:
September 05, 2017

Contact this candidate

Resume:

Balaji Deivam

+1-216-***-**** / ac153f@r.postjobfree.com

LinkedIn Profile

SUMMARY

Over 10+ Years of professional experience as a System Admin including 3+ years of Hadoop Administration, Revolution R / Shiny Server, Informatica Administration & 6+ years Linux and SAS administration on Linux/AIX/Windows platforms

Experience in researching, designing, building, testing, deploying and maintaining Hadoop ecosystems with Hadoop clusters ranging 160+ production nodes (Cloudera CDH) & components such as HDFS, MapReduce, Yarn, Pig, Hive, Oozie, HBase, Sqoop, Flume, Zookeeper, Kafka, HUE, Impala, Spark

Experience in event management tools such as Cloudera manager, Ganglia & Nagios

Proficient in setup, configuration and management of security for Hadoop clusters using Kerberos and Active Directory/LDAP

Performance tuning of Hadoop clusters, HDFS, YARN and Hadoop MapReduce routines

Experience in Unix shell scripting (e.g. Bash, ksh, etc.)

Strong technical, administration & mentoring knowledge in Linux, Bigdata/Hadoop, SAS & other Open source solution

Implemented BigData user-facing GUI solutions such as JupyterHub, RStudio, Zeppelin and provided support to end users.

File system management and monitoring, HDFS support and maintenance.

Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager and also manually.

Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users and testing HDFS, Hive, Pig and MapReduce access for the new users.

Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

Involved in customer interactions, business user meetings, vendor calls and technical team discussions to make right choices in terms of design and implementations and to provide best practices to the organization.

Experience in managing the cluster resources by implementing fair scheduler and capacity scheduler.

Experience in building & configuring Cloudera, Hortonworks Hadoop cluster in AWS

Experience with system integration, capacity planning, performance tuning, system monitoring, system security, operating system hardening and load balancing

Researching new tools/products including open-source and presenting to Business with their benefits to implement

Point of Contact for Vendor escalation

Proficient in HDFS, GPFS/SAN, NAS, NFS file system

SAS Analytics Platform environment planning, maintenance and administration

Experience in SAS Admin for SAS 9.1.3, SAS 9.3, SAS 9.4M2 & M4

Worked individually in SAS Upgrade from 9.1.3 to 9.3, 9.3 to 9.4, 9.4M2 to 9.4M4

Completed SAS Admin Training for 9.3 from SAS Institute

Highly motivated, team player with a positive attitude and a quick learner of new programming techniques and languages

24X7 responsible for High availability Hadoop Platform to resolve incidents as quickly with the goal of maintaining service levels and will prepare root cause analysis (RCA) document

EDUCATION

Bachelor of Engineering (B.E) from Anna University, India

SKILLS

oAdministrator: Hadoop (Cloudera), AWS, SAS 9.1.3, SAS 9.3, SAS 9.4M2, Informatica 9.5.1 & 9.6.1, Microstrategy 9.2.1, Coldfusion10, Slurm, Python, Microsoft R, Linux, Shiny, RStudio, Jupyterhub, Zeppelin, MYSQL DB

oBigData/Hadoop: Cloudera Manager, HDFS, YARN, Hive, Spark, HBase, Oozie,HUE, Impala, Kerberos, LDAP/AD, SSL, Sqoop, Mapreduce, Zeppelin, Cloudera Navigator, Sentry, Nagios, Ganglia

oSAS Tools: SAS Enterprise Guide, SAS Personal Login Manager, SAS Management Console, SAS Eminer (Web & Desktop), SAS VA, SAS EBI, SAS Stored Process, SAS WRS, SAS Add-in, WINSCP, WINSQL, Putty, EOD, Service Manager, PVCS, TOAD, ESP Workstation

oOperating Systems: Linux, AIX, Solaris, Windows, Mainframe Z/OS

oDomain knowledge: Banking, Life Sciences, Computer Hardware

EXPERIENCE

Hadoop Administrator

Seagate Technology, Longmont, Colorado

June’15 to Current

Installed & Configured of fully distributed Hadoop cluster thru Cloudera manager.

Experience in event management tools such as Cloudera manager, Ganglia & Nagios

Strong technical, administration & mentoring knowledge in Linux, Bigdata/Hadoop, SAS & other Open source solution

Implemented BigData user-facing GUI solutions such as JupyterHub, RStudio, Zeppelin and provided support to end users.

Proficient in setup, configuration and management of security for Hadoop clusters using Kerberos and Active Directory/LDAP

Performance tuning of Hadoop clusters, HDFS, YARN and Hadoop MapReduce routines

Experience in Unix shell scripting (e.g. Bash, ksh, etc.)

File system management and monitoring, HDFS support and maintenance.

Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager and also manually.

Involved in Hadoop Cluster environment administrator that includes adding, decommissioning & re-balancing.

Experience in managing 160+ production nodes & 60+ Pre-production nodes

Experience in configuring Namenode High Availability in Hadoop Cluster

Involved in Optimization of Cluster as per the requirement.

Migrating data from one cluster to another using distcp.

Upgrading Hadoop using Cloudera Manager and manually

Responsible for SAS 9.4,Python, Revolution R system administration on a highly available platform spanning data warehouse, Hadoop, Oracle, storage subsystems and leading web technology solutions.

Experience in upgrading SAS 9.4M2 to M4 and also helped to reduce SAS Licensing cost.

Installing various R and Python Packages in Linux environment

Worked with eSecurity team to get sign-off after fixing the vulnerabilities before production implementation

Experience in SSL certification installation and testing through developer tools.

Ownership for creating and managing the Infrastructure Roadmap.

Web server Linux system administration experience such as Cold Fusion, Apache, JBoss

High performance storage systems design and administration such as GPFS and HDFS

Technical documentation and facilitation skills to initiate and support the new platform upgrades/releases.

Revolution R, Shiny Pro, RStudio, Jupyterhub, Slurm & Informatica Administration

Oracle 11g on IBM AIX

Daily tasks include resolving Service requests raised by customers on priority basis, raising (Normal/Emergency) Change requests as per requirement, and updating the changes in Centralized Inventory.

24X7 responsible for High availability Hadoop Platform to resolve incidents as quickly with the goal of maintaining service levels and will prepare root cause analysis (RCA) document

Hadoop & SAS Administrator

American Express, Phoenix, Arizona

June’14 to June’15

Installed and configured Hadoop Map Reduce, HDFS and Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster.

SAS to Big Data migration & Building architecture for migrating existing SAS to Big Data environment.

File system management and monitoring, HDFS support and maintenance

Performance tuning of Hadoop clusters, HDFS, YARN and Hadoop MapReduce routines

Provide information to business clients regarding technology directions and translate technology opportunities into integration solutions

Managing SAS Environment, responsible for any changes in the environment

Involved in upgrading SAS 9.3 to 9.4 & Maintaining SAS Metadata repository

Completed Single Sign on migration(SSO) – SAS Applications

Integrated SAS with Java to call SAS jobs from Hadoop environment.

Installed SAS Audit Performance and measurement 9.3 in the existing SAS cluster.

Applying Hot fixes and patches for the SAS Environment

Identifying root causes for any server related issues and fix the same.

Implementing best practices for configuring applications and work with the development teams to implement the standards.

Co-ordination with vendor teams during change/incident.

SAS Web platform Administration, Security & Vulnarabilities, AIM, User access Certification

Capacity Planning, Storage Management

Managing SAS License & Renewal

Experience in GPFS Cluster & Firmware Upgrade

SAS Administrator/Microstrategy Administrator

KeyBank, Cleveland, Ohio

Feb’12 to May’14

a) SAS Administrator

SAS Upgrade starting from estimates, getting quote from SAS, Implementation, Migrating user projects

SAS 9.3 installation and upgrade from SAS 9.1.3

Create/modify servers (metadata, workspace, stored process, connect, share, web app User/Group/Role Management

Migrated SAS frontend software’s to Citrix Server

Experience in SAS Metadata Management

Applies maintenance releases, upgrades and hot fixes as required

Managing Metadata Users and Groups

Changing passwords for the service accounts and SAS internal accounts

Maintaining connectivity to all data sources

SAS Environment Management/ monitoring SAS Server logs

Managing workspace servers object spawners and stored process servers, Metadata Server

SAS Services startup and shutdown using script/SAS Management console

SAS Server space maintenance and following up with the users to freeup/compress large files

Maintaining different license for the front end SAS Clients & License renewal on the SAS Server

Maintaining SAS performance on the server and monitoring the same

Involving SAS Technical Support for any major issue to resolve the issue quickly

Production support on SAS server/Users Job

Implemented best practices

Completed SAS Admin Training for 9.3 from SAS

b) Microstrategy 9.2.1 Admin

Worked also as a Microstrategy Admin

Updating security levels for the different projects/folder/User groups level

User access control, Creating/Managing User accounts

Created usage report on tables/projects on Microstrategy

Applying hotfixes and other performance maintenance, License manager

Creating Attributes, Filters, Facts, Metrics

System Analyst

IMS Health America, Chennai, India

Nov’07 to Feb'12

As an analyst I used to gather the requirements for Analysis and ad-hoc reporting based on the business request and process the request using SAS to generate reports

Understanding on managing Huge Data for analysis

different SAS Dataset and Procedures like Merge, Set, Do Loops, ARRAY

Write efficient code to reduce repeat steps using SAS/MACRO

Performed Data validation, Error Checking using SAS

Performed Proc Import procedure to read data in SAS dataset

Analysis on different created Data sets by using SAS Proc, SAS/SQL, SAS Macros

worked efficiently on automation of many SAS Programs to improve reporting efficiency

Experienced in using CITRIX

Review and re-write existing code for efficiency and also worked efficiently on automation of many SAS Programs to improve reporting efficiency

Involved in various Analytical and ad hoc reports using SAS/SQL based on the business request

Used SFTP/FTP to transfer data between different servers.

Handled multiple projects at the same time.

Worked on all phases of SDLC



Contact this candidate