Sign in

Hadoop Administrator

Company:
Titan Technologies Inc
Location:
Issaquah, Washington, United States
Salary:
DoE
Posted:
February 23, 2018
Description:

Full-time/Permanent Hire with our Direct End Client

Please review the below details of the position open & if you are a good fit / interested to pursue please Apply online with a latest copy of your resume and contact information’s.

We will get in touch with you at the earliest possible! Appreciate your time.

Job Title: Hadoop Administrator

Location: Issaquah, WA (USA)

Compensation: Base Salary per Annum + Excellent Benefits + Relocation Assistance

*** IMP. NOTE: Client is unable to Sponsor any Work Visa for now therefore Candidates who do not need any Work visa authorization May apply / pursue only!

Required skills, abilities, and certifications:

• Understanding of Hadoop and distributed computing.

• Understanding of Hadoop components (i.e. HDFS, YARN, Zookeeper, Sqoop, Hive, Impala, Hue, Sentry, Spark, Kafka, flume, etc.)

• Minimum of 2 years’ experience supporting products running on AIX, Linux or other UNIX variants.

• Experience with supporting production JVMs.

• Knowledge of Relational Databases (DB2, Oracle, SQL Server, DB2 for iSeries, MySQL, Postgres, MariaDB).

• Configuration Management experience (puppet preferred).

• Version control experience (UNIX, git, git preferred).

• Proficient with operating system utilities as well as server monitoring and diagnostic tools to troubleshoot server software and network issues.

• Deep knowledge of associated industry protocol standards such as: LDAP, DNS, TCP/IP, etc.

• Understanding of Enterprise level services - Active Directory, PKI Infrastructure (Venafi).

• Understanding of Kerberos, cross-realm authentication and Kerberized services.

• Security experience including SSL certificates, TLS, hardening, PEN tests, etc.

• Understanding of message based architecture.

• Flexibility is a non-negotiable skill that the ideal candidate must have.

• Periodic off-hours work required including weekends and holidays. Must be able to provide 24 by 7 on-call support as necessary.

• B.S. degree in Computer Science or equivalent formal training and experience.

Tasks and responsibilities

• Partners with Infrastructure to identify server hardware, software and configurations necessary for optimally running big data workloads (i.e. Spark, Hive, Impala, etc.)

• Plans and executes major platform software and operating system upgrades and maintenance across physical and virtualized environments.

• Strong focus on design, build, deployment, security, system hardening and securing services.

• Designs and implements a toolset that simplifies provisioning and support of a large cluster environment.

• Proactively manages hadoop system resources to assure maximum system performance and appropriate additional capacity for peak periods and growth.

• Reviews performance stats and query execution/explain plans, and recommends changes for tuning Hive/Impala queries.

• Recommends security management best practices including the ongoing promotion of awareness on current threats, auditing of server logs and other security management processes, as well as following established security standards.

• Provides support to the user community using incident and problem management tools, email, and voicemail.

Best Regards,

Amit

Titan Technologies Inc

Find Talent- Source from the best- Get it from Titan!