Post Job Free
Sign in

Hadoop Developer

Company:
Career Guidant Inc.
Location:
Tampa, FL, 33646
Posted:
August 22, 2025
Apply

Description:

Career Guidant, an internationally acclimed, trusted multi-faced orgiansation into Information Technology Custom Learning Services for Enterprises, Lateral Staffing Solutions, Information Technology Development & Consulting, Infrastructure & Facility Management Services and Technical Content development as core competencies. Our experienced professionals bring a wealth of industry knowledge to each client and operate in a manner that produces superior quality and outstanding results.

Career Guidant proven and tested methodologies ensures client satisfaction being the primary objective. Committed to our core values of Client Satisfaction, Professionalism, Teamwork, Respect, and Integrity.

Career Guidant with its large network of delivery centres,support offices and Partners across India, Asia Pacific, Middle East, Far East, Europe, USA has committed to render the best service to the client closely to ensure their operation continues to run smoothly. Our Mission

"To build Customer satisfaction, and strive to provide complete Information Technology solution you need to stay ahead of your competition" If you have any queries about our services.

Job Description

Preferred

At least 2 years of experience in Big Data space.

Strong Hadoop MAP REDUCE/Hive/Pig/SQOOP/OOZIE - MUST

Candidate should have hands on experience with Java, APIs, spring MUST

Good exposure to columnar NoSQL DBs like HBase.

Complex High Volume High Velocity projects end to end delivery experience

Good experience with at least one of the scripting language like Scala, Python.

Good exposure to BigData architectures.

Experience with some framework building experience on Hadoop

Very good understanding of Big Data eco system

Experience with sizing and estimating large scale big data projects

Good with DB knowledge with SQL tuning experience.

Experience with Impala

Good exposure to Splunk

Good exposure to Kafka

Experience with Apache Parquet Data format

Past experience and exposure to ETL and data warehouse projects

Experience with Spark, Flume

Cloudera/ Hortonworks certified

Experience and desire to work in a Global delivery environment

The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.

Qualifications

Basic

Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.

At least 4 years of experience with Information Technology. #J-18808-Ljbffr

Apply