Post Job Free
Sign in

Big Data Developer

Company:
Radiantze Inc.
Location:
Clinton Township, OH, 43224
Posted:
June 27, 2025
Apply

Description:

Big Data Developer - Hadoop

Location: Columbus, OH / Jersey City, NJ

Experience Level: 12+ Years

Business Line: Corporate & Investment Bank / Consumer & Community Banking

Please Share resume to

Position Overview

One of the world's leading financial institutions, is seeking an experienced Big Data Developer with extensive Hadoop expertise to join our Technology team. As part of the client's commitment to innovation and digital transformation, you will play a critical role in building scalable data solutions that power our banking operations, risk management, and customer experience initiatives. The ideal candidate will have deep technical expertise in Java/Python programming and comprehensive understanding of large-scale financial data processing in enterprise environments.

Key Responsibilities

Data Architecture & Development

Design and develop scalable big data solutions using Hadoop ecosystem components (HDFS, MapReduce, Hive, Spark, Kafka, HBase)

Build and optimize ETL pipelines for processing large volumes of financial data

Implement data ingestion frameworks from various sources including real-time streaming and batch processing

Develop and maintain data models for complex financial datasets

Programming & Technical Implementation

Write efficient, maintainable code in Java and Python for big data applications

Develop MapReduce jobs, Spark applications, and streaming data processing solutions

Create and optimize SQL queries and stored procedures for data extraction and transformation

Implement data quality checks and validation frameworks

Client-Specific Data Management

Process and analyze massive transaction volumes, trading data, credit card transactions, and regulatory reporting datasets

Support digital banking initiatives and customer analytics platforms

Implement risk management solutions for credit risk, market risk, and operational risk

Ensure compliance with banking regulations including SOX, PCI-DSS, and internal data governance standards

Handle sensitive customer and financial data following client’s security protocols and encryption standards

Performance & Optimization

Monitor and tune Hadoop cluster performance for optimal resource utilization

Optimize data processing jobs for improved performance and cost efficiency

Implement data partitioning and compression strategies

Troubleshoot and resolve performance bottlenecks in big data pipelines

Required Qualifications

Experience & Education

Bachelor's degree in Computer Science, Engineering, or related field

12+ years of total IT experience with minimum 6+ years in big data technologies

5+ years of hands-on experience with Hadoop ecosystem (HDFS, MapReduce, Hive, Spark)

4+ years of experience in banking, financial services, or fintech industry .

Experience working in large enterprise environments with complex data architectures

Technical Skills

Expert-level proficiency in Java and Python programming

Strong experience with Hadoop distributions (Cloudera, Hortonworks, MapR)

Proficiency in Apache Spark (Scala/Python), Kafka, HBase, and Hive

Experience with data serialization formats (Avro, Parquet, ORC)

Knowledge of SQL and NoSQL databases

Familiarity with cloud platforms (AWS, Azure, GCP) and their big data services

Experience with version control systems (Git) and CI/CD pipelines

Financial Domain Knowledge

Deep understanding of banking operations, trading systems, and financial markets

Experience with financial data formats and industry standards (FIX, SWIFT, etc.)

Knowledge of regulatory requirements (Basel III, Dodd-Frank, MiFID II)

Understanding of risk management and compliance frameworks

Additional Skills

Experience with data visualization tools (Tableau, Power BI, Qlik)

Knowledge of machine learning frameworks and algorithms

Understanding of data security and encryption techniques

Experience with Agile/Scrum development methodologies

Preferred Qualifications

Master's degree in Computer Science, Data Science, or related field

Certifications in Hadoop technologies (Cloudera, Hortonworks)

Experience with real-time trading systems and market data processing

Knowledge of containerization technologies (Docker, Kubernetes)

Experience with data governance and metadata management tools

Understanding of DevOps practices and infrastructure automation

Key Competencies

Strong analytical and problem-solving skills

Excellent communication and collaboration abilities

Ability to work in fast-paced, high-pressure financial environments

Attention to detail and commitment to data accuracy

Ability to mentor junior developers and lead technical initiatives

Strong project management and multitasking capabilities

Regards,

Radiantze Inc

Apply