Big Data Developer - Hadoop Location: Columbus, OH / Jersey City, NJ Experience Level: 12+ Years Business Line: Corporate & Investment Bank / Consumer & Community Banking Please Share resume to Position Overview One of the world's leading financial institutions, is seeking an experienced Big Data Developer with extensive Hadoop expertise to join our Technology team.
As part of the client's commitment to innovation and digital transformation, you will play a critical role in building scalable data solutions that power our banking operations, risk management, and customer experience initiatives.
The ideal candidate will have deep technical expertise in Java/Python programming and comprehensive understanding of large-scale financial data processing in enterprise environments.
Key Responsibilities Data Architecture & DevelopmentDesign and develop scalable big data solutions using Hadoop ecosystem components (HDFS, MapReduce, Hive, Spark, Kafka, HBase)Build and optimize ETL pipelines for processing large volumes of financial dataImplement data ingestion frameworks from various sources including real-time streaming and batch processingDevelop and maintain data models for complex financial datasets Programming & Technical ImplementationWrite efficient, maintainable code in Java and Python for big data applicationsDevelop MapReduce jobs, Spark applications, and streaming data processing solutionsCreate and optimize SQL queries and stored procedures for data extraction and transformationImplement data quality checks and validation frameworks Client-Specific Data ManagementProcess and analyze massive transaction volumes, trading data, credit card transactions, and regulatory reporting datasetsSupport digital banking initiatives and customer analytics platformsImplement risk management solutions for credit risk, market risk, and operational riskEnsure compliance with banking regulations including SOX, PCI-DSS, and internal data governance standardsHandle sensitive customer and financial data following client’s security protocols and encryption standards Performance & OptimizationMonitor and tune Hadoop cluster performance for optimal resource utilizationOptimize data processing jobs for improved performance and cost efficiencyImplement data partitioning and compression strategiesTroubleshoot and resolve performance bottlenecks in big data pipelines Required Qualifications Experience & EducationBachelor's degree in Computer Science, Engineering, or related field12+ years of total IT experience with minimum 6+ years in big data technologies5+ years of hands-on experience with Hadoop ecosystem (HDFS, MapReduce, Hive, Spark)4+ years of experience in banking, financial services, or fintech industry .Experience working in large enterprise environments with complex data architectures Technical SkillsExpert-level proficiency in Java and Python programmingStrong experience with Hadoop distributions (Cloudera, Hortonworks, MapR)Proficiency in Apache Spark (Scala/Python), Kafka, HBase, and HiveExperience with data serialization formats (Avro, Parquet, ORC)Knowledge of SQL and NoSQL databasesFamiliarity with cloud platforms (AWS, Azure, GCP) and their big data servicesExperience with version control systems (Git) and CI/CD pipelines Financial Domain KnowledgeDeep understanding of banking operations, trading systems, and financial marketsExperience with financial data formats and industry standards (FIX, SWIFT, etc.)Knowledge of regulatory requirements (Basel III, Dodd-Frank, MiFID II)Understanding of risk management and compliance frameworks Additional SkillsExperience with data visualization tools (Tableau, Power BI, Qlik)Knowledge of machine learning frameworks and algorithmsUnderstanding of data security and encryption techniquesExperience with Agile/Scrum development methodologies Preferred QualificationsMaster's degree in Computer Science, Data Science, or related fieldCertifications in Hadoop technologies (Cloudera, Hortonworks)Experience with real-time trading systems and market data processingKnowledge of containerization technologies (Docker, Kubernetes)Experience with data governance and metadata management toolsUnderstanding of DevOps practices and infrastructure automation Key CompetenciesStrong analytical and problem-solving skillsExcellent communication and collaboration abilitiesAbility to work in fast-paced, high-pressure financial environmentsAttention to detail and commitment to data accuracyAbility to mentor junior developers and lead technical initiativesStrong project management and multitasking capabilities Regards, Radiantze Inc