Post Job Free
Sign in

Lead Software Engineer - Cloud Data Engineer (Data Bricks)

Company:
JPMorganChase
Location:
Columbus, OH
Pay:
$152,000.00-$215,000
Posted:
April 26, 2025
Apply

Description:

Description

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.

As a Lead Software Engineer at JPMorgan Chase within the Corporate Sector - Employee Data Platform (Palmos) Group, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems

Develops secure high-quality production code, and reviews and debugs code written by others

Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems

Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture

Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies

Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills

Formal training or certification on software engineer concepts and 5+ years applied experience

Experience in building data pipelines using Apache Spark, with proficiency in Scala, Python, utilizing platforms such as AWS EMR or Databricks

Hands-on experience with IaC tools like Terraform

Knowledge of open table formats such as Iceberg and Delta Lake, along with expertise in AWS Glue Data Catalog and fine-grained access control.

Advanced in one or more programming language(s) (Python, Java, etc.)

Proficiency in automation and continuous delivery methods

Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security

Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

Preferred qualifications, capabilities, and skills

Extensive hands-on experience with AWS Cloud services including S3, IAM, EMR, Glue, ECS/EKS, and Athena.

Certifications in AWS, Databricks, and Terraform are highly desirable.

1+ years of experience implementing data pipelines using Databricks, including tools such as Unity Catalog, Databricks Workflows, and Databricks Live Table, along with experience

Apply