Post Job Free
Sign in

Software Engineer III - Cloud Data Engineer (Databricks)

Company:
JPMorganChase
Location:
Columbus, OH
Posted:
May 07, 2025
Apply

Description:

Description

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.

As a Software Engineer III at JPMorgan Chase within the Employee Data Platform (Palmos) Group, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems

Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems

Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development

Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems

Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture

Contributes to software engineering communities of practice and events that explore new and emerging technologies

Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills

Formal training or certification on software engineering concepts, coupled with 3+ years hands-on experience

Experience in building data pipelines using Apache Spark, with proficiency in Scala or Python, utilizing platforms such as AWS EMR or Databricks

Hands-on experience with IaC tools like Terraform

Knowledge of open table formats such as Iceberg and Delta Lake, along with expertise in AWS Glue Data Catalog and fine-grained access control.

Advanced in one or more programming language(s) (Python, Java, etc.)

Proficiency in automation and continuous delivery methods

Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security

Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

Hands on experience implementing data pipelines using Databricks, including tools such as Unity Catalog, Databricks Workflows, and Databricks Live Table, along with experience in data orchestration tools like Airflow technologies

Preferred qualifications, capabilities, and skills

Extensive hands-on experience with AWS Cloud services including S3, IAM, EMR, Glue, ECS/EKS, and Athena.

Certifications in AWS, Databricks, and Terraform are highly desirable.

Exposure to cloud technologies

Apply