Post Job Free
Sign in

Data Engineer- (AWS/PySpark/ETL)

Company:
JPMorganChase
Location:
Columbus, OH
Posted:
January 30, 2026
Apply

Description:

Description

Job Description

Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team.

As a Data Engineer III at JPMorgan Chase within the Consumer and Community Banking and Data Technology, you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

Develops secure high-quality production code, and reviews and debugs code written by others

Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems

Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems

Collaborate closely with cross-functional teams to develop efficient data pipelines to support various data-driven initiatives

Implement best practices for data engineering, ensuring data quality, reliability, and performance

Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows

Perform data extraction and implement complex data transformation logic to meet business requirements

Leverage advanced analytical skills to improve data pipelines and ensure data delivery is consistent across projects

Monitor and executes data quality checks to proactively identify and address anomalies

Ensure data availability and accuracy for analytical purposes

Communicate technical concepts to both technical and non-technical stakeholders

Required qualifications, capabilities, and skills

Experience with ETL tools like Ab Initio, Informatica, Data Pipeline and workflow management tools (Airflow, etc.)

Strong hands on coding experience with PySpark, Python and AWS

Experience working with modern Data Lakes : (Snowflake, Databricks etc.)

Hands-on practical experience delivering system design, application development, testing, and operational stability

Very strong problem solving skills

Proficiency in automation and continuous delivery methods

Preferred qualifications, capabilities, and skills

Advanced in one or more programming language(s) like SQL, Java etc

Proficient in all aspects of the Software Development Life Cycle

Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security

Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning etc.)

In-depth knowledge of the financial services industry and their IT systems

Practical cloud native experience

Proven leadership and mentoring experience with varying levels of software engineers

Apply