Job Description
Senior Software Engineer – Java, Cloud & Big Data
HealthBridge Financial, Inc. is a first-of-its kind financial security solution that provides a resource to help patients bridge the gap between the high cost of healthcare and their financial wellbeing.
Engineers at Healthbridge get to pursue a mission that helps patients afford their healthcare and help alleviate the financial stress that comes from a healthcare event. We are a heavily data-oriented business and are rapidly scaling.
About the Role
We are seeking a highly driven Senior Software Engineer with deep Java expertise and a passion for building scalable, high-performance cloud and big data systems. You’ll play a pivotal role in designing, coding, and optimizing our core services, including large-scale file processing and big data pipelines. You’ll set the technical bar for the team, mentor junior engineers, establish best practices, and leverage AI tools to accelerate development-all in a dynamic, AWS-based environment. This is an opportunity to shape engineering culture and processes in a high-growth, investor-backed startup
Key Responsibilities
Hands-on Coding & System Design
Architect, develop, and deliver robust Java-based backend systems optimized for scalability, reliability, and high-throughput data processing
Design and implement large-scale file processing workflows and big data pipelines (batch and/or streaming)
Write clean, efficient, and well-documented code; set and enforce coding standards
Drive performance enhancements, profiling, and tuning for mission-critical services
Lead by example in code reviews, debugging, and troubleshooting complex issues-your own and others’ code
Technical Leadership & Process Improvement
Establish and refine development processes, CI/CD pipelines, and code review standards
Mentor and coach junior developers, ensuring they follow best practices and grow their skills
Champion the safe and ethical use of AI-powered development tools to boost productivity
Cloud-Native & Big Data Engineering
Design and implement cloud-native solutions leveraging AWS services (e.g., S3, EMR, Glue, Lambda, EC2, ECS, RDS)
Build and optimize data ingestion, transformation, and storage processes for large-scale datasets and files
Collaborate with DevOps, infrastructure, and data engineering teams to ensure seamless deployments and optimal resource utilization
Work closely with the CISO and security team to ensure code meets security standards
Requirements
5+ years of professional Java development experience (Spring Boot, Hibernate, or similar frameworks)
Demonstrated experience designing and scaling distributed systems on AWS
Proven expertise in building and optimizing big data/file processing pipelines (e.g., Hadoop, Spark, AWS Glue, Apache Beam, or similar)
Track record of performance tuning and debugging in high-traffic, data-intensive environments
Experience establishing and enforcing engineering best practices and mentoring junior engineers
Familiarity with DevOps tools (Docker, Kubernetes, Terraform, Jenkins, etc.)
Strong communication and collaboration skills
Experience working in an Agile environment
Passion for collaborating within and across teams, utilizing Agile ceremonies such as refinement, planning, and retrospectives
Nice-to-Have
Experience leveraging AI tools (e.g., Cursor, GitHub Copilot, CodeWhisperer) in software development
Exposure to microservices architecture and event-driven systems
Contributions to open-source projects or technical blogs
Location Preference:
We would like for this role to be within commutable distance of Chicago and/or Grand Rapids MI. There will be significant opportunity to work remotely but at sprint start/end we will want the team member to come into the office. We would prefer folks who can come in to a Chicago office 3 days midweek.
Full-time
Hybrid remote