JustinBradley's client, a top-tier source of mortgage financing, is seeking a Python Developer with deep expertise in cloud-native technologies to lead the development of scalable, efficient, and secure data infrastructure. This role merges advanced AWS capabilities with strong software engineering, DevSecOps practices, and architectural leadership to drive enterprise-wide analytics, business intelligence, and machine learning solutions.
Key Responsibilities:
Lead the design and implementation of enterprise-scale data pipelines and lakes using AWS services (Glue, Lambda, Step Functions, S3, Redshift, DynamoDB).
Architect real-time and batch data solutions with Apache Spark, Kinesis, and Kafka to support diverse analytic workloads.
Ensure efficiency, scalability, and performance tuning across distributed data systems and AWS resources.
Build, optimize, and maintain ETL/ELT pipelines with orchestration tools like AWS Glue, Step Functions, or Apache Airflow.
Integrate structured and unstructured data sources, including APIs, databases, and third-party platforms into centralized cloud storage.
Implement rigorous data quality, transformation, lineage, and cataloging using AWS Glue Data Catalog and Lake Formation.
Enforce governance, compliance, and security standards leveraging IAM, KMS, and other AWS-native tools.
Drive continuous improvement of data workflows via CI/CD practices and Infrastructure as Code (Terraform, CloudFormation).
Collaborate cross-functionally with data scientists, analysts, and product stakeholders to deliver business-critical solutions.
Required Skills & Experience:
8+ years in software engineering or data engineering with a focus on cloud-native solutions; proven experience in leadership roles.
Deep technical expertise in AWS services: Glue, Redshift, Athena, S3, Lambda, EMR, Kinesis, and Step Functions.
Proficient in Python, SQL, and Spark for scalable data processing.
Knowledge of modern front-end and back-end technologies (React.js, Node.js, Java, Bash, PowerShell) is a strong plus.
Experience with microservices, serverless computing, and event-driven architectures.
DevSecOps and CI/CD best practices using Jenkins, GitHub Actions, Bitbucket, or AWS CodePipeline.
Working knowledge of data modeling, warehousing (Snowflake, Redshift), and performance optimization.
Familiarity with AI/ML integrations, distributed systems, and orchestration platforms.
Tools: Terraform, Ansible, Kubernetes.
Bachelor's or Master's in Computer Science or related field.
Certifications preferred: AWS Solutions Architect, AWS Data Engineer, Agile Certified Practitioner (ACP).
JustinBradley is an EO employer - Veterans/Disabled and other protected employees.