Post Job Free
Sign in

Backend Software Engineer - AI/ML, Cloud, Data Eng

Location:
Overland Park, KS
Posted:
January 12, 2026

Contact this candidate

Resume:

Satya Sri Lasya Mundru

913-***-**** *************@*****.*** LinkedIn

Professional Summary

Results-oriented Backend Software Engineer with 3.5 years of product development experience designing and scaling distributed backend systems in Banking and Healthcare domains. Proficient in Python development, AI/ML model integration, data engineering, and cloud-based solutions. Skilled in Python, SQL, C++, Golang, Java, TensorFlow, PyTorch, Flask, FastAPI, AWS, Azure, Docker, Kubernetes, and CI/CD pipelines. Proven track record of delivering enterprise-scale APIs, ETL workflows, AI/ML deployments, and data pipelines that support clients like Birlasoft, Elevance Health, and Fifth Third Bank. Skilled in AWS, Azure, and GCP, with a strong focus on performance optimization, observability, and end-to-end lifecycle ownership from design to production in Agile environments.

Technical Skills

Programming & Scripting: Python, Java, SQL, Shell Scripting, C++, Golang (familiar)

AI/ML Frameworks: TensorFlow, PyTorch, Scikit-learn, Hugging Face, NLTK

Web & API Frameworks: Flask, FastAPI, Django, REST, GraphQL

Cloud Platforms: AWS (EC2, S3, Lambda, SageMaker, Glue, Redshift), Azure (Data Lake, AKS, Data Factory), GCP (familiar)

Systems Design: Microservices Architecture, Distributed Systems, Data-intensive Systems, Event-driven Architecture, Fault Tolerance

Data Engineering: Apache Spark, PySpark, Pandas, NumPy, Airflow, dbt, Kafka

Databases: MySQL, PostgreSQL, MongoDB, SQL Server, Snowflake

DevOps & CI/CD: Jenkins, GitHub Actions, Docker, Kubernetes, Terraform, Ansible

Monitoring & Logging: CloudWatch, Splunk, ELK, Prometheus, Grafana

Visualization & Reporting: Tableau, Power BI, Matplotlib, Seaborn

Professional Experience

Project: Intelligent Fraud Detection Platform (Banking Domain) Python Developer Fifth Third Bank, Kentwood, MI

Jan 2025 – Present

Designed and deployed AI/ML pipelines for fraud detection using PyTorch and Scikit- learn, reducing false positives by 20%.

Developed FastAPI microservices or real-time transaction validation integrated across multiple AWS regions.

Automated data ingestion and ETL workflows using Airflow and AWS Redshift, optimizing throughput.

Built Power BI dashboards for portfolio risk monitoring and loan default predictions.

Leveraged AWS Lambda + S3 for event-driven architectures, cutting data latency by 40%.

Led end-to-end feature ownership from architecture to production rollout in a highly autonomous environment.

Implemented schema validation & data governance with Great Expectations.

Enhanced observability through Prometheus, Grafana, and ELK, reducing debugging time by 35%.

Partnered with data science teams for model training & deployment pipelines.

Participated in Agile ceremonies, delivering sprint-based modules.

Collaborated with cross-functional teams across data science, DevOps, and product to align platform goals.

Environment: Python, FastAPI, PyTorch, Scikit-learn, SQL Server, AWS (Redshift, Lambda, S3, EC2), Airflow, Jenkins, Docker, Kubernetes, Tableau, Power BI, ELK, Golang.

Project: AI-Driven Claims Analytics (Healthcare Domain) AI Python Developer Elevance Health, Atlanta, GA

Jun 2024 – Dec 2024

Designed NLP pipelines with Hugging Face Transformers and NLTK for claims text classification.

Built Python ETL frameworks to process large-scale claims into Azure Data Lake.

Deployed predictive models in TensorFlow for patient readmission analysis.

Integrated FastAPI endpoints for real-time claim adjudication, reducing processing time by 30%.

Orchestrated Azure Data Factory pipelines for HIPAA-compliant ETL workflows.

Processed multi-terabyte healthcare datasets with PySpark, reducing query runtimes.

Deployed AI models with Docker on Azure Kubernetes Service (AKS) for scalability.

Implemented Prometheus/Grafana monitoring for pipeline and model observability.

Ensured compliance by validating data with Great Expectations.

Developed Power BI dashboards for claim cost, fraud detection, and provider efficiency.

Worked with compliance officers and data scientists to align AI solutions with regulatory needs.

Environment: Python, TensorFlow, Hugging Face, PySpark, Azure Data Lake, Azure Data Factory, AKS, Docker, Prometheus, Grafana, Power BI.

Project: Digital Banking Automation Suite (Banking Domain) Python Developer Birlasoft, Hyderabad, India

Jun 2021 – Jul 2023

Developed Python automation frameworks for loan origination & transaction processing.

Created Flask REST APIs to serve banking microservices.

Optimized SQL queries and stored procedures for financial transactions.

Migrated legacy ETL jobs into PySpark pipelines, improving runtime by 40%.

Automated regulatory compliance reports with Python + Tableau/Power BI.

Built ML-based models using Scikit-learn to predict loan approvals.

Implemented CI/CD pipelines in Jenkins for automation of deployments.

Deployed containerized applications on Docker & Kubernetes.

Enhanced fraud monitoring with anomaly detection algorithms.

Designed scalable data ingestion pipelines with Pandas & NumPy.

Participated in Agile sprints, ensuring delivery of high-quality modules.

Environment: Python, Flask, Pandas, NumPy, Scikit-learn, SQL, PySpark, Jenkins, Docker, Kubernetes, Tableau, Power BI.

Education

Master of Science in Computer Science (2023 – 2025), University of Central Missouri, USA

Bachelor of Technology in Computer Science (2019 – 2023), Andhra University, India

Certifications

AWS Certified Developer – Associate

Microsoft Azure Fundamentals (AZ-900)

TensorFlow Developer Certificate

SQL for Data Science – Coursera



Contact this candidate