NAVYA SRIJA KUNA
Phone: +1-618-***-**** Email: ************@*****.***
LinkedIn: linkedin.com/in/kuna-navya-srija-564a2720b GitHub: github.com/NavyaSrija Location: Edwardsville, IL Open to Remote / Hybrid Python Developer Roles PROFESSIONAL SUMMARY
• Over 5 years of combined professional and academic experience in software engineering, backend architecture, and multi-cloud deployment using Python, FastAPI, Flask, Django REST Framework, AWS, and Google Cloud Platform
(GCP).
• Designed and implemented RESTful APIs, microservices, and scalable backend systems supporting high-volume production workloads across AWS and GCP environments.
• Expertise in Python programming, data structures, OOP, asynchronous processing, and Python-based analytical workflows on GCP using BigQuery, Cloud Functions, and GCP SDK.
• Strong knowledge of API design patterns, dependency injection, and service-oriented architecture principles for building modular, maintainable systems.
• Proficient with AWS services — Lambda, EC2, S3, Glue, Athena, Fargate, CloudWatch and skilled in GCP services such as BigQuery, IAM, Cloud Storage, and Cloud Functions for secure and scalable deployment.
• Skilled in Docker, Kubernetes, and Terraform for containerization and infrastructure automation across multi-cloud platforms.
• Hands-on experience building CI/CD pipelines with Jenkins, GitLab CI, and Ansible for automated testing, packaging, and deployment.
• Adept in SQL optimization, query design, and database schema management for PostgreSQL, MySQL, and optimized BigQuery SQL queries for large datasets.
• Experience developing secure authentication modules with OAuth 2.0, JWT, and RBAC authorization.
• Proficient in test automation using PyTest, UnitTest, and Selenium for backend and API quality assurance.
• Experienced in ETL pipeline development and data processing using Pandas, NumPy, PySpark, and Python data pipelines integrated with BigQuery.
• Created API documentation with Swagger and OpenAPI for cross-team collaboration and onboarding.
• Implemented logging and monitoring solutions with Prometheus, AWS CloudWatch, and GCP logging/monitoring tools for distributed systems.
• Adept at Agile/Scrum development, sprint delivery, and collaborative development with cross-functional teams.
• Strong debugging and root-cause analysis skills for resolving production issues in cloud-native applications.
• Experience integrating third-party APIs and payment systems via REST and GraphQL.
• Recognized for writing clean, modular code and mentoring junior developers on best practices.
• Delivered backend optimization projects reducing latency and improving throughput by 40%.
• Collaborative communicator with engineers, QA, DevOps, product teams, and non-technical stakeholders.
• Passionate about continuous learning in cloud-native Python development, AI automation, statistical analysis, and MLOps integration.
TECHNICAL SKILLS
Category Skills
Programming
Languages
Python (2.7 – 3.12), Java, C, C++, SQL, JavaScript Frameworks / Libraries FastAPI, Flask, Django REST Framework, Node.js, React Cloud & DevOps
Google Cloud Platform (BigQuery, Cloud Functions, IAM, Cloud Storage, GCP SDK), AWS (Lambda, EC2, S3, Glue, CloudFormation, CloudWatch), Docker, Kubernetes, Jenkins, Terraform, GitLab CI/CD, Ansible
Databases PostgreSQL, MySQL, MongoDB, Redis, DynamoDB, Snowflake, BigQuery Testing Tools PyTest, UnitTest, Selenium, Postman, JMeter, Robot Framework Data Tools Pandas, NumPy, PySpark, Kafka, Hive, Airflow, ETL Design, Statistical Analysis, Python Data Processing on GCP
Tools & IDEs Git, GitHub, Bitbucket, Jira, Google Cloud SDK, VS Code, PyCharm, Spyder, Eclipse Operating Systems Windows, Linux (Ubuntu / CentOS), Mac OS Development
Methodologies
Agile, Scrum, DevOps, SDLC, TDD
PROFESSIONAL EXPERIENCE
Client – VSLN International Inc., New Jersey (USA) Role: Python Software Engineer (Intern) Duration: Aug 2025 – Present Environment: Python 3.10, FastAPI, AWS Lambda, PostgreSQL, GitLab CI, PyTest, Google Cloud Platform Project: Enterprise Resource Automation Platform
Designed and developed FastAPI-based microservices for enterprise workflow automation and digital process orchestration.
Implemented end-to-end CRUD operations with PostgreSQL, integrating stored procedures and ORM optimization via SQLAlchemy.
Built asynchronous endpoints using async/await, improving API throughput by 38 %.
Developed modular REST APIs supporting version control, reusable middleware, and dynamic schema validation.
Configured AWS Lambda functions for automated file transformation and notification triggers through SNS.
Authored API documentation in Swagger/OpenAPI and managed endpoint versioning across environments.
Automated deployment using GitLab CI/CD and Docker containers for zero-downtime releases.
Collaborated with frontend teams to align JSON payloads and error responses for consistent UI integration.
Built unit/integration tests (PyTest + Mock) raising test coverage above 90 %.
Implemented RBAC authentication with JWT tokens and role-level permission mapping.
Designed logging frameworks with structured JSON logs sent to AWS CloudWatch for diagnostics.
Conducted code reviews to enforce PEP 8 standards and static analysis via pylint.
Tuned PostgreSQL indexes and connection pools to reduce query latency by 45 %.
Created DevOps scripts to automate staging/production branch merges.
Collaborated in Agile sprints, contributing to backlog refinement and retrospectives.
Supported data migration tasks and ETL integrations from legacy ERP systems.
Managed API monitoring and alerting through AWS CloudWatch dashboards.
Mentored junior interns on Python testing and API architecture best practices.
Delivered weekly progress demos to management with technical documentation.
Collaborated with customers to understand analytical needs and delivered Python-based statistical reporting solutions on Google Cloud Platform (GCP).
Performed statistical analysis, data preprocessing, and automated report generation using Python integrated with BigQuery.
Developed and tested efficient and reusable Python modules for GCP-driven ETL and reporting workflows.
Debugged and resolved issues in Python scripts and data pipelines deployed within the GCP environment.
Maintained structured documentation for BigQuery datasets, GCP processes, and code workflows to support team onboarding and audits.
Client – Cloud Nova Systems LLC (USA)
Role: Senior Python Developer (Contract) Duration: May 2024 – June 2025 Environment: Python 3.10, FastAPI, Flask, Docker, Terraform, AWS Glue, RDS, Fargate, BigQuery, GCP SDK Project: Financial Data Analytics and AI Integration Platform
Architected and delivered high-volume RESTful APIs powering financial analytics dashboards.
Built ETL pipelines using AWS Glue and PySpark for structured data processing (>2 TB daily load).
Developed data models and schemas for RDS and Redshift, supporting multi-tenant architecture.
Designed asynchronous task queues with Celery and RabbitMQ for scheduled batch processing.
Automated deployment using Terraform, Docker, and Jenkins multi-stage pipelines.
Integrated OAuth2/JWT authentication with AWS Cognito for secure client access.
Built custom logging middleware capturing API performance metrics and error stacks.
Optimized API query logic reducing average response times from 350 ms to 150 ms.
Configured auto-scaling and load balancing via AWS Fargate and ECS.
Performed data validation and schema enforcement using Pydantic models.
Implemented caching mechanisms through Redis to reduce database load by 50 %.
Enhanced system observability through Prometheus metrics and CloudWatch alerts.
Conducted API load testing (JMeter) ensuring 99.9 % uptime under peak traffic.
Refactored monolithic Flask services into modular FastAPI microservices.
Collaborated with data scientists to serve AI inference results via secure REST endpoints.
Automated data backups and disaster-recovery plans through AWS Snapshots.
Documented all deployments and integration flows for knowledge transfer.
Mentored team members on Docker multi-stage builds and Git branch strategy.
Presented quarterly system improvement reports showing 60 % faster data delivery.
Designed, developed, enhanced, tested, and deployed scalable Python applications on GCP using Cloud Functions, IAM, and the GCP SDK.
Built and optimized BigQuery data pipelines, improving performance for large-volume datasets and analytical workloads.
Implemented Python-based automation and reporting solutions leveraging BigQuery SQL and GCP services.
Applied clean coding patterns and reusable design principles to all GCP-integrated microservices, improving maintainability across teams.
Utilized SQL, BigQuery, IAM, and GitHub/GCP SDK for deployments, access management, and workflow automation.
Client – Cognizant Technology Solutions, Hyderabad (India) Role: Python Software Developer Engineer Duration: Sep 2020 – Dec 2023 Environment: Flask, AWS Glue, PostgreSQL, GitLab CI, Docker Project: Digital Loan Origination and Credit Risk Platform
Designed and implemented Flask REST APIs for digital-loan origination, EMI calculation, and risk profiling.
Engineered ETL pipelines with AWS Glue for ingesting customer KYC and credit data from multiple sources.
Created normalized PostgreSQL schemas with foreign-key relationships and triggers for data integrity.
Optimized SQL joins and indexing, reducing query response time by 45 %.
Deployed Flask microservices to AWS Lambda and ECS for cost-efficient scaling.
Developed config-driven data-validation framework using Python and YAML rules.
Integrated external banking APIs and payment gateways for loan disbursement automation.
Created role-based access control (RBAC) and JWT authentication modules for secure transactions.
Automated testing with PyTest and GitLab CI to enforce zero-defect deployments.
Designed event-driven notifications via AWS SNS and Twilio for status alerts and loan reminders.
Implemented structured logging and exception handlers forwarding logs to AWS CloudWatch.
Worked with cross-functional teams to translate business logic into API contracts and data models.
Conducted peer code reviews and static analysis using Flake8 and Bandit for security compliance.
Participated in Agile sprint ceremonies and backlog prioritization with product owners.
Automated build and deployment scripts using Docker compose for multi-service integration.
Supported UAT and production incident resolution with 24-hour turnaround.
Authored user manuals and API documentation for integration teams.
Mentored junior developers on API error handling and database migration strategies.
Collaborated with DevOps engineers to enable blue-green deployment strategy reducing downtime to zero. RESEARCH PROJECTS
Brain Tumor Segmentation & Classification (Graduate Project) Environment: Python, TensorFlow, Keras, OpenCV, SimpleITK, Flask, Docker
Built an automated pipeline for MRI/CT tumor segmentation integrating nnU-Net++ and EfficientNet B3 achieving 98.9 % accuracy. Developed FusionNet model using attention based mechanism and achieved accuracy over 99% for both modalities.
Implemented preprocessing stages (skull-stripping, bias-correction, intensity normalization) using Open CV and SimpleITK.
Enhanced model training with data augmentation (flip, rotation, contrast stretching) +12 % Dice score gain.
Used 3D U-Net for volumetric segmentation and EfficientNet for tumor classification.
Applied Grad-CAM visualization to interpret model decisions and assist radiologists.
Containerized the pipeline with Docker and deployed Flask REST API for real-time inference.
Integrated AWS S3 for data storage and CloudWatch for performance monitoring.
Automated hyperparameter tuning using Keras Tuner and TensorBoard visualization.
Evaluated performance with Dice, IoU, Precision, Recall, and F1 metrics.
Collaborated with clinical partners to validate results on BraTS and TCIA datasets.
Documented dataset preparation and model workflow for IEEE publication.
Delivered a poster presentation at SIUE Graduate Symposium (2025). Predictive Maintenance for IoT Sensors
Environment: Python, FastAPI, AWS Kinesis, PySpark, PostgreSQL, SNS
Designed a FastAPI-based service for real-time sensor data collection and anomaly detection.
Streamed telemetry through AWS Kinesis and stored aggregated records in PostgreSQL.
Built PySpark MLlib models for early-failure detection using vibration and temperature patterns.
Created ETL scripts to clean and normalize data before model training.
Achieved 22 % reduction in downtime by implementing predictive alerts through AWS SNS.
Developed dashboard APIs to display equipment status and maintenance forecasts.
Containerized the solution with Docker for edge deployment and cloud integration.
Wrote unit tests and CI/CD pipelines for continuous integration of model updates.
Enhanced data security with IAM roles and encrypted S3 storage.
Collaborated with IoT engineers to integrate sensor data into existing ERP systems.
Presented outcomes to project sponsors showing measurable ROI improvement.
Authored technical documentation and system architecture diagrams. EDUCATION
Master of Science in Computer Science
Southern Illinois University Edwardsville (SIUE), USA Jan 2024 – May 2025 GPA: 3.29 / 4.00 Bachelor of Technology in Artificial Intelligence
Anurag Group of Institutions, Hyderabad, India Aug 2019 – May 2023 GPA: 8.75 / 10.00 CERTIFICATIONS
AWS Certified Developer – Associate
Azure Data Engineer Associate
Databricks Lakehouse Fundamentals
SnowPro Advanced Data Engineer
Oracle Database SQL Certified Associate
Coursera: AI for Everyone, Machine Learning for All, Big Data & Ethics
Coding Ninjas: Advanced Python
ACHIEVEMENTS & LEADERSHIP
Presented AI-Driven Tumor Segmentation in MRI Imaging at SIUE Graduate Symposium (2025).
Awarded Best Automation Initiative at Cognizant (2022).
Delivered guest sessions on FastAPI and AWS for graduate students.
Mentored developers on DevOps pipelines and backend optimization.
Contributed to open-source Python data-processing libraries.
Completed Accenture Nordic Developer Program via Forage.