Post Job Free
Sign in

Full Stack Python Developer with Cloud Expertise

Location:
Tempe, AZ
Posted:
February 26, 2026

Contact this candidate

Resume:

SHILPA PALADUGU

FULL STACK PYTHON DEVELOPER

Email ID: ****************@*****.***

Contact no: 928-***-****

Professional Summary: -

●Results-driven Full Stack Developer with over 5+ years of experience in designing and optimizing complex microservices, ETL pipelines, and scalable RESTful APIs. Proven expertise in backend development, serverless architectures, and seamless integration of diverse data sources.

●Extensive knowledge of cloud platforms including AWS (Lambda, S3, RDS, DynamoDB), OpenShift, and Kubernetes, with proficiency in building and deploying scalable solutions. Adept at crafting serverless architectures, implementing caching mechanisms, and optimizing system workflows for high performance and reliability.

●Skilled in utilizing frameworks like Django, Flask, and Django REST Framework, and developing robust data pipelines with Apache Spark, Airflow, and Snowflake. Proficient in creating visualizations and dashboards using Tableau, Matplotlib, and other tools.

●Experienced in troubleshooting, CI/CD optimization, and implementing Agile methodologies for efficient development cycles. Expertise in version control, containerization, and deployment using Docker, Jenkins, and AWS Elastic Beanstalk.

●Experience in Building custom tools to synchronize Kubernetes ConfigMaps/Secrets with AWS Parameter Store and Secrets Manager for centralized configuration.

●Strong collaborator and problem solver with experience mentoring teams, refining design decisions, and improving system reliability.

●Experience in Developing and optimized interactive user interfaces using React.js, leveraging Redux for state management and integrating RESTful APIs with a Python backend to enhance performance and scalability.

Professional Experience:

Kemper Insurance USA Feb 2025 to Present

Full stack Developer

Responsibilities:

●Enhanced Jenkins pipelines for automated API publishing and deployment, ensuring seamless integration across VM-based and cloud-based environments.

●Designed and deployed scalable, microservices-based solutions, enabling modular development and efficient service communication using RESTful APIs and asynchronous messaging.

●Resolved critical functionality bugs, improving system stability and reliability by 15%.

●Designed and implemented performance-optimized solutions, adhering to industry standards and documenting technical designs in detail.

●Created testable and reusable React.js functional and class components using ES6, ensuring scalability and performance optimization.

●Implemented security measures for Amazon EKS environments by configuring IAM roles, enforcing Kubernetes RBAC policies, and encrypting sensitive data using AWS Secrets Manager for Angular environment variables and Fast API keys.

●Developed and maintained scalable ETL pipelines using Apache Spark for processing structured and unstructured data, increasing data accessibility.

●Designed and deployed scalable microservices architectures using Kubernetes, optimizing resource allocation via YAML manifests and Helm charts.

●Built serverless architectures using AWS Lambda, reducing infrastructure costs by 30%. Automated data workflows involving MongoDB, AWS RDS, S3, and Redshift.

●Improved SEO performance and Progressive Web App (PWA) compliance using Server-Side Rendering (SSR) with Next.js and React.js for optimized initial load and user experience.

●Created Python-based reporting services integrated with Oracle databases, leveraging Control-M for automated task scheduling.

●Integrated CloudFormation with Python custom resources to extend functionality, such as validating IAM policies during stack creation.

●Designed custom NLP pipelines, including Named Entity Recognition (NER) and tokenization, to enhance information retrieval and advanced text analytics.

●Improved web application performance by transforming them into Single Page Applications (SPAs) using React Router and React Draggable.

●Developed Python automation scripts with Boto3 to manage AWS resources (EC2, S3, Lambda) and streamline provisioning/deprovisioning workflows.

●Used Terraform to manage cloud infrastructure, ensuring efficient deployment and scaling of resources.

●Integrated Redis Cache for backend caching, reducing data retrieval response times by 25%.

●Implemented and monitored Apache Airflow tasks to ensure data quality, incorporating alerting mechanisms for quick failure resolution.

●Implemented error handling and retry logic in Boto3 scripts to improve resilience during AWS API rate-limiting or transient failures.

●Developed real-time data streaming solutions by integrating Kafka producers and consumers into Python applications, using Confluent Kafka libraries for seamless interaction.

●Deployed scalable data warehousing solutions in Snowflake, optimizing query performance and reducing storage costs by 20%.

●Developed reusable Terraform modules to standardize infrastructure components across projects.

●Automated deployments with AWS Elastic Beanstalk and CloudFormation, streamlining resource provisioning.

●Designed and optimized database operations for AWS DynamoDB (NoSQL) and AWS RDS (MySQL), utilizing DynamoDB Streams for real-time data processing.

●Collaborated with business stakeholders to gather requirements and implement API updates, achieving a 98% resolution rate for over 200 support tickets monthly.

●Managed project iterations, bug tracking, and task assignments using Agile Rally and Jira for efficient execution and tracking.

Prudential Financial, Scottsdale, AZ July 2024 to Jan 2025

Software Engineer

Responsibilities:

●Optimized application performance and scalability using Django's caching, middleware, and database optimization techniques to ensure high responsiveness for large-scale deployments.

●Developed and deployed lightweight, scalable, and RESTful web applications using Flask, leveraging its modular architecture and robust ecosystem of extensions.

●Enhanced data processing efficiency using Pandas vectorized operations, group-by operations, and indexing techniques, enabling improved handling of data-intensive tasks.

●Developed and deployed microservices and React-based web applications using Jenkins, Rancher, and AWS ECS for scalable cloud infrastructure.

●Automated CI/CD pipelines with Jenkins/GitLab CI, integrating Kubernetes for rolling updates, rollbacks, and zero-downtime deployments.

●Designed and maintained complex ETL workflows with Apache Airflow, ensuring reliable and efficient data pipelines.

●Orchestrated Python-based Lambda functions with Boto3 to trigger Kubernetes Jobs for batch processing of S3 data pipelines.

●Ensured seamless client-server communication by handling API interactions between React.js frontend and Python backend using JSON-based RESTful services.

●Integrated Terraform workflows into CI/CD pipelines using Jenkins or GitLab CI, streamlining infrastructure deployment alongside application releases.

●Migrated legacy data systems to Snowflake, improving data accessibility and enabling advanced analytics for stakeholders.

●Integrated IAM roles with Kubernetes service accounts (IRSA) for secure, granular access to AWS services from pods without hardcoding credentials.

●Created Python libraries utilizing NumPy and Pandas for error detection in daily reports, improving accuracy and automating reporting workflows.

●Built custom image filters and transformations using Pillow, handling diverse image file formats for seamless compatibility across platforms.

●Collaborated with cross-functional teams to define data requirements and integrate Airflow with various data sources and destinations.

●Implemented secured and optimized APIs using Flask-RESTful, incorporating error handling, authentication, and authorization for enhanced user experience and secure access control.

●Developed complex SQL queries and stored procedures in Snowflake for data transformation and analysis, ensuring data consistency and reporting accuracy.

●Collaborated on data visualization strategies using Matplotlib, ensuring visualizations aligned with domain-specific requirements and were dependable across diverse datasets.

●Utilized a range of Python packages, including NumPy, Pandas, SQL Alchemy, Matplotlib, SciPy, boto3, and Psycopg2, to support diverse development and data analysis tasks.

Wipro – Lloyds Bank Aug 2021 – Jul 2023

Full Stack Developer

Responsibilities:

●Developed and maintained frontend and backend modules using Python on the Django Web Framework, ensuring robust and scalable application architecture.

●Designed and implemented RESTful APIs using Django and Flask for seamless interaction between business modules and external systems.

●Utilized AWS S3 extensively for file storage and retrieval, configured backups, and deployed AWS resources using CloudFormation templates.

●Migrated data from on-premises databases (MSSQL) to AWS DynamoDB and AWS RDS using AWS Glue, improving data accessibility and scalability.

●Implemented OpenShift Container Platform for deploying, running, and managing microservices, and built a front-end mobile application using React.js, Redux, and JavaScript.

●Built and deployed serverless solutions with AWS Lambda integrated with API Gateway, enhancing system scalability and reducing infrastructure costs by 20%.

●Developed decoupled and scalable microservices architecture using Amazon SQS, optimizing asynchronous communication and system reliability.

●Designed and optimized DynamoDB tables, achieving a 25% reduction in query response times for critical use cases.

●Enhanced data processing performance by implementing PySpark jobs for loading data from on-prem databases to AWS S3 and Google Cloud Storage.

●Designed user-friendly, intuitive mobile and web interfaces using React-Bootstrap and Parallax, while integrating server-side RESTful APIs developed in Python (FastAPI/Flask/Django).

●Contributed to building a distributed data processing platform using Apache Spark, enabling fault-tolerant and high-availability data workflows.

●Automated data analysis and performance calculations using NumPy, SciPy, and SQL Alchemy.

●Implemented dead-letter queues and message retries for enhanced system reliability in event-driven architectures.

●Containerized applications using Docker, ensuring consistency across development, testing, and production environments.

●Developed dynamic, user-friendly web interfaces using HTML5, CSS3, JavaScript, and frameworks like Angular and Bootstrap.

●Designed a Dynamic Form Builder to enable flexible form creation using JavaScript Object-to-XML for backend integration.

●Utilized Apache Spark for interactive queries and seamless integration with NoSQL databases.

●Conducted troubleshooting, fixed critical Python bugs, and deployed updates to ensure reliable data delivery pipelines.

●Used PyUnit for unit testing and JIRA for bug tracking and Agile project management.

Star Health, India

Python Engineer Dec 2019 – July 2021

●Designed full-stack web applications using Django, Flask, Angular, and Bootstrap.

●Engineered real-time processing pipelines with PySpark, Kafka, and Hadoop.

●Built predictive analytics models with Scikit-learn and TensorFlow.

●Automated deployments and monitoring using Jenkins, Git, and custom Python scripts.

●Developed REST APIs and integrated Power BI for reporting and dashboarding.

●Advocated best coding practices and mentored junior engineers in Python and data engineering.

Educational Details:

●Bachelor’s in computer science & information technology from Madanapalle Institute of Technology & Science, 2020

●Master’s in information technology from Northern Arizona University, 2024



Contact this candidate