Post Job Free
Sign in

Senior Python Developer with Cloud & ML Expertise

Location:
Hyderabad, Telangana, India
Posted:
January 22, 2026

Contact this candidate

Resume:

SWATHI SANKARAN

Email: ******.*****@*****.*** I Ph.: (516)- 400-2826

LinkedIn: www.linkedin.com/in/swathi-s-705914324

Professional Summary:

Senior Python Developer with 10 years of experience designing, developing, and deploying scalable, data-driven applications across cloud and enterprise environments. Strong expertise in Python, RESTful APIs, AWS, SQL, and distributed systems, with a proven ability to build high-performance web services, automation frameworks, and machine learning pipelines. Experienced across the full SDLC using Agile methodologies, CI/CD, and modern DevOps practices.

●Proficient in building and integrating microservices using Django, Flask, and Node.js, supported by relational and NoSQL databases including PostgreSQL, MySQL, MongoDB, Cassandra, and Redis. Extensive hands-on experience with data processing, analytics, and machine learning using NumPy, Pandas, SciPy, Scikit-Learn, and NLP frameworks such as NLTK and Stanford NLP.

●Strong background in cloud platforms including AWS (EC2, Lambda, EMR, S3, IAM) and Google Cloud, with experience deploying applications using Docker, Jenkins, and GitLab CI/CD.

●Adept at developing secure, high-availability systems on Linux and UNIX environments, with deep knowledge of shell scripting, automation, and debugging.

●Skilled in building end-to-end solutions—from backend APIs and middleware in Python and C/C++ to responsive web interfaces using HTML5, CSS3, JavaScript, AngularJS, and Bootstrap.

●Experienced in ORM frameworks (Django ORM, SQLAlchemy), data modelling, and enterprise application servers such as Tomcat and WebLogic.

●A results-driven technologist with strong analytical and problem-solving skills, excellent communication, and the ability to work independently or in cross-functional Agile teams to deliver robust, high-quality software solutions.

Technical Skills:

●Programming Languages: Python (2.4, 2.7, 3.x), Java, C++, Perl, Shell Scripting, SQL.

●Python Frameworks & Libraries: Django (1.3, 1.4, 1.5+), Flask, Pyramid, web2Py, Pandas, NumPy, SciPy, Matplotlib, Scikit-Learn, StatsModels, NLTK, SOAP.

●Databases: MySQL 5.1, SQL Server 2008, Oracle 10g, PostgreSQL, MongoDB (NoSQL).

●Web Technologies: HTML5, DHTML, XHTML, CSS3, JavaScript, jQuery, AJAX, AngularJS, Node.js, XML, Bootstrap, RESTful Web Services, AWS.

●Cloud & DevOps: AWS, Docker

●Version Control & Collaboration: Git, SVN, Subversion, CVS, Perforce, JIRA.

●Machine Learning & NLP: Scikit-Learn, StatsModels, NLTK, OpenNLP, Stanford NLP (NER, POS Tagging, Tokenization), MATLAB, R

●Operating Systems: Windows, macOS, UNIX/Linux, HP-UX, Red Hat Linux (4.x, 5.x, 6.x), Ubuntu.

●Python IDEs & Tools: PyCharm, Jupyter Notebook, Sublime Text 3, Eclipse, VIM.

●Servers & Middleware: Apache Tomcat, WebSphere, WebLogic.

●Database & Modeling Tools: SQL Developer, Toad, Erwin, Rapid SQL.

●Reporting Tools: SQR Reports, AXSPoint Reports.

●Networking & Security: Cisco Routing & Switching, TCP/IP, DNS, DHCP, Wireshark, Cisco Packet Tracer.

●Other Tools: PuTTY, Active Directory, Angry IP Scanner.

●Hardware & Platforms: Microsoft Azure, Windows 7/10, macOS High Sierra (10.13).

●Methodologies: Agile, Scrum, Waterfall.

Professional Experience:

Client: East West Bank, NY Sep 2024- Till date

Sr. Python Full-Stack Developer

Responsibilities:

●Designed and developed scalable web applications using Python, Django, Flask, and Jinja2, building robust RESTful APIs, database models, views, and business logic for enterprise-grade solutions.

●Built full-stack solutions using Python, AngularJS, HTML5, CSS3, JavaScript, Bootstrap, and AJAX, delivering responsive and user-friendly web interfaces.

●Engineered microservices and backend systems using Python, Node.js, SQL, and Perl on Linux and Windows platforms, supporting high-availability enterprise workloads.

●Designed and implemented data processing pipelines using PySpark, Apache Spark, Pandas, and NumPy for handling large-scale structured and unstructured datasets.

●Integrated Kafka with Spark for real-time streaming analytics and tuned Spark applications on AWS EMR to optimize performance and cost efficiency.

●Built and deployed machine learning models using Scikit-learn for classification, regression, and predictive analytics, improving forecasting accuracy by 20%.

●Developed AI-driven solutions using Vector Databases and Retrieval-Augmented Generation (RAG), enabling high-accuracy data retrieval, semantic search, and intelligent responses across enterprise systems.

●Designed RAG pipelines integrated with Python, AWS, Spark, and vector storage, significantly enhancing search quality and system performance.

●Built and deployed serverless architectures using AWS Lambda, DynamoDB, EC2, S3, and IAM, supporting scalable, secure, and cost-efficient cloud solutions.

●Developed backend services and APIs using Python and Flask with seamless integration into AWS cloud services and PostgreSQL, DynamoDB, and MongoDB databases.

●Implemented CI/CD pipelines using Jenkins, Git, GitHub, and Docker, streamlining build, testing, and deployments across development and production environments.

●Containerized applications using Docker, ensuring deployment consistency, scalability, and faster releases.

●Built ETL pipelines using Apache Airflow, Spark, and UNIX shell scripting, automating data ingestion, transformation, and scheduling through Autosys.

●Implemented ELK Stack (Elasticsearch, Logstash, Kibana) for centralized logging, monitoring, and advanced troubleshooting.

●Developed secure authentication and authorization systems using LDAP, enabling user management, access control, and directory integration.

●Integrated RESTful and GraphQL APIs, enabling seamless communication between front-end and back-end systems.

●Created automated test frameworks using pytest, unittest, and PyUnit, improving code reliability, test coverage, and release quality.

●Performed unit, integration, regression, and stress testing, ensuring production-grade application stability.

●Used Wireshark, Fiddler, and Live HTTP Headers for network-level debugging, API inspection, and performance analysis.

●Performed bug fixing, performance tuning, and production support for mission-critical Python applications.

●Worked in Agile/Scrum teams, participating in sprint planning, stand-ups, retrospectives, and JIRA-based tracking.

●Conducted code reviews and mentored junior developers, ensuring coding standards, security, and best practices.

●Built and deployed applications in Windows development environments and Linux production servers, integrating Microsoft Azure and AWS cloud platforms.

Environment: Python 3.7, Django, Flask, Jinja2, PySpark, Apache Spark, Pandas, NumPy, Scikit-Learn, Vector Databases, RAG, MongoDB, PostgreSQL, MySQL, Dynamo DB, AWS (EC2, Lambda, S3, EMR, IAM), Apache Airflow, Kafka, Docker, Jenkins, Git, GitHub, JIRA, AngularJS, Node.js, HTML5, CSS3, JavaScript, Bootstrap, REST, JSON, Perl, PHP, Scala, IIS, Visual Studio, PyCharm, Spyder, Eclipse, Windows, Linux, Agile.

Client: Honeywell- Albany, NY Jan 2022- Aug 2024

Sr. Python Full-Stack Developer

Responsibilities:

●Led end-to-end application development using Python, Django, Flask, CherryPy, and MVC architecture, delivering scalable enterprise web applications and RESTful APIs across multiple business domains.

●Actively participated in SDLC phases including requirements gathering, system design, development, testing, deployment, and client interaction.

●Designed and implemented data extraction, transformation, and loading (ETL) pipelines using Python, Pandas, PySpark, Spark-SQL, and Airflow, enabling efficient processing of large structured and unstructured datasets.

●Automated data ingestion into Snowflake, MySQL, PostgreSQL, Hive, and MongoDB, improving data availability and reporting accuracy.

●Built and deployed Generative AI and Machine Learning solutions using RNNs, LSTMs, GANs, StyleGAN, Deep Fakes, Scikit-learn, SciPy, Pandas, and NumPy to create synthetic images, music, and text, improving model training and business insights.

●Fine-tuned pre-trained LLMs and generative models to enhance content generation, automated reporting, and predictive analytics.

●Designed serverless and cloud-native architectures using AWS Lambda, EC2, S3, Dynamo DB, RDS, Step Functions, Glue, SQS, SNS, Cloud Watch, IAM, and KMS, enabling high-availability, secure, and cost-optimized deployments.

●Developed Python-based AWS Lambda APIs and used AWS CLI to automate backups, monitoring, and resource management.

●Built microservices and API layers using Python (Django, Flask), Node.js, and Spring Boot, integrating REST and GraphQL APIs with Angular 6/7 and React.js frontends.

●Implemented AJAX, JSON, and microservices-based architectures to support high-traffic web applications.

●Developed high-performance data pipelines using PySpark, Spark-SQL, Kafka, and Hive, enabling real-time and batch data processing.

●Integrated Kafka with Spark for streaming analytics and event-driven microservices.

●Containerized and deployed applications using Docker, Kubernetes, AWS ECS, and Jenkins CI/CD pipelines, ensuring reliable builds, automated testing, and fast deployments.

●Automated infrastructure and deployments using Ansible, Run deck, Jenkins, Maven, and Ant.

●Designed and administered databases including MySQL, PostgreSQL, Oracle, MongoDB, Dynamo DB, and SQLite, creating schemas, stored procedures, collections, and optimized queries.

●Built enterprise dashboards and control panels using Django, Oracle, PostgreSQL, and VMware APIs, enabling real-time monitoring and customer administration.

●Created Excel-based analytical reports using XlsxWriter with advanced charts, formatting, and automation.

●Developed web applications using Angular 6/7, React.js, HTML5, CSS3, Bootstrap, JavaScript, jQuery, and integrated them with Python-based APIs.

●Implemented secure internal frameworks using Python, CherryPy, and GnuPG encryption, accelerating service development and data security.

●Wrote Python scripts to parse XML, process time-series data, and load results into MySQL, MongoDB, and Hive using Pandas.

●Followed Agile (SCRUM) methodology with sprint planning, daily stand-ups, retrospectives, Git version control, and Jenkins CI/CD.

●Authored unit and integration tests using PyTest and PyUnit, integrating them into automated build pipelines for quality assurance.

Environment: Python 3.7/3.5, Django 2.x, Flask, CherryPy, PySpark, Spark-SQL, Pandas, NumPy, SciPy, Scikit-learn, GANs, RNNs, LSTMs, StyleGAN, Snowflake, MongoDB, MySQL, PostgreSQL, Oracle, DynamoDB, AWS (EC2, S3, Lambda, RDS, Glue, Step Functions, IAM, CloudWatch, SQS, SNS, KMS), Kafka, Docker, Kubernetes, Jenkins, Ansible, Git, React.js, Angular 6/7, Node.js, HTML5, CSS3, JavaScript, Bootstrap, REST, GraphQL, Linux, Windows.

Client: AT&T- Dallas, TX Mar 2019- Dec 2021

Sr. Python Developer

Responsibilities:

●Designed, developed, and supported Python-based web applications using Django, Flask, MySQL, and Apache, contributing across all phases of the Software Development Life Cycle (SDLC) including requirements, design, development, testing, deployment, and production support.

●Built secure backend services using Python, Django, RESTful APIs, JSON, and XML, enabling reliable data exchange with external systems.

●Developed data-driven web applications using HTML, CSS, JavaScript, jQuery, AJAX, and JSP, delivering interactive user interfaces and client-side validation.

●Implemented Django forms and PyQt GUI components to manage customer and policy data, supporting CRUD operations and real-time updates.

●Built risk-analysis and insurance systems using Python, Django, and statistical modeling, including insurance premium calculation engines and credit-risk data pipelines.

●Designed and maintained large-scale data stores using MySQL, PostgreSQL, Hive, and HDFS, enabling high-volume historical and transactional analysis.

●Developed machine learning and data science models using Pandas, NumPy, SciPy, Scikit-Learn, XGBoost, SVM, Random Forest, Bayesian HMM, and NLTK to support fraud detection, risk assessment, and predictive analytics.

●Performed data cleaning, data munging, feature engineering, model validation, and visualization using Matplotlib, Bokeh, and Plotly.

●Built big-data pipelines using Apache Spark (Spark SQL, MLlib), Hive, and HDFS on AWS, supporting real-time credit-card fraud detection and large-scale analytics.

●Wrote Hive queries and Spark jobs to compare incoming data against historical records and generate analytical datasets.

●Automated job scheduling, batch processing, and ETL workflows using UNIX shell scripting, Python OS module, process forking, and job control mechanisms.

●Developed Python batch processors to consume and produce multiple data feeds.

●Implemented secure SSH-based authentication and server automation, and used Wireshark for network protocol analysis and debugging.

●Used Valgrind and multithreading to optimize performance, memory usage, and concurrency.

●Built and tested RESTful and SOAP web services, integrating with external APIs such as Twilio, Facebook, and Twitter.

●Wrote unit tests using PyUnit and UnitTest, improving code reliability and test coverage.

●Created PHP/MySQL backends for Flash-based data entry and developed CSV-based data pipelines for batch processing.

●Used Perl, CGI, and Python scripts to automate content management, file processing, and database updates.

●Managed development, test, certification, and production servers, ensuring environment isolation and system stability.

●Used Subversion (SVN) for version control, supporting team collaboration and release management.

●Worked closely with business users, risk analysts, and domain experts to validate data models, reports, and system behavior.

●Followed Agile/SCRUM methodologies, participating in sprint planning, daily stand-ups, and user acceptance testing.

Environment: Python 2.7, Django 1.3, Flask, Pandas, NumPy, SciPy, Scikit-Learn, XGBoost, Spark (Spark SQL, MLlib), Hive, HDFS, MySQL, PostgreSQL, Oracle, PL/SQL, Apache, REST, SOAP, JSON, XML, HTML, CSS, JavaScript, jQuery, AJAX, PyQt, Eclipse PyDev, SVN, Jenkins, JIRA, Linux, UNIX, Shell Scripting, Perl, C++, Valgrind, AWS.

Client: Finish Line- Indianapolis, IN Jun 2016- Feb 2019

Python Developer

Responsibilities:

●Designed and developed RESTful microservices using FastAPI, Python, and Flask, enabling secure data exchange between Spotfire and Amazon SageMaker for machine learning model execution.

●Built scalable API layers with dependency injection, modular architecture, and schema-based validation, improving maintainability, reliability, and performance.

●Developed data orchestration pipelines using Apache Airflow (DAGs) for automated data ingestion, transformation, and scheduling, monitored through Airflow UI and CLI.

●Built analytics and reporting workflows using Pandas, NumPy, and Matplotlib for statistical analysis, visualization, and feature engineering supporting ML pipelines.

●Automated AWS infrastructure operations using Python (Boto3) to manage EC2, S3, and Lambda, including instance lifecycle control, object metadata retrieval, and serverless function execution.

●Implemented real-time cloud monitoring and alerting using AWS CloudWatch, StreamAlert, SNS, SQS, and Lambda, enabling event-driven alerts for critical system and security events.

●Configured Amazon Athena to run SQL-based analytics on S3 data lakes, supporting ad-hoc reporting and large-scale data exploration.

●Designed and implemented relational data models and schemas to support application and analytics workloads in MySQL and AWS-based databases.

●Containerized FastAPI and Python applications using Docker and deployed them through Jenkins CI/CD pipelines, ensuring consistent, repeatable, and scalable deployments.

●Built automated testing frameworks using PyUnit, validating API endpoints, data pipelines, and business logic across development and QA environments.

●Actively contributed across all phases of the SDLC including requirements analysis, system design, development, testing, and deployment using Agile methodologies and JIRA.

Environment: Python, FastAPI, Flask, Django, Apache Airflow, Pandas, NumPy, Matplotlib, PyUnit, AWS (EC2, S3, Lambda, Athena, CloudWatch, CloudTrail, Kinesis, SQS, SNS), Spotfire, Amazon SageMaker, Docker, Kubernetes, Jenkins, Git, GitHub, MySQL, JSON, Linux, JIRA, RESTful APIs.

Educational details:

●Bachelors of Technology from JNTU in 2013

●Master’s from University of Hartford in 2016



Contact this candidate