KIRANMAYEE
*********************@*****.*** +1-980-***-****
PROFESSIONAL SUMMARY
With over 10+ years of experience, specialized in Python development, cloud, MLOps and DevOps, web and application development, data engineering, and (AI/ML) machine learning, demonstrating strong project leadership and collaborative skills.
Extensive experience in cloud computing, capital markets, higher education, and web application development.
Designed and deployed scalable solutions on AWS (EC2, S3, RDS, Lambda, and Kubernetes), enhancing application performance and reducing downtime.
Developed and deployed AI applications using AWS Bedrock, leveraging its managed services to reduce infrastructure overhead and expedite machine learning model deployment.
Automated workflows using AWS Step Functions, Code Build, CodePipeline, and CI/CD pipelines, improving development and deployment efficiency.
Managed cloud infrastructure with AWS CloudFormation, Terraform, and Kubernetes, increasing system reliability and scalability.
Developed robust Python applications leveraging frameworks like Django, FastAPI and Flask, enabling the creation of scalable, high-performance backend services with seamless database integrations through SQLAlchemy and PostgreSQL.
Created scalable, cloud-native data processing pipelines with Azure Databricks and Python, ensuring smooth data transformation and integration with Azure SQL Database and Cosmos DB.
Deep expertise in leveraging Python for AI and natural language processing (NLP) projects, using libraries such as TensorFlow, Keras and NLTK for advanced text processing and machine learning applications.
Skilled in applying Python to machine learning and predictive modeling using frameworks like Spark MLlib and experienced in detailed data analysis and visualization
Developed and deployed advanced Gen AI models using TensorFlow, PyTorch, and OpenAI GPT APIs to automate text generation, enhancing productivity and decision-making processes for clients across diverse industries.
Developed dynamic and responsive web applications using Python (Django, Flask, FastAPI), JavaScript (Vue.js, React), and front-end technologies (HTML5, CSS3, Bootstrap).
Automated and streamlined code deployment processes by using Azure DevOps, Docker, and Kubernetes, ensuring consistent environments across all stages of development, testing, and production.
Architected scalable, event-driven applications using AWS Lambda, enabling automated workflows that responded to data changes in real-time, reducing latency by over 50%.
Leveraged Autogeny frameworks like Hugging Face Transformers and OpenAI Codex to automate code generation for large-scale applications, reducing developer effort and accelerating project timelines.
Utilized Azure Kubernetes Service (AKS) to containerize and orchestrate Python applications, improving scalability and manageability of cloud-native solutions in production.
Implemented CI/CD pipelines for Python applications using Azure DevOps, automating the deployment process and integrating with Azure Kubernetes Service (AKS) for seamless continuous integration.
Built and deployed reinforcement learning models in real-time with AWS SageMaker RL, driving optimized decision-making in dynamic business environments.
Skilled in implementing Machine learning models using AWS SageMaker, Comprehend, Python, R, TensorFlow, PyTorch, Keras and Scikit-learn, consistently delivering high-performance and accurate predictive analytics.
Implemented complex data structures, algorithms, and database solutions with MySQL, MongoDB, PostgreSQL, and Redis, optimizing data handling and retrieval.
Integrated RESTful APIs, GraphQL, and third-party services to enhance application functionality.
Utilized Python libraries (Pandas, NumPy, Scikit-learn) for data manipulation, analysis, and machine learning, reducing data processing time by 35%.
Implemented real-time data streaming with Kafka and machine learning workflows on Amazon SageMaker, improving data throughput and model training efficiency.
Developed and deployed machine learning models on AWS SageMaker using XGBoost, TensorFlow, and PyTorch, achieving a 25% increase in model accuracy over previous solutions.
Deployed and managed databases on GCP (Cloud SQL, Bigtable, Spanner) and AWS (Redshift, DynamoDB), ensuring high performance and reliability.
Integrated AWS Lambda with Amazon API Gateway and Amazon DynamoDB to build serverless RESTful APIs, improving the speed of data retrieval and response times for applications.
Designed and implemented Machine Learning solutions with Scikit-learn, Keras, and XGBoost, improving predictive analytics capabilities and reducing time-to-insight by automating key data-driven processes.
Led the training and fine-tuning of custom models using AWS Bedrock's pre-built algorithms for advanced NLP tasks such as sentiment analysis and document classification.
Led development teams in Agile environments, ensuring timely delivery of high-quality software solutions.
Collaborated with cross-functional teams, including traders, analysts, and stakeholders, to design and implement tools for financial markets and higher education.
TECHNOLOGIES
Python, AWS, Django, Flask, FastAPI, Vue.js, React, Kubernetes, Docker, Jenkins, MySQL, MongoDB, PostgreSQL, Redis, Kafka, SageMaker, CI/CD, Terraform, MLflow, SageMaker Pipelines, Azure ML,AWS Bedrock, Pytest, model versioning, Pandas, NumPy, Scikit-learn, GCP.
TECHNICAL SKILLS
Programming Languages:
Python, JavaScript (ES6+), Shell Scripting, TypeScript
Web Frameworks:
Django, Flask, FastAPI, Express.js
Front-End Technologies:
HTML5, CSS3, Bootstrap, jQuery, AJAX, Vue.js, Angular 17/15/10, React
Database Technologies:
MySQL, MongoDB, PostgreSQL, Cassandra, DynamoDB, Oracle, Cloud SQL, Firestore, Bigtable, Spanner
Cloud
Azure, AWS
Containerization Orchestration:
Docker, Kubernetes, AWS ECS
CI/CD and Build Automation:
Jenkins, AWS CodePipeline, Maven, AWS CodeBuild, ANT
API Development and Testing:
Postman, Swagger, GraphQL
Data Processing and Analysis:
Pandas, NumPy, PySpark, Scikit-learn, Power BI, Tableau
Machine Learning & AI:
TensorFlow, PyTorch, Keras, GPT-3, BERT, Transformers
MLOps
MLflow, SageMaker Pipelines, Azure ML, Jenkins, Pytest, GitHub Actions, model monitoring, experiment tracking, automated testing, CI/CD integration
Logging and Monitoring:
ELK Stack (Elasticsearch, Logstash, Kibana), Prometheus, Grafana, Datadog
Data Engineering:
AWS Glue, PySpark, ETL, Data Pipelines
Version Control:
Git, GitHub, Bitbucket
Security and Authentication:
OAuth, SSL/TLS, JWT (JSON Web Tokens)
Miscellaneous:
XML, XSD, XSLT, SOAP, ActiveMQ, Snowflake, JIRA, Agile (Scrum), NetBeans, IntelliJ IDE, PySide, Gunicorn
PROFESSIONAL EXPERIENCE
AT&T – Dallas, Texas
Senior Software Engineer (AI/ML Python Full Stack AWS & Azure) April 2023 - Present
Led the development of dynamic web applications using Python, significantly improving application performance and user engagement across various domains including banking and mortgage markets.
Developed high-performance RESTful APIs using FastAPI and Python, integrating with Azure Functions and Azure API Management to enable scalable, low-latency backend services for financial applications.
Developed dynamic and responsive web interfaces with HTML5, CSS3, Bootstrap, and JavaScript (ES6+), enhancing user engagement by 85%.
Designed and managed databases using MySQL and MongoDB, implementing efficient data storage and retrieval mechanisms.
Optimized financial data models for portfolio management systems, enhancing the accuracy of asset valuation and risk assessment, leading to more informed investment decisions.
Developed and fine-tuned GPT-based and AI models for diverse applications using Machine Learning and OpenAI services.
Utilized transformers and large language models (LLMs) GPT-3 and BERT for NLP tasks, improving text classification and sentiment analysis accuracy by 30%.
Integrated Azure Functions with Azure Logic Apps and Event Grid to enable serverless workflows, creating scalable and cost-effective solutions for real-time data processing and task automation, leveraging FastAPI for rapid API development.
Configured and optimized Databricks notebooks for seamless data ingestion, transformation, and reporting.
Developed and executed Python scripts in Databricks for complex data workflows and financial calculations.
Leveraged Databricks' distributed computing environment to handle large-scale data processing and analysis.
Integrated Databricks with multiple APIs (e.g., Tempo, Jira, Jira Align) to fetch and process structured and unstructured data.
Established AI/ML governance frameworks to ensure compliance with industry regulations, including GDPR and HIPAA.
Designed audit-ready documentation for AI/ML pipelines, detailing model performance, data lineage, and decision-making logic.
Implemented JavaScript-based algorithms and dynamic web features with React to enhance user interactions and ensure responsive UIs, using Redux for state management and Webpack for optimized build processes.
Conducted risk assessments for AI/ML models in regulated environments, ensuring adherence to ethical AI standards.
Cleaned, validated, and transformed raw data into actionable insights, ensuring accuracy for financial reporting and analytics.
Implemented scalable ETL pipelines in Databricks for data cleansing, aggregation, and enrichment.
Collaborated with a team of researchers to explore and implement cutting-edge AI technologies, resulting in the development of AI solutions.
Implemented XML, XSD, and XSLT for data exchange and transformation.
Developed dynamic and responsive user interfaces using Vue.js, enhancing user engagement and improving application performance.
Implemented continuous integration and deployment (CI/CD) pipelines using Azure DevOps and GitLab, enabling seamless and efficient management of code repositories and automated deployment for Gen AI-driven applications.
Authored comprehensive data dictionaries detailing schema, metadata, and variable descriptions for enhanced data governance.
Created detailed model documentation, including algorithm selection rationale, parameter tuning, and performance metrics.
Engineered highly efficient backend systems with Node.js and Express, creating RESTful APIs that processed thousands of concurrent requests while ensuring low latency and high availability in production environments.
Configure and optimize auto-scaling policies for Fargate tasks to handle variable application loads seamlessly.
Implement secure networking practices, such as VPC configurations and IAM roles, to ensure secure access control and compliance with security policies.
Developed and maintained scalable test automation frameworks with Pytest to support continuous integration and delivery.
Integrated custom plugins and fixtures in Pytest to handle test dependencies efficiently.
Deployed applications using Gunicorn, Docker, and Kubernetes for containerization and orchestration, improving deployment speed by 50%.
Developed custom PySpark functions to extend the capabilities of standard data processing tasks, enabling more complex data transformations and analyses.
Integrated Amazon Comprehend for NLP natural language processing tasks, enabling sentiment analysis and keyword extraction from large datasets for actionable business insights.
Utilized Azure Cognitive Services (like Text Analytics API, Computer Vision, and Speech Services) to develop Gen AI-driven solutions for language understanding, image recognition, and speech-to-text conversion, significantly enhancing user experience.
Monitored and troubleshooted PySpark jobs to ensure reliable and efficient data processing, reducing downtime and improving data pipeline reliability. Established CI/CD pipelines using Jenkins and Code Build for continuous integration and delivery.
Managed the integration of SWAP (Swap One, TS Imagine) with existing systems, improving the efficiency of derivatives processing
Conducted API testing and documentation with Postman and Swagger to ensure API reliability and usability, achieving a 98% success rate in API tests.
Integrated React components with GraphQL and Apollo Client, allowing for efficient querying and manipulation of data from the backend, resulting in improved client-side performance and user experience.
Integrated SQS with event-driven architectures, ensuring asynchronous communication and decoupling between services.
Developed and optimized backend services for large-scale applications using Java, ensuring seamless integration with front-end interfaces.
Built robust, scalable backend systems with Azure Functions, integrating them with Azure SQL Database and Cosmos DB to handle complex data processing requirements for high-demand Gen AI applications.
Built RESTful APIs in Java Spring Boot, enabling efficient communication between microservices in distributed systems.
Implemented Java-based ETL workflows for real-time data processing, reducing pipeline execution time by 20%.
Designed and maintained enterprise-grade software architectures using Java, focusing on scalability and performance optimization.
Designed and implemented message queuing workflows using Amazon SQS for reliable message storage, retrieval, and processing.
Designed and developed dynamic, responsive web applications using React.js, implementing reusable components to enhance maintainability.
Integrated React frontends with Python-based backends using REST APIs and GraphQL, ensuring seamless data flow and efficient performance.
Designed and developed responsive user interfaces (UIs) using React.js and integrated with RESTful APIs created in Python.
Configured PostgreSQL replication and failover strategies to ensure data availability and minimize downtime.
Utilized SWAP One for derivatives processing, including trade lifecycle management, valuation, and settlement, contributing to streamlined operations and reduced operational risk.
Integrated SWAP One with other financial systems to enhance data flow and consistency across trading platforms, improving the accuracy of financial reporting and compliance.
Optimized Machine Learning workflows with MLflow and Azure ML for model tracking, versioning, and deployment, ensuring streamlined model management and faster iteration cycles for evolving Gen AI solutions.
Implemented MLOps best practices using MLflow and Azure ML for model versioning, experiment tracking, and continuous integration of Gen AI models.
Built end-to-end machine learning pipelines with CI/CD automation, enabling rapid deployment and monitoring of models in production.
Developed scalable test automation frameworks with Pytest and integrated with Jenkins and GitHub Actions for automated model testing.
Utilized MLflow for tracking machine learning experiments, versioning models, and ensuring reproducibility of model training processes across multiple projects.
Leveraged TS Imagine monitoring and analyzing market risk, providing actionable insights and mitigating potential risks in volatile market conditions.
eBay - San Jose, California.
Senior Software Engineer - AI/ML & Full Stack January 2021 to March 2023
Developed and maintained trading platforms for equity and fixed-income markets (capital markets), ensuring real-time data processing and transaction accuracy, which led to a 20% increase in trade execution speed.
Implemented scalable solutions on AWS (EC2, S3, RDS, Lambda, API Gateway) to enhance application performance.
Designed and developed high-performance RESTful APIs using FastAPI and Python, enabling efficient communication for trading platforms and integrating with AWS API Gateway for seamless scalability and real-time data processing.
Implemented risk management solutions in capital markets security application, integrating real-time data feeds to monitor market risks and adjust trading strategies, accordingly, reducing potential losses by 15%.
Utilized advanced Python programming to develop and deploy AI models on AWS, optimizing data pipelines for predictive analytics in management, underwriting, and risk assessment.
Implemented cost-effective batch processing jobs using AWS Lambda in conjunction with Amazon SQS, automating tasks like image processing, which reduced manual workload by 80%.
Leveraged DynamoDB and OpenSearch for real-time data storage and retrieval, ensuring optimal performance for AI applications.
Managed and optimized databases with PostgreSQL and implemented caching with Redis for improved performance.
Leveraged NumPy, Pandas, and Scikit-learn for data manipulation, analysis, and machine learning tasks, reducing data processing time by 35%.
Configured security and compliance features within AWS Bedrock to ensure that AI workloads adhered to industry standards, improving data privacy and governance.
Utilized Python and FastAPI for developing microservices deployed via Flask-based and FastAPI-based endpoints, enabling real-time decision-making for predictive analytics in trading systems.
Utilized Python for data manipulation and analysis tasks, leveraging libraries like Pandas, NumPy, and Matplotlib, and built predictive models deployed via Flask-based microservices for real-time decision-making.
The ELK stack (Elasticsearch, Logstash, Kibana) was employed for efficient logging, monitoring, and visualization.
Implemented real-time data streaming and processing with Kafka, enhancing data throughput by 30%.
Collaborated with traders and analysts to design and develop tools for market data analysis, enabling
the identification of arbitrage opportunities and enhancing trading profitability.
Managed version control and collaborative development with Git. Configured Nginx for web server management and load balancing.
Implemented complex data queries and transformations in Apache Hive to streamline analytics for multi-terabyte datasets.
Designed HiveQL scripts for batch data processing, improving ETL efficiency by 25% in a distributed environment.
Integrated Hive with Spark and Kafka for real-time data ingestion and analysis pipelines in the Big Data ecosystem.
Implemented automated workflows within AWS Bedrock, leveraging its integration with Amazon Polly and AWS Translate to enhance multi-lingual chatbot functionalities.
Designed Python scripts for seamless integration with external APIs, enabling automated data retrieval and updates within Databricks.
Collaborated with cross-functional teams to refine Databricks workflows and ensure alignment with business objectives.
Documented best practices for Databricks notebook management, including version control and parameterization.
Automated monthly processes, such as data extraction and accrual report generation, using Python and Databricks notebooks.
Created dynamic and interactive front-end applications using React, implementing features such as lazy loading, code-splitting, and hooks for improved performance, scalability, and maintainability.
Diagnosed and resolved performance bottlenecks in Databricks notebooks and Python scripts.
Implemented robust error-handling mechanisms for debugging API issues and ensuring data pipeline reliability.
Monitor SQS queues using CloudWatch metrics, setting up alerts for queue depth, processing times, and message delay to ensure operational efficiency.
Established CI/CD pipelines with Jenkins, integrating SonarQube for continuous code quality checks, reducing code review time by 20%.
Developed process flows and system architecture diagrams to complement model and application documentation.
Maintained version-controlled documentation for data pipelines and machine learning workflows to ensure consistency across teams.
Automated build and test processes using AWS CodeBuild as part of CI/CD pipelines, leading to a 50% reduction in build times and enhancing the reliability of code deployments.
Integrated AWS CodeDeploy with Jenkins and CodePipeline to streamline continuous deployment workflows, reducing deployment times by 30% and minimizing manual intervention.
Designed and developed seamless integration between React front-end and Node.js backend, implementing real-time features such as chat applications and live data updates using Socket.IO for bidirectional communication.
Designed and developed scalable solutions to handle high-traffic and data-intensive applications.
Conducted unit testing to ensure code quality and reliability, reducing post-deployment bugs by 50%.
Utilized Docker for containerization and Kubernetes for orchestration, enhancing application deployment and management and improving deployment efficiency by 50%.
Configured AWS CloudFront for content delivery and caching to improve application speed and performance, increasing content delivery speed by 30%.
Integrated Pytest test suites with CI/CD pipelines (e.g., Jenkins, GitHub Actions) to ensure automated testing during builds.
Automated deployment validation using Pytest in cloud environments like AWS
Automated API testing workflows using Pytest to validate RESTful services and ensure robust backend functionality.
Configure PostgreSQL replication and failover strategies to ensure data availability and minimize downtime.
Automated build and deployment processes with Maven, ensuring efficient software delivery. Used Eclipse IDE for efficient coding, debugging, and project management.
Leveraged Node.js and Express to build and maintain microservice architectures, using tools like Docker for containerization and Kubernetes for orchestration to scale services based on demand and improve system reliability.
Managed project tasks and workflows using JIRA, adhering to Agile (Scrum) methodologies for iterative development and timely delivery.
Cisco Systems, San Jose, CA.
Senior Software Engineer August 2018 to December 2021
Implemented microservices in Python, deploying them on Azure Kubernetes Service (AKS) with Docker containers for efficient orchestration and automated scaling.
Integrated Azure Cosmos DB with Python applications to provide globally distributed NoSQL databases, ensuring low-latency and high-throughput data storage for web applications.
Managed continuous integration and continuous deployment (CI/CD) workflows for Python applications using Azure DevOps, ensuring automated testing and smooth deployments to Azure App Services.
Responsible for Debugging and Troubleshooting issues.
Involved in debugging the applications monitored on JIRA using agile methodology.
Developed, tested, and debugged software tools utilized by clients and internal customers.
Built Python backend services that interact with Azure SQL Database, providing data-driven web applications with optimized relational database queries and seamless cloud-based integration.
Coded test programs and evaluated existing engineering processes.
Implemented comprehensive testing strategies with Jest and Mocha for JavaScript and Node.js applications, ensuring high code quality and reducing bugs in production through automated unit and integration tests.
Responsible for running and maintaining business processes on a daily / weekly / monthly basis.
Implemented real-time messaging in Python applications using Azure Service Bus and Azure Queue Storage, enabling asynchronous communication between microservices deployed on Azure Kubernetes Service (AKS).
Coded test programs and evaluated existing engineering processes.
Responsible for running and maintaining business processes on a daily / weekly / monthly basis.
Novartis, Hyderabad, India
Senior Python Developer October 2016 to June 2018
Developed and maintained web applications using Python and Flask, implementing data structures and algorithms to optimize performance.
Applied Object-Oriented Programming (OOP) principles and design patterns to enhance code modularity and maintainability.
Utilized SQL Alchemy ORM for efficient database interactions and integrated AWS services such as EC2, RDS, S3, and Lambda for scalable cloud solutions.
Automated tasks and deployment processes with Shell and ANT scripts, improving development workflows.
Conducted database design and optimization using Oracle, ensuring high performance and reliability.
Developed and consumed SOAP services with WSDL, ensuring secure and efficient data exchange.
Established CI/CD pipelines with Jenkins for continuous integration and deployment, enhancing software delivery efficiency.
Integrated ActiveMQ for asynchronous messaging, improving system scalability and reliability.
Utilized XML for data representation and WebLogic server for robust application deployment.
Practiced Agile (Scrum) methodologies to ensure iterative development and timely project delivery.
Utilized IntelliJ IDE for efficient coding, debugging, and testing.
Conducted unit testing with Mockito to ensure code quality and reliability.
SS&C Eze Software, Bangalore, India
Python Developer January 2015 to September 2016
Automated the execution of batch tests using Shell scripting, integrated with Python, streamlining test processes.
Developed and managed complex data structures and XML parsing routines using Python for efficient data handling.
Utilized advanced Python features like pickle and unpickle for serializing and deserializing objects, facilitating data sharing across applications.
Implemented PyQt for dynamic column filtering, enhancing user experience by allowing effective viewing of transactions and statements. Applied navigation rules for improved application flow.
Leveraged Django database APIs to interact with database objects, ensuring seamless data access and manipulation.
Refactored Python/Django modules to support various data formats, meeting specific project requirements.
Created Python APIs for debugging, enabling the capture and examination of array structures at failure points.
Wrote Python programs for parsing and validating data from Excel files, handling large volumes of user data efficiently.
Designed and implemented Cassandra databases for server-client data management, ensuring scalability and reliability.
Developed front-end interfaces using HTML, CSS, JavaScript, and jQuery, enhancing user interaction and application functionality.
Dynamically generated property lists for applications using Python, facilitating easier configuration and management.
Implemented high-performance MongoDB instances to support scalable data storage and retrieval.
Education:
Bachelor of Science in Computer Science
Gitam University
Graduated: May 2014
Certifications:
AWS Certified Machine Learning - Specialty