Post Job Free
Sign in

Senior Python & AI Engineer for Enterprise GenAI Solutions

Location:
Milpitas, CA
Posted:
January 05, 2026

Contact this candidate

Resume:

Sai Sashank

***************@*****.*** +1-816-***-**** LinkedIn

Summary:

•Senior Python Developer and AI Engineer with 9 years of experience architecting and delivering scalable, cloud-native, and AI-driven solutions in high-performance environments.

•Designed and implemented production-grade Generative AI applications using Vertex AI, GPT-4, and Hugging Face models to support clinical knowledge access and internal decision-support workflows in a HIPAA-compliant environment.

•Built retrieval-augmented generation (RAG) pipelines using Vertex AI, LangChain, BigQuery, FAISS, and Pinecone, enabling secure conversational access to SOPs, HL7 mappings, and pathology documentation.

•Developed LLM-powered internal assistants and copilots that reduced document search time by ~60% and improved productivity for clinicians, lab technicians, and operations teams.

•Fine-tuned domain-specific LLMs (LLaMA, Mistral) on Vertex AI for clinical summarization, pathology report generation, and medical Q&A, improving accuracy and reducing manual review effort.

•Implemented LLM evaluation and observability using RAGAS and custom scoring metrics to track hallucination rates, answer relevance, and context precision across production AI systems.

•Created agent-based AI workflows using LangChain, AutoGen, Rasa, and BotPress, enabling multi-step reasoning, tool calling, memory persistence, and task automation for healthcare operations.

•Established prompt engineering standards and guardrails across teams, introducing reusable prompt templates, structured outputs, and validation strategies to ensure consistent and reliable LLM behavior.

•Built scalable backend services using Python (FastAPI, Django) to expose GenAI and RAG capabilities through secure APIs consumed by React-based internal applications.

•Deployed and operated cloud-native AI services on GCP (Vertex AI, Cloud Run, BigQuery, Cloud Functions) with hybrid support across AWS and Azure for enterprise workloads.

•Implemented MLOps pipelines using Vertex AI Pipelines, MLflow, and CI/CD, supporting model versioning, automated retraining, and controlled production rollouts.

•Orchestrated containerized microservices using Docker, Kubernetes, and Terraform, ensuring high availability, scalability, and environment consistency across AI platforms.

•Designed data pipelines using BigQuery, Snowflake, PostgreSQL, and MongoDB to support embeddings generation, LLM training, and analytics workloads.

•Set up observability and monitoring using ELK Stack, Prometheus, and Grafana, providing visibility into AI service latency, model performance, and system health.

•Implemented enterprise security controls using OAuth2, JWT, RBAC, and cloud IAM, ensuring secure access to GenAI systems and sensitive healthcare data.

•Mentored engineers on GenAI adoption and best practices, supporting architectural reviews, sprint planning, and the rollout of LLM-powered solutions across teams.

•Built and deployed machine learning models using scikit-learn, TensorFlow, and PyTorch for classification, regression, clustering, and anomaly detection across healthcare and operational datasets.

•Designed deep learning pipelines for NLP and predictive analytics, including feature engineering, model training, hyperparameter tuning, and performance evaluation in production environments.

•Developed and maintained database architectures using PostgreSQL, SQL Server, Snowflake, MongoDB, and Cassandra, supporting high-throughput transactional and analytical workloads.

•Optimized complex SQL and T-SQL queries, stored procedures, and indexing strategies to improve query performance and reduce latency in mission-critical systems.

•Built end-to-end ETL/ELT pipelines using Apache Airflow, Azure Data Factory, and Python, enabling reliable ingestion, transformation, and validation of large-scale structured and unstructured data.

•Created Power BI dashboards with advanced DAX, row-level security, and incremental refresh to deliver real-time insights for leadership and operational teams.

•Developed Tableau dashboards and visual analytics using custom SQL, calculated fields, parameters, and LOD expressions to support KPI tracking and trend analysis.

•Integrated Python-driven analytics with Power BI and Tableau, combining statistical modeling and ML outputs with interactive visualizations.

•Collaborated with stakeholders across data science, engineering, and business teams to translate analytical requirements into scalable data, ML, and BI solutions.

Technology Skills:

Technology Category

Technologies

Generative AI & LLMs

Vertex AI, OpenAI GPT-4, ChatGPT API, Hugging Face Transformers, LangChain, LlamaIndex, AutoGen, RAG, Prompt Engineering, Tool Calling, Vector DBs (Pinecone, FAISS, Weaviate), RAGAS

MLOps & AI Platforms

Vertex AI Pipelines, MLflow, Kubeflow, AWS SageMaker, Azure ML, TFX, Model Monitoring, Drift Detection, CI/CD for ML

Programming Languages

Python, SQL, T-SQL, PL/SQL, JavaScript, TypeScript, Java, Bash, PowerShell

Backend & API Development

FastAPI, Django REST Framework, Flask, Express.js, REST APIs, GraphQL, gRPC, OAuth2, JWT, RBAC

Cloud & DevOps Platforms

GCP (Vertex AI, Cloud Run, BigQuery, GKE), AWS (Lambda, EC2, S3, RDS), Azure (Functions, AKS), Docker, Kubernetes, Terraform, Jenkins

Databases & Storage

PostgreSQL, SQL Server, MySQL, MongoDB, Redis, Cassandra, Snowflake, BigQuery, DynamoDB

Machine Learning & Deep Learning

Scikit-learn, TensorFlow, PyTorch, XGBoost, LightGBM, Feature Engineering, Model Tuning

Data Engineering

Apache Airflow, Azure Data Factory, Kafka, Spark, dbt, ETL/ELT Pipelines, SQLAlchemy

Frontend Technologies

React.js, Next.js, Angular, Redux, Bootstrap, Tailwind CSS, HTML5, CSS3, SASS

Monitoring & Observability

ELK Stack, Prometheus, Grafana, CloudWatch, DataDog, Splunk

Version Control & CI/CD

Git, GitHub, GitLab, GitHub Actions, CircleCI, Azure DevOps

Containerization & Infrastructure

Docker, Kubernetes, Helm, Terraform, CloudFormation, Istio, Nginx

Data Visualization & BI

Power BI, Tableau, Plotly, Dash, Matplotlib, Seaborn

IDEs & Dev Tools

VS Code, PyCharm, Jupyter Notebook, IntelliJ IDEA, Postman, Swagger/OpenAPI

Project & Collaboration Tools

Jira, Confluence, ServiceNow

Operating Systems

Linux (Ubuntu), macOS, Windows Server, UNIX

WORK EXPERIENCE:

Senior AI Engineer / ML Engineer

Kaiser Permanente, Berkeley, California - Onsite April 2023 – Present

Project Description: At Kaiser Permanente, I led the migration of legacy healthcare systems to a cloud-native microservices architecture using Python (FastAPI, Django) and React.js. Managed full SDLC, integrating RESTful APIs, modernized SOAP services, and implemented GraphQL for efficient data querying. Developed middleware with Health Connect ensuring HIPAA compliance and HL7 healthcare data standards. Designed automated CI/CD pipelines with GitLab, Azure DevOps, and SonarQube for quality and security. Integrated cutting-edge Generative AI solutions including NLP-based clinical summarization and conversational AI assistants. Utilized Docker, Kubernetes, Prometheus, Grafana, and ELK stack for scalable deployments and monitoring. Collaborated with data science teams to deploy AI/ML models in production via MLflow and SageMaker for predictive analytics.

Responsibilities:

•Spearheaded the implementation of Leica GT450 digital pathology scanners in one of Northern California’s largest anatomic pathology labs, digitizing workflows and enabling scalable diagnostic imaging.

•Created LLM evaluation dashboards using RAGAS to measure faithfulness, context precision, and answer relevancy across guest-facing AI assistants. Developed a fine-tuned LLM for hospitality FAQs, training on historical call-center transcripts and guest feedback data to improve personalization.

•Administered and monitored SAM server configurations, deploying and validating scanner upgrades to ensure system stability and lab compliance.

•Automated AI evaluation pipelines with dashboards that tracked business impact, performance metrics, and adoption KPIs across multiple teams.

•Deployed and tested Path Tracker bulk barcode scanning systems, improving specimen tracking accuracy through seamless integration and QA.

•Integrated RAG capabilities using LangChain and OpenAI APIs to develop an internal assistant for hotel staff that retrieves policy documents and procedures, reducing onboarding time by 35%.

•Automated model training pipelines using Apache Airflow, significantly reducing the manual overhead in data preparation, training, validation, and deployment cycles.

•Designed and delivered multiple user-specific web applications using Python (FastAPI, Django) and React.js, streamlining workflows for pathologists, lab techs, and clinical staff

•Collaborated closely with end-users to gather functional requirements, translating them into responsive, interactive UIs with React.js and secure backend APIs using Python + PostgreSQL

•Migrated legacy healthcare systems to a modern microservices architecture built with .NET, Angular, and Python, handling full SDLC, REST/SOAP integrations, and LINQ-based data access

•Developed HIPAA-compliant middleware services using Health Connect and HL7 messaging, ensuring secure clinical data exchange across distributed systems

•Built and maintained multiple full-stack applications using Python (Django, FastAPI), React, and Tasty Pie, supporting lab automation and pathology reporting systems

•Created LLM-powered agents using Rasa and BotPress with memory, reasoning, and external tool integrations to automate technician workflows and case routing

•Applied ML algorithms including Random Forest, KNN, Naive Bayes, and clustering using scikit-learn, pandas, NumPy, and Python to analyze device performance and predict memory failures

•Built real-time dashboards with ELK Stack (Elasticsearch, Logstash, Kibana), Graphite, and Redis to give clinical leadership visibility into lab operations and equipment health

•Delivered GenAI solutions like GPT-4-powered documentation assistants and workflow copilots, built with LangChain and integrated into internal web apps

•Deployed serverless and containerized applications using AWS Lambda, EC2, S3, Elastic Beanstalk, and Docker, enabling rapid iteration and scalable performance

•Led release management and DevOps processes including GitHub permission handling, rollback strategies, and coordination across QA, security, and engineering teams

•Mentored internal teams on GenAI best practices, facilitating adoption through training, architectural reviews, and rapid prototyping sessions

•Built analytics dashboards and lab reports using T-SQL, Power BI, and Vantage, enabling leadership to make data-informed decisions

•Designed a GPT-4-powered digital assistant with LangChain to support conversational access to SOPs, equipment manuals, and HL7 mappings

•Developed retrieval-augmented generation (RAG) pipelines using FAISS and Pinecone to support intelligent querying of pathology records and imaging notes

•Fine-tuned LLMs (LLaMA, Mistral) to generate structured pathology reports from unstructured sources like scanned PDFs and audio notes

•Created an NLP pipeline to extract clinical entities from pathology reports and match them with structured patient data, reducing manual reconciliation time by 60%

•Integrated GenAI copilots into React-based lab apps to assist with barcode validation, scan quality checks, and technician troubleshooting

•Managed ML lifecycle with MLflow, implementing model versioning, performance tracking, and automated retraining triggers

•Developed ETL pipelines using Apache Airflow to clean and prepare scanner logs, lab audit trails, and database extracts from SQL Server and MongoDB

•Optimized LLM usage with advanced prompt engineering, tool-calling, and chain-of-thought reasoning for consistent and reliable model outputs

•Established an internal LLM governance framework to track hallucination rates, prompt quality, feedback loops, and model comparison metrics

Environment: Python 3.10+, FastAPI, Django 3.x/4.x, ReactJS, PostgreSQL, Microsoft SQL Server, MongoDB, HL7, Health Connect, RESTful APIs, SOAP, GraphQL, Microservices, Serverless (AWS Lambda), Docker, Kubernetes, Apache Airflow, Git, GitHub, GitLab CI/CD, Jenkins, SonarQube, MLflow, Pandas, NumPy, Scikit-learn, LangChain, OpenAI API (GPT-4), Pinecone, FAISS, Streamlit, Dash, Power BI, T-SQL, Excel VBA, Tableau, Vantage, Elastic Beanstalk, CloudFront, S3, EC2, Azure DevOps, Prometheus, Grafana, ELK Stack, Pytest, Visual Studio Code, Jupiter, Jira, Confluence.

Senior Software Engineer (AI/ML Engineer GenAI Full Stack Python)

New York State Department of Labor – Rem Jun 2022 – Mar 2023

Project Description: Developed and deployed cloud-native web and mobile applications for claims and Benefits migration using Python (FastAPI, Django) and ReactJS with secure, real-time data integration. Built serverless APIs using AWS Lambda, automated deployments via Jenkins and Git, and implemented ML models using scikit-learn, Pandas, and Snowflake. Collaborated with geospatial analysts, contributed to open-source Python tools, and supported data validation using R and Excel VBA. Ensured compliance with security hardening standards and led testing, documentation, and production rollout.

Responsibilities:

•Engineered and deployed scalable cloud-native full-stack applications using Python (FastAPI, Django) and ReactJS, designed to optimize state-wide labor data processing and compliance tracking workflows

•Built secure and modular RESTful APIs deployed on AWS Lambda with integration across EC2, S3, CloudFront, Cloud Search, and Elastic Beanstalk, reducing manual intervention in case reporting by 70%

•Created and deployed serverless ML inference pipelines using Python, Pandas, and scikit-learn to detect anomalies in workforce trends and improve decision-making latency

•Applied a range of ML algorithms (Linear Regression, Naive Bayes, Random Forest, K-Means) using scikit-learn, NumPy, Seaborn, NLTK, and Matplotlib to uncover patterns in citizen job behavior and program efficacy

•Extracted data from Snowflake and Redshift, transforming and loading it into One Lake Data Warehouse to support analytics dashboards used by state-level stakeholders

•Designed ETL pipelines for high-volume data ingestion using Jenkins, Python scripts, and TWS Scheduler, ensuring consistency and fault tolerance

•Collaborated with cross-functional teams of geospatial analysts, data scientists, and engineers to build scalable data processing tools and integrate spatial insights into labor and healthcare programs

•Contributed to open-source geospatial Python tools, engaging with the geospatial tech community to standardize data processing pipelines and improve performance

•Developed data access and reporting scripts in R, supporting statistical validation and visualization for labor trend analysis

•Applied system hardening guidelines, performed post-configuration functional testing, and aligned deployments with internal policy and security standards

•Developed user-centric frontend interfaces using ReactJS, TypeScript, and Bootstrap, aligned with accessibility standards and responsive UX principles

•Built visual workflows and automated reporting tools using Power BI, Power Automate, and Power Apps, enabling secure, self-service analytics for healthcare and employment data

•Designed complex Tableau dashboards by joining multiple data sources; implemented user role-based security, parameters, filters, and KPIs aligned to healthcare reporting requirements

•Created custom reporting and visualizations in SSRS (tabular, matrix, chart, drill-down) and optimized SQL Server performance through query tuning and indexed views

•Collaborated closely with data analysts, engineers, and product leads to integrate data validation layers and predictive logic into both React-based UIs and serverless backends

•Implemented data governance and compliance standards aligned with HIPAA-like models, applying strict access control, audit logging, and field-level encryption on sensitive labor datasets

•Developed custom ReactJS hooks and Redux logic to manage complex stateful workflows in claim processing portals

•Ensured secure infrastructure by implementing AWS IAM policies, MFA, encrypted S3 buckets, and VPC subnetting strategies across deployments

•Designed and maintained data models for Postgres and SQL Server, handling relational integrity, index tuning, and query optimization for high-throughput access

•Created custom metrics dashboards using Grafana and CloudWatch, enabling live infrastructure monitoring and alerting across services

Environment: Python 3.10+, FastAPI, Django 3.x, ReactJS, TypeScript, PostgreSQL, MS SQL Server (2012/14/16), Snowflake, Redshift, One Lake, Pandas, NumPy, Seaborn, NLTK, Scikit-learn, Matplotlib, Jenkins, Git, GitHub, AWS (Lambda, EC2, S3, CloudFront, Route 53, Elastic Beanstalk), Serverless Framework, TWS Scheduler, Power BI, Tableau, SSRS, Power Automate, Power Apps, Excel VBA, ELK Stack, Bootstrap, HTML5, CSS3, JIRA, Visual Studio, SAP.

Machine Learning Engineer / Python Developer

Cyberminds Inc – Remote Jan 2022 – May 2022

Project Description: End-to-end development of big data pipelines and predictive analytics solutions supporting enterprise ERP/CRM and IoT platforms. Built real-time data workflows using Spark and Kafka, and developed deep learning models for classification and anomaly detection. Integrated Tableau with big data and cloud sources to deliver dynamic visual insights for stakeholders. Delivered production-ready solutions with automated CI/CD and cloud-native deployments on AWS and Azure.

Responsibilities:

•Designed and built scalable cloud-native data pipelines using Apache Spark, AWS Glue, PySpark, and SQL for ingesting and transforming large volumes of structured and unstructured data to support real-time ERP and CRM systems

•Developed and deployed predictive models and deep learning solutions using TensorFlow, Scikit-learn, and Python for classification, anomaly detection, and forecasting on high-dimensional datasets

•Created advanced, interactive dashboards using Tableau, Power BI, and Python visualizations, optimizing performance on massive datasets by tuning extracts, aggregations, and custom SQL logic

•Integrated Tableau with big data systems like HDFS, MongoDB, and Cassandra, and connected to cloud sources including AWS Athena and Google Cloud Storage to power real-time data-driven decisions

•Developed dynamic IoT data visualizations by streaming sensor data through Apache Kafka and Apache Flink, enabling real-time monitoring and alerting for mission-critical infrastructure

•Combined Tableau and Python for statistical modeling and ML-driven analysis, delivering automated insight generation and predictive KPIs to business stakeholders

•Automated dashboard refreshes, access provisioning, and report distribution using Tableau Server, Tableau Online, and Tableau REST API, ensuring always-on visibility for execs and analysts

•Built and maintained reusable UI components in Microsoft Power Apps, enhancing user experience and scalability across internal-facing business applications

•Collaborated with cross-functional teams to transform business requirements into efficient cloud-native architecture and real-time analytics solutions on AWS and Azure platforms

•Wrote robust, modular code in Java, Python, and C# to support backend services, data processing jobs, and integration with third-party systems and APIs

•Utilized Python multiprocessing and async I/O to improve performance of batch ingestion and API orchestration

•Implemented CI/CD pipelines for data and ML applications to automate integration, testing, and cloud deployment, reducing release friction and improving team velocity

•Worked closely with data scientists and product managers to deliver production-grade ML workflows, supporting predictive analytics for marketing, finance, and operations use cases

•Optimized big data processing performance, storage costs, and latency through partitioning, caching, and query tuning in Spark and SQL environments

•Ensured secure data access and auditability by configuring IAM roles, VPCs, security groups, and encryption policies in line with industry compliance standards

Environment: Apache Spark, PySpark, AWS Glue, TensorFlow, Scikit-learn, Pandas, NumPy, Tableau, Power BI, Tableau Server/Online, Tableau REST API, Power Apps, HDFS, MongoDB, Cassandra, AWS Athena, GCS, AWS Lambda, S3, IAM, Kafka, Flink, SQL, PostgreSQL, Java, Python, C#, Jenkins, Azure DevOps, CI/CD, Git, MLflow, Google Cloud Platform

Research Assistant AR/VR, Virtual Reality & AI

University of Central Missouri – Missouri (Onsite) Oct 2021 – Jan 2022

Project Description: Worked as part of a Mixed Reality research studio to build AR/VR applications for education, safety, and fashion domains. Integrated immersive simulation environments using Unity and AWS cloud services. Collaborated with academic and IT teams to research, prototype, and implement AR/VR-based learning systems.

Responsibilities:

•Developed interactive Augmented and Virtual Reality applications using Unity3D, C#, and AWS infrastructure

•Researched and tested AR/VR usability across devices for enhanced safety training and virtual campus navigation

•Built proof-of-concept immersive experiences in education, fashion, and healthcare safety using real-time 3D modeling

•Integrated Unity-based AR modules with cloud-hosted Python APIs to sync real-time spatial data across user sessions

•Collaborated with AI teams to overlay NLP-driven assistants in VR training simulations using voice-controlled interfaces

•Designed internal tooling to visualize real-time telemetry from VR headsets using Python and WebSocket’s

•Applied computer vision techniques to enable object detection and interaction within AR environments.

Environment: Unity 3D, C#, AWS RDS, AWS EC2, PostGIS, MS SQL Server 2016, Blender, Visual Studio

IT Consultant – Data Science

Captain Strategy – Remote Nov 2019 – Dec 2020

Project Description : Developed full-stack Python-based data science solutions for clients across various domains. Delivered ML pipelines, dashboards, APIs, and backend systems that supported scalable insights using AWS cloud and BI platforms.

Responsibilities:

•Built and deployed machine learning models (classification, regression, clustering) using Scikit-learn, Pandas, NumPy

•Created NLP pipelines using NLTK for text classification, tokenization, and sentiment analysis

•Developed REST APIs using Python Serverless Framework, deployed on AWS Lambda

•Delivered dashboards using Power BI and Tableau, integrating insights with business SharePoint

•Created Elasticsearch-based custom search services and Python-driven visualizations with High charts

•Built Django-based full-stack apps and integrated them with ReactJS, HTML5, and Bootstrap

•Automated ML pipelines with AWS Batch, S3, Redshift, and Snowflake for data movement and scoring

•Replaced Excel VBA tools with Python-based solutions to streamline legacy workflows

•Designed and developed interactive Power BI dashboards integrating Python-based preprocessing and SQL Server/Redshift data sources

•Created custom DAX expressions and Power Query M scripts to implement dynamic KPIs, drill-through reports, and row-level security

•Built data models and calculated tables for advanced time-based metrics such as YoY growth, cohort retention, and customer churn

•Delivered cross-platform Tableau visualizations using data blending, calculated fields, filters, LOD expressions, and trend forecasting

•Integrated Tableau with AWS Athena, Redshift, and Snowflake, leveraging custom SQL queries for performance optimization

•Automated dashboard refreshes using Tableau Server and Power Automate, enabling real-time updates for business teams

•Combined Tableau and Python (via TabPy) for advanced statistical analysis, anomaly detection, and sentiment scoring

•Built reusable dashboard templates and conducted training sessions for stakeholders to drive BI self-service adoption

•Developed Tableau Storyboards to narrate customer journey metrics and guide C-level business decisions.

Environment: Python, Django, AWS (Lambda, S3, Redshift, Batch, IAM), ReactJS, Highcharts, Tableau, Power BI, MySQL, Elasticsearch, TWS Scheduler, Git, Visual Studio.

Management Trainee – Data Science

Hitachi Systems Micro Clinic – New Delhi, India Jan 2019 – Oct 2019

Project Description: Worked on internal analytics and automation systems using Python, Power BI, Tableau, and AWS. Built dashboards, RESTful APIs, and search systems while collaborating with marketing and operations on RFM analytics and business segmentation.

Responsibilities:

•Applied RFM (Recency, Frequency, Monetary) models for customer segmentation using Python

•Built dashboards for audits and performance metrics using Power BI and Tableau

•Developed backend logic using Python/Django, created APIs to integrate with Angular frontend

•Built custom Elasticsearch services and visualized data using Highcharts and Tableau

•Developed predictive models using scikit-learn and automated data workflows with AWS Lambda

•Engineered full-stack apps using ReactJS, Redux, and GraphQL, maintaining responsive UI/UX

•Configured AWS VPC, Route53, and IAM policies for secure cloud deployment

•Delivered scheduled reports and automated alerts through Power Automate and Tableau Online

•Built and deployed microservices with FastAPI and PostgreSQL, securing endpoints with JWT-based auth

•Designed modular Python frameworks to manage ML experimentation tracking, integrating with Airflow

•Conducted A/B testing on model variations and visualized impact using Tableau dashboards and custom KPIs

•Integrated GraphQL APIs into BI dashboards to allow dynamic filtering across embedded analytics

•Created responsive, accessibility-compliant UI components using React, Bootstrap, and Styled Components.

Environment: Python, Django, Tableau, Power BI, AWS (Lambda, EC2, S3, IAM, VPC), ReactJS, Highcharts, Elasticsearch, SQL Server, Oracle Fusion, Microsoft CRM.

MS SQL Server Developer

ICICI Bank – India June 2016 – Dec 2018

Project Description: Developed SQL-based solutions for investment services automation. Delivered backend utilities, API integrations, data pipelines, and performance reports using SQL Server, Python, and PowerShell.

Responsibilities:

•Developed stored procedures, ETL jobs, and reporting logic using T-SQL and SSIS

•Built automation scripts using Python and PowerShell for access request processing and task orchestration

•Used Django ORM and APIs to extract failure point data from processor logs for debugging

•Created SOAP-based web services for external XML integrations

•Developed parallel scripts using VBA and Python for modeling attributes and auditing logic

•Managed database migrations and performance tuning across Oracle and SQL Server platforms

•Created Python utilities to parse and enrich transaction logs for anomaly detection in high-volume financial data

•Developed backend automation scripts to cleanse and transform credit data, feeding risk scoring engines

•Applied machine learning models for fraud detection, evaluated performance metrics using Python/Matplotlib

•Deployed internal chatbot prototypes using Rasa and Flask, integrated with knowledge base on investment FAQs

Environment: MS SQL Server (2008/2012/2016), SSIS, SSRS, PL/SQL, Python, Django, Shell Scripting, SOAP/XML, Oracle, PowerShell, Git

Education Details:

Masters in Big Data Analytics and Information Technology, University of Central Missouri. 2020-2021

Bachelors in Jawaharlal Nehru Technological University Hyderabad.



Contact this candidate