Post Job Free
Sign in

Supply Chain Machine Learning

Location:
Orlando, FL
Posted:
February 25, 2025

Contact this candidate

Resume:

Name: Anjani Thati

Contact: 321-***-****

Email-ID: **********@*****.***

SUMMARY:

Data Analytics professional with 9 years of experience delivering actionable insights across Finance, Healthcare, Biotechnology, and Supply Chain industries, specializing in predictive modeling, business intelligence, and data-driven decision-making.

Designed and automated financial reporting dashboards in Power BI, Tableau, and Looker, integrating real-time transactional data from financial institutions, investment firms, and fintech platforms to enable data-driven investment strategies.

Developed healthcare cost prediction models using machine learning (XGBoost, Random Forest, and Gradient Boosting) to optimize insurance claim approvals, hospital resource allocation, and fraud detection.

Implemented AI-driven demand forecasting models for supply chain optimization, improving inventory management and reducing stock shortages using deep learning techniques.

Built anomaly detection frameworks for fraud prevention in financial transactions and healthcare billing, leveraging unsupervised learning techniques such as Isolation Forests and Autoencoders.

Developed credit risk scoring models using logistic regression, random forests, and ensemble techniques, improving the accuracy of loan default predictions and customer creditworthiness assessment.

Optimized supply chain logistics using real-time analytics and IoT sensor data, improving demand forecasting, route optimization, and supplier performance assessment.

Built financial forecasting models using time-series analysis (ARIMA, SARIMA, Prophet), enabling accurate revenue projections and investment decision support.

Designed and implemented ETL/ELT workflows for financial, healthcare, and biotech data integration, ensuring high data quality, transformation, and compliance with regulatory standards.

Developed revenue cycle management dashboards for healthcare providers, streamlining the billing process and improving revenue collection through automated data insights.

Developed machine learning models for supply chain risk management, predicting potential disruptions using external market data, supplier performance trends, and geopolitical risk indicators.

Integrated electronic health records (EHR) analytics with cloud-based AI platforms (AWS SageMaker, Azure ML, Google Vertex AI) to enhance patient care and clinical decision-making.

Conducted claims analytics in healthcare insurance, identifying fraudulent claims and reducing unnecessary payouts by applying pattern recognition algorithms and NLP-based document analysis.

Enhanced revenue cycle efficiency in hospitals by integrating patient admission rates, treatment costs, and claims processing analytics, reducing billing errors and administrative overhead.

Developed predictive analytics solutions for customer churn analysis, demand forecasting, and marketing attribution modeling, using gradient boosting algorithms and ensemble learning techniques.

Implemented machine learning lifecycle management (MLOps) using Docker, Kubernetes, MLflow, and SageMaker, ensuring scalability, version control, and automation for production ML models.

Designed and optimized row-level security (RLS) and column-level security (CLS) policies in Power BI, Tableau, and Looker, ensuring controlled access to sensitive data based on user roles.

Applied NLP-based text mining to clinical documentation and financial reports, extracting actionable insights from unstructured text for risk management and fraud detection.

Developed logistics simulation models for warehouse operations, optimizing workforce scheduling and inventory storage layouts using Monte Carlo simulations and linear programming.

Led large-scale data migration projects for finance and biotech companies, moving legacy systems to modern cloud data warehouses (Snowflake, AWS Redshift, Azure Synapse) with minimal downtime.

Created Tableau dashboards for biotech firms, visualizing real-time laboratory test results, clinical trial progress, and drug efficacy trends, improving R&D efficiency.

Integrated third-party supply chain data sources into enterprise analytics platforms, enriching reporting capabilities with data from freight carriers, customs agencies, and distribution centers.

Configured real-time transaction monitoring dashboards for financial institutions, integrating fraud detection alerts with machine learning-driven anomaly detection techniques.

Developed cloud-based AI/ML solutions in AWS Redshift, Azure Synapse, and GCP BigQuery, supporting scalable analytics for financial institutions and biotech research facilities.

Mentored cross-functional teams on data analytics best practices, upskilling finance and biotech professionals in SQL, Python, machine learning, and BI tools.Work with highly complex data models and over 100+ data sources, performing mainframe development and analysis for EDW.

Requires strong knowledge of COBOL, VSAM file structures, JCL, and TSO, along with experience in data extraction, cleansing, and preparation for data warehouses.

Experience with healthcare claims, diagnosis codes, procedure codes, HIPAA data elements, and privacy standards is essential.

Involves project planning, requirements analysis, coding, testing, and implementation of data solutions.

Work with large databases using statistical tools like SAS to develop and maintain production reports.

5+ years of experience delivering actionable insights across Finance, Healthcare, Biotechnology, and Supply Chain industries, leveraging data-driven decision-making.

Specialized in predictive modeling and business intelligence, utilizing advanced analytics to optimize operations and strategic planning.

Extensive mainframe expertise with hands-on experience in COBOL, VSAM, JCL, and TSO, supporting enterprise data processing and analytics.

Proficient in data warehousing and ETL, extracting, transforming, and loading large-scale datasets to enhance reporting and compliance.

Strong background in healthcare analytics, analyzing claims data, ensuring HIPAA compliance, and improving patient outcome predictions.

Designed and deployed GCP-based cloud infrastructure, ensuring best practices in security, scalability, and cost optimization.

Led large-scale on-premises to GCP migration projects, minimizing downtime and improving system performance.

Implemented Vertex AI for predictive analytics, enhancing decision-making in healthcare and financial applications.

Developed automated CI/CD pipelines using Terraform and Ansible, streamlining cloud deployments.

Designed robust change management processes, ensuring seamless transitions with minimal disruption.

Created AI-driven anomaly detection systems leveraging GCP BigQuery ML and AutoML.

TECHNICAL SKILLS:

Programming & Scripting

Python, R, SQL (T-SQL, PL/SQL, Spark SQL), Java, JavaScript, PHP, HTML, CSS, M (Power Query)

Data Analysis & Statistical Computing

Pandas, NumPy, MATLAB, SAS, SPSS, Excel, Google Sheets, Trifacta, Hypothesis Testing, A/B Testing, Regression Analysis, Time-Series Analysis

Data Visualization & BI

Power BI, Tableau, QlikView, Looker (LookML), Google Data Studio, Matplotlib, Seaborn, Plotly, D3.js, ArcGIS, QGIS, Google Earth Engine, SnowSight

Machine Learning & AI

TensorFlow, PyTorch, Scikit-learn, H2O.ai, LightGBM, XGBoost, spaCy, NLTK, Hugging Face Transformers, OpenAI API, Azure ML, Amazon SageMaker

ETL & Data Integration

SSIS, Informatica, Talend, Apache NiFi, Apache Flink, Google Dataflow, AWS Glue, Alteryx, Paxata, Oracle Data Integrator, Delta Lake, IDQ

Database Management & Big Data

Snowflake, Oracle, MySQL, PostgreSQL, Microsoft SQL Server, MongoDB, Hadoop, Spark, Hive, Neo4j, Amazon Neptune

Cloud Computing & Storage

AWS (S3, Redshift, Glue, Lake Formation, Kinesis), Azure (Data Lake, Synapse, Cognitive Services, Data Catalog), Google Cloud (BigQuery, Pub/Sub, AI Platform)

Data Governance & Compliance

Data Lineage, Metadata Management, Data Masking, Role-Based Access Control (RBAC), GDPR, HIPAA, SOX Compliance, Informatica Axon, Collibra, Apache Atlas

Workflow Automation & DevOps

Docker, Kubernetes, Jenkins, Apache Airflow, CircleCI, Grafana, Prometheus, Splunk

Mainframe Development

COBOL, VSAM, JCL, TSO

Project Management & Agile

Agile, Scrum, Waterfall, JIRA, Confluence, Azure Boards, Microsoft Teams

PROFESSIONAL EXPERIENCE:

Client: Clipboard Health, San Francisco, CA Mar’22-Present

Role: Sr. Data Science Analyst

Project Summary:

I lead data analysis efforts in healthcare, working with large datasets like electronic health records and insurance claims. I build predictive models to identify patient risks and help hospitals predict patient readmissions. I also create real-time dashboards for decision-making and use advanced machine learning tools to enhance patient monitoring, fraud detection, and hospital forecasting. I actively attend Scrum calls to discuss project requirements, address any blockers, and review the progress of models and proofs of concept (POCs) with the team. I also ensure that healthcare data is secure and easily shared across systems.

Responsibilities:

Led end-to-end healthcare data analysis leveraging SQL (T-SQL, PL/SQL, Spark SQL), Python (pandas, NumPy, Scikit-learn), and R, analyzing EHR, claims, and patient outcomes to improve operational efficiency and quality of care.

Developed predictive analytics models using XGBoost, Random Forest, Logistic Regression, and Deep Learning (TensorFlow, PyTorch) to identify patient risk factors, readmission likelihood, and disease progression trends.

Designed and deployed interactive BI dashboards in Power BI and Tableau, integrating Azure Synapse Analytics, SQL Server, and Data Lake Storage to provide real-time clinical and operational insights.

Optimized revenue cycle management (RCM) analytics, identifying inefficiencies in claims processing, reimbursement rates, and patient billing trends, improving cash flow forecasting and reducing revenue leakage.

Built cloud-based healthcare analytics solutions leveraging Azure Data Factory, Azure Synapse, and Azure Machine Learning, ensuring scalability, security, and compliance with healthcare regulations.

Designed real-time patient monitoring systems by integrating IoT data from wearable devices and patient monitors with Azure IoT Hub and Azure Event Hubs, enabling proactive interventions and early anomaly detection.

Developed NLP-driven insights from clinical notes, physician transcriptions, and patient feedback using Azure Cognitive Services, spaCy, and Hugging Face Transformers, enhancing decision support for medical professionals.

Implemented advanced forecasting models (ARIMA, SARIMA, Prophet, and LSTMs) to predict hospital admissions, ICU demand, and emergency department capacity, improving resource planning and operational efficiency.

Standardized healthcare data interoperability using FHIR and HL7 APIs, integrating EHR systems, lab results, and third-party health platforms, ensuring seamless data exchange and compliance.

Developed fraud detection models for insurance claims and billing anomalies, leveraging unsupervised learning techniques (Isolation Forests, Autoencoders, PCA) to flag suspicious transactions and improve regulatory reporting.

Implemented robust data governance frameworks with Collibra, Informatica Data Quality, and Azure Purview, ensuring HIPAA, GDPR, and CMS compliance while improving data security and audit readiness.

Automated ETL pipelines using Azure Data Factory, Informatica, and Talend, streamlining data ingestion from hospitals, insurance providers, and research institutions, reducing processing time by 40%.

Enhanced data privacy and security controls by deploying role-based access control (RBAC), encryption (AES, SSL/TLS), and tokenization techniques in Azure Synapse Analytics, ensuring compliance with regulatory mandates.

Integrated GIS and geospatial analytics using Azure Maps and ArcGIS to assess healthcare accessibility, disease spread patterns, and optimal resource distribution, supporting policy and strategic planning.

Developed clinical trial analytics dashboards, integrating real-time patient recruitment data, adverse event tracking, and compliance metrics to accelerate drug development insights.

Built A/B testing frameworks to assess the effectiveness of new treatment protocols, healthcare interventions, and operational changes, leveraging Bayesian A/B testing and uplift modeling for evidence-based decision-making.

Optimized hospital workflow efficiency by leveraging process mining tools (Celonis, Apromore) to analyze bottlenecks in patient journeys, resource utilization, and administrative workflows.

Configured and deployed Explainable AI (XAI) solutions using SHAP and LIME, ensuring interpretability of machine learning models in clinical decision-making and regulatory compliance.

Implemented real-time alerting and anomaly detection systems for early warning of sepsis, adverse drug reactions, and deterioration risks, integrating Azure Event Hubs for high-speed data processing.

Standardized data quality assessment frameworks using Power BI, Alteryx, and Great Expectations, ensuring high-integrity, validated datasets for predictive modeling and BI reporting.

Automated clinical documentation analysis using OCR (Azure Form Recognizer and Google Vision API) and NLP, extracting key medical information from physician notes, discharge summaries, and pathology reports.

Developed personalized patient engagement models using recommendation algorithms (Collaborative Filtering, Content-Based) to improve treatment adherence and chronic disease management.

Designed governance frameworks for AI-driven analytics, ensuring compliance with FDA, HIPAA, and ethical AI guidelines, mitigating risks of bias, fairness, and transparency in AI models.

Collaborated with executive leadership, clinicians, and data teams to drive data-driven decision-making, translating complex analytics insights into strategic healthcare policies and operational improvements.Worked with disease surveillance systems to monitor patient trends, outbreaks, and public health risks.

Integrated electronic medical records (EHR) and HL7 messaging for real-time data exchange.

Developed predictive models using time-series analysis (ARIMA, LSTMs) to forecast disease outbreaks.

Automated data ingestion pipelines for patient records, laboratory results, and hospital claims.

Developed models to detect fraudulent claims, analyze patient billing trends, and optimize insurance claim approvals.

Used ML techniques (XGBoost, Random Forest) for hospital resource allocation and financial forecasting.

Automated data pipelines, integrating structured/unstructured healthcare data for compliance and operational insights.

Ensured compliance with HIPAA, GDPR, and healthcare data security protocols.

Designed healthcare dashboards using Power BI, Tableau, and SQL to visualize patient outcomes and revenue cycle performance

Developed predictive analytics models for optimizing claims processing, fraud detection, and hospital resource allocation.

Led the cloud foundation build, ensuring security, scalability, and compliance with best practices.

Managed GCP infrastructure for healthcare analytics, integrating EHR and claims data for improved decision-making.

Migrated legacy on-premises systems to GCP, enhancing system reliability and reducing costs.

Designed and deployed Vertex AI solutions for predictive patient readmission models.

Developed automated ML workflows using Kubeflow, ensuring seamless AI model deployment.

Environment: SQL (T-SQL, PL/SQL, Spark SQL), Python (pandas, Scikit-learn), R, Power BI, Tableau, Azure Synapse, Data Factory, Event Hubs, Azure ML, Cognitive Services, Informatica, Collibra, FHIR, HL7, NLP, Predictive Modeling, Real-Time Analytics, ETL, Data Governance.

Client: Golden Valley Bank, Chico, CA Feb’20-Feb’22

Role: Sr. Data Analyst

Project Summary:

At the bank, I developed systems to manage financial data and build tools to predict risks and trends in the financial market. I created dashboards to help leaders make quick decisions and used machine learning to improve credit risk assessments and detect fraud. I also worked on tools to meet financial regulations and improve the accuracy of financial forecasts. Regularly participating in Scrum calls, I contributed to discussions around project requirements, overcame any technical challenges, and provided feedback on the development of financial models and POCs.

Responsibilities:

Developed and standardized financial data pipelines by integrating structured and unstructured financial data from banking systems, regulatory filings, and market feeds, ensuring data accuracy, accessibility, and compliance across analytics workflows.

Developed predictive analytics solutions leveraging XGBoost, Random Forest, H2O.ai, and AutoML, improving credit risk scoring, loan default predictions, and financial fraud detection.

Built and deployed real-time financial reporting dashboards in Power BI (DAX), Tableau, and AWS QuickSight, integrating AWS Redshift, S3, Glue, and DynamoDB for high-performance business intelligence solutions.

Engineered scalable ETL pipelines using AWS Glue, Apache Airflow, and Talend, enabling seamless financial data ingestion, transformation, and integration from transactional systems and external sources.

Implemented financial fraud detection models using graph-based analytics (Neo4j), anomaly detection (Isolation Forest, Autoencoders, DBSCAN), and network analytics, improving compliance with AML and KYC standards.

Designed high-frequency trading and portfolio risk monitoring systems utilizing AWS Kinesis, Apache Kafka, and EventBridge, enabling real-time market trend analysis.

Applied advanced time-series forecasting models (SARIMA, Prophet, LSTMs) to predict market movements, liquidity risk, and cash flow patterns, supporting financial decision-making.

Developed explainable AI (XAI)-driven financial risk models using SHAP, LIME, and interpretability frameworks, ensuring transparency in credit scoring, fraud detection, and regulatory audits.

Implemented stress testing and regulatory compliance models to assess CCAR, DFAST, and Basel III risk factors, ensuring alignment with regulatory requirements and financial resilience planning.

Automated financial data validation and reconciliation using Great Expectations, AWS DataBrew, and Alteryx, improving data quality and audit compliance for financial statements.

Optimized financial decision workflows using Monte Carlo simulations, scenario modeling, and stochastic risk assessments, enhancing strategic investment planning.

Built NLP-driven financial sentiment analysis models using spaCy, Hugging Face Transformers, and AWS Comprehend, extracting insights from market reports, earnings transcripts, and regulatory filings.

Implemented A/B testing frameworks for financial products, customer segmentation, and investment strategies, leveraging Bayesian analysis and uplift modeling.

Developed cloud-based financial data architectures using AWS Redshift, Glue, and S3, optimizing data warehousing, structured storage, and high-speed analytics.

Led governance initiatives by implementing data lineage, metadata management, and regulatory reporting automation using AWS Lake Formation, Collibra, and Informatica Data Governance.

Integrated alternative data sources into financial risk models, leveraging social, behavioral, and transaction patterns to improve creditworthiness assessments and investment profiling.

Designed and managed OLAP cubes, star schema, and dimensional data marts for fast financial reporting, portfolio analytics, and investment risk assessment.

Developed interactive geospatial analytics solutions using Google Earth Engine and QGIS, enabling risk mapping, market expansion analysis, and location-based investment insights.

Built cloud-native MLOps workflows using AWS SageMaker, Kubeflow, and MLFlow, ensuring scalable, reproducible, and compliant deployment of financial AI/ML models.

Implemented automated risk monitoring systems using AWS Step Functions, Lambda, and SQL-based anomaly detection, improving credit monitoring and transaction oversight.

Enhanced data security and compliance workflows using RBAC, encryption (AES, TLS), and tokenization techniques, ensuring regulatory compliance with SOX, SEC, and GDPR guidelines.

Led cross-functional collaborations between finance, risk, and analytics teams, ensuring alignment between data-driven insights and strategic financial objectives.

Developed advanced data storytelling frameworks to present financial insights to executives and stakeholders, improving data-driven decision-making and business strategy execution.

Mentored junior analysts and finance teams, conducting training sessions on BI tools, machine learning for finance, and regulatory reporting best practices.

Developed and optimized mainframe-based data extraction and transformation processes using COBOL, JCL, and VSAM, ensuring seamless data integration into enterprise data warehouses.

Designed and implemented ETL workflows to extract, cleanse, and load large-scale healthcare datasets from multiple sources, improving data integrity and processing efficiency.

Created and maintained production reports using SAS for high-volume databases, supporting statistical analysis and regulatory reporting.

Analyzed paid and denied claims, including diagnosis and procedure codes, to identify patterns, reduce fraud, and ensure compliance with HIPAA and PHI regulations.

Led mainframe data migration efforts, utilizing COBOL, JCL, and VSAM to extract and transform financial transaction data into a modernized data warehouse.

Developed pseudo-code, program specifications, and Warnier-Orr diagrams to analyze and optimize existing mainframe-based financial data processing workflows.

Built automated ETL pipelines to integrate structured and unstructured financial data from multiple sources, ensuring seamless transition to front-end data warehouse processing.

Created statistical reports in SAS, analyzing historical transaction data to enhance fraud detection mechanisms and compliance monitoring.

Managed full SDLC implementation, ensuring coding, testing, and deployment of scalable data processing solutions to improve financial reporting and auditing accuracy.

Migrated financial risk models to GCP, improving real-time risk assessment capabilities.

Designed scalable data warehouses using BigQuery, reducing query times by 40%.

Developed automated financial fraud detection models using Vertex AI and BigQuery ML.

Engineered serverless event-driven architectures using Cloud Functions and Eventarc.

Implemented change management strategies to ensure smooth cloud adoption.

Environment: AWS (Redshift, Glue, S3, DynamoDB, QuickSight, Kinesis, SageMaker, Lambda, Step Functions), Apache Kafka, Neo4j, Airflow, H2O.ai, MLFlow, Kubeflow, Talend, Tableau, Power BI, Alteryx, Collibra, Python, SQL.

Client: Echo Global Logistics, Chicago, IL Nov’18-Jan’20

Role: Sr. Business Intelligence Engineer

Project Summary :

I worked on optimizing logistics data by building business intelligence tools that helped track orders, inventory, and shipments. Using machine learning, I improved predictions for supply chain operations and even created solutions to predict and prevent logistical delays. I also built systems to ensure everything from fleet management to warehouse operations ran smoothly. I attended Scrum calls to collaborate with stakeholders, address any blockers, and review POCs and models, ensuring the solutions met the logistics team’s needs.

Responsibilities:

Developed scalable BI solutions using Power BI, Tableau, and AWS QuickSight, integrating structured and unstructured supply chain data from ERP, WMS, and TMS systems for dynamic reporting.

Implemented ETL workflows using Apache Airflow, AWS Glue, and SQL-based transformations, ensuring seamless data ingestion, cleansing, and pipeline automation for supply chain analytics.

Designed optimized data models (star schema, OLAP cubes) for inventory, transportation, and supplier performance analytics, enhancing query performance and reporting efficiency.

Applied time-series forecasting techniques (ARIMA, Prophet, LSTM) for demand planning, order volume prediction, and inventory optimization, reducing volatility in supply chain operations.

Engineered real-time data streaming solutions using Kafka, AWS Kinesis, and MQTT for real-time shipment tracking, fleet monitoring, and predictive logistics insights.

Developed geospatial analytics dashboards utilizing PostGIS, QGIS, and Google Earth Engine, enabling route optimization, warehouse placement analysis, and fleet tracking.

Created anomaly detection algorithms using Isolation Forest, DBSCAN, and Autoencoders to flag vendor discrepancies, shipment delays, and fraudulent logistics activities.

Designed automated data validation frameworks leveraging Great Expectations, AWS DataBrew, and Informatica, improving data accuracy, consistency, and compliance in supply chain records.

Implemented predictive maintenance analytics using sensor data, IoT integration, and failure prediction models, reducing downtime of logistics assets like fleet vehicles and warehouse equipment.

Developed optimization algorithms using linear programming, genetic algorithms, and constraint solvers for cost-efficient route planning and supply allocation.

Built API integrations for real-time data exchange between ERP, logistics platforms, and third-party suppliers, ensuring seamless order fulfillment and vendor data synchronization.

Designed role-based access controls (RBAC) and data security protocols using AWS IAM, encryption techniques (AES, TLS), and tokenization, ensuring secure supply chain data management.

Engineered scalable cloud data warehouses using AWS Redshift, Snowflake, and BigQuery, optimizing storage, retrieval, and real-time analytics for supply chain metrics.

Automated transportation cost modeling using Python (Pandas, NumPy, Scikit-learn) and SQL-based cost-benefit analysis, optimizing logistics expenses.

Implemented real-time event-driven architectures with AWS Step Functions, Lambda, and EventBridge, enabling automated shipment tracking and exception handling workflows.

Built scalable MLOps pipelines using AWS SageMaker, MLFlow, and Kubeflow, ensuring reliable and reproducible predictive analytics deployment in supply chain environments.

Optimized data retrieval and query performance by implementing materialized views, indexing strategies, and partitioning techniques in large-scale logistics databases.

Developed NLP-based text analytics models using spaCy, Hugging Face Transformers, and AWS Comprehendto extract insights from supply chain contracts, vendor reviews, and logistics reports.

Led data governance initiatives using Collibra, Informatica, and AWS Lake Formation, enforcing data lineage tracking, metadata management, and compliance automation.

Integrated external data sources (economic indicators, fuel prices, weather data) into supply chain analytics platforms, enabling data-driven risk assessments and strategic planning.

Developed and optimized mainframe-based data processing workflows using COBOL, VSAM file structures, JCL, and TSO, ensuring efficient handling of large-scale logistics and financial datasets.

Designed and implemented ETL pipelines to extract, cleanse, and integrate data from multiple sources into a front-end data warehouse, improving data accuracy and accessibility for reporting and analytics.

Led full system development life cycle (SDLC) functions, including project planning, requirements analysis, program specifications, coding, testing, and implementation, ensuring seamless execution of enterprise data solutions.

Utilized SAS for statistical reporting and large-volume database analytics, developing and maintaining production reports to support business intelligence and operational decision-making.

Analyzed and processed paid and denied claims, ensuring compliance with HIPAA, PHI regulations, and industry-standard data elements, improving data quality and fraud detection mechanisms.

Architected cloud solutions for supply chain analytics using BigQuery and Dataflow.

Developed predictive shipment delay models using Vertex AI.

Designed hybrid cloud networking solutions integrating GCP and on-premises systems.

Implemented Terraform-based infrastructure as code (IaC) for automated deployments.

Developed real-time tracking dashboards in Looker, reducing shipment delays.

Designed robust IAM policies for secure access control.

Environment: AWS (Redshift, Glue, Kinesis, QuickSight, SageMaker, Lambda), Snowflake, BigQuery, Apache Airflow, Kafka, Power BI, Tableau, PostgreSQL, Python (Pandas, NumPy, Scikit-learn), SQL, QGIS, Informatica.

Client: Day Zero Diagnostics, Boston, MA May’17-Oct’18

Role: Sr. Data Governance & Compliance Analyst

Project Summary:

I worked on optimizing logistics data by building business intelligence tools that helped track orders, inventory, and shipments. Using machine learning, I improved predictions for supply chain operations and even created solutions to predict and prevent logistical delays. I also built systems to ensure everything from fleet management to warehouse operations ran smoothly. I attended Scrum calls to collaborate with stakeholders, address any blockers, and review POCs and models, ensuring the solutions met the logistics team’s needs.

Responsibilities:

Developed and enforced data governance policies ensuring compliance with FDA 21 CFR Part 11, GxP, HIPAA, GDPR, and ISO 13485, securing clinical trial and pharmaceutical manufacturing data.

Implemented data quality frameworks using Informatica Data Quality, Great



Contact this candidate