Sudheesh Sulam — Data Analyst
201-***-**** NY, USA ***************@*****.*** www.linkedin.com/in/sudheesh-sulam05 SUMMARY
• Over 4+ years of experience as a Data Analyst specializing in healthcare, financial services, and ERP analyt- ics, leveraging SQL, Python, Tableau, and cloud platforms to deliver actionable insights and optimize business operations.
• Designed KPI dashboards in Tableau and Power BI with advanced calculations, forecasting, and heatmaps, im- proving decision-making speed by up to 25% and ensuring HIPAA, FDA, and BCBS 239 compliance.
• Built and optimized data pipelines in Azure Data Factory, AWS Glue, Informatica, SSIS, and Apache NiFi, inte- grating data from claims, EHR, ERP, and trading systems to reduce processing times by 30–40%.
• Developed predictive models in SAS and Python (Pandas, NumPy, SciPy) for healthcare risk adjustment, utilization forecasting, and operational demand prediction, achieving up to 90% forecast accuracy.
• Created and maintained enterprise data models in ERwin, Oracle, Snowflake, AWS Redshift, and BigQuery to support unified reporting, financial risk analytics, and supply chain optimization.
• Engineered and tuned complex SQL, PL/SQL, and PySpark queries using CTEs, window functions, and material- ized views, reducing report refresh times and compute costs by up to 20%.
• Led data quality, profiling, and cleansing initiatives for large-scale datasets, ensuring accuracy, regulatory compli- ance (HIPAA, GDPR), and readiness for analytics and reporting.
• Delivered measurable business impact, including a 15% improvement in healthcare supply chain delivery, 20% increase in operational efficiency, and enhanced regulatory audit readiness. EDUCATION
• Master of Science in Computer Science - St. Francis College, New York
• Bachelor of Technology in Computer Science - Jain University, India TECHNICAL SKILLS
Languages: Python, Scala, R, SQL (T-SQL, PL/SQL)
Packages: NumPy, Pandas, Matplotlib, Seaborn, PySpark, SciPy, Scikit-learn, TensorFlow Visualization Tools: Tableau, Power BI, Advanced Excel (Pivot Tables, VLOOKUP) IDEs: Visual Studio Code, PyCharm, Jupyter Notebook Database Management: MySQL, PostgreSQL, SQL, Oracle Cloud Technologies: AWS, Azure, Google Cloud Platform Methodologies: SDLC, Agile, Waterfall, Predictive Modeling, Risk Scoring, Dimensional Modeling Version Control: Git, GitHub, Azure DevOps
Data Manipulation: Data Analysis, Data Mining, Data Preprocessing, Data Mapping, Data Cleaning, Data Visualization, Data Modeling, Data Warehousing, Data Storytelling, Data Wrangling, Data Acquisition, Data Integration, Data Transformation Other Skills: SAS (Risk Adjustment, Utilization Models), Informatica PowerCenter, Apache NiFi, SSIS (Conditional Split, Lookup, Script Task, Data Conversion), Machine Learning Algorithms, Time-Series Forecasting, Healthcare Analytics (Claims, EHR, HIPAA, FDA Compliance), Financial Data Analytics (BCBS 239, GDPR Compliance), ERwin Data Modeling, OLAP & OLTP, Advanced Analytics, ETL Automation, CI/CD Pipelines Operating Systems: Windows, Linux, Mac OS
WORK EXPERIENCE
Data Analyst — United Health Group, MN, USA Aug 2024 - Present
• Partnered with compliance, clinical, and population health teams to design KPI dashboards in Tableau, ensuring HIPAA and FDA compliance while supporting value-based care initiatives, improving audit efficiency and reporting transparency.
• Developed predictive risk adjustment and healthcare utilization models in SAS, improving patient outcome forecasts and population health management strategies for provider networks.
• Utilized SQL to extract and integrate large datasets from claims, EHR, provider systems, and inventory platforms, analyzing 50K+ transactions to identify trends in cost, demand, utilization, and network performance.
• Designed enterprise healthcare data models in ERwin and Oracle, and engineered data marts in Snowflake and BigQuery to enable unified reporting, risk scoring, and high-performance analytics.
• Automated ETL workflows and data quality checks in Azure Data Factory, AWS Glue, and AWS Lambda, reducing processing time by 30% and increasing dashboard reliability for claims and utilization data.
• Conducted time-series forecasting in Python (Pandas, NumPy) to predict medical supply demand with 90% accu- racy, minimizing shortages, avoiding overstock, and improving provider readiness.
• Created interactive Tableau dashboards with LOD calculations, forecasting, and heatmaps to track provider per- formance, treatment effectiveness, and operational efficiency, improving decision-making speed by 25%.
• Tuned complex SQL and PL/SQL queries using CTEs and window functions, reducing report refresh times by 20% and improving stakeholder access to insights across clinical and operational teams.
• Improved healthcare supply chain delivery times by 15%, enabling timely access to critical medical resources, enhancing patient care, and supporting Optum’s mission to improve population health outcomes. Data Analyst — Barclays, India May 2021 - Jul 2023
• Collaborated with business analysts and stakeholders to define data entities, relationships, and subject areas for financial data modeling, aligning with Barclays’ data governance and regulatory compliance requirements (BCBS 239, GDPR).
• Conducted data profiling, validation, and cleansing on large-scale transactional and market data, ensuring accuracy and consistency before ETL development to support risk and compliance reporting.
• Designed and maintained scalable Snowflake and AWS Redshift schema architectures using ERwin for logical/physical modeling, implementing forward/reverse engineering for iterative enhancements.
• Created materialized views and applied schema-level tuning to optimize query performance for risk, P&L, and portfolio analytics, reducing compute costs by 20%.
• Developed high-performance ETL workflows using Informatica PowerCenter, Python, and Apache NiFi to integrate trading, customer, and market data from multiple legacy and cloud systems.
• Applied advanced SSIS transformations (Conditional Split, Lookup, Script Task, Data Conversion) to enhance ETL efficiency and maintain full audit traceability for compliance audits.
• Engineered complex SQL stored procedures, UDFs, and views across PostgreSQL, SQL Server, and MySQL to support real-time analytics for fraud detection, credit risk, and transaction monitoring.
• Utilized PySpark and SQL for large-scale data processing, enabling faster analytics on structured/semi-structured market and customer datasets.
• Designed and deployed secure Power BI dashboards with advanced DAX measures and row-level security, automat- ing deployment through CI/CD pipelines in Azure DevOps to ensure real-time stakeholder access. Data Analyst — Wipro, India Mar 2020 - Apr 2021
• Engineered and automated ETL processes using SQL Server Integration Services (SSIS), transforming unstructured ERP data into structured datasets, reducing manual processing time by 40%.
• Streamlined monthly and quarterly reporting by creating automated reports with SQL Server Reporting Services
(SSRS), cutting down report generation time by 30% and ensuring timely delivery.
• Applied Python (SciPy, pandas) for advanced data analysis, identifying key trends and actionable insights from ERP data, driving 20% improvement in operational decision-making.
• Integrated AWS cloud services for seamless data storage and retrieval, improving system performance and scalability of ERP-related data processing by 25%.
• Developed dynamic dashboards using Tableau, enabling real-time monitoring of business metrics, which improved data accessibility and business intelligence for stakeholders by 25%.
• Designed and implemented efficient relational data models to integrate data from multiple ERP modules (finance, inventory, supply chain), ensuring seamless data flow and scalability.
• Developed complex Excel reports using PivotTables, VLOOKUP, and macros, automating analysis of key KPIs, and reducing manual reporting time by 25%.
• Implemented Agile methodologies in project management, ensuring iterative progress and timely delivery of re- porting features, resulting in a 20% faster delivery of ERP reporting enhancements.