SAI SREE M
Location: TX Phone: 940-***-**** Email: *******************@*****.*** https://www.linkedin.com/in/sai-sri-m-ab3528213/
SUMMARY
Innovative and detail-oriented Data Analyst with around 5 years of experience and a proven track record of transforming complex datasets into strategic business insights.
Armed with a Master's degree in Health Informatics and diverse experience across freelance and corporate environments, offering a powerful combination of technical prowess and business acumen to drive data-informed decision-making.
Proficient in Python, SQL, and R, with extensive experience in database management systems including MySQL, Oracle, and MongoDB.
Possess strong knowledge of database management in writing complex SQL queries, stored procedures, database tuning, query optimization, and resolving key performance issues.
Expert in integrating Epic EHR with data warehouses, conducting detailed data analysis, and developing validated reports to support business intelligence, regulatory compliance, and operational efficiency.
Excellent knowledge in designing and developing dashboards using TIBCO Spotfire by extracting data from multiple sources (Flat files, Excel, Access, SQL Server, Oracle).
Involved in Performance Tuning and Deployment of TIBCO Applications.
Good understanding of Tableau Desktop architecture for designing and developing dashboards.
Extensive Tableau Experience in Enterprise Environment and Tableau Administrator experience including technical support, troubleshooting, report design and monitoring of system usage.
Knowledge of data mapping/migration, Cleansing, Profiling and Data Governance.
Outstanding Data analysis skills including Data mapping from source to target database schemas, Data Cleansing and processing, writing data extract scripts/programming of data conversion and researching complex data problems.
Hands-on experience using Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimension Tables, Physical and Logical Data Modeling.
Skilled in creating impactful visualizations and interactive dashboards using Tableau and Power BI, effectively communicating complex data patterns to C-level executives and stakeholders.
Adept at employing advanced data mining, cleansing, and wrangling techniques to ensure data integrity and reliability.
Strong background in statistical analysis, predictive modeling, and market trend forecasting.
Experienced in managing projects using Agile methodologies and maintaining code quality through version control with Git and GitHub.
Performed data manipulation and statistical analysis using SAS to generate insights for clinical and operational reporting.
Created SAS scripts for automating data cleaning processes, improving data accuracy, and reducing manual intervention.
Possess strong knowledge of database management in writing complex SQL queries, stored procedures, database tuning, query optimization and resolving key performance issues.
Strong knowledge in SQL using SQL Server and Oracle databases. Worked in Data Extraction, Transforming, and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Pipeline Pilot.
Proficient in analysis, design, and implementation of data warehousing/ BI solutions using Tableau, Spotfire, Alteryx, Pipeline Pilot, Informatica, MicroStrategy, and Database technologies.
Committed to delivering high-quality, data-driven solutions that optimize business processes, uncover market opportunities, and drive strategic growth.
TECHNICAL SKILLS
Programming Language: Python, SQL, R HTML, CSS, JS.
Databases: MySQL, Oracle, MS SQL Server, MongoDB.
Data Visualization Skills: Tableau, TIBCO Spotfire, Power BI, MS Excel(VBA, VLOOKUP, Pivot tables), MS SSRS, MS SSAS, Google
Analytics.
Data Integration (ETL)Tools: Alteryx, Pipeline Pilot, Informatica, MS SQL Server Integrated Service (SSIS).
Clinical Skills: Strong Medical and Drug Terminology, Electronic Health Records Management, Clinical Trails,
Healthcare Regulations (HIPAA), Clinical Data knowledge (HL7, CCD), and ICD-10.
Analytical and technical skills: Data Collection, Data Entry, Data Management, DMP (Data Management Plan), Data Cleansing,
Statistical Analysis, Data Visualization, Electronic Data Capture (EDC), Data Acquisition, Critical
thinking, Problem-solving.
Application Software: MS Office, Eclipse, MATLAB, Visual Studio, MS Excel advance, MS Access, MS Visio, Toad Oracle
Manual Testing: Requirement analysis, Test scenario design, Test case design, Test data management, Test Execution,
Database testing, Regression testing, and Defect tracking.
EDUCATION
University of North Texas Denton, TX
Master of Science in Health Informatics
PROFESSIONAL EXPERIENCE
Tenet Health Dec 2023 – Current
Health Data Analyst
Description: Tenet Healthcare Corporation is a diversified healthcare services company headquartered in Dallas, Texas. It operates a network of hospitals, outpatient centers, and healthcare services, offering a comprehensive range of medical services and care. Tenet is committed to delivering high-quality, compassionate care to patients across the United States.
Responsibilities:
Analysis of Business requirements & Design Specification Document to determine the functionality of the ETL Processes.
Extensive involved in Data Analysis, Data Cleansing, Requirements gathering, Data Mapping Functional and Technical design docs, and Process Flow diagrams created custom Python functions and libraries to support data analysis workflows.
Involved in extensive DATA validation using SQL queries and back-end testing.
Wrote complex SQL. PUSQL Testing scripts for Backend Testing of the data warehouse application and to check dataflow in different environments.
Extract, transform, and load (ETL) processes to integrate data from various sources into Domo Extensively used joins and sub Query's and functions for validating data Monitored the Teradata production system using Teradata manager for skewed queries Collected requirements and tested several business reports.
Conducted source-system data analysis to understand current state, data quality and availability of existing data.
Expertise on Loan IQ Trader Desktop and Loan IQ configuration, integration and reporting module.
Extensively used the Teradata utilities like BTEQ, fast load multi load, DDL Commands and DML Commands (SQL) .Created various Teradata Macros in SQL Assistant for to serve the analysts
Extensively worked on Data cleansing and Data staging of operational sources using ETL processes.
Facilitated seamless data integration between Epic electronic health records (EHR) and data warehouses by designing and
implementing ETL processes, ensuring accurate and timely data flow for clinical and operational reporting.
Conducted in-depth data analysis of Epic EHR data, including patient records and clinical workflows, to support business intelligence initiatives and enhance decision-making processes through custom reporting and data visualization.
Developed and validated complex reports within Epic to support regulatory compliance and operational efficiency, utilizing SQL and Epic’s reporting tools to ensure data accuracy and completeness.
Worked with end users to gain an understanding of information and core data concepts behind their business Verified the data in DW tables after extracting the data from source tables by writing SQL queries using TOAD Worked extensively on UNIX environment, Monitor network and Mainframe/Tandem systems and other applications
Write Complex SQL queries to reflect those rules and be able to validate the mappings by running these test scripts and checking if the data in target table populates correctly by writing complex SQL queries.
Involved in SQL Optimizations, Performance Analysis, and future growth analysis for OLTP and data warehouse applications.
Involved in creating source to target mapping edit rules and validation, transformations and business rules.
Perform Data Quality integrity testing (completeness, conformity, consistency, accuracy. duplicates) Performed Functional Testing and Backend Testing using the database comparable results manually Involved in user training sessions and assisting in UAT (User Acceptance Testing) Develop compelling and informative data visualizations using Power BI.
Utilize Microsoft Power Query to import, clean, and transform data from various sources (Excel, CSV, databases, etc.).
Validated the data flow from centralized database to different products by executing SQL.
Worked with Data masking validation between different schemas using Toad for oracle and SQL server.
Conducted data manipulation and statistical analysis using SAS, including creating data sets, merging data, and performing descriptive statistics.
Environment: Informatica Power center 9.6.Teradata, Oracle 11g, Loan JQ, TOAD, HP Quality Center Business Objects Enterprise XI R2/ R3, Oracle SQL PUSQL Windows, UNIX, Excel. Main Frames.
KIMS HOSPITALS, India
Data Analyst Mar 2020 – Nov 2022
Led the development of a prescriptive data to optimize operations and improves the patient outcomes.
Conducted comprehensive data analysis and interpretation of clinical data to support patient care and hospital operations.
Reduced data entry errors by 40% by training hospital staff on best practices and the use of advanced data management tools.
Assisted in the design and implementation of clinical trials, including data collection, analysis, and reporting.
Designed and executed complex SQL queries to extract, merge, and integrate data from various sources into a centralized database, ensuring data completeness and accuracy across all datasets.
Engineered Python scripts for ETL processes, automating data extraction, transformation, and loading tasks, reducing manual effort by 10 hours per week and minimizing errors in market data processing.
Developed visual reports and interactive dashboards in Tableau, enabling clear communication of healthcare data trends to stakeholders.
Leveraged Google Cloud Platform for data processing and storage, optimizing healthcare data analysis with Big Query and Cloud Storage.
Applied data mining, cleansing, and text mining techniques, improving data quality and consistency for reliable healthcare analytics.
Utilized GitHub for version control, ensuring reliable code management with branching strategies and continuous integration.
Expertly utilized Clinical Trial Management System (CTMS) software to monitor and manage clinical trial data, ensuring accuracy, compliance, and efficiency throughout the trial lifecycle.
Utilized Epic, Cerner, and Medidata Rave to manage patient health data, ensuring real-time accuracy and compliance with healthcare regulations
Contributed to a 35% improvement in patient care outcomes by analyzing clinical data to identify key trends and patterns.
Circle K, Gurugram, India Jun 2018 – Feb 2020 Data Analyst
Description: Circle K is a global leader in convenience and fuel retail, serving millions with accessible locations and diverse product offerings. Involved in optimizing data models and visualizations to streamline operations and improve decision-making capabilities.
Responsibilities:
Led the development of prescriptive data model to optimize operations and increase packaging efficiency
Analyzed requirements, developed and debugged applications using BI tools
Interacted with the users and business analysts to assess the requirements and performed impact analysis
Used agile software development with Scrum methodology.
Worked in coordination with clients for feasibility study of the project in Spotfire and with DB Teams for designing the database as per project requirement.
Produced databases, tools, queries and reports for summarizing and root causing failure data.
Designed, developed, tested, and maintained Spotfire reports based on user requirements.
Extensively worked on Tableau dashboards optimization for better performance tuning.
Extensively worked with various functionalities like context filters, global filters, Drill-Down Analysis, Parameters,
Background images, Maps, Trend Lines and created customized views, conditional formatting for end users.
Assisted report/visualization developers and statistical modelers in filtering, tagging, joining, parsing, and normalizing data sets for use in reports & analytical models.
Extract, transform, and load data from a variety of data sources into staging datasets and ultimately into visualization tool.
Worked on Data blending, Data preparation using Pipeline pilot for tableau consumption and publishing data sources to Tableau server. Created scripts in Python and R for calling APIs through Insomnia and Postman.
Developed custom SQL connections and calculated fields within the environment to facilitate automated production of data visualizations.
Provided production support to various tools and created tickets in Jira.
Environment: TIBCO Spotfire 7.x, Tableau 10.x (Desktop/Server), Pipeline Pilot, Alteryx, Oracle, SQL Server, MS SQL, AWS, Python, R- Script, Power BI, SharePoint, APIs, Insomnia, Postman, Jira, MS Office Suite (Word, Excel, PowerPoint)