Sai LAKSHMI
**************@*****.*** 302-***-****
Senior Data Analyst
https://www.linkedin.com/in/prasanna-lakshmi-706b2930a/
PROFESSIONAL SUMMARY:
Over 9+ years of IT industry experience with emphasis on Healthcare Data Analysis, Claims Processing, Pharmacy Data Management, Member Data Integration, Reporting, Prediction Modeling, and Data Architecture.
Over 7+ years of experience in Power BI Development with architecture expertise, designing and developing complex clinical and operational dashboards integrated with custom SQL queries.
Over 7+ years of experience developing healthcare data pipelines using Python, Azure Data Factory, SQL Server, Snowflake, Teradata and Informatica for claims processing systems.
Extensive experience in source-to-target mapping for healthcare data warehouses, ensuring accurate transformation of claims, pharmacy, and member data into actionable analytics platforms.
Expert in data quality validation and cleansing methodologies for healthcare datasets, maintaining compliance with HIPAA and other healthcare regulations.
Proficient in healthcare data visualization, clinical reporting, and outcomes analysis using various chart types with customized filters for different healthcare stakeholders.
Experience in Guidewire suite (PolicyCenter, ClaimCenter, BillingCenter and Contact Center) and healthcare systems integrations.
Strong knowledge of healthcare database management with expertise in writing complex SQL queries for membership analysis, claims trend identification, and pharmacy utilization metrics.
Expertise in healthcare analytics including claims processing, pharmacy benefit management, and member enrollment systems.
Experience in designing and analyzing healthcare A/B tests with understanding of confidence intervals and statistical significance in clinical outcomes.
Well-versed in healthcare data models following 3NF and Star Schema (Dimensional) Data Modeling techniques for clinical and claims data warehouses.
Experience in working with databases like MongoDB, MySQL and Cassandra.
Possess strong knowledge of database management in writing complex SQL queries, Joins, query optimization and resolving key performance issues.
Familiar with P&C Insurance Analytics using Guidewire Policy Center.
Expertise in Tracking, Capturing, Managing and Communicating the Requirements using Requirement Traceability Matrix (RTM) which helped in controlling numerous artifacts produced by the teams across the deliverables for a project.
Experience on working with MongoDB Ops Manager, Cloud Manager and Atlas Manager
Worked in all phases of BW/BI full life cycles including analysis, design, development, testing, deployment, post-production support/maintenance, documentation, and end-user training.
Proficient in developing Entity-Relationship diagrams, Dimensional Modeling, Star/Snowflake Schema Designs, and Expert in modeling Transactional Databases and Data Warehouse.
Experience in designing and analyzing A/B tests with an understanding of confidence intervals and noise reduction techniques.
Expert in Insurance Technology solutions like Guidewire suite and PEGA with in-depth knowledge in the Underwriting, policy administration, billing, rating, reinsurance and claims processExpert in Insurance Technology solutions like Guidewire suite and PEGA with in-depth knowledge in the Underwriting, policy administration, billing, rating, reinsurance and claims process..
Architecture experience in AWS setup for Tableau, like server sizing, capacity planning & deployment. Experience with version upgrades of servers and reports.
Certified Scrum Master having around 5+ years of experience in Agile methodologies with both Scrum (Iterative and Increment) for Ongoing Operations.
Well versed in 3NF and Star Schema(Dimensional) Data Modeling techniques.
Experience in Handling Huge volume of data in/out from Teradata/Big Data.
Experience in integrating databases like MongoDB, MySQL with webpages like HTML, PHP and CSS to update, insert, delete and retrieve data with simple ad-hoc queries.
Worked and extracted data from various database sources like Oracle, SQL Server, DB2, and Teradata.
Hands on experience with modeling using ERWIN in both forward and reverse engineering cases.
Strong exposure to writing simple and complex SQL, PL/SQL queries.
Experienced as Data Analyst. Have hands-on experience in Onsite-Offshore model projects.
Involved in writing shell scripts on UNIX for Teradata ETL tool and data validation.
Experienced in using various Teradata Utilities like Teradata Parallel Transporter (TPT), Mload, BTEQ, FastExport,and Fastload.
EDUCATION:
●Bachelor of Computer Science.
TOOLS AND TECHNOLOGIES:
Programming Languages : VB 6.0, SQL, Python, R.
ETL tools : Informatica Power center, SSIS, AB Initio
Data modeling : Sybase Power Designer / IBM Data Architect
BI & Reporting tools : Business Objects, Cognos, SSRS, Crystal Reports
MS-Office Package : Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).
Visualization tools : Tableau Desktop, Power BI Desktop, Python, Pandas, NumPy, Datorama
ETL Tools / Tracking tool : Informatica, SSIS, SSAS, SSRS / JIRA.
Database Development : T-SQL and PL/SQL, Microsoft Hyper-V Servers
Databases : : Teradata R12 R13 R14.10, MS SQL Server, DB2, Netezza
Operating Systems: : Windows, UNIX, Sun Solaris.
PROFESSIONAL EXPERIENCE:
CVS Health, Woonsocket, RI
Nov 2023 - Present
Role: Senior Data Analyst
Responsibilities:
Roles & Responsibilities:
●Developed comprehensive source-to-target mapping documents for healthcare claims data integration projects, ensuring accurate data transformation and load processes.
●Created and optimized SQL data models for pharmacy claims analysis, improving prescription processing efficiency by 35% through targeted optimizations.
●Designed pharmacy benefits data warehouse architecture and implemented ETL processes for seamless integration with legacy systems.
●Conducted detailed claims data validation between source systems and target databases, ensuring 100% accuracy in financial reconciliation reports.
●Engineered complex SQL queries to analyze prescription trends across different member demographics, identifying potential cost-saving opportunities.
●Implemented data quality frameworks for member enrollment data, reducing data entry errors by 27% through automated validation procedures.
●Created pharmacy utilization dashboards in Tableau, providing insights into prescription patterns and identifying high-cost medication trends.
●Developed member segmentation models using Python and R to identify high-risk patients for targeted intervention programs.
●Designed claims processing workflows in Azure Data Factory, improving throughput by 40% and reducing processing time for daily claim batches.
●Implemented data quality checks for pharmacy data, ensuring NDC codes, dosage information, and pricing structures were accurately captured.
●Created healthcare provider network analysis reports using complex SQL joins across multiple data sources to identify coverage gaps.
●Optimized claims adjudication data pipelines between Guidewire ClaimCenter and enterprise data warehouse, ensuring timely daily refreshes.
●Developed member attribution logic using SQL to accurately assign patients to primary care providers for value-based care programs.
●Built healthcare cost prediction models using historical claims data to forecast future spending patterns for budget planning.
●Implemented HIPAA-compliant data masking procedures for PHI in test environments while maintaining referential integrity across systems.
●Created healthcare fraud detection queries that identified unusual billing patterns and potential duplicate claims submissions.
●Designed medication adherence tracking systems using SQL and Python to identify members at risk of non-compliance with prescribed therapies.
●Built pharmacy formulary comparison tools to analyze cost implications of formulary changes across different member populations.
●Developed provider performance dashboards using claims data to measure quality metrics and patient outcomes across the network.
●Created member journey analysis using SQL window functions to track the complete patient experience from enrollment through various care touchpoints.
●Implemented longitudinal patient record database structure to enable comprehensive analysis of member health outcomes over time.
●Developed EDW transformation logic specifically for healthcare claims data, ensuring accurate mapping of procedure and diagnosis codes.
●Created SQL stored procedures for automated data reconciliation between claims processing systems and financial reporting platforms.
●Designed data quality scorecards for pharmacy data feeds, establishing KPIs for completeness, accuracy, and timeliness of data delivery.
●Implemented healthcare regulatory reporting data pipelines to ensure compliance with CMS and state reporting requirements.
Environment: SQL Server 2019, Azure Data Factory, Azure Databricks, Snowflake, Teradata, Python 3.9, R 4.1, Tableau 2023.1, Power BI, SSIS, SSRS, JIRA, Azure DevOps, Git, Informatica PowerCenter 10.4, AWS S3, AWS Redshift, Visual Studio Code, Docker, Kubernetes, Apache Airflow, Jupyter Notebooks, HIPAA-compliant data environments, Guidewire ClaimCenter, Epic EHR integration tools
TechWave, Hyderabad, India
December 2021 - April 2023
Role: Data Analyst
Responsibilities:
●Designed source-to-target mapping documents for claims data migration projects, ensuring accurate transformation of legacy system data.
●Developed complex SQL queries to analyze healthcare claims patterns, identifying opportunities for process optimization and cost reduction.
●Created data validation frameworks for pharmacy claims data, ensuring accurate processing of NDC codes and medication information.
●Implemented ETL processes for member enrollment data, standardizing disparate data sources into a unified member database.
●Designed healthcare provider database structure with optimized indexing strategy for improved query performance on large datasets.
●Created data quality reports to track completeness and accuracy of claims submission data from various provider networks.
●Developed DAX queries in Power BI for healthcare utilization analysis, creating interactive dashboards for clinical operations teams.
●Implemented referential integrity checks between member, provider, and claims tables to ensure data consistency across systems.
●Created SQL stored procedures for automated claims data processing, reducing manual intervention and improving throughput.
●Designed healthcare data marts specifically optimized for common analytics queries, improving report generation performance.
●Conducted JAD sessions with healthcare stakeholders to gather requirements for clinical and operational reporting solutions.
●Implemented data lineage tracking for sensitive PHI elements, ensuring compliance with healthcare data governance requirements.
●Created SQL optimization strategies for large claims history tables, reducing query execution time by 45% for common reports.
●Developed member benefit verification logic using SQL to accurately determine eligibility for specific services and procedures.
●Implemented claims denial analysis reports to identify common rejection reasons and improve first-pass acceptance rates.
●Created provider credentialing database structure and implemented validation rules to ensure data accuracy and completeness.
●Developed healthcare KPI dashboards in Power BI, establishing baseline metrics for organizational performance monitoring.
●Implemented code mapping tables for ICD-10, CPT, and HCPCS codes, enabling standardized analysis across different coding systems.
●Created SQL-based data cleansing routines for address normalization and duplicate identification in provider and member records.
●Developed claims aging analysis reports to track processing timelines and identify bottlenecks in the adjudication workflow.
●Created test cases for validating the accuracy of calculated fields in healthcare claims processing systems.
●Implemented ETL monitoring frameworks to ensure timely completion of daily data loads from claims processing systems.
●Developed traceability matrices to ensure complete coverage of business requirements in healthcare data integration projects.
●Designed and implemented pharmacy formulary tables with effective dating to accurately track medication coverage changes over time.
●Created SQL views to simplify complex healthcare data relationships for business analyst consumption and ad-hoc reporting needs.
Environment: SQL Server 2016/2019, Oracle 12c, SSIS 2019, SSRS, Power BI, DB2, XML, SAS 9.4, SAS Enterprise Guide 8.2, Teradata 16.20, Teradata SQL Assistant, WinSQL, DataFlux, Quality Center 9.2, IBM Rational ClearQuest 7.0, TOAD, Business Objects, Excel VBA, Microsoft Project, Informatica PowerCenter 9.6, Python 3.8, Anaconda, Visual Studio 2019, Agile/Scrum methodologies, HIPAA-compliant test environments
DesIDEA Software Technologies, India
` August 2018 - November 2021
Role: Data Analyst II
Responsibilities:
●Development, support and maintenance of key Business Intelligence and data reporting solutions by using digital marketing and demographical data provided by the clients, enabling them to make better business decisions for future purposes.
●Implementation and delivery of business solutions to develop and deploy ETL, analytical, reporting and scorecard / dashboards in the area of digital, social media and competitive data from Google Analytics and Adobe.
●Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments Development and Production .
●Involved in creation/review of functional requirement specifications and supporting documents for business systems, interaction with Business Analysts, Client team, Development team etc.
●Development and implementation SSIS packages to extract and load data from various heterogeneous data sources like Excel, CSV, Oracle, MS Access, SQL Server, flat files etc.
●Data quality analysis and Data profiling of the various data sources to determine adherence to standards. Perform data scrubbing, cleansing and standardization to prep data prior to the load.
●Development and deployment of SSIS packages with tasks and transformations like Execute SQL, Execute Package, Conditional split, Script Component, Multicast, Merge and Lookup to load the data into Destination.
●Use SQL server business Intelligence Development Studio to design SSIS packages to transfer data between servers and deploy the data. Scheduled the job to do this task periodically.
●Creation of SQL scripts and T-SQL stored procedures, functions, Views and other database objects for populating and validating the data. Performance tuning of various scripts, procedures, indexes etc. to improve the overall speed.
●Connected to AWS Redshift through Tableau to extract live data for real time analysis.
●Researched and developed hosting solutions using MS Azure for service solution.
●Interaction with the development teams and business analysts to create and modify the data model and LDM Logical Data Model for publishing multiple dashboard reports on web and cloud platform in accordance with expectation of the clients.
●Development of complex, intuitive dashboard reports for the display of key business data on the platforms such as SSAS, Qlikview, Crystal reports and GoodData.
●Development of various types of complex reports like Drill Down, Drill through, Cross tab reports. Creation of facts, dimension tables and cubes for slicing and dicing of data.
●Work on Performance tuning and Optimization of ETL packages at Control Flow and Data Flow level along with making proper use of transactions and checkpoints.
●Communication with key stake holders, Client Sales Managers CSMs and end users in reference to business requirement and specifications for the latest dashboard reports and BI solutions.
●Creation and modifications of the fact and dimension tables in the star schema in accordance with the BI Data analytics and reporting solutions.
●Coordination of design and implementation of key dashboard reporting platforms based on social media, digital media and competitive data.
●Coordination of the on boarding of new clients onto the dashboard reporting platforms and managed their expectations.
Environment:MS SQL Server 2008 R2/2012, T-SQL, SSIS, SSRS, SSAS, Qlikview, Omniscope, DataFlux, SSIS DQS, Oracle 11g, Crystal Reports, MS Access, MS Azure, Power BI, Excel, VBA, .NET, Sharepoint, Business Objects, XML, Erwin, TFS, Good Data.
Bizionic Technology, India
April 2016 - July 2018
Role: Data Analyst/Business Analyst
Responsibilities:
●Support and development of the key anti money laundering applications for the sanction screening and transaction monitoring process, to ensure they are in compliance with the US Patriot Act and Bank Secrecy Act BSA
●Support and maintenance of key Anti Money Laundering AML applications, which manage data related to US Sanctions and Specially Designated Nationals SDN's from OFAC Office of Foreign Assets Control, USA Patriot Act 314 a and transaction screening process.
●Provide Business as Usual BAU support and Production support for the NESS application and perform data analysis. Provided training to the users as well as developers and bring them up to speed on different aspects of AML and Compliance.
●Designed the test cases, test scripts and coordination of testing activities for various NESS application releases. Co-ordination on regular basis, with off shore team to make sure smooth execution on testing, development and training front.
●Documentation of procedures for creation, analysis and maintenance of various Global, Local and Regional Lists.
●Development, maintenance and replication of various Global, Local and Regional Lists of sanctioned people and entities, which helps Citi regional offices around the world, maintain strict standards in terms of financial transactions.
●Conducted research, assisted and provided information to support suspicious activity reports SAR and made recommendations. Worked on various aspects of AML, Fraud prevention, KYC, Brokerage and Trading Compliance.
●Define the review procedures for disposition of cases and identify the issues with data quality and accuracy and rectify it.
●Development and tuning of the Optimization Script in the database, in order to reduce the rate of the false positive hits generated by the system and streamlining the overall screening and filtering process and improve the overall performance.
●Development and tuning of the database script for performing the hit rate analysis of the list entries against the transaction files.
●Guided the automation of Gap Analysis process and preparation of Lists using MS Excel, Macros, MS Access, SQL and SAS.
●Obtained a wide selection of client documentation from internal and external data sources in order to extract key information and validate client's identity and reputation in order that they meet KYC checks
●In-depth Data quality analysis and Profiling of the input data, of the business units and new businesses within Citi Group, scheduled to come on board the AML applications, to assess issues with data quality, using DataFlux.
●Designed the framework for the DataFlux Architect and Data Flux Profiler jobs for the extraction of data from various data sources flat files, MS SQL Server, Oracle, Excel and COBOL Copybook files.
●Development and maintenance of a profiling report application in Crystal Reports in accordance with the GAML guidelines, which provides the details, frequency distribution and pattern analysis for all the data elements for various business sectors in Citi such as CitiFinancial, Citi Auto and FDR.
●Discussion of the Data Quality and profiling results with the key stakeholders and GAML and recommend the measures to rectify, fix and cleanse the data.
●Development of a reporting solution for data profiling, using DataFlux, MS SQL Server and reporting tools.
●Enhancements to the current applications to add complex functionalities as well as performance tuning and optimization of the application for better speed and matching accuracy, in view of addition of new business units and clients.
Environment: Solaris 9, 10, Red Hat Linux 6.x, 7.x, 8, 9, AIX 6.1, Veritas Volume Manager 3.x, 4.x, VeritasNet Backup 4.5, Oracle 9i, WebLogic 8.x, Sun Fire v480, v880, 4800, 6800, Chef, Jenkins, Puppet, Shell Scripting, Python