Post Job Free

Resume

Sign in

Data Analyst Sql Server

Location:
Boydton, VA
Posted:
November 06, 2023

Contact this candidate

Resume:

R. Teja 909-***-****

Data Analyst ad0wbz@r.postjobfree.com

https://www.linkedin.com/in/teja-y-659717293/

Summary: -

Over 9 years of experience as a Data Analyst Healthcare and insurance domain with expertise in Data Warehousing, Data Integration, and Business Intelligence (BI) Data Warehousing.

Experienced with Medicare and Healthcare Insurance as well as Medical Standards HIPAA and HL7.

Experience in data science/computational prediction methods to solve complex public health problems (cancer) using big data, unsupervised (clustering algorithm)/supervised learning, and deep learning (CNN, LSTM)-prediction tasks.

Excellent Knowledge of Electronic Medical Record (EMR) / Electronic Health Records (EHR) modules and process flow.

Experience working with RDBMS including Oracle/ DB2, SQL Server, PostgreSQL, MS Access and Teradata for faster access to data on HDFS.

Exposure to implementation and operations of data governance, data strategy, data management, and solutions.

Experienced working with clients in the Healthcare industry. (NPI, Membership & Billing, Meta Vance, EDI 837, Professional and Institutional Claim Processing, COB, NASCO).

Outstanding knowledge& working experience of slice & dice Looker, Power BI & Congo’s.

Extensive experience in leading Business Intelligence/ ETL implementations using MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), MS SQL Server Analysis Services (SSAS) and Tableau.

Experience in Data Migration, Data conversion, and Data Integration.

Mentored & and assisted staff performing in Data Interpretation, Data Analysis and Data Integration by using Databases like MySQL and DB2.

Experienced in data integration, metadata management, ETL, data modeling tool sets.

Solid understanding of statistical analysis, predictive analysis, machine learning, data mining, quantitative analytics, multivariate testing, and optimization algorithms.

Strong working knowledge in healthcare claims data from varied insurance Payers from both Commercial and Medicare space.

Experienced working on a large volume of Data using Teradata SQL and BASE SAS programming.

Proficient in ICD - 9-CM and ICD-10-CM coding and claims processing.

Experience in conducting User Acceptance Testing (UAT) and documenting test cases in both personal commercial line of property and casualty insurance.

Proficient in data extraction, cleaning, transformation, analysis of categorical and numerical variables, and creating statistical linear/multilinear regression models based on response variables, finding multicollinearity, and Causation.

Strong knowledge of Unified Modeling Language (UML), and System Development Life Cycle (SDLC) methodologies like Waterfall and Agile/SCRUM.

Experience in Data Mining, Data mapping, and Data modelling and a good understanding of the ETL tools like AB Initio, SSIS, and Informatica Power Center.

Technical Skills:

Languages

Python, R, SQL

ETL processing

Google Cloud Data Fusion, Azure Data Factory, AWS glue, SSIS, Excel Power Query

Cloud Platform(Data warehouse)

Snowflake, Teradata, AWS (Redshift, S3), GCP- Google Big query, Microsoft Azure (Azure Synapse Analytics),

Databases

MySQL, SQL Server, Oracle, Hive, NoSQL-MongoDB, Amazon (RDS, Dynamo DB) Google Cloud, PostgreSQL

Data Analytical/Statistical/Build Models Platform

MS Excel, Dataiku, Alteryx

BI Tools (Reporting, Analyzing, and Visualization)

Tableau, Power BI, QlikView, Spot fire, Google Analytics, Looker, MS Office (Word/Excel/PowerPoint/Visio), D3

Data Modelling Tools

MS Visio, Erwin, Power BI

Data Processing Framework

Hadoop, Spark

Cardinal Health, CA

Sr. Business/Data Analyst Dec 2021 – Till Now

Summary: - The project focused on various aspects of data management, analysis, and migration within the healthcare domain. A gap analysis was conducted to identify additional functionalities required to meet HER standards, EMR, EHR, HL7, and EDI requirements. Data quality mappings were created using the Informatics Data Quality (IDQ) tool and imported into Informatica Power Center. The project aimed to excel in data extraction, cleaning, loading, analysis, and modelling using R, Python, and ETL tools. It involved collaborating with stakeholders to create data functional design documents, choosing suitable MDM technologies, migrating data warehouses, implementing ETL processes, and ensuring data governance and data quality.

Responsibilities:-

Performed GAP analysis between the current and TO-BE business process models to determine what additional functionalities are needed to meet HER standards. EMR, EHR, HL7, and electronic data interchange (EDI).

Researched the existing client processes and guided the teams in aligning with the HIPAA rules and regulations for the systems for all the EDI transaction sets.

Created various data quality mappings in the Informatics Data Quality (IDQ) tool and imported them into Informatica Power Center as mappings.

Reporting tools, software, and other applications, including SQL Database, Looker, Tableau and Salesforce dashboards (reports designing tool) and Data warehouse.

Created Look ML views from data mart tables and developed models for looks and dashboards.

Created Look ML views by writing complex SQL in SQL Runner for creating derived tables.

Conducted surveys to gather geodemographic data and analysed them with GIS software

Migrating data from MySQL to Mongo DB (No SQL).

Worked in Healthcare Claims Administration Healthcare Claims Processing 837/835 includes facility claims and professional claims.

Data mapping, logical data modeling, created class diagrams and ER diagrams and using SQL queries to filter data within the Access database

Converted HL7 to FHIR Resources for Interoperability goals across systems.

Performed Data Cleaning, Feature Scaling, and Feature engineering using Python packages such as Pandas, Numpy, Matplotlib, and Sic-kit Learn.

Worked on QEYAS projects, helping create healthcare matrix and measures for Dubai, U.A.E., using PostgreSQL and SQL programming.

Involved in testing the SQL, Look ML in Looker & and Snowflake DB Scripts.

Outstanding preeminence in Data extraction, Data cleaning, Data Loading, Statistical Data Analysis, Exploratory Data Analysis, Data Wrangling, and Predictive Modeling using R, and Python.

Work together to create the Data Functional Design documents with data modelers and ETL developers.

Working knowledge of OLAP Reporting Tools like Congo’s, Micro strategy, Crystal Report and Java-based visualization tools like D3.

Check the data and table structure in the PostgreSQL & and Redshift databases and run the queries to generate reports.

Creating React/Node heath care applications that reward providers based on patient outcomes versus patient volumes considering patient’s social determinants of health.

A highly immersive Data Science program involving Data Manipulation and visualisation, Web Scraping, Machine Learning, Python programming, SQL, GIT, Unix Commands, NoSQL, MongoDB, and Hadoop.

Analyzing and choosing different MDM technologies and goods in light of client requirements.

Worked on projects to move data from Oracle/DB2 data warehouses to Teradata data warehouses.

Use GIS linear referencing data techniques to produce databases for use by the Super load application.

Participated in the Data Governance working group sessions to create Data Governance Policies.

Collaborated with data engineers and operation team to implement ETL process, wrote and optimized SQL queries to perform data extraction to fit the analytical requirements.

Working with clients to understand their data migration needs and determine any data gaps.

Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform.

Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.

Implemented Unix/Linux commands and Perl scripts to collect and filter out all production batch jobs, shell scripts, SQL statements, and Oracle control files related to the large Oracle database and loaded all those records into tables for data analysis.

Create automated solutions using Data bricks, Spark, Python, Snowflake, HTML.

Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.

Perform slicing and dicing of data using SQL and Excel for data cleaning and data preparation

Extensively involved in Data Extraction, Transformation, and Loading (ETL process) from XML to the staging area and from Staging to ODS using Informatica Power Center.

Used gap analysis to do the data mapping to determine the source table, source columns, destination table, and destination columns, and worked on business requirement changes to fill those gaps.

Molina Healthcare, CA

Business/Data Analyst Aug 2019 – Oct 2021

Summary: -

The project aimed to enhance Molina's claims reimbursement user interface, focusing on improving the user experience and incorporating necessary changes according to guidelines. The primary goal was to enhance the security and privacy of health information entered by users during claim submission. By implementing these improvements, the project aimed to provide a more efficient and reliable system for handling claims while prioritizing user data protection.

Responsibilities: -

Worked on creating Process Flow diagrams, Use Case Diagrams, Class Diagrams and Interaction Diagrams using Microsoft Visio and Rational Rose.

Analyzed HIPAA 4010 and HIPAA 5010 standards for 837P, 837I, and 835 transactions and prepared a gap analysis document for each transaction.

Created interface stored procedures used in SSIS to load/transform data to PRMS

Develop code related to SQL in Snowflake, Teradata, Python and Shell scripting as per the changing needs of the projects for file transfer and automation.

Developed dashboard prototypes using Cloud Dashboard Tools Looker and AWS Quick Sight managing all aspects of the technical development.

Perform data comparison between SDP (Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Data bricks, Spark SQL, and Python.

Worked with Tableau to create Dashboard to show the visualization about the Healthcare claims data and Federal and State Insurance data.

Deployed, enhanced and maintained the My Chart Patient Portal and integrated Epic EHR wif teh wif RESTful FHIR API Data Interchange (EDI).

Writing various scripts to query several networking devices, and databases to gather information and store that data in MySQL, Mem SQL and NoSQL databases

Agile Assets Pavement Analyst's GIS and Linear Referencing Features

Created Visualization reports using D3 to showcase the dealer’s/customer integration and mindsets.

Have been presented no dashboards to business to explain the functionality of looker.

Involved in specifying the star schema design by applying Data warehouse methodology of dimensional modelling and development using BO Designer.

Performed exploratory data analysis like statistical calculation, data cleaning and data visualizations using Numpy, Pandas and Matplotlib.

Well versed with Data Migration, Data Conversions, and Data Extraction/ Transformation/Loading (ETL) using DTS, PL\SQL Scripts.

Modification and creation of EHR content as prescribed by the project team and clinical users.

Perform data collection, cleaning, wrangling, analysis, and building machine learning models on the data sets in R and Python.

Extensively used ETL methodology for supporting data extraction, transformations, and loading processing, in a complex EDW using Informatica.

Created a relational model and dimensional model for online services such as Clinical EMR records.

Involved in data lineage and Informatica ETL source to target mapping development, complying with data quality and governance standards.

Worked on SSMS and wrote stored procedures and advanced SQL Queries.

Extensively worked on Data mapping and created Data Dictionaries.

Responsible for ETL through SSIS and loading Data to DB from different input sources.

Worked on AWS S3 bucket for storing data and also worked on AWS Redshift for the data warehouse.

Senior level SQL query skills Oracle and TSQL in analyzing and validating SSIS ETL database data warehouse processes.

Humana, KY

Data Analyst March 2018 – May 2019

Summary: - The project involved project management activities and serving as a liaison between business users and technical teams. The project aimed to ensure effective communication, track issues, and prioritize resolution using tools like JIRA. Additionally, various tasks were performed, such as creating reports, analyzing data lineage processes, implementing content management systems, and developing SQL Server Integration Services (SSIS) packages. The project team worked collaboratively with different departments and operated independently to deliver successful outcomes.

Responsibilities: -

Provided regular verbal and written status reports to IT management and business community; published meeting minutes and maintained project plans.

Followed all the phases in the project management life cycle.

Worked with different Business Areas like Claims and Enrollment to document proposed ICD 9 10 Code changes.

Worked on HIPAA transactions and code sets standards according to the test scenarios such as 837 health care claim transactions.

Involved in full HIPAA EDI transactions such as 835, 837 (P, D, I) 276, 277, 278.

Created complex reports utilizing SQL Server Reporting Services (SSRS) and Office 365.

Analyzed data lineage processes to identify vulnerable data points, control gaps, data quality issues, and overall lack of data governance.

Defined and documented the vision and scope of the project for Data warehousing and BI Analytics.

Involved in Identifying User Requirements and Analyzing the Existing Data Source and IMS data to Build Product Platforms into Standardized Data warehouse/data mart.

Tested the ETL Informatics mappings and other ETL Processes (Data Warehouse Testing)

Reviewed the data model and reporting requirements for Congo’s Reports with the Data warehouse/ETL and Reporting team.

Working with data ingestions from multiple sources into the Azure SQL data warehouse.

Primarily involved in Data Migration using SQL, SQL Azure, Azure Storage, Azure Data Factory, SSIS, and PowerShell.

Created processes to load data from Azure Storage blob to Azure SQL, to load from web API to Azure SQL and scheduled web jobs for daily loads.

Used various sources to pull data into Power BI such as SQL Server, SAP BW, Oracle, SQL Azure etc.

Implemented CMS (Content Management System) in automating various aspects of Web and Publication Content Creation, Document Management, and Delivery.

Worked with the OWM Team to write up Requirement Documents (define user and system interfaces - actors with the help of use case diagrams).

Developed and deployed SQL Server Integration Services (SSIS) Packages

Niva Bupa Health Insurance, India

Data Analyst July 2015 – Sep 2017

Summary: - The project involved working with data extraction, manipulation, and reporting using various tools like Tableau and MS Excel. The aim was to forecast key performance indicators (KPIs) and ensure compliance with data standards, including HIPAA, EDI, and transaction syntax. The project also involved evaluating and selecting Master Data Management (MDM) technologies and data mapping, modelling, and design. Dashboards, scorecards, and data analysis were created to provide insights for business decision-making, and guidance was provided for transitioning from Access to SQL Server.

Responsibilities:-

Worked on data extraction using various queries/formulas, and manipulated using various analytical tools to create reports.

Worked on various reports using reporting tools like Tableau and MS Excel to forecast the growth of various KPIs.

Work closely with Subject Matter Experts to develop a thorough understanding of how data is maintained in the Enterprise Data Warehouse (EDW) structure and how it interfaces with corporate systems.

Worked with data compliance teams and data governance team to maintain data models, Metadata, and Data Dictionaries define source fields and their definitions

Incorporated and implemented all the HIPAA standards, Electronic Data Interchange (EDI), and transaction syntax like ANSI X12, ICD-9, and ICD-10 coding.

Worked on Commercial Lines Property and Casualty Insurance including both policy and claim processing and reinsurance.

Evaluation and selection of various MDM technologies and products in view of the client's requirements.

Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.

Participated in various Healthcare presales initiatives for EHR applications, HIPAA 4010-HIPAA 5010 and ICD9 - ICD10 conversions.

Involved in defining the source to target ETL data mappings, business rules and data definitions.

Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.

Involved in Providing guidance for transitioning from Access to SQL Server.

Analyze data using Tableau for automation and determine business data trends.

Care Health Insurance, India

Analyst Jan 2014 to Feb 2015

Summary: - The project involved assisting in the documentation of system requirements, including functional and non-functional aspects. Various documents such as SRD, FRD, FDD, and use cases were developed to capture the project's scope. The aim was to gather accurate business requirements, comply with healthcare standards like EDI X12 and HIPAA, and ensure the accuracy and integrity of data in healthcare insurance processes.

Responsibilities:-

Address system concerns in the Facet system by reporting issues and developing documentation.

Ensure billing accuracy for Medicare, Medicaid, and third-party insurance by adhering to EDI X12 standards.

Provide Medicare Operations support and deliver standardized data and reporting to meet business user needs.

Maintain data integrity through SQL queries, collaboration with the development team, and effective communication using UML-based feasibility studies and Use Case Models.

Involved in Guide wire Corporate Training on Guide wire Policy Center, Worked on Commercial Lines Property and Casualty Insurance.

Integrate and manage electronic medical records (EMR), electronic health records (EHR), personal health records (PHR), and patient medical records (PMR) using ASP technology and full integration tools.

Education:-

MSc Biotechnology (Dr. V.S.Krishna govt college/Andhra University and May 2008)



Contact this candidate