Post Job Free

Resume

Sign in

Data Analyst Software Developer

Location:
McKinney, TX
Posted:
October 11, 2020

Contact this candidate

Resume:

Experience Summary:

Having * years of working experience on various IT Systems & application using open source technologies involving Analysis, Design, Coding, Testing, Implementation, and Training,

Excellent skills in state-of-the-art technology of client-server computing with a good understanding of Big Data Technologies

Developed search activity dashboards that provide an overview of clinical messages activity and count of message segment per facility to better understand flow on mirth connect, Orion Rhapsody interface (Data Exchange Interface Engine), in real-time and overtime.

Highly proficient and experienced in working on software development and data modeling projects using JavaScript, Python, MSSQL, PostgreSQL, Elasticsearch, Logstash, Kibana, Grafana, Power BI, SAS, HL7, FHIR, EDI, ELR, Data Exchange Interface Engine like(NextGen Mirth Connect, Orion Rhapsody) and PowerShell on Windows and Unix/Linux machines.

Education:

Ph. D. (Information Technology), University of the Cumberlands, Williamsburg, KY.

Expected August 2021.

Master of Science (Electrical Engineering), Northwestern Polytechnic University, Fremont CA.

April 2016.

Bachelor of Technology (Electronics and Communication Engineering), JNTUH, India.

June 2012.

Technical and Analytical Skills:

Roles: Software Developer/Data Scientist, SAS Data Analyst, Senior SQL Analyst, Financial Analyst, and Clinical Analyst.

Programming: JavaScript, Python, HL7, SAS, R, XML, PowerShell, REST API, HTML, VBA

Tools Data Exchange Interface Engine: Mirth Connect, Mirth Result, Mirth Match, Rhapsody GIT.

Database: MSSQL, PostgreSQL, Oracle, MongoDB, MySQL

Big Data: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana, Power BI.

Project Methodologies: Agile Methodology, Waterfall

Work Experience:

NTT Data Services (Tenet Healthcare)- Plano, Texas June 2017 - Current

Integration Developer/Data Scientist

Served as a Software developer and Data Science, responsible for creating new interfaces, resolving the issues, and created interactive visualization to show the flow of clinical messages on Mirth Interface by segment per facility and perform analytics based on stakeholder needs. Understands department, segment, and organizational strategy and operating objectives.

This project is aimed to deliver the clinical message to different facilities as per their requirement by developing Mirth Connect and Orion Rhapsody Data Exchange Interface Engine using filters/responses/transformers and code template libraries.

Technical Environment: Mirth (Match 1.5, Result 2.1, Connect 3.3/3.4/3.5), Rhapsody, HL7, FHIR, MSSQL Server 2008/2012, PostgreSQL 9.X, MongoDB, SSRS, Power BI, ELK 6.3, JavaScript, HL7v6, PowerShell, REST API, SOAP, HTML, XML, Tortoise Git, Python, R, ServiceNow, BMC Remedy, Microsoft office, Excel ODBC, Microsoft Azure, Windows.

Responsibilities:

I am responsible for being the technical point of contact with upper management, business analysts, project management, and miscellaneous other groups for the proactive monitoring project.

Developed Mirth Connect and Orion Rhapsody Data Exchange Interfaces independently for new requirements (ADT, ORU, SIU) and fixed bugs for production errors with the help of JavaScript, JSON, XML, Database connections/queries and REST API.

Expert working with Healthcare messages like CCD, CDA, ADT, ORU/ORM, MDM, MFN, DFT, SIU, VXU, ELR.

Developing interfaces that match, yet improve upon, all legacy Rhapsody interfaces for vendors.

Worked very closely with healthcare interoperability and messaging standards, like HL7 2.x, HL7 3.x, FHIR, CCDA, X12 HIPAA, EDI, Blood Bank, Microbiology, Transcription, Laboratory, Pathology, Radiology, Patient Access, etc

Developed FHIR mapper for Gender Code, Practitioner Mapper and Patient Services.

Created FHIR code templat libraries to converting strings to JSON Objects and JSNO object to Hapi Resource.

Designed interface for LOINC, SNOMED, and RxNORM code for various facilities to exchange the data in HL7.

Expertise in debugging the HL7 messages with the help of Debug statements and fixing errors

Developed Git pre-commit/post-commit hooks to automate the Mirth Connect Configuration keys and values.

Strong in analyzing troubleshooting various EDI related errors like coding errors, data errors, validation errors.

Worked on the EDI implementation project with the help of XML, JavaScript, Oracle and MSSQL.

Created and maintaining probabilistic matching rules in Mirth Match EMPI and use Match's match-quality reporting to rate matching effectiveness and look for areas for improvement.

Created tooling needed to perform data-steward activities against Mirth Results' clinical data repository (stored in PostgreSQL). Such activities include Detecting missing values from contributed systems and suggesting ways to solve and prevent the issue.

Analysis between Mirth result and mirth match data to find the duplicates patients (Demographics) records and finding the duplicates from the Mirth match threshold score.

Hands on experience in using databases on cloud like AWS (EC2,RDS) and MSFT AZURE.

Used AWS SCT tool for schema convertion and AWS DMS tools for automating migrating the schema, objects and data.

Migrated MS SQL and MySQL Data to Amazon Aurora with PostgreSQL on AWS.

Worked with UNIX/Linux environment including commands, shell scripting, Python Scripting, and PowerShell scripting

Experience in using various packages in Python-like ggplot2, caret, dplyr, Rweka, gmodels, twitter, NLP, Reshape2, rjson-PyPi, plyr, pandas, NumPy, Seaborn, SciPy, Matplotlib, sci-kit-learn, Jupyter Notebooks, VS Code

Responsible for Oracle and SQL Server Reporting Services Planning, Architecture, Training, Support, and Administration in Development, Test, and Production Environments.

Created SSRS Consolidated report for Audit data and plotted graph for calculated Threshold value and creating reports utilizing SQL Server Reporting Services (SSRS)

The ETL process for continuously bulk importing clinical data from Oracle, SQL server and PostgreSQL Database into Elasticsearch and Spreadsheet.

Created an ODBC connection for Excel to pull the data from PostgreSQL and SQL Server.

Responsible for enabling analysis through producing information products and is involved in the research and development efforts. Traditional programming (SAS, SQL, R, and PostgreSQL) and business intelligence (i.e. ELK) experience for creating dashboards

Experience installing and configuring ELK stock

Determining which feed and Mirth Results data are useful for metrics and developed Mirth Connect (Data Exchange Interface Engine) channel to automate metric-shipping feeds to Elasticsearch

Monitoring and logging tools such as ELK Stack (Mirth Connect, Elasticsearch, and Kibana).

Indexing and search/query substantial number of documents inside Elasticsearch and created a Kibana dashboard for sanity-checking the data and Working with the Kibana dashboard for the overall build status with drill-down features

Data Discovery, visualizations, and dashboards are created in Kibana for quick analysis of healthcare data.

Have good experience of production support to maintain and resolve the issues while on-call rotation.

Experienced in using ServiceNow and BMC Remedy for creating tickets like Request, Incident, Change management.

Prime Healthcare - Ontario, CA May 2016- June 2017

Software Developer / Data Analyst

Worked as a Software Developer and Data Analyst preparing requested data sets for modeling. Analyzed data sets using elementary statistical techniques and prepared data summary reports. Assisted the lead developers and engineers in modeling error rates correlated by-product.

Technical Environment: NextGen Mirth (Match 1.5, Result 2.1, Connect 3.3/3.4/3.5), Orion Rhapsody, HL7, MSSQL Server 2008/2012, Oracle, PostgreSQL 9.X, MongoDB, SSRS, Power BI, ELK 6.3, JavaScript, HL7v6, PowerShell, REST API, SOAP, HTML, XML, Tortoise Git, SAS, VB, MS Excel, MS Access, Microsoft Office, Excel ODBC, Microsoft Azure, Windows.

Responsibilities:

Worked very closely with healthcare interoperability and messaging standards, like HL7, CCDA, HIPAA, EDI, Blood Bank, ELR, Microbiology, Transcription, Laboratory, Pathology, Radiology, Patient Access, etc

Developed interfaces with HL7 V2.x and V3.x messaging requirements for the submission of electronic laboratory reporting to local and state public health agencies.

Worked with the healthcare IT standards such as HIPAA, LOINC, SNOMED, CPT-4/ICD-9; HL7 message types (ADT, ORU), ELR/challenges of health care data exchanges-HIE.

Expertise on HL7 inbound and outbound interfaces through Data Exchange Interface Engine like Mirth Connect and Orion Rhapsody. Managed various interface messages ADT, SIU, ORU, ORM, etc

Manipulated HL7 data in XML transmissions to match vendor specifications using Java Scripting.

Design and configure network architecture for Bridges/Ensemble, Orion Rhapsody and Mirth Connect servers for new customized HL7 Interfaces

Created Data Exchange Interfaces with the help of JavaScript, XML, HTML, Rest API’s based on Mirth Connect to map the data between outside systems like Cerner, Ensemble to exchange the HL7 messages

Responsible for working directly with IT to implement any changes to the interfaces including external EDI

Developed and implemented a new EDI interface with the help of HL7 and converting historical.

Conducted ongoing quality control processes associated with programs such as ELR production data and reconciliation of daily ELR files.

Retrieved electronic messages and transformed to another format as required using Orion Rhapsody

Created and extracted Clinical data tables from SQL, Text, CSV, and Excel files to SAS using SAS tools like SAS/ACCESS, Infile, LIBNAME engine.

Scripted SAS programs with the use of Base SAS and SAS/MACROS for ETL purposes also transferring and converting finance data from Excel files to another to be used for further analysis and created global and local variables.

Created customized figures for clinical study reports as well as ad-hoc requests using SAS procedures Gplot and Gchart.

Do trend analysis and check interaction p-values based on the separated and combined dataset, using results of regression analysis, and make the programs reusable using SAS MACRO.

The implemented waterfall method of Software Development Life Cycle (SDLC) methodology for design, development, and implementation and testing of various SAS modules.

Worked with financial data like analyzing the salary sheets, inpatient, and outpatient billing.

Analyzed clinical trial data and generated tables, listings, and graphs using MACRO, GRAPH, and SQL.

Generated Summary Tables, Listings, and Graphs to support clinical study reports by using Base SAS, SAS/STAT, SAS/MACRO SAS EG, and SAS/GRAPH.

Implemented CDISC SDTM and ADaM standards as well as generating tables, figures, and listings to support the statistical analysis of clinical trial data.

Summarize results of complex data and statistical analysis, and present reports to the laboratory to support decisions on future operations and manufactory orientation.

Built Macros and create macro variables using %Macro and %Mend, and DATA _NULL_ to help generate analysis data sets and create a specified structure.

Verifacts Services Pvt Ltd, India Jan 2013 - Jan 2015

Senior Data Analyst

Participated as SQL analyst to the team and responsible for processing data from many disparate sources to be used for analysis and various other APIs.

Technical Environment: Oracle, MSSQL, MS Visio, Excel Vlookup, Pivot tables, Excel-VB, FTP (File transfer protocol), MS Office, Windows/Linux

Responsibilities:

Provided forms item analysis and summary statistics reports in Excel and reading data from different sources like .csv, excel, tab-delimited files.

Worked with gathering requirements, design and develop for the CVWS internal tool

Performed independent research and analysis activities and coordinated with business partners.

Used Agile methodology and worked with developers and responsible to assign the task.

Constructing and deconstructing the SQL queries as per the business requirement. Created predefined queries for department performance reports.

Research new emerging technologies for data storage and reporting analysis, implementing and promoting best practices.

Proficient in maintaining and configuring Microsoft SQL Server database instance with SQL Server Integration (SSIS) and Reporting Service (SSRS)

Extracted data from Oracle using SQL Pass-through facility and generated reports.

Importing/exported reports from/ to the CVWS tool for employment verification.

Prepared invoices for different clients based on contracted prices and supported to the client for ad-hoc reports.

Updated Databases daily for multiple clients and maintain records for incoming-outgoing cases

Proficient in Exporting and importing the reports with FTP.

Created advanced formulas, Vlookup, pivot tables, charts, graphs, and custom reports.

Bank of Maharashtra, India Jun 2012 – Jan 2013

Data Engineer

Served as a Financial Analyst to the group and responsible for processing accounts data from many

disparate sources to be used for statistical analysis and created various graphs.

Technical Environment: SAS/Base, SAS/STAT, SAS/GRAPH, SAS/MACROS, SAS/ACCESS, SAS/CONNECT, SAS/ODS, SAS/SQL, MS-Office, Mainframe/UNIX.

Responsibilities:

Understand and update the existing code or create new code by using BASE/SAS, SAS/MACROS, and SAS/SQL to implement new requirements.

Expertise in statistical modeling and analysis of study data by utilizing appropriate statistical methods.

Extracted data from Oracle using SQL Pass-through facility and generated reports.

Modified existing SAS programs and created new programs using SAS macro variables to improve the ease and speed of modification as well as the consistency of results.

Generate Reports in user required format by using ODS and PROC REPORT.

SAS data sets are validated using SAS procedures like PROC MEANS, PROC FREQUENCY, and PROC UNIVARIATE.

Analyzed and implemented the code and table changes to improve performance and enhance data quality.

Reading data from different sources like .csv, excel, tab-delimited files.

Compare the source data with historical data to get some statistical analysis.

Perform transformations like Merge, Sort, and Update to get the data in the required format.

Create an understandable EXCEL template to start dad mapping for each line of business.

Extensively utilized SAS/BASE, SAS/SQL, and SAS/MACRO.

Involved in project review meetings with respective Business SME's.



Contact this candidate