Post Job Free

Resume

Sign in

Data Quality Pl Sql

Location:
San Ramon, CA, 94582
Posted:
September 19, 2023

Contact this candidate

Resume:

San Ramon, CA *****

925-***-****

adztc8@r.postjobfree.com

SUMMARY

SYNOPSIS Bhaskar

Lakshmikanthan is a Director,

Enterprise Data Architecture at

R1RCM. Bhaskar has over 18+

years of IT experience in

healthcare industry (payer, plan

and medical groups). Seasoned

Data Architect adept at

understanding mandates,

developing plans and implementing

enterprise-wide solutions. Complex

problem-solver with an innovative

approach. Ready to bring in over

18 years of progressive experience

and take on a challenging new role

with growth potential.

SKILLS

Data bricks, DLT table, DQ

validation, snowflake, data vault

modeling, synapse, Tableau

Collibra data quality, Collibra

data Governance

Language/Tools

Database/Software Appliance

C, C++, JAVA, JSP ORACLE

10G, 9I, 8I, 7.13

PL/SQL, SQL, XML, XSLT DB2

UDB 7.1

• SQLSERVER on Azure

• Erwin, NoSql

Informatica Data Quality 9.6 and

above

Informatica Power Center 9.1 and

above

• Cognos series, Unix shell scripts

Hadoop, MapR, Hive, Hbase,

Casandra

IBM Infosphere, Microsoft Azure,

Apache NIFI

• TRAINING:

Big Data (Cloudera) training

(Hadoop, MapReduce, HIVE,

PIG, HBASE, SQOOP, YARN,

SCALA, PYTHON, Spark,

ZOOKEEPER ) at Edureka

Bhaskar

Lakshmikanthan

EXPERIENCE

August 2021 - Current

DIRECTOR, ENTERPRISE DATA ARCHITECTURE R1 RCM MURRAY, USA

Lead team of architects responsible for Architecture, Design of Data analytics platform

Lead data strategy road map design, data strategy walk through with leadership team.

Responsible for setting up Data Governance, Data Quality using data bricks and Collibra data quality from ground up

Responsible for setting up Data Obfuscation process using customized python code.

• Responsible for designing Near Real time Data Ingestion using Kafka Responsible for designing Snowflake target warehouse for analytics using Data Vault data model

• Responsible for architecture/Design for ML (Data science team)

• Responsible for designing Kafka Topic strategy

• Responsible for designing Data Vault design for enterprise. Developed and presented new ideas and conceptualized new approaches and solutions.

Reviewed internal reports and identified areas of risk or potential cost savings.

Coordinated resources across departments to maximize productivity levels.

Developed policies and procedures to ensure compliance with corporate standards.

Developed and implemented comprehensive strategies to improve operational processes and organizational efficiency.

Identified opportunities for improvement in operational performance metrics.

Created detailed plans outlining timelines, goals, budgets, staffing needs and other requirements for projects.

• Met with stakeholders to address issues and implement solutions. Generated reports to review data and issue corrective actions for improvements.

Analyzed customer feedback data to identify trends in product performance or customer service issues.

Generated concept designs, collaborated with customers to gain feedback and directed entire design process.

• Organized data and modeled information for use in key decision-making. Implemented ETL processes using data bricks for transforming raw data into meaningful information for further analysis.

Performed exploratory data analysis on structured and unstructured datasets in order to identify patterns and trends in data.

Participated in code reviews of other developers' work to ensure high quality standards are met.

November 2017 - July 2021

LEAD DATA ARCHITECT Kaiser Permanente Pleasanton, USA May 2017 - November 2017

LEAD DATA QUALITY ENGINEER/ARCHITECT BLUESHIELD OF Prepared technical documents describing design approaches, implementation details, testing strategies.

Integrated automated tests into continuous integration pipeline for ensuring reliability of deployed solutions.

• Cleaned and manipulated raw data.

• Created graphs and charts detailing data analysis results. Assisted in developing new procedures for processing case data efficiently.

Ensured compliance with relevant laws regarding protection of sensitive information.

Improved quality of data by producing coherent definitions and data-naming standards.

Responsible for configuring, setting up security related tags for Data Tagging process on Azure cloud using Waterline tool

Designing and setting up production process in data pipeline for Auto Tagging using Waterline

Configuring interface to waterline and Collibra tools and running production process

• Responsible for setting up Collibra on Azure Cloud

• Setting Data Governance process on Azure

Setting up Data Quality process on Azure Cloud using Waterline and Trifacta

• Setting up and helping business analyst to use and run Hive SQL

• Helping and supporting business steward in resolving data issues Responsible for Data Architecture model and roadmap for end to end data and workflow process for enterprise

Configuring interface between Waterline and Atlas for sharing metadata Tag information

• Configuring Apache NIFI for data load from files, DB's etc.,

• Through data pipeline to target.

Collected, outlined and refined requirements, led design processes and oversaw project progress.

• Identified, protected and leveraged existing data.

• Developed and managed enterprise-wide data analytics environments. Created conceptual, logical and physical data models for use in different business areas.

• Created and implemented complex business intelligence solutions. Built test plans and scenarios to validate accuracy of data being processed.

Identified gaps between current state of system and desired future state of system.

Collaborated with stakeholders on gathering requirements for dashboard reporting needs.

Worked closely with software engineers in order to design efficient database schemas and queries.

Investigated problems related to production environment including performance issues, query tuning.

Analyzed business requirements for data solutions to ensure compliance with enterprise architecture guidelines.

CALIFORNIA San Francisco, USA

October 2016 - February 2017

DATA ARCHITECT MedAmerica USA

January 2013 - September 2016

LEAD DATA QUALITY ENGINEER/ARCHITECT BLUESHIELD OF CALIFORNIA San Francisco, USA

Responsible for setting up COE for data quality and Data Governance at the enterprise level

Developed Data Quality workflow routines using Informatica tool for data steward corrections and governance leads approval

Responsible for leading data quality team for designing, developing and implementing solution on time

Developed and maintained data pipelines to ingest and store large volumes of data from multiple sources.

• Redesigning the existing Data warehouse

Proactively identify opportunities for data quality operational process improvement and work cross-functionally to resolve issues

Proactively identify opportunities for data quality process methodology for projects at enterprise level

Provide technical leadership to drive successful completion of all Data Quality projects

Responsible for evaluating and implementing new technologies into the current Data Quality process

Act as approver on Data Quality related design, design documents and scripts

Reviews Development, QA and UAT Data Quality testing results to ensure quality

• Requirement gathering for Data Governance process workflow

• Designed data pipeline for daily incremental load from external source.

• Created and implemented complex business intelligence solutions.

• Responsible for Data Quality architecture and solution at enterprise level Developed Data Quality workflow routines using Informatica tool for data stewards corrections and governance leads approval

Responsible for leading data quality team for designing, developing and implementing solution on time

Responsible for coordinating with other managers to schedule team work and manage data quality resources across projects

Proactively identify opportunities for data quality operational process improvement, and work cross-functionally to resolve issues

Proactively identify opportunities for data quality process and methodology for projects at enterprise level

Provide technical leadership to drive successful completion of all Data Quality projects

Responsible for evaluating and implementing new technologies into current Data Quality process

Act as approver on Data Quality related design, design documents and scripts

Reviews Development, QA and UAT Data Quality testing results and ensures quality and adherence to SDLC processes

• Understand functional business processes across entire organization Assists with development and documentation of Data Quality processes and procedures as necessary

January 2007 - January 2013

SENIOR DATA QUALITY ENGINEER/ARCHITECT BLUESHIELD OF CALIFORNIA San Francisco, USA

February 2004 - December 2006

DATA ARCHITECT Brown & Toland Medical Group San Francisco, USA

Self-manage Data Quality workload for multiple systems in various phases of development in order to meet deadlines

Demonstrate total ownership, accountability, and commitment to Data Quality deliverables

Responsible for leading Data Quality team for data quality architecture and designing ‘single source of truth' data quality solution for provider system

Responsible for Designing and developing Data Quality solution for Provider Master Data Management (MDM) system

Responsible for Designing and Implementing Data Quality solution for Enterprise Data warehouse

Integrating Medicare and commercial provider data and designing and developing a dashboard and reports for calculating cost of healthcare using Hadoop (Cloudera)

Designed and developed process of integrating Medicare data from CMS and other sources to enable it to depict pattern of disease across region, race, age, sex and other parameters using Hadoop, SQOOP, HIVE and MapReduce

Developed Dashboard for COHC (Cost Of HealthCare) comparing PM/PM, Peer PM/PM by integrating data across enterprise and using Hadoop, MapReduce frame work, Hbase, Sqoop.

Evaluated existing processes and products to identify areas of improvement or non-conformance issues.

Investigated customer complaints related to product quality issues and identified solutions for resolution.

Performed root cause analysis of defects found during inspections or tests to determine corrective action plans.

Collaborated with cross-functional teams to develop solutions that improve overall product quality.

Performed data analysis to identify root causes of quality issues and developed corrective actions.

Designed, Developed and implemented complex Data Quality Solution from ground up using Informatica Data quality tool 9.1 for matching Medicare Providers against system of records(CAPS system)

• Designed, developed complex data Ingestion platform using Informatica Designed,Developed and implemented Business Data Quality rules for integrating and validating data from multiple provider systems to PIMS

(Mckesson system)

Standardized and Validated member and Provider name and address data using Informatica data quality tool 9.1

Designed, Developed and implemented Data Quality Alert system(Via email) when good data goes bad using Informatica Data Quality tool 9.1 and Power Center 9.1

Automated manual processes by creating custom scripts and programs using scripts (Java and C++, unix shell)

Designed and developed MDM solution using IBM Infosphere for provider system

August 2000 - February 2004

Developer KAISER PERMANENTE WALNUT CREEK, USA

• Administered development and production SQL server boxes Developed stored procedures which validates external data coming in from vendors for data quality like duplicates, typo's and erroneous data and this helps in saving lot of time for data management team and also helps users of data from spending time on data quality and data integrity check

Enhanced performance of lot of stored procedures that loads data to different database by indexing and re-indexing stored procedures

Created scripts that run and create tables, views and tablespaces using Oracle 8i

Administered Oracle database for different vendor application and interfaces

• Migrated MS Access database in to SQL SERVER 2000 Designed stored procedures with parameters for counters like dbfilepath, dbsize, server name, dbname from system table to estimate current and future database growth

Estimating database growth for future, theses values are exported to Cognos report which basically shows columns as graph, based on future estimation with percentage of growth

• Performance tuned SQL queries, stored procedures and scripts Logged different parameters for performance evaluation using SQL profiler

Then based on facts of each parameter can determine whether more memory or not

Developed stored procedure for re-indexing tables for optimum performance on weekly basis and this is done to avoid de-fragmentation

Extensively used T-SQL for queries and also DTS packages and stored procedures

Developed DTS packages for extraction of data from flat file with delimiters and transformed them to destination database tables

• Theses transformation are based on whether it is full load or incremental Designed and developed multidimensional OLAP cubes for Claims and member subject areas and developed PM/PM reports using COGNOS tool

• Dimension models are developed using ERWIN data modeler Developed enterprise wide data flow diagrams and developed ER diagrams for entire business of specific subject areas using ERWIN fusion tool

Designed and developed data marts for different business groups based on their requirements

Developed stored procedure for extracting data from flat file and inserting data to different tables

Debugged stored procedures for performance and created indexes for faster access of data there by improving performance

Developed database for claims and encounter data for different health plans

• Scheduled jobs for extraction of data from flat file

• Extensively used Cognos for production reports and scheduled reports.

• In HIPAA Project for three years as

Developed on XML, XSLT and generating scripts to load EDI transaction data in to the UDB database

• Responsible to build and maintain Entity Relation Diagram (ERD), Data March 2000 - July 2000

DEVELOPER/CONSULTANT PE BIO SYSTEMS USA

June 1998 - March 2000

CONSULTANT BELLSOFT INC ATLANTA, USA

Dictionary for each data model data's Tables (DB2/Oracle/Access) Administered oracle database with regular backup and recovery work and also created tables and stored procedures

Developed XML, XSL, CSS and XSLT that involved parsing the incoming data and loaded them into DB2 tables

Developed Java applications to test the data, which is imported in to database from Washington Publishing Company

• Created different ERWIN models for EDI transactions Helped the National HIPAA team to design and develop the application inventory Oracle database

Developed stored procedures, functions and packages for the business module

Developed front end oracle web forms and reports for inventory database

PL/SQL was extensively used for development of procedures and triggers

I have also extensively used SQL for database triggers and performance tuning

• Designed and developed data migration work for National HIPAA team I have worked on EDI transactions, such as 837,835,270/271,276/277 to transform from x12 format through KPC engine to transform it to application readable format

Mapping of EDI transactions with application format was done using proprietary application developed using java and xml.

Designed and developed Artificial DNA database for order entry group using Oracle

• Worked on HTML script for web interface

Responsible to build and maintain Entity Relation Diagram (ERD) and Data Dictionary for each data model

PL/SQL was extensively used for development of procedures and triggers

• Replicated SQL server database to oracle 8I.I

• Developed stored procedures for automation of judicial council data Responsible for Data Normalization, DDL creation, DB2 Object management, Backup and recovery

Responsible for data Load/ data conversion, and running DB2 utilities on demand and periodically

Developed reports using oracle reports for work management and time reporting department for bank

Designed and developed receipts settlement, EDI error reports using Oracle reports and forms for catalog Company

Worked on E-commerce project using BizTalk and it was used for getting invoice from vendors

• Developed reports using SQL and reports are run in night batch

• Migrated DB2 database to oracle database

PL/SQL was extensively used for development of procedures and triggers.

December 1997 - May 1998

CONSULTANT INDUSA TECH, FIRE RECORD MANAGEMENT USA June 1996 - November 1997

SYSTEM BAYNET SYSTEM, BEARING BANGALORE, INDIA

December 1992 - November 1996

Software Engineer OMC COMPUTERS INC BANGALORE, INDIA January 1991 - September 1991

CONSULTANT/ANALYST NOBEL SYSTEM BANGALORE, INDIA April 1990 - November 1990

Management Trainee HINDUSTAN AERONAUTICS LIMITED BANGALORE, INDIA

Designed and developed fire records management system for Chicago fire department

This system keeps track of information related to fire including cause of fire, injuries, number of deaths, medical aid etc.,

• Complex reports were generated using oracle reports PL/SQL was extensively used for development of procedures and triggers

• Extensively used PL/SQL for database triggers and performance tuning. Responsible for designing and developing using Object oriented language Product (Bearing ) inventory system

Developed Relational Database for storing Product and Customer information.

Extensively used PL/SQL for database triggers., Designed and developed online reservation system for travel agent

• Developed hotel management system

Developed forms for boarding details, customer details and front office details etc.,

Responsible for Data Modeling and analyzed data requirements and mapped them to business rules

• All Data structures were normalized to minimum 3-rd normal form Data validation, field edits and cross edits and referential integrity edits are enforced following business rules

• Developed DDLs for Database and Tables, Indexes, Views creation

• Developed stored procedures and triggers and generated reports. Worked on Digitizing the city of Chicago Fire Department drawings using Auto CAD

• Worked on 3 dimensional drawings.

Analyzed market trends in order to recommend product positioning strategies and tactics.

Managed all aspects of a project lifecycle from initial conception through final delivery.

• Design and drafting fuselage of aircraft using CATTIA software

• Design and drafting aircraft wings using CATTIA and Autocad. EDUCATION AND TRAINING

December 1996

Bachelor of Science Computer Application

Multimedia Institute Of Information Technology, Bangalore, Karnataka January 1994

Diploma Oracle

Tata Unisys, Bangalore, India

January 1993

Diploma Unix and C

Brilliants computer, Bangalore, India

IBM Infosphere (MDM

ABC

Lake House Architecture

Data Bricks University, Online

Snowflake Architecture

Snowflake University, Online

• Designed lake house architecture

• Designed dq scripts for validation using DLT in data bricks ACCOMPLISHMENTS

Profiled Member data using Informatica Data Quality tool 8.6/9.1for validating uniqueness, natural keys and valid values

Profiled provider data and claims data for duplicates and data integrity issues using Informatica data Quality tool 8.6/9.1

Profiled and validated data model for provider, member and claims subject areas using Informatica Data Quality tool 8.6/9.1

Profiled Member and Nurse Help data from the ALERE external vendor system for duplicates, data integrity and with specific business rules

Developed Data Quality solution for Data Marts and implemented Score card and trend reports for the Marts

Developed Data Quality name and address validation, business rules validation and Monitoring system using Informatica Data Quality tool 9.1 for ACO Project

Member data, Claims (pharmacy, vision, dental) data and Provider

(medicare and commercial) data was integrated to help facilitate both member and provider to view their respective information in real time

Developed Score cards and trending report for business leaders and stake holders to view the quality of the data over a period of time based on business rules

Automated manual Data Quality validation process using Informatica Data Quality 9.1and Power Center 9.1

Saved the company (350k per annum) by automating the manual Data Quality process

Designed and Developed Data Quality Solution for Provider Master Data Management (MDM) using Informatica Data Quality 9.1

• Trained on Big data (Hadoop and MapR).

Led team to design architecture standards and roadmap for the data across enterprise, earning recognition from upper management and financial reward.

• Consistently maintained high customer satisfaction ratings. Recognized as Employee of the quarter for outstanding performance and team contributions.



Contact this candidate