**** ****** ***** ****, ***** Land, TX, 77479
Certified in EMR EPIC Clinical Data Model, Epic Clarity ASAP module, Caboodle Data Model, Caboodle ETL Administration.
Lead Data Architect with strong business understanding knowledge of HealthCare, Oil & Gas, Manufacturing, and Pharma domains.
Project management experience in running the upgrade projects from planning to production implementation by developing and maintaining an Enterprise-wide Business Intelligence roadmap
Extensive working knowledge in Epic Clinical, Clarity data model, Caboodle data model and Slicer Dicer.
Data integration expert with over 13 years of experience in designing dimensional data models and architecting Data warehouse/ETL solutions.
Experience in using Epic Hyperspace and solid understanding of Chronicles data structures.
Vast experience in defining the overall Business Intelligence data architecture and leading ETL initiatives through planning, designing, developing and deploying Data warehouses and data integration solutions.
Complete life cycle experience working in incremental and Agile project methodologies with proven experience in facilitating daily scrums, spring planning, spring review, and spring retrospective
Certified Informatica Administrator and Developer with experience in design, development and implementation of data integration and data warehouse initiatives
Lead Informatica and Tidal Administrator responsible for software installation, upgrades, configuration and system administration, designing DR solutions, systems failover, grid processing, NT authentication security setup, firewall setups, SFTP setup etc.
ETL TOOLS : SAP BO Data Services 4.1/3.x, Business Objects Data Integrator, Informatica PowerCenter, Informatica Data Quality, SSIS
REPORTING/ANALYTICS TOOLS : Tableau, Webi, Cognos, Netuitive, SSRS, SSAS
DATABASES : Oracle, MS SQL Server, MS Access, NoSQL
OPERATING SYSTEMS : UNIX, Windows NT/XP, LINUX
DATABASE PROGRAMMING : SQL, PL/SQL, TOAD, T-SQL, Rapid SQL
SCHEDULING : TIDAL Enterprise Scheduler
LANGUAGES : Python, UNIX Shell Scripting
SOFTWARE (OTHER) : MS Office, Visio, HP Mercury Quality Center, Remedy
ALERT SYSTEMS : Alarm Point, NetIQ
Big Data Ecosystems : Apache Hadoop, Map Reduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Storm, Kafka
HARRIS HEALTH SYSTEM March 2015 - Current
Lead BI Data Architect
Led the implementation of Epic Caboodle Data Warehouse which provides analytical solutions to Healthy Planet by bringing together information from various sources within and outside of Harris Health System for data management and reporting. As part of Caboodle project, implemented the ED Visits, Hospital Admissions, Orders, and Visits universes based on Epic standards. Responsible for custom Caboodle development to map data from Non-Epic sources to Epic released data models.
Played a lead role in setting up the architecture for Slicer Dicer Infrastructure which is a self-service reporting tool that provides Providers with intuitive and customizable data exploration abilities to sift through large population of patients. Installed the Slicer Dicer and worked with cross functional teams to roll out the Epic Ambulatory applications - ASAP, ED, Hospital Billing, and Beaker Lab data models in Slicer Dicer.
Responsible for leading the Epic Clarity, Caboodle, Slicer Dicer and BI upgrade projects by working with cross functional teams and providing architectural leadership, guidance, and support.
Responsible for all the administration activities for Caboodle and Slicer Dicer overseeing the Clarity and Caboodle nightly ETL processes, troubleshooting and resolving errors, working with Epic and SQL Server DBA to improve efficiencies performance tuning the jobs, managing system configurations, applying upgrades and SUs, verifying the slicer dicer jumpstart processes daily.
Owned the Caboodle’s overall heath by collaborating with related stakeholders – including Epic, IT Technical Services, Healthy Planet, Clarity stakeholders
Mentored staff in developing universes using Epic Caboodle Universe Builder tool and assisted the team with Reporting Workbench, Clarity requirements and Datalink troubleshooting.
Primary Architect in Development and support of the Ambulatory Quality of Care data model which captures all the objects related to patient-encounter data points for the Quality Indicators represented in the Harris Health Quality Monthly Board of Managers Scorecard based on HEDIS compliance criteria for metrics such as Colorectal Cancer Screening, Breast Cancer Screening, Cervical Cancer Screening, Controlling Blood Pressure, Diabetic Eye Metric, Diabetic Foot metric, A1C, Adult and Pediatric immunization metrics. This module provided Harris Health with a true Self-Service analytics environment.
Translated regulatory requirements from the State of Texas and turned them into effective reports for DSRIP (Delivery System Reform Incentive Payment) which rewards Harris Health System for meeting performance milestones on defined care delivery improvements.
Primarily responsible for developing application data and ETL solutions for enhancements to Optime, Inpatient, Willow, Orders and other modules under the Epic EMR umbrella thereby making the reports available to customers.
Executed design sessions to gather technical requirements, review, approve and communicate design artifacts with stakeholders.
Facilitated meetings and led the discussions with functional groups to develop, clarify and interpret specifications for Clinical information and financial systems reports as needed.
Involved in Reporting Workbench Administration and Epic-Crystal Reports Integration tasks.
Supported the implemented BI solutions by monitoring and tuning queries and data loads, addressing user questions concerning data integrity, monitoring performance, troubleshoot database issues and communicate functional and technical issues.
Developed physical modeling standards and strategies ensuring data Integrity, quality, consistency, security and performance.
Responsible for the documentation of well-defined methods, procedures, programs and configurations in the delivery of application services.
Primary contributor for the Technical Architecture Design documents, and Technical testing strategies templates for Harris Health
Experienced working in a fast paced setting.
BP Dec 2012 – Dec 2014
Lead Product Engineer
Lead Middle ware Product Engineer responsible for setting up the infrastructure and architecture for Tidal, which is an enterprise wide scheduling tool.
Primary architect responsible for the upgrade and porting the applications to Informatica 9.5.1 on Windows platform.
Designed and configured Informatica domain, repositories, and Integration services.
Implemented high availability using Informatica Grid technology, LDAP authentication.
Single point of contact with vendor support in acquiring software, licenses, patches and coordinating software bug fixes.
Created, managed, and maintained Informatica connections including Oracle, SQL Server, ODBC, Web Service Consumer, SAP/R3, SAP/BAPI, SAP BW, FTP and performed Informatica and TIDAL code migrations from DEV->STAGE->QA->PROD environments.
Used Change Management practices to promote code migrations from test to production environments.
Coordinated with UNIX and Windows teams on server patches with safe server outages, shutdown and startups.
Successfully designed and managed Extraction, transformation, and load (ETL) architecture for integrating BP trading systems into an enterprise data warehouse for regulatory reporting.
Lead the design and development of major cross-functional projects / initiatives and partner with solution architects on the implementation of complex data systems.
Designed common staging and data warehouse dimensional model around Trade, Mark to Market and exposure subject areas using Ralph Kimball dimensional modelling principles.
Responsible for data analysis and creating source to target mappings.
Created common ETL framework such as ETL mapping templates, Error Handling processes, pre and post load processes such as database stats gather and index rebuilding processes.
Conducted database design and code reviews.
Responsible for creating and transitioning L1/L2/L3 support model for ETL; mentoring and coaching production support team on application support.
Responsible for database tuning with 24*7 ETL processes with in tight Service Level Agreements.
Developed ETL mappings using Informatica Power Center.
DEVON ENERGY Dec 2010 – Nov 2012
Lead Integration Consultant
Primarily responsible for the installation and administration of Tidal scheduling platform.
Installed and administered Informatica architecture and Informatica MDM.
Migrated code from Local/Central repository to different environment repository
Used various De-duplication techniques Data Quality Transforms and other logics
Responsible for designing ETL processes sourcing SAP data into operational data warehouses for various regulatory reporting
Responsible for gathering data requirements, data analysis, and for creating ETL detail design documents for each target tables (Fact and dimension tables)
Created High level/detailed level design documents and also involved in creating ETL functional and technical specifications
Created complex Informatica mappings for designing Type 2 dimensions, periodic snapshot fact tables and reusable Informatica mapplets consisting of common transformations
Streamlined Informatica migration processes using deployment groups
Created standardized ETL control tables for capturing data load statistics, errors and exceptions
Created pre and post load procedures such as index rebuilds, stats gather
Implemented automated daily status reports on data loads, exceptions and errors through ETL
Responsible for Unit testing, integration testing and coordinating system and User Acceptance testing.
ACCENTURE/SHELL TRADING Aug 2009 – Dec 2010
Responsible for developing Data Warehouse solution, sourcing the data from Endur and populating information on the crude commodity in the functional areas of Risk, Settlements and Accounting, Operations, and Deal Capture.
Responsible for developing Informatica mappings to load data from flat files to stage, and from Stage to data warehouse using Informatica Power Center 8.1
Worked with the PM and business owner to identify their detail requirements and prepare Logical design documents.
Worked with data modeler to review and define the LDM and PDM for all data structures.
Created functional and technical design for ETL transformations.
Responsible for tuning database queries, developing best practices for ETL process, setting up error handling methodology and production support for all scheduled data movements
Responsible for coding, testing and deploying ETL mappings and drive quality into the resulting deliverables.
Coded PL/SQL scripts for pre session and post session tasks
Created Technical design document, End to End Application Approach document and Production Run book document
Leveraged past standards and environments of Shell Blue Print to create a set of technical standards to guide design and build.
Worked with the infrastructure teams to plan Development, Test, and PROD.
Created and revised specs for the ETL batch load components, and suggested and implementing improvement to the exiting batch load structures. Code and deploy batch components.
Worked with control-M team to schedule informatica jobs.
Used HP Quality Center and IBM Clear Quest to create and manage the Defects.
Responsible for Informatica repository management such as creating individual and project folders, adding users to proper groups, granting appropriate privileges to the development team
Responsible for deploying Informatica code components from DEV to INT and for creating deployment package for higher environments
Proficient in SQL,T-SQL, PL/SQL – Procedures, Functions, Triggers and SQL Performance Tuning
HP June 2007 – July 2009
ETL Data Integration Developer
Involved in the development of Operational Data Stores with SAP/BW as the primary source system feeding specific internal reporting projects for Global Material Planning
Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures.
Responsible for development of Oracle stored procedures for index management, pre/ post load processes and UNIX scripting for workflow execution
Tasks included data quality validation to ensure consistency across all customer facing servers.
Responsible for the collection, transformation and publishing of data in various formats dictated by end-user applications
Create Analysis and Design document and MTP (Move to Production) packages.
Responsible for scheduling, supporting, and monitoring workflows in TIDAL scheduler.
Tested and implemented software enhancements and hot fixes.
Used Change Management practices to promote software releases from development to acceptance and production environments
Responsible for monitoring and maintaining over 900 tidal jobs
Experience in full life cycle implementation of various Cognos Business Intelligence Reporting applications
Johnson & Johnson Jun 2006 – Jun 2007
Involved in design and development of Summit Lab, data management software to support large-scale blood testing center operations.
Responsible for the design and development of NBCS, a sub system to store blood-testing results in a centralized data center using Oracle 126.96.36.199/ Oracle 8.0.5 on AIX UNIX 4.3 platforms
Assisted the development team for developing External Run Controls ERC sub system.
Performed onsite Analysis of Business processes of National Blood Collection Service NBCS.
Designed business model and screen layouts for National Blood Collection Service.
Designed and developed application logic libraries, and interface programs using PL/SQL.
Developed Test cases for Unit Testing, Integration testing and System testing.
Master of Science in Computer Science – University of Houston.
Bachelor of Technology in Electrical & Electronics Engineering – Jawaharlal Nehru Technological University
Caboodle Administration Certification
Caboodle Data Model Certification
Clinical Data Model Certification
Clarity Data Model - ASAP Certification
Certified Informatica Administrator and Developer
Trained in SSIS, SSAS, and SSRS tools.
Trained Informatica Data Quality developer
Trained Hadoop Administrator