Post Job Free

Resume

Sign in

Power Bi Data Warehouse

Location:
Crowley, TX
Posted:
March 13, 2024

Contact this candidate

Resume:

JOSEPH NKOH

ad4bqv@r.postjobfree.com / 972-***-****

Summary:

Accomplished Developer-Analyst(Data-Steward) / Admin with 12 years of experience in the IT industry with skills in Data Mining, Analysis, Edit/Validation, Testing-Troubleshooting, Production-Support, BI, Complex HEDIS/BI reporting, MDM, Database / Data warehouse development, deploying various software applications, Enterprise systems and updating Ticketing system, which includes 8 plus years of experience in Complex analyses, Production-Support, Excel, O365, BI, Azure, HIPPA/SOX-Compliance, Data-Governance, Hadoop, SQL and BIG DATA related Technologies in an Agile/Scrum/Jira-QMS. HEALTHCARE-OIL&GAS-RETAIL-INSURANCE-FINANCIAL DOMAIN.

Bilingual in French and English

Technology Focus/ Tools

POWER-BI

AXIOM SL-Migration

RELTIO-MDM System

GITHUB/ GITLAB

Black, White / Regression Testing

ZEPPELIN

JENKINS

NETVIZ

DATA-WAREHOUSE

IDASHBOARD

SharePoint, CMS-data

Salesforce

JIRA QMS, CONFLUENCE

SSIS / SSAS / SSRS (Custom HEDIS Reports)

IBM/JITBIT-Ticket systems

KPI Data

Office 365 Framework

DevOps

Snowflakes

Data bricks

AWS -S3 BUCKETS

INFORMATICA

ORACLE SQL DEVELOPER

POSTGRE SQL

PL /SQL, MySQL, SSIS

MS SQL server, ORACLE DB

AZURE-Synapse / Data Factory ETL

ACCESS

EXCEL- Pivot Tables, VLOOKUP, Macros

HADOOP, SPARK, MapR, SCALA

SQOOP, HIVE-IMPALA, Airflow.

CLOUDERA

SAS - Enterprise Guide etc.

Power Automate, Power Pivot

Visual Basics

Microsoft Visual FOXPRO

ServiceNow - Ticket system- ITIL

EDUCATION

Bs - Computer science In PROGRESS - UNIVERSITY OF WISCONSIN -Milwaukee.

BIG DATA /POWER-BI TRAINING / SCRUM MASTER Certification in progress!

Clients:

CENTENE Corporation

CUMMINS INC.

PERATON-CMS.GOV

Austin water (City of Austin TX

Express Script Cigna

IQVIA

P.D. A Corporate

OPTUM-UHG

HMS - Healthcare

HCA-Healthcare Syst

CHRONOLOGICAL SUMMARY OF EXPERIENCE

CENTENE Corporation. (Hire-Talent)

DATA ANALYST (Power BI.) June 2023 – Feb 2024

100% Remote USA.

Build custom visuals Power-BI reports with complex parameters and ad hoc reports using business requirements gathered from stakeholders with data-connection from EDW. Also, QA the deployment process from DEV to TEST and PROD.

Download and configure SAS Enterprise Guide and Enterprise Miner for SAS code running/retrieval, creating Macros, generating efficiency reports, and performing quality control using SAS procedures.

Write and update SQL codes in Teradata (SQL Assistant) for RFP report building and initiative in all client states.

Drive initiative to Import Disaster data into EDWP, file preparation, conversion, code writing and importation, and QA.

Create Jira User stories, Task/ sub-task and update them per 2 weeks Sprints to track work progress and attend daily standup meetings and weekly touch base.

Great understanding of Centene Data / Terminologies, the various teams, Access Central, Centene University, OneDrive, SharePoint, knowledge-share/training session

Converted and Import Excel Patient Claim files into SAS Claim files and loading them into SAS Enterprise work environment for analysis purpose.

Perform User Acceptance Testing and Regression testing as final production step, Code reviews, to advise Stakeholders and Business team on best validation practices and showcase product stability.

Maintained and supported RFP data and initiative in all client states by running Queries within Teradata Studio for reporting and validation.

Performed Data, Logic and Metadata QC, QI, and update approval process (accepting merge request and check on changes) using; Jenkins, Databricks, Snowflakes, DAS, Zeppelin, and GitHub.

Moved Power Pivoted Data and Logic Notebooks from Gitlab to Databricks and removed duplicates withing the notes.

Maintained, Review and Document Medicaid Data dictionary for Medicaid reference Pivot tables and peer review data source request Docs (MSR) / SharePoint.

CUMMINS INC. (RG-Talent)

Sr. POWER-BI / ANALYST Jan 2022 – June 2023

100% Remote USA.

Create custom visuals Power-BI reports with complex parameters from Data Lake using Data factory ETL process, AWS S3, SharePoint, SQL servers and other data-sources for statistical analysis and Jira-QMS purposes.

Troubleshoot and fix Power BI report auto-refresh issues, Gateway issues. Manually refresh entire report datasets and Republishing report.

Healthcare Payers, and Healthcare organizations by utilizing tools like; Excel, RELTIO MDM, Access, SQL, and RevUp-CARE Care management Systems.

Utilize Reltio to onboard new data sources or connect to new consuming systems design, develop, test and review & Optimize MDM solutions.

Create Relationships and hierarchies across people (payers and payees), products, and places with connected graph

Managed data from external provider to properly class MDM data components (customer category, sub-category, Customer ID.)

Performed data quality check in compliance with ISO, SOX, HIPPA and Company data quality procedures and standards using Jira-QMS.

Create Jira User stories, Task/ sub-task and updates them per Epics and work progress Refinements including Managing CMS-Jira QMS, UAT/Regression process documentation and Gitlab work Backlogs.

Appended and merge multiple Pivot tables, adjust and create new relationship between tables for proper modeling and reporting-Power automate.

Created complex DAX measures and formulas to calculate averages, sums, Data-Time Fiscal year, and filters to return various reports based on set parameters.

Migrate and convert data-files daily from AWS S3 buckets into SharePoint where BI report refreshes from at set times.

Actively participated in Power-BI subscriptions and Cloud analytic best practices for report migration from ON-PREMISEs to AZURE.

Created, comment and update Jira User Story on DevOps and document best user practices for confluence Jira.

Ran Model SQL Queries (Scala & SQL) on Snowflakes and Databricks for data conversion and UAT / Regression testing to ensure data quality.

Integrate data using SSIS into Power Bi desktop for transformation and creating Automated Power reports.

SQL Data Engineering-Sr. Adviser. (Kforce)

Express Script – Cigna Feb 2021 - Dec 2021

100% Remote USA.

Developed Oracle database objects, including tables, views and materialized views using Oracle SQL Developer / DMV, Spark-SQL Queries to ensure data integrity and correction with CMS data.

Designed SQL procedures and modified existing codes to correct errors, to adapt it to new environment and improve performance within the migration pipeline.

Advised and Analyzed project requirements to find bugs and eliminate issues within a timely manner within CMS Enterprise JIRA.

Created Jira stories, Task/ sub-task and updates them per Epics and work progress Refinements.

Managed CMS-Jira, Confluence and Gitlab work Backlogs, Code reviews, advise engineering team on best migration practices to mitigate fails and bugs.

Performed API automation testing for integration, regression and security using: Lambda and Selenium.

Monitored the Sqoop jobs; Schedules, Run-time errors and validate the process from Oracle to Postgre SQL through Hive with Spark as a processing platform and Airflow-Nifi for Scheduling/monitoring.

Validated Oracle Developer DB: validate multiple Safire databases (15) to the Main Mother-Database to confirm uniformity and Database Tables, fields, and Schema compatibility.

Heavy documentation of various processes and procedures of validation in Jira-Confluence, monitoring and troubleshooting failed jobs following UAT/Regression testing procedures to validate DB compatibility / Data correctness and quality following data governance rules.

PARATON-CMS (Apex System)

DATA STEWARD / SQL BI DEVELOPER July 2018 – Feb 2021

100% Remote USA.

Maintained and updated documentation in Jira-Confluence for all analytic and Data quality management purposes.

Create Jira User stories, Task/ sub-task and updates them per Epics and work progress Refinements including Managing CMS-Jira-Confluence UAT/Regression process documentation and Gitlab work Backlogs.

Perform User Acceptance Testing and Regression testing as final production step before product (Software) launch, Code reviews, to advise Stakeholders and Business team on best validation practices and showcase product stability.

Create, Maintain, and authenticate security VPN / ID; EUA ID, EFI, OKTA-Peraton, JIRA-QMS, Q-QIP, CMS-VPN, AWS-Auth.

Performed Tableau Test rail testing for data convention purposes using; DAS and Gitlab

Maintained and supported CMS-FPS data conversions and security with EUA / EFI CMS authentications.

Ran Queries within Visual Studio, DAS, Snowflakes and Databricks for Comparison purposes and SSIS validation.

Ran Metadata from beginning to end in MTE and PROD and attend VRR meeting for Modelers approvals, Also Run Net Viz nodes in Git / fps portal in UI. (YEMO- files)

Performed Data, Logic and Metadata QC, QI, and update approval process (accepting merge request and check on changes) using; Jenkins, Databricks, Snowflakes, DAS, Zeppelin, and GitHub.

Moved Power Pivoted Data and Logic Notebooks from Gitlab to Databricks and removed duplicates withing the notes.

Maintained, Review and Document Medicaid Data dictionary for Medicaid reference Pivot tables and peer review data source request Docs (MSR).

AUSTIN WATER- (City of Austin TX) (KForce)

Power-BI SQL Developer April 2017 - July 2018

100% Remote USA.

Created custom visuals Power-BI reports with complex parameters from AWS S3, SharePoint, SQL servers and other data-sources for statistical analysis purposes. (Utility-data)

Appended and merge multiple Pivot tables, adjust and create new relationship between tables for proper modeling and reporting-Power automate (Utility-data)

Created complex DAX measures and formulas to calculate averages, sums, Data-Time Fiscal year, and filters to return various reports based on set parameters.

Actively participated in Power-BI subscriptions and Cloud analytic best practices for report migration from ON-PREMISEs to AZURE. (Utility-data)

Created, commented and updated Jira User Story on DevOps and documented best user practices for confluence and Jira-QMS.

Ran Model SQL Queries (Scala & SQL) on Snowflakes and Databricks for data conversion and UAT / Regression testing to ensure data quality.

Integrate data using SSIS into Power Bi desktop for transformation and creating Automated Power reports.

Data Analyst-Stewardship/ Tech Support. (PDA Corporate)

P.D.A Corporate Jan 2014 - March 2017

DFW- USA

Built various types of reports and performed basic to complex Claim data Integrations / analysis (Excel, BI-Visuals, Macros, SSIS-packages etc..) using Sales-Claim Data within salesforce and SQL Server, Data-warehouse, SAS Scripts, cloud databases and other lock-servers.

Utilized Informatica PowerCenter to Extracting data from heterogenous sources, transform these data per Business requirements and load the data into a target DB or warehouse.

Mined, Converted, Parsed, Closed, and recreated Sales iDashboard Data into Power-BI visuals using SQL Queries in the SQL/ Salesforce database.

Responded to and resolved technical-network, software, and Sales force issues from employees within the Company through the JitBit Ticketing system.

Wrote and adapted SQL queries / SAS Scripts from the JitBit Ticket summary to create Visual, HEDIS-reports, Power-BI reports, and Perform Pipeline abalysis, Server monitoring, updates and recovery following SOX compliancy.

SAS Analyst/ Data Stewart (Apex System)

Optum- UHG June 2013 - Dec 2013

Remote USA.

Download and configure SAS Enterprise Guide for SAS code running.

Using SQL queries to Min, Format (Data cleaning) and Map Claim records from the SQL Database.

creating Macros, generating efficiency reports and performing quality control using SAS procedures.

Converted and Import Excel Patient Claim files into SAS Claim files and loading them into SAS Enterprise work environment for analysis purpose.

Created Service Now (SN) tickets and Subtask to escalate various stages of the SAS job.

Created basic to complex Excel Spreadsheets, Crystal/Custom-reports and Pivot tables based on Business or end user requirements with Claims Data from SQL server and Data warehouses.

Performed Unit / code and Software Testing, Pipeline analysis, Edit and validation of data Migrated data for accuracy and SOX compliance purpose.

Data Mining/ Jr. Azure- Analyst (US Tech Solutions)

HMS-Healthcare Jan 2013 - April 2013

Irving, Texas

Integrated / imported huge volumes of Patient Claims Data from MS-Excel to MS-Access using SSIS packages, Convert Text file formats into Access file formats.

Analyzed existing Patient Insurance Claims Data to identify faults like; Duplicate records, mix-match Patient Medicaid and pharmacy records, unpopulated fields in patient records etc.

Prepared SAS datasets (with pharmacological/340B/ RX data) to perform complex and sophisticated Analysis, statistics, and Storage on Azure platform.

Used MS-Access to build Database with basic tables to store data, forms to view, add, and update data in tables withing the Azure Platform, also produce printable Epic-reports with specific layout.

Used MS-Excel to identify and remove duplicate records, reorganize field records.

and populate Patient Pharmacy records to match Medicaid records.

Designed Basic and Complex SAS codes procedures and modified existing SAS codes to correct errors, to adapt it to new environment and improve performance.

Download, configure and use Microsoft Visual FoxPro to systematically map Patient Pharmacy records to Patient Medicaid records there by validating the existence of “no duplicates and missing records” in the Insurance Claims Database.

SQL Admin- Developer- Analyst II

HMS-Healthcare ( US Tech Solutions) May 2012 - Dec 2012

Irving, Texas

Pulled data from multiple Data warehouses / Sources using Power BI direct- Connect, ODBC Connectors and Excel VBA to create visuals reports and Pivot Tables.

Coordinated statistical data analysis, design and develop application components including Oracle packages, stored procedures, triggers, extensions, views, embedded HTML, and customization as part of the development team.

Identified and documented detailed business rules and use cases based on requirements analysis.

Addressed faults in software or Customer Complain ticket auto generated by the Zendesk Ticket system Complains and update system after every ticket.

Recommended data standardization and usage to ensure data integrity.

Use BI SSIS, SSAS and SSRS to extract, store and retrieve data from its data engine for online analytical processing. Also produced interactive Data Visualization products focused on Business requirements.

Developed database objects, including tables, views and materialized views using SQL/ DMV Queries.

Designed SQL procedures and modified existing SQL stored procedures to correct errors, to adapt it to new environment and improve performance.

Analyzed project requirements to find bugs and eliminate issues within a timely manner.

Participated in the testing process through test review and analysis, test witnessing, and certification of final output.

Produced Variety of BI-SSRS Basic/ Interactive report types, Visual reports, Graph reports, tabular reports, spreadsheets based on Business intelligence.

Improved data gathering, analysis and visualization procedures with strategic optimizations.

Analyzed, Edited and Validated data for migration from T-SQL to Hadoop Environment.

Attended the daily/weekly meetings and stand-up calls on Agile/Scrum sprint status and delivery status.

Participated in the Designing, developing, and maintaining business intelligence solutions within the company, also involved in crafting, and executing queries upon request for data.

Presented information to Production through reports and visualization using SSRS and Power BI-Dashboard / Epic reports.

Environment: Data warehouse, SSIS, SOX-Compliance, Microsoft SQL Server 2017, Epic-reports, Microsoft Visual FoxPro, Tableau 10.2, Microsoft Excel, Power- BI, AWS, Hadoop platform, ServiceNow, Hive- Impala. Microsoft - Access

Jr. Hadoop Adm. / Analyst (Infotech)

HCA - Dallas TX Jan 2012- June 2012

HCA is a leading Healthcare management Company focused on helping to improve the health care system. The Projects involved were migration projects, moving a legacy system from the pivotal SQL/ORACLE environment to the HADOOP environment.

Responsible for coding Map Reduce program, Hive queries, testing and debugging the Map Reduce programs.

Used Sqoop tool to extract data from a relational database / Data warehouse into Hadoop.

Worked closely with data warehouse architect and business intelligence analyst to develop solutions.

Responsible for performing peer code reviews, troubleshooting issues, and maintaining status report.

Involved in creating Hive Tables, loading with data, and writing Hive queries, which will invoke and run Map Reduce jobs in the backend.

Installed and configured Hadoop cluster in DEV, QA, and Production environments.

Performed upgrade to the existing Hadoop clusters.

Enabled Kerberos for Hadoop cluster Authentication and integrated with Active Directory for managing users and application groups.

Worked with systems engineering team for planning new Hadoop environment deployments, expansion of existing Hadoop clusters.

Monitored workload, job performance and capacity planning using Cloudera Manager.

Environment: MS- Microsoft SQL Server Data Tools 2017, SSIS, Data warehouse, Tableau, Excel (Pivot tables etc.) Epic- reports, ServiceNow, Hadoop ECO system tools using Hive, Scala, Spark, Squoop, MapR, AWS, Oracle-DB, HIPPA/ SOX Compliance.

POCs –

Executed a POC to implement a business use case that requires migrating datasets from SQL Server to Hive and migrate few SQL Server programs to Spark/Scala applications. As part of the POC, a few partitioned hive tables were built along with a few master tables to store the use case tables. T-SQL programs were migrated to Spark/Scala applications and performance benchmarking done with Legacy application.

As part of Hadoop and Spark adoption, I participated in several Proof of Concepts for evaluating and benchmarking the big data technology stack and sample use cases.

Executed a Streaming Proof of concept where streaming data was ingested and processed using Kafka and Spark Streaming and data saved into HDFS and Hive tables. As part of the POC, real time service logs were streamed into hive table using Kafka and Spark streaming Framework.



Contact this candidate