Post Job Free

Resume

Sign in

power Bi Developer

Location:
Lakeville, MN
Salary:
$50/HOUR
Posted:
December 03, 2023

Contact this candidate

Resume:

Kavitha Revoori

ad1nw6@r.postjobfree.com

612-***-****

SUMMARY:

Over 8+ years of strong experience in Business and Data Modelling/ Data Analysis Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, MDM, NoSQL, Metadata Management Services, and Configuration Management.

Working experience as a Data, BI Analyst, and Python Developer with a solid understanding of evaluating Data sources and Data Warehouse concepts by using SQL, Tableau, Python, SQL Server, Oracle, Snowflake, Databrick, ADF, Snow Python, SAS, OLAP, JIRA, SDLC.

Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe, Big Data model techniques using Python.

Hands-on Experience in building Complex Tableau dashboards to provide insight to the data.

Accomplished various tasks in big data environment which involved Microsoft Azure data factory Data Lake and SQL server.

Proficient in Requirements management life cycle, including Requirements gathering, Requirements elicitation, documentation, validation and tracking.

Very keen in knowing newer techno stack that Google Cloud platform (GCP) adds.

Can work parallels in both GCP and Azure Clouds coherently.

Familiar with GSA which provides a wide variety of data sets that can be freely used, re-used and redistributed by anyone.

Experience in creating Complex Spot fire Dashboards or Reports Using Tibco Spot fire Professional.

Create Cosmos scripts which pulls data from upstream structured streams and include business logics and transformations to meet the requirement.

Hands on experience with big data tools like Hadoop, Apache Spark, Hive, Pig, PySpark, Spark SQL.

Developed and maintenance of data pipeline on Azure Analytics platform using Azure Data bricks, python, Pysparks.

Experience with Trizetto QNXT System implementation, Claims and Benefits configuration set-up testing, Inbound/Outbound Interfaces and Extensions, Load and extraction programs involving HIPAA 834 and proprietary format files and Reports development.

Proficient in analysis, design and implementation of data warehousing/ BI solutions using Tableau, Alteryx, Informatica, MicroStrategy, Tableau and Database technologies.

Worked on inventory reports, savings reports, retail reports, Usage reports and designed interactive dashboards with KPI’s, Sub-reports, Graph, Scale, top bottom usage Categories etc.

Transforming and loading data into Azure SQL Database and maintaining data storage in Azure Data Lake.

Expertise and Vast knowledge of Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration (ELT) and Business Intelligence.

Extensive experience in development of T-SQL, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.

Application development VB6, Well versed in 3NF and Star Schema (Dimensional) Data Modelling techniques.

Experienced with Payor collects revenue through premium payments or tax dollars, processes provider claims for service.

Write Kusto (KQL) and Cosmos Queries (SCOPE) and publish the data into Power BI

Hands on experience with Modelling using ERWIN in both forward and reverse engineering cases.

Strong exposure to writing simple and complex SQL, PL/SQL queries.

Proficient in handling complex processes using SAS/ Base, SAS/ SQL, SAS/ STAT SAS/Graph, Merge, Join and Set statements, SAS/ ODS.

Redesign of queries for implementation in to SSIS (Visual studio 2010)

Manage departmental reporting system, Trouble shooting daily issues and integrating existing accesses database with numerous external data source including SQL visual studio 2010, Excel & Accesses

Experience in developing Map Reduce Programs using Apache Hadoop for analysing the big data as per the requirement.

Strong experience in migrating other databases to Snowflake.

In depth knowledge of software development life cycle (SDLC), Waterfall, Iterative and Incremental, RUP, evolutionary prototyping and Agile/Scrum methodologies, as well as data warehouse concepts such as OLTP, OLAP, Star and Snowflake Schema, data mapping, facts, and dimensions.

Excellent understanding of Microsoft BI toolset including Excel, Power BI, SQL Server Analysis Services, Visio, Access.

Sound knowledge of SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation, and maintenance of applications.

EDUCATION

MBA from Kakatiya University, India

TECHNICAL SKILLS:

Tools

Tableau, Spotfire, Informatica 9.6.0, MicroStrategy, Test Management Tool (QC 9.2, ALM 11), SVN, Shiny with R, Microsoft Visual studio 2013/2010, Kusto, GCP, Mainframe.

Retail Domain

Retail, IME (Information, media and Entertainment accounts), Health care domain.

Languages

SQL, R, Python, C, Eetlc, UNIX shell scripting, Azure data Bricks, Azure data lake, servicenow.

Database

SQL, Hive, Databases SQL-Server, My SQL, MS Access, Mongo DB, Teradata, Ultipro.

Data Science

Predictive Analytics, Fraud Detection, Pattern Mining, Sentiment Analysis, Anomaly Detection, Outlier Detection.

Machine Learning Algorithm

Classification (Random Forest, K-nearest neighbor, NaivesBayes, AdaBoost, SfdVM) Clustering (K-Means, Hierarchical clustering), Regression - (Linear Regression, Logistic Regression).

Analytics

Python (NLTK, NLP, Scikit-learn, Numpy, Panda, SciPy, plotly), R.

Data Engineering

Data Mining (Python, R, SQL).

BI Tools

Power BI, Tableau, Tableau Reader, SAP Business Objects, OBIEE, QlikView, SAP Business Intelligence, Amazon Redshift, or Azure Data Warehouse.

Operating Systems

Ubuntu, Windows.

WORK EXPERIENCE

Power BI Developer

Molina Health Care, CA Mar 2020 - Till Date

Responsibilities:

Working with Snowflake utilities, Snow SQL, Snow Pipe, and Big Data model techniques using Python.

Worked on creating filters, parameters and calculated sets for preparing dashboards and worksheets in Power BI.

Extensively used SAP Data Services Designer and Management Console.

Worked with ServiceNow data modeling to create and modify data models within ServiceNow, including defining tables, fields, and relationships between different types of data.

Worked with I and P Data and layouts patient information, Dates of service, Diagnosis codes, and Procedure codes.

Worked on Data analysis and visualization by using tools such as Excel, Tableau, or Power BI to analyse and visualize data from ServiceNow and other sources.

Using JCL to submit batch jobs to process large datasets, such as sorting, merging, or aggregating data.

Attending weekly status meetings and represented the MDM Data platform.

Data analysis of meter data in MDM and generating reports, identifying data mismatches and proposing data fix solutions.

Responsible for general reporting, training, auditing of data in Ultimate Software UltiPro

Working based on functional knowledge of MDM Data Manager, MDM Console as well as MDM Main tables, Lookup tables.

Developed rule-based applications using Drools and Java.

Analyze system functionality with customers, identify gaps and support UltiPro Payroll and HR system to meet existing and future business requirements

Manually execute workflows using workflow manager and task manager in Informatica 9.1 Powercenter.

Worked with QNXT Configuration Team for HIPAA Claims Validation and Verification Process.

Implement, configuration, data mapping and testing of UltiPro payroll systems and applications

Evaluated current QNXT system configuration for Medicare Advantage and Medicaid plans and recommended improvements to increase auto adjudication of claims.

Experience in GCP Dataproc, GCS, Cloud functions, Big Query.

Perform data comparison between SDP (Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Databricks, Spark SQL, and Python.

Create multiple GSA and Confidential reports. Extract information from multiple sources using Microsoft Access to merge specific information from various documents in developing a new report.

Duties included the implementation of SaaS-based Ultimate Ultipro HRIS including the Core HR, Payroll, UTM (Time Management) and Recruiting modules.

Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.

Analysed escalated incidences within the Azure SQL Database.

Maintenance of large data sets, combining data from various sources by Excel, SAS Grid, Enterprise, Access and SQL queries.

Expertise in Extraction, loading data from Oracle, DB2, SQL Server, MS Access, Excel, Flat Files and XML using informatica and Talend.

Worked on ADF integration for Design and Development of ETL integration.

Practical understanding of the Data Modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact, and Dimension tables.

Involved in running Map Reduce jobs by processing millions of records.

Written complex SQL queries using joins and OLAP functions like CSUM, Count, and Rank etc.

Worked on apache Spark SQL and Data frames for faster execution of Hive queries using Spark SQL Context.

COBOL code I used to transform data into a format that is more suitable for analysis. For example, you may need to convert data from a legacy system into a format that can be loaded into a modern database.

Environment: SQL/Server, SAS/BASE, SAS/SQL, Collibra, SAS/Graph, Macro, Snowflake, MS-Office, Informatica, ER Studio, XML, Hive, HDFS, Flume, Sqoop, Talend, R connector, Python, R, Power BI, Spark SQL, SAP, AWS, Pyspark, Kusto, GCP(Google Cloud), Alteryx 11.x, Tableau, Mainframe, Payroll, TeradataV12, ASAP.Net, Payor, Code set data, Product owner, claims using QNXT, MDM, Drools, apache spark, Servicenow.

Power BI Developer

Northern Trust Bank, Chicago, IL Apr 2017 - Feb 2020

Responsibilities:

Developed a reporting solution suite with several Power BI reports related to membership data.

As part of Project, interpreted business requirement and translate them into Power BI model, report and dashboards, created Data pipelines to extract data from Enterprise Data Warehouse (EDW)using ETL.

Create reports like, marketing campaign & promo codes performance. Inventory days of supply analysis zip codes online VS retail sales analysis. SAP

Extensively used SAS/Base procedures and functions for data manipulations and creating summery data values.

Written SQL queries in SQL Server, AWS's Redshift environment using SQL workbench.

Developed Stored Procedures, Views and Complex Queries on Kusto and SQL Server.

Responsible to Deliver Proof of Concepts using AWS, understanding customer business requirement

Worked on Talend Bigdata integration Suite for Design and Development of ETL/Bigdata code and Mappings.

Re-write some Hive queries to Spark SQL to reduce the overall batch time.

Utilized Python packages & Spark SQL for initial data visualizations and exploratory data analysis.

Used SAS Enterprise guide for SAS for data access and hoc reporting.

Performed configurations management and change controls in TFS and created deployment plans to implement SSIS packages and SQL scripts from development environment to test and production environment.

Managed workflow process in Collibra via Activiti.

Create Pyspark frame to bring data from DB2 to Amazon S3.

Validate pre-sales packages created by another SAS programmer.

Designed and developed parameterized reports, drill down reports, drill through reports, matrix reports, sub reports using SQL Server Reporting Services according to the business requirement.

Designed and developed several Power BI and SSRS Reports. As example, built a Dashboard using KPI's for Hospital and Professional Billing in Revenue Cycle. Created new metrics to use them for KPI Scorecard for senior level leadership and published them to Power BI App workspace

Developed ADHOC date analytic platform using Power BI Desktop.

Created a Python scripts for data processing to ingest data into Azure Data Lake and Azure SQL Database.

Configured POWER BI data gate ways in power BI service for data refresh

Developed compelling and interactive analysis reports and dashboards using Data Analysis Expression Language (DAX).

Developed analysis reports and visualization using DAX functions like table function, aggregation function and iteration functions.

Involved in deploying and configuring reports according to the requirements.

analysed data models, data relationships, and dependencies for greatest optimization.

Created at least two to three mock-up report/ dashboards development and enhancements and Integrated custom visuals based on business requirements using Power BI desktop

Developed complex SQL queries using stored procedures, common table expressions (CTEs) to support calculated fields in Power BI Dashboards.

Worked with VB scripts and Unix shell scripting’s for file validations.

Administered user, user groups, and scheduled instances for reports in Power BI.

Analysing the data using Statistical features in Power BI to develop Trend Analysis.

Complete awareness and implementation of SDLC in projects. Accessing, manipulating, analysing, and visualizing data from large databases using a variety of tools and techniques, such SQL, Python, and Power BI.

Developed advanced SQL queries, Stored Procedures using SQL Server Management Studio (SSMS)to use them in building ETL packages and to generate data reports.

Performed performance tuning on SQL queries, stored procedures using SQL profiler and adapted different fine- tuning methodologies. Performed operations like Data reconciliation, validation, and error handling after extracting data into SQL Server.

Have delivered more than 100 AD HOC reports on time to 100% user's satisfaction.

Environment: MS SQL Server 2017/2019, SAS, Microsoft Management Studio, Microsoft Visual Studio, MS SQL Server, Spark SQL,SAP, AWS, Pyspark, Kusto, GCP(Google Cloud), GSA.

BI Developer

Axis Bank Limited- Mumbai, Maharashtra Jan 2015 – Nov 2016

Responsibilities:

For Chat-BI project, data was related to chat messenger conversation happened among the customer and Customer care executive.

Gathered the data from the various sources including web, manual application forms and APIs into an intermediate data warehouse.

Developed SSIS packages to extract the data from the sources. Using various transformations cleansed the data to make it ready for the analysing and visualizations. Worked on the Reports module of the project as a developer on MS SQL Server 2008/2008R2 (using SSRS, T - SQL, scripts, stored procedures, and views).

Built dashboards using SSRS, Power BI for the business teams to take cost effective decisions.

For Market Budget Planning and Resources (MBPR), Pulled data using Informatica PowerCenter and SSIS packages into the data warehouse.

Analysed the existing SSIS packages in the production server for their time durations. Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.

Analyzation of the budget per each product and creation of reports for the business team so that they could invest on the increase of the sales per product using SSAS and MS - Excel.

Optimized the stored procedures and re- write it to withstand the incoming unknown garbage during real time.

Visualized the data using MS- Excel and SSRS and Power BI. Created sub reports, bar charts and matrix reports.

For Commerce Data Platform, ensuring that DW/BI standards and guidelines are used to develop the information models and database designs.

Developed SQL Server Integration Services packages (SSIS), BI solutions to extract and analyse a very large volume of data.

Consolidated and automated the extracted data and developed analysis cubes using SSAS to analyse the budget per each product and creation of reports for the business team so that they could invest on the increase of sales per product using SSAS and MS – Excel.

Enhance (or create - if new system) system logical data model to meet requirements, applying the Data

Modeling Standards and the logical Data Naming Conventions. Apply the same enhancements in the physical data model, using the Data Modeling Standards and the physical Data Naming Conventions appropriate for the specific back-end database.

Developed different centralized Data warehouse and de-normalized structures to enhance productivity so that these applications will help to store and analyse the data and improve the query performance. Written.

Advanced SQL queries to improve Data Reliability, efficiency, and quality.

Environment: Microsoft SQL Server 2012/2008, SQL Server Integration Services (SSIS) 2012/2008, Power BI, Windows server 2008, Visual Studio 2012/2008, RDBMS, DTS, SSIS, SSRS, SSAS, T-SQL.



Contact this candidate