Post Job Free
Sign in

Power Bi Data Specialist

Location:
Brampton, ON, Canada
Posted:
July 30, 2025

Contact this candidate

Resume:

**********@*****.***

647-***-****

MANDAR DANI

PROFESSIONAL SUMMARY

A highly adaptable and results-oriented IT professional with a strong foundation in understanding business needs and translating them into effective technical solutions. Proven ability to elicit, analyze, and document requirements across the software development lifecycle, with expertise in business process analysis, system analysis, and data management. Extensive experience collaborating with stakeholders to deliver value-driven out comes, possesses a diverse background across Insurance, Healthcare, ERP and Education domains, with a track record of contributing to process improvements, data-driven decision-making, and successful project delivery.

EMPLOYMENT SUMMARY

GBC, Toronto Sep 2016 – Jun 2025

Role: Business System Analyst, Data Analyst

ETL, Data Integration projects & Power BI Reporting:

Involved in project Creating reports and dashboards using Power Bi, initiative - Strategic Insights & Data-Driven Decision Making. Projects: Academic quality decision making dashboard, Analytics for online learning improvement.

Completed data mapping, detailed data lineage document. Designed & developed data extraction strategies.

Power Query - Conducted data analysis including data profiling, cleansing, and validation, to ensure data quality, consistency, and integrity for reporting and decision-making purpose.

Designed and developed data models (star schema) and created interactive dashboards and reports using Power BI to visualize data, identify trends, and provide actionable insights to business users, developed DAX measures, solutions.

Utilized Power BI features such as bookmarks, drill-through reports, and KPIs to enhance report interactivity and storytelling, effectively communicating data-driven insights to diverse audiences.

Produced charts & reports for stories like Demographic info, Registration conversion rate, successful program ranking, retention rate, satisfaction survey report, legacy trends vs. current trends etc.

Designed and implemented robust data ingestion pipelines within Azure Data Factory, included creating ETL workflows responsible for extracting data from data source, handling initial transformations like data type conversion, basic filtering and efficiently loading data into the Bronze (raw) layer of ADLS Gen2.

Used Azure Databricks Secrets to securely store and manage credentials for accessing external data sources,

Databricks Notebooks to read the raw files landed by ADF from the raw staging area in ADLS Gen2. Removed or imputed nulls, Identify and remove duplicate records, ensure correct data types, standardize date formats, string cases.

Defined Azure Databricks jobs – job chaining, define running intervals, alert events.

Delivered multiple ETL and data integration projects for Government agencies and external agencies

Led requirements gathering and analysis phase for ETL and data integration projects, collaborated with SMEs and Product Owners to define project objectives and translate business needs into detailed Business Requirements Documents, Software Requirements Specifications and Data Mapping Documents, change request documents, impact analysis - used Jira & Confluence to store.

Designed steps for Data integration process (inbound, outbound data), Completed outbound file extraction programs and inbound data ingestion, data profiling, clean up and mapping. Automated process of integration through shell script, SQL Loader & PLSQL code (extract data, upload & download data from/to FTP and updated application database, error handling) written for Delta & Full loads.

Worked with QA team in understanding requirements, changes in existing process to be tested, prepared test cases and test data, analyze output of test case and supported UAT team to sign off project. Followed continuous development & Integration promotion cycles (Development – Test – Pre-prod – Production).

Maintenance & enhancement of ERP System:

Worked on multiple enhancement & process improvement projects - Registration, Program configuration, Transfer Credit, Authorization workflow, Apply on behalf, Elective configuration etc.

Worked on strategic projects with tight deadline to implement covid19 related changes and then reverting back keeping enhancements and fixes.

Attended requirement gathering meeting with client, SME and project manager, involved in completing BRD, user stories & SRS documentation – used Harvest, Jira, Confluence for documentation purpose.

Coordinated with Quality Assurance (QA) and User Acceptance Testing (UAT) teams, ensuring requirements traceability, developing test plans, and facilitating user testing to validate solutions and ensure successful system implementation.

Development: Converted multiple workflows into single page application using ajax, html, CSS, Vanila JavaScript & jQuery.

Quickly learned new technologies & tools and delivered on time: learned page builder tool for web development – delivered multiple modules – backend Oracle, EVISIONS Argos reporting tool. Generated Management Information System (MIS) reports using Argos, providing data-driven insights to support strategic decision-making and business planning.

Solved production tickets, hot fixes, data fixes, enhanced Banner ERP by introducing new app pages (oracle forms/ PL SQL procedures, functions)

Enhanced system efficiency by identifying and resolving complex technical issues. Fixed reports, optimized queries, views, recommended improvements, ensuring optimal performance levels were maintained.

Technologies: GIT, JIRA, Confluence, Power BI Desktop, AJAX, JavaScript, Argos Reports, Oracle, PL/SQL - packages, procedures, functions, triggers, views, Bash shell script, jQuery, Putty, File Zilla, Toad, BANNER ERP system, Page Builder

Mercer, Toronto Sept 2015 – Sept 2016

Role: BSA / Onsite Project Lead for L&T Infotech Work Location: Toronto, ON

Participated in project planning activities, including defining project scope, identifying resource requirements, and selecting team members (including offshore resources).

Scheduled and facilitated knowledge transfer sessions with Subject Matter Experts (SMEs) to ensure effective communication and knowledge sharing between business and technical teams and to understand application priorities and business needs for bringing multiple applications into managed service mode.

Documented applications information in Knowledge Capture Documents, including application overviews, technical architectures, data dictionaries, process flows, Screen flow, Transaction flow, testing processes, Batch processes, Typical problem areas, Reports, Environments & Release information

Coordinated project execution, assigned work items to team members, tracked progress, and provided regular status reports to management, serving as the primary point of contact between business stakeholders and offshore development teams.

Implemented security fixes, migrated classic asp applications to IIS6 to IIS 7.5, supported in UAT.

Worked on support & maintenance of applications in production by resolving defects in predefined SLAs, modified/ enhanced database procedures, functions, maintained Asp.Net code.

Value adds: Worked on front end classic Asp code fixes, end to end application testing.

Achieved Customer Satisfaction Index 4.8/5.0

Technologies: Oracle 11g, PL/SQL, SQL Server 2008, TSQL, Classic Asp, JavaScript, Asp.Net, C#, PL/SQL Developer, SSMS, TFS, VSS, Visual studio 2012, SharePoint

L&T Infotech Ltd Aug 2014 – July 2015

Project - COE for Insurance product Duck Creek Work Location: Pune, India

Offshore Project Lead for Business Development support activity

Coordinated COE launch for ‘Duck Creek tool for insurance solutions’ and team development.

Designed training modules (domain + technical training), Data shredding & integration use case development, Team building strategies. Presented Duck Creek Capabilities for partnership with Accenture

Played role in first implementation (phase 1) through duck creek COE for North Bridge

Delivered POC for data shredding to staging tables, transform and load to enterprise integration database.

Tools & Technologies: Duck Creek, SQL Server, SSMS, .Net, share point, Hadoop.

Munich Re, Toronto, ON Mar 2012 – Aug 2014

Role: System Analyst and Data Analyst & Administer Work Location: Toronto, ON

Contribution:

Conducted requirements gathering meetings with business stakeholders to understand business needs and translate them into detailed Business Requirements Documents (BRDs) and functional specifications. Created Knowledge Capture Documents repository which include Application Overview, Technical Architecture, Data dictionary, Screen flow, Transaction flow, Batch processes, typical problem areas, Reports, documentation on share point platform.

Collaborated with development teams to define and document system changes, including PL/SQL packages, procedures, functions, and views, ensuring solutions met user requirements. Updated holding tables and metadata tables of data integration process models.

Value add: Asp.NET, C# defect fixing, Crystal reports enhancement, end to end testing

Worked closely with the testing team, creating test data and supporting user acceptance testing (UAT) to validate deliverables and ensure system quality.

Played key role in streamlining functional Enhancement releases, production defect fix and production data fixes release process and branching, integration of code & testing of system catering to primary policy, claims and billing administration system for commercial insurance business.

Achieved performance enhancement by tuning report queries by 90%, worked on database & query Tuning.

Achieved customer satisfaction ratings above 4.5/5 by precise requirement analysis and timely delivery.

Technologies: Oracle, PL/SQL, ASP.NET, JavaScript, Crystal Report 2008, TFS, Share point

Windsor Health Insurance, Bellingham, WA, USA Feb 2011 – Mar 2012

Project: ICD9 to ICD10 conversion, 5010 upgrades for 837-EDI Work Location: Bellingham, WA, USA

Role: System Analyst, Onsite Project Lead

Completed project before deadline, mandated date mentioned by statutory body CMS. CMS required all healthcare payers to accept and produce electronic (837-EDI) files in the ANSI X12 5010 format.

This requirement includes update to HIPAA real-time transactions sets like Eligibility Benefit Inquiry/Response, Claims Status Inquiry/Response, Claims acknowledgement and Medical claims with subtypes for Professional, Institutional and Electronic remittances.

Understood existing system and modified according to new requirements that is updated XML reading process, altered or designed new staging tables & holding tables, introduced or altered packages, procedures, functions, views.

Defined & delivered work items for team in line with business & delivery dates.

Translate requirements into tech specs, communicate with Business, offshore, QA.

Introduced new logic for logging errors in batch processing & data validation for profiling.

While implementing new changes there were issues reported in the existing system. Implemented & managed fixes in new & old running environment and kept current (old) system in running mode.

Technologies: Oracle 10g, PL/SQL, PB9.0, SVN, SQL Developer

Harleysville Insurance Company, Harleysville, PA Aug 2008 – Jan 2011

Name: Group Benefit Platform for Group Insurance Work Location: Pune, India & Harleysville, PA

Role: Project Lead & BSA for L&T Infotech

Coordinated with teams located in 3 geographical locations.

Participated in requirement gathering phase, involved in brainstorming meetings.

Worked with front end team in designing DB objects as per requirements.

Involved in designing Conceptual & logical data model and developing database, designed Procedure, functions, triggers, Views.

Designed and implemented Database driven workflow.

Implemented complex integration, in which data is sent & received between Legacy & GBP application.

Created database deployment package using DB Pro tool for various environments like development, test, Pre production and production.

Completed Feasibility study of enhancements and change requests.

Visited onsite for major releases and to sort out issues.

Technologies: SQL SERVER 2005, 2008, INFOPATH, Visual studio 2005, Team Foundation Server, Red gate, SSMS, Share point

Education

Post-Graduation Diploma in Business Management

Post-Graduation Diploma in Advance Computing

Bachelor Of Engineering (Electronics)

Certification

Certificate in Data Science – University of Toronto

Introduction to Cyber Security, Network Basics – Cisco Networking Academy

IKM Oracle 11g PL/SQL Programming



Contact this candidate