Post Job Free
Sign in

Senior SQL/MSBI/Data Analyst & Azure Specialist

Location:
Lake Dallas, TX, 75065
Posted:
May 05, 2026

Contact this candidate

Resume:

AKSHAY MOHITH KODURU

SR.SQL/MSBI/DATA ANALYST/ ENGINEER 408-***-**** ************@*****.***

PROFESSIONAL PROFILE:

•Over 9 years of experience in Microsoft Database Technologies, that includes experience in Database Design and Development of SQL Server 2008/2012/2014/2016/2018/2019 for Healthcare, Public and Gaming Domains.

•Highly skilled Azure Data Platform Specialist with over 4 years of experience managing, implementing, and optimizing data solutions on the Azure cloud platform. Expertise in deploying, securing, and maintaining Azure Synapse Analytics, Azure Data Factory, Power BI, Azure Machine Learning, and SQL Server-based systems, ensuring seamless and high-performance data operations.

• 3+ years of experience in Snowflake, including setting up virtual warehouses, creating schemas, performance tuning with clustering keys, and implementing security roles and RBAC policies.

•Extensive experience in Database Design, Data Modeling, T-SQL Development, Developing and Implementing ETL Packages using SSIS and Complex Reports using SSRS, Power BI.

•Experience in writing Efficient T-SQL queries, Sub-queries, Dynamic queries, Stored Procedures, User Defined Functions, Database Triggers and Exception Handlers.

•Expert in configuring and creating SSIS solutions for high performance Data Integration, Data Migration and Business Intelligence Process that includes Merging Data from various Heterogeneous Data Stores, Populating Data Warehouses and Data Marts, Cleaning and Standardizing Data.

•Experience in implementing and managing Event Handlers, Package Configurations, Logging, System and User-defined Variables, Check Points and Expressions for SSIS Packages.

•Experience in managing and automating Control flow, Data flow, Events and Logging programmatically using Microsoft .NET framework for SSIS packages.

•Experience in Administering SSIS packages such as Scheduling package execution, backing up and Restoring packages, Monitoring Integration Services Performance and activity, and implementing Security considerations for SSIS Packages.

•Expertise in OLTP/OLAP system study, good experience building Star schema and Snowflake schema depending on the business data requirements.

•Hands on experience on all phases of Software Development Life Cycle (SDLC) of Data warehousing, including requirement gathering from business users, analysis, design, creation of mapping documents, specification writing, development, testing, and implementation.

•Hands on experience on using MS Excel, Configuring SharePoint sites with Team projects in TFS.

•Extensive knowledge of Designing, Developing and deploying various kinds of reports using SSRS using relational and multidimensional data.

•Experience on administration tasks such as data loading, batch jobs, data unloading, backup & Recovery, user and application table management, upgrades, creating databases/File groups/files/Transaction logs.

•Extensive knowledge in Tuning SQL queries, T-SQL queries and improving the performance of the Database. Experience in Administering, Managing and Troubleshooting Reporting services.

•Knowledge in developing OLAP Cubes by using SQL Server Analysis Services (SSAS), and defined data sources, data source views, Dimensions, Measures, Hierarchies, Attributes, Calculations using multi-dimensional expression (MDX), Perspectives and Roles.

•Proficient in building interactive Power BI dashboards and data models using DAX, Power Query, and M Query.

•Familiar with Agile, Waterfall, Bill Inmon and Ralph Kimball methodologies.

•Excellent communication, interpersonal and analytical skills.

EDUCATION: Master of Science, California State University Fullerton.

TECHNICAL SKILLS:

Databases

MS SQL Server 2019/2018/2016/2014/2012/2008, MS Access, DB2, Mainframe, Netezza, Azure SQL.

SQL Server Tools

SSMS, Configuration Manager, Enterprise Manager, Query Analyzer Profiler, DTS, SSIS, SSAS, SSRS, Database Tuning Advisor, BI tools (Report Studio, Analysis Studio, Query Studio).

ETL-Tools

DTS, SQL Server Integration Services (SSIS), Azure Data Factory.

Reporting Tools

SQL Server Reporting services (SSRS), Crystal Reports, Power BI.

RDBMS

SQL Server 2008R2/2012/2014/2016/2017/2019.

Programming

T-SQL, PL/SQL, Dynamic SQL, MDX.

Operating Systems

Windows 10/Vista/XP/2008/2003, Windows 9x, UNIX.

PROFESSIONAL EXPERIENCE:

WSSC Water

Laurel, MD Sr. SQL/Data Warehouse/Report Developer Apr 2022 – Present

Project Title: Application Support and Migration

The Consolidated Engineering System (CES) will be migrated from the IBM Mainframe to the E-Permitting Application. The Planning Division has been using the mainframe system to obtain the CES Information. With this migration, the planning division will be able to perform all the functions of the CES application in a user-friendly web-based platform (E-permitting).

Job Duties and Responsibilities:

•Design and develop T-SQL Scripts, Procedures, and functions to implement complex business logic as part of data extraction processes.

•Work on Crystal Reports and SSRS (Report Builder) to update and create new reports.

•Coordinate with the IT/Application team to implement all T-SQL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications.

•Troubleshot and resolved SQL query failures, data mismatches, and ETL job failures, minimizing downtime and business impact.

•Performed root cause analysis (RCA) for recurring production issues and implemented permanent fixes to improve system stability.

•Optimized complex T-SQL queries by rewriting joins, indexing strategies, and eliminating unnecessary scans, improving performance by up to 30–50%.

•Monitor permitting system performance and identify system performance bottlenecks and resolve system issues identified by end users.

•Implemented Azure Data Warehouse to consolidate data from customer databases, and online transactions. Using ADF, ingest data into Azure Data Lake Storage, process it in Azure Synapse (dedicated SQL pool), and visualize trends in Power BI dashboards used by business managers and executives.

•Migrated legacy Crystal Reports to Power BI, improving report performance and data visualization capabilities. Translated complex business logic from Crystal Reports into Power BI DAX and Power Query for enhanced data modeling and reporting.

•Improved data refresh rates and report load times by optimizing ETL processes in Power BI.

•Perform SQL Server tools like SQL Performance optimization using SQL Server Profiler and Database Engine Tuning Advisor.

•Designed and implemented automated data pipelines using SQL and Python, reducing manual reporting effort by 40% and improving data refresh frequency from weekly to daily.

•Built and maintained forecasting models (regression, time series) to predict demand and resource allocation with 95% accuracy, directly supporting strategic planning.

•Designed and developed an enterprise SQL Server data warehouse, integrating data from ERP, CRM, flat files, and APIs.

•Implemented dimensional data models (star/snowflake) using Kimball methodology, improving query efficiency and scalability.

•Built and maintained multiple data marts (Sales, Finance, Customer) aligned with business requirements.

•Developed scalable SSIS ETL pipelines for data extraction, transformation, and loading into staging and warehouse layers.

•Developed VBA macros and Excel automation to consolidate raw data feeds from multiple sources, cutting manual processing time by 30+ hours per month.

•Implemented Change Data Capture (CDC) for near real-time tracking of data changes, enabling incremental data loads for downstream systems.

•Developed end-to-end Power BI dashboards by transforming raw data into actionable insights using Power Query, DAX, and advanced data modeling in SQL Server.

•Built and maintained large tabular data models in Azure Analysis Services, ensuring data accuracy and high performance for enterprise reporting needs.

•Developed executive dashboards in Tableau and Power BI that delivered real-time KPIs for supply chain and operations, cutting decision-making time by 25%

•Built and maintained SSIS packages for automating data extraction, transformation, and loading (ETL) processes from multiple source systems into the data warehouse.

•Engaged directly with business stakeholders to gather reporting requirements, demo dashboards, and iterate on designs to ensure alignment with business objectives.

•Utilized Python and R scripts to automate data cleaning, statistical analysis, and predictive modeling, enhancing reporting capabilities.

•Optimized SQL Server queries and stored procedures, reducing dashboard load times and improving ETL efficiency by 30%.

•Participated in cross-team initiatives focused on modernizing reporting architecture, migrating legacy reports to Power BI while ensuring data accuracy and performance.

•Collaborated with stakeholders to convert legacy SSRS reports into modern Power BI paginated reports, aligning with enterprise reporting standards and branding.

•Developed data quality checks and exception handling scripts, ensuring 99.9% data accuracy in executive reports.

•Developed PowerShell scripts to automate administrative tasks in cloud data environments, including resource provisioning and scheduled backups.

•Migrated legacy on-prem SQL Server and Oracle data marts to Snowflake, refactoring ETL logic and redesigning data models to leverage Snowflake’s shared-disk architecture and compute-storage decoupling.

•Participated in Agile/Scrum ceremonies (daily stand-ups, sprint planning, retrospectives), contributing to 2-week sprint cycles and achieving consistent delivery targets.

•Document all test procedures and coordinate with business analysts/users to resolve all requirement issues and maintain quality for the same.

Environment: Microsoft SQL Server 2018 Enterprise Edition, SSRS, T-SQL, IBM Mainframe, ADF, Azure Synapse, Azure SQL, Azure Data Lake Gen2, Azure ML, SQL Tuning Profiler,Kimball, Crystal Report Enterprise, Power BI Desktop, Service now, Html, VB script, ASP.NET, AWS, GCP, C#.

Microsoft

Seattle, WA Sr. SQL/ETL/Azure/Power BI Developer Mar 2021 – Mar 2022

Project Title: Business Application Group – Power BI Reports

Purpose of the project is to build Power BI reports to enable a line of business with 4000+ employees, across multiple geographic locations, to attend a mandatory in-person training event closest to their location within a short period of time. As the solution grew in popularity across multiple organizations, we saw the opportunity to extend the solution and make it capable of supporting more complex requirements.

Job Duties and Responsibilities:

•Designed SSIS Packages to extract, transfer, load (ETL) existing data into SQL Server from different environments for the SSAS cubes.

•Created SSIS packages in the solution explorer (In EXTRACT, TRANSFORM AND LOAD) and writing SQL query in data flow task, variables, error handling process and writing functions.

•Designed and implemented Azure Data Factory pipelines to orchestrate and automate ETL processes for cloud-based data integration. Monitored and optimized ADF pipeline performance, reducing execution time by using parallelism and activity optimization.

•Deploying the packages in the solution explorer to catalog which is in management studio. Effectively loaded data into relational database using ETL tool.

•Effectively worked in production support and managed the quality assurance for SSIS packages.

•Lead the implementation, maintenance, and enhancement of a robust Azure-based data warehouse and analytics platform, including Azure Synapse Analytics, Data Factory, Azure ML, Power BI, and Data Lake Storage Gen2.

•Oversaw data pipelines (ETL/ELT) developed in Azure Data Factory, integrating structured and unstructured data from investment systems into Synapse and curated zones in ADLS.

•Administered and tuned Azure Synapse dedicated SQL pools, serverless queries, and Power BI semantic models, ensuring high performance, low latency, and cost-optimized compute usage.

•Conducted monitoring and performance analysis using Azure Monitor, Log Analytics, and custom KQL dashboards to track pipeline health, query performance, and system uptime (SLA 99.9%+).

•Managed and supported SSAS Tabular models, SSIS packages, and SQL Server instances (on-prem and Azure-hosted), including patching, role-based access controls (RBAC), and scheduled jobs.

•Defined and implemented KPI frameworks to track fulfillment performance, enabling leadership to identify process gaps and achieve a 15% improvement in SLA compliance.

•Performed platform troubleshooting, root cause analysis, and escalated incident response for BI reporting failures, data refresh delays, and integration errors.

•Designed and maintained end-to-end data pipelines for ingesting and transforming structured and semi-structured data from APIs, flat files, and RDBMS into cloud-based data lakes and marts using best practices for scalability and fault tolerance.

•Led security audit activities, including periodic access reviews, key rotations, compliance validation, and data masking implementation across sensitive financial datasets.

•Collaborated with cross-functional teams to resolve production incidents, providing timely updates and ensuring SLA compliance.

•Developed CI/CD pipelines for infrastructure (Bicep), Synapse scripts, and Power BI datasets using GitHub Actions, Azure DevOps.

•Developed Python scripts to automate data extraction, transformation, and validation routines across multiple systems, improving data pipeline efficiency and reducing manual intervention.

•Created and maintained source-to-target data mapping (STTM) documents in collaboration with business analysts and data modelers, ensuring accurate field-level mapping and transformation logic across systems.

•Participated in after-hours on-call rotation, supporting business-critical trading analytics and compliance dashboards during peak market activity and release weekends.

•Applied Agile delivery practices, including sprint planning, JIRA backlog grooming, and DevOps release coordination, to support a scalable and flexible BI platform.

•Created Power BI reports and published them to the external website which was used the business users all over the world. Developed and maintained Power BI dashboards tailored for both internal stakeholders and external clients, ensuring data accuracy and performance.

•Managed data security protocols by implementing Row-Level Security (RLS) for secure data access across internal and external user groups. Managed database backup and recovery strategies, ensuring minimal downtime and data loss during system failures.

•Built status-at-a-glance visualization reports integrating SQL, Excel, and BI tools to monitor operational performance across multiple business units.

•Utilized Power BI (Power View) to create various analytical dashboards that depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers enabling end-user to make filters.

•Configured and managed ODBC connections on user’s systems to ensure seamless connectivity between applications and databases.

•Worked with Team Foundation Server (TFS), perforce as the source for version control and project/issue management.

•Participated in code reviews and design discussions, promoting data quality, reusability, and pipeline observability using logging, exception handling, and alerting mechanisms.

Environment: Microsoft SQL Server 2018 Enterprise Edition, MS Access, ETL Packages (SSIS), VB.Net, Visual Studio 2019, Team Foundation Server (TFS), SSRS, T-SQL, Azure Data Factory, Synapse, Azure SQL, Azure Data Lake Gen2, Power BI, Power Apps (model driven and automate), Sybase, Rapid SQL, SSAS.

AIM Specialty Health

Chicago, IL Oracle /SQL Server /ETL/DW Developer June 2020 – Feb 2021

Project Title: Data warehouse Upgradation & ERR-ODS Migration

The goal of this project is to upgrade all the legacy ETL processes (2008) to newer version (2017). Deploy all the upgraded packages to the 2016 server and decommission the legacy processes after successful upgradation.

Job Duties and Responsibilities:

•Among a team of developers who successfully completed Data Warehouse upgradation project which involved upgradation of 1500 SSIS packages from 2008 to 2017 version.

•Involved in Complex projects like ERR-ODS migration and Data Mart upgradation to work with end users and migrate all the IT processes from on-prem to azure servers.

•Managed end-to-end ERP migration processes, including data extraction, transformation, and loading (ETL) from legacy systems to new ERP platforms, ensuring data integrity and compliance with industry standards.

•Developed, modified, and optimized hundreds of ETL/SSIS packages from Package deployment model to Project Deployment Model.

•Used Embedded/Dynamic SQL Queries inside the ETL packages so that they can be dynamically deployed into Dev, Test and Production Environments.

•Built SSIS Packages for integrating data using OLE DB connection from heterogeneous sources (Excel, CSV, Oracle, JSON, XML, flat file, Text Format Data, Sybase, Netezza etc.).

•Extracted and linked data from multiple sources such as Clarity, Membership, and Claims Submission systems, ensuring completeness and accuracy of Medicare and Medicaid claim data.

•Collaborated with business stakeholders to understand KPIs and deliver actionable insights via OBIEE dashboards and Oracle Data Visualizations.

•Migrated reports and dashboards from OBIEE 11g to OAS/OAC, ensuring performance tuning and functional parity.

•Built complex PL/SQL packages, procedures, and functions to support custom ETL logic, aggregations, and source-level transformations for OBIEE reports.

•Utilized Databricks and Synapse Analytics to process large volumes of structured and semi-structured data, streamlining workflows for analytics and reporting purposes.

•Utilized MedInsight analytics platform to aggregate and analyze healthcare data, providing actionable insights for cost reduction and quality improvement initiatives.

•Performed data-driven analysis using MedInsight to identify trends in healthcare utilization, optimize provider performance, and improve patient outcomes.

•Implemented robust data validation and quality control checks to identify anomalies and ensure compliance with regulatory requirements in Medicare and Medicaid claims processing.

•Worked with cross-functional teams to integrate EHR/EMR, clinical, and claims data, ensuring the seamless flow of information for improved patient care and operational efficiency.

•Collaborated with cross-functional teams to support data-driven initiatives, contributing to strategic improvements in claims accuracy, timeliness, and compliance.

•Automated recurring data tasks using Python and SQL scripting, reducing manual effort and increasing the reliability and timeliness of data pipelines.

•Designed, developed, and tested BizTalk applications to automate business workflows, integrate legacy systems, and facilitate communication between multiple enterprise applications.

•Developed and optimized complex SQL and PL/SQL stored procedures, functions, and triggers to process large healthcare datasets.

•Created Jobs Performance report that queries system tables to track the duration of each job and weekly-average duration using SSRS.

•Used to be point on contact on client end for monitoring the ETL Jobs while coordinating with the Off-shore team.

•Designed ETL pipelines to transform and load large volumes of claims, enrollment, and provider data from QNXT into enterprise data warehouses.

•Created SQL-based reporting solutions for Medicaid, Medicare, and Commercial health plans by extracting data from QNXT core tables.

•Collaborated with clinical and IT stakeholders to translate Epic EHR data into actionable insights for population health and care quality initiatives.

•Created SQL-based validation scripts to ensure data integrity between source EMR records and reporting data in Caboodle.

•Designed Cubes with Star Schema using SQL Server Analysis Services (SSAS). Created several Dashboards and Scorecards with Key Performance Indicators (KPI) in SQL Server Analysis Services.

•Identified the dimension, fact tables and designed the data warehouse using star schema.

•Created visualization reports and dashboards using Power BI desktop and published them to the Power BI Service.

•Performed change impact analysis during Epic upgrades to assess downstream effects on Caboodle reports and reporting layer SQL logic.

•Used TFS for Code control, Creating Areas, Iterations, Project Alerts, Issue reporting, document management, Builds, bug, change request, Issue, Task, requirement, review, Risk, Shared Steps, Test Case and Queries etc. in a DEVOPS Environment.

Environment: Microsoft SQL Server 2017 Enterprise Edition, MS Access, ETL Packages (SSIS), OBIEE 12c, Oracle Analytics Server (OAS), and Oracle Analytics Cloud (OAC), VB.Net, Visual Studio, Team Foundation Server (TFS), SSRS, T-SQL, MS Visual Studio 2017, IBM System I navigator 2009, IBM Client Access, Power BI, Sybase, Rapid SQL, Autosys 2016, Biztalk server 2016, SSAS.

Boyd Gaming Corporation

Las Vegas, NV SQL/SSIS/SSRS/SSAS/ETL Developer July 2018 – May 2020

Project Title: Data Intelligence Growth and Governance (DIGG) / Promotions

DIGG is a strategic initiative of Boyd Gaming Corp to centralize all the data residing in separate systems and model the data into one single database. This is a high visible project at our client location. They need someone who can help migrate data and ensure data completeness, Data Transformation and Data Quality.

Job Duties and Responsibilities:

•Among a team of developers that successfully completed a complex data conversion project that was designed to migrate records to Enterprise Data Warehouse.

•Worked on Pinnacle EDW Integration project for manipulating and transforming the data in Marketing and Promotions Sector.

•Building Stored Procedures and ETL automation Process for loading the data from staging tables into the target databases, which handles data rejection, substitution, correction and notification without modifying data.

•Creating complex SQL scripts to generate stored procedures to implement complex business logic and to verify agreed Historic data (incremental) brought into the EDW.

•Assist in writing T-SQL scripts to build database objects like table, index, and view that compliments the intelligence reporting needs of stakeholders.

•Replicate the transformation logic specified in the mapping document using SQL and compare the result with ETL result.

•Executed SQL conversion pre-validation and post-validation scripts to avoid data loss and to make sure that the Count, redemption amount for each source matches the data loaded in EDW.

•Responsible for fixing critical defects addressed during data conversion.

•Ensure data completeness, data quality by validating that all records, all fields and the full contents of each field are loaded.

•Utilize data engineering techniques to effectively present data flow to users and help reconcile gaps in various systems.

•Involved in creating and customizing SSAS Cubes using SQL Server Analysis Services for Hotel and Casino DataMart. Use Analysis service processing task to process the analysis cube in SSAS after DW run.

•Automated database synchronization tasks using Red-Gate tools, ensuring consistency across development, testing, and production environments.

•Created visualization reports and dashboards using Power BI desktop and published them to the Power BI Service.

•Define and enforce Azure security best practices for all environments (DEV/TEST/QA/PROD).

•Designed and optimized SQL queries and stored procedures to extract, transform, and analyze large datasets from SQL Server, Snowflake, and Azure Synapse Analytics.

•Conducted statistical analysis and hypothesis testing using Python (pandas, NumPy, SciPy) to drive data-driven decision-making.

•Conducted trend analysis and KPI reporting, identifying opportunities for cost savings and revenue growth.

•Participated and aware of all sorts of agile ceremonies (user story grooming, sprint planning, sprint retrospective).

Environment: Microsoft SQL Server 2017 Enterprise Edition, MS Access, ETL Packages (SSIS), VB.Net, Visual Studio, Team Foundation Server (TFS), Micro Strategy 9.2.1, T-SQL, MS Visual Studio 2017, IBM System I navigator 2009, IBM Client Access, DB2, Netezza, SSAS.

Prospect Medical Systems

Orange, CA SQL/SSIS/SSRS/SSAS/ETL/SSRS Developer May 2017 – June 2018

Project Title: Claims/Lab/RX integration

The project is to create an ETL Automation process for loading the 834 and supplemental files raw data to be stored into the database. The main purpose is to store this data to generate report based on the state requirements.

Job Duties and Responsibilities:

•Among a team of developers that successfully completed a complex data conversion project that was designed to migrate millions of records to a new server (SQL Migration).

•Worked with providers and Medicare or Medicaid entities to validate EDI transaction sets or Internet portals. This includes HIPAA 4010, 837, 835, 270/271, and others. Provided healthcare provider problem resolution.

•Used to extract all the Eligibility Files like 837, 835, 999 and 834 from Facets and then used to migrate the data according to client requirement.

•Updated mapping tables related to assigned functional areas as directed by the Business Analysts.

•Created and modified tables, views and indexes; PL/SQL stored procedures, functions and triggers according to business requirement.

•Executed SQL conversion pre-validation and post-validation scripts to avoid data loss.

•Built Stored Procedures and ETL automation Process for loading the claim files into the target databases. Actively involved in creating SSIS packages to transform data from OLTP database to a data warehouse.

•Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user communities.

•Created checkpoints, configuration files in SSIS packages and Deployed SSIS packages with minimal changes using XML configuration file.

•Developed dictionaries and data mapping documents required for Claim load process.

•Built SSRS reports and creating alerts to notify the department if the files from vendors are missing in each period.

•Responsible Maintaining MTR claim Reports and modify them accordingly as per the user request by heat service tickets.

•Coordinating with business users in analyzing business requirements and translating them to technical specifications.

•Developed several sales, marketing, finance reports using Sub reports, Drill down and Drill through, Matrix and tabular reports using KPI’s.

•Generated Reports using Global Variables, Expressions and Functions for the reports.

•Identified and worked with Parameters for parameterized reports in SSRS 2012.

•Involved in maintaining and supporting operational reports, dashboards, and scorecards using Microsoft Power BI (Power Pivot, T-SQL, Excel pivot tables).

•Responsible for managing daily activities of department including compliance initiatives, group practice agreements, managing health care claim strategies and other.

Environment: Microsoft SQL Server 2016/2014 Enterprise Edition, MS Access, ETL Packages (SSIS), QNXT, SSRS, SQL Server Analysis Services (SSAS), VB.Net, Visual Studio, Team Foundation Server (TFS), T-SQL.



Contact this candidate