Post Job Free

Resume

Sign in

Sql Server Power Bi

Location:
Coppell, TX
Posted:
August 11, 2023

Contact this candidate

Resume:

Jyothi Modupalli

Email – adyuyf@r.postjobfree.com

Ph : 512-***-****

Professional Summary

Around 11 years of IT experience in the Implementing ETL&ELT process using Microsoft Business Intelligence Technologies (SSIS, SSRS &SSAS) and Azure Cloud technologies.

Highly accomplished, versatile Data engineer with career experience contributing to enterprise-scale software development life cycle (SDLC initiatives; adept in delivering best-practice support to end-to-end application design, development, testing, integration, and post-implementation support life cycle

Extensive experience in Azure Cloud Services (PaaS & SaaS), Azure Synapse Analytics, SQL Azure, Data Factory, Azure Data Lake.

Expertise in Data warehousing, Business Intelligent Technologies and Database with Extensive knowledge in Data analysis, SQL queries, ETL&ELT Process, Reporting Services (using SSRS, Power BI) and analysis Services using SQL Server 2005/2008/2012/2014/2016 SSIS, SSRS and SSAS, SQL Server Agent.

Experience in Power Automate Working with services, triggers, actions, conditions, and loops Run a flow on a schedule, calling custom business services using approval options, monitoring flows, Team Flows, Extending Power apps with Flow Extending Flow with Power apps.

Experience in designing cloud-based solutions in Azure by creating Azure SQL database, setting up Elastic pool jobs and design tabular models in Azure analysis services.

Basic hands-on experience on Log Analytics and Kusto query language.

Experience in developed solutions for SharePoint Online, Power Apps and Power Automate.

Experience in developed applications in Power apps using Dataverse, Power automate, Excel, and SharePoint Online.

Excellent in High Level Design of SSIS Packages for integrating data using OLE DB connection from heterogeneous sources like (Excel, CSV, Oracle, flat file, Text Format Data) by using multiple transformations provided by SSIS.

SQL Server Analysis Services (Cube Creation, Knowledge on OLAP and designing Cubes, Populating data, Knowledge on writing MDX.

Experience in logical, physical, and dimensional data modeling, data mapping, and data normalization, well verse in ERWIN.

Experience in creating indexed views, complex stored procedures, effective triggers, and useful functions to facilitate efficient data manipulation and consistent data storage.

Designed Power BI data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts. Developed Power BI reports and dashboards from multiple data sources using data blending.

Strong analytical problem-solving skills, self-starter with excellent inter-personal skills.

Multi-tasking and ability to work effectively under pressure meeting deadlines as an individual contributor and a committed team player.

Technical Skills

Technology

TOOLS

Operating Systems

Windows 7/8/10/11, Windows Server 2003/2008/2012/2016

Programming Languages

ASP.NET, C#.NET, VB.NET, C#.NET, C, C++, Python

Database

SQL Server 2008/2012/2014/2016/2019, Oracle, PL-SQL

Cloud

Azure SQL, Storage accounts, Azure Data Lake, Dedicated SQL pool, Azure Synapse, Azure Data factory, Microsoft Power Automate, IaaS, PaaS, SaaS.

Tools

SQL Server, Business Intelligent Studio, Bitbucket, SolarWinds, Redgate tool belt, Visio, TFS, JIRA, Informatica, Mongo DB, Tableau, Power BI, VSTS, Hornbill, Tidal, Octopus, TeamCity, ServiceNow, Data Migration Assistant, Snowflake, ChatGPT.

Web designing tools

HTML, XML, JAVA SCRIPT, Visual Studio, Share Point, MS Planner

Network Topologies

TCP/IP, DNS, FTP, SFTP

Academic Qualifications & Certifications:

Master of Computer Applications from SRI KRISHNADEVARAYA UNIVERSITY from INDIA.

Microsoft Certified Azure data engineer associate (DP-203).

Microsoft Certified Implementing an Azure Data Solution (DP-200).

Microsoft Certified Implementing a Data Warehouse with Microsoft SQL Server 2012/2014.

Project Details:

Client: AT&T, Dallas Feb’22 – till date

Role: Senior Data Engineer

Robotics Process Automation (RPA) Center of Excellence (COE), The Center of Excellence cuts across organizations and teams to help develop, deploy, manage, measure, and enable automation projects at all enterprise levels Team supporting Automation Anywhere build Intelligent Automations by leveraging Microsoft's Power Platform Technologies comprising of Power Automate Desktop and Cloud, Power Apps, Power Virtual Agents, Power BI and AI Builder. By deploying digital automations, we helped several Business Units within AT&T increase their cost savings and/or cost avoidance.

Responsibilities:

Created Pipelines using Azure synapse analytics to Load data into azure data lake, creating pipeline jobs, scheduling triggers, Mapping data flows using Azure Data factory and using key vaults to store credentials.

Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure SQL DB).

Involved in architecture discussions with Microsoft Data architect team.

Worked with Canvas and Model Driven apps, Integrating data sources such as Dataverse, SharePoint, and SQL.

Developed solutions for SharePoint Online, Power Apps, and Power Automate.

Supported multiple teams focused on implementing automation though ADF pipelines and creating SQL Stored procedures.

Having good experience working with Azure BLOB and Data Lake storage and loading data into Azure SQL Synapse analytics (DW).

Created ADF pipelines to load the data into snowflake dataware house.

Customize the ChatGPT using Azure Databricks and created reports in Power BI.

Created SSIS Packages and automate the process scheduling jobs using SQL agent. Monitoring job failures.

Automate the Microsoft ticketing system using cloud flows and desktop flows.

Having good experience on SSIS packages to extract data and load into SQL server and monitoring SQL agent jobs.

Created databases and schema objects including tables, indexes and applied constraints, connected various applications to the databases and written functions, stored procedures, views using T-SQL.

Automate the process moving data from Microsoft power automate to azure data lake using azure synapse Analytics link.

Experience in creating the Power bi visualizations that suits the business requirement. Developed pie charts, bar graphs, stacked bars, line charts and generated computed tables in power BI by using DAX.

Environment: Azure Synapse analytics, Azure Data Lake & BLOB, Microsoft Power Automate, Azure Data factory, Azure SQL MI, Microsoft SQL SERVER 2016, BIDS, power bi, SharePoint, Microsoft planner, ChatGPT, Data Bricks, SQL server SQL pools, Snowflake.

Client: ACA Group, Dallas Oct’21 – Jan’22

Role: Senior Data Engineer

ACA is the leading governance, risk, and compliance (GRC) advisor in financial services empowering clients to reimagine GRC and protect and grow their business. ACA’s innovative approach integrates consulting, managed services, and our Compliance Alpha technology platform with the specialized expertise of former regulators and practitioners and a deep understanding of the global regulatory landscape.

Responsibilities:

Having experience in Power automate flows to automate business processes. Developed solutions for SharePoint Online, Power Apps and Power Automate.

Created ADF Pipelines to Load data from Data Lake files (excel) to Azure Synapse. Experience with REST API data to be onboarded into SQL Azure Synapse.

Having good exposure into read the data from Azure Data Lake through external tables by using external resources.

Having good exposure into create tables (with different distributions and storage modes) in Azure Synapse and implement incremental load for the same by using ADF.

Having hands on experience on creating triggers based on different events (like schedules/ change in data lake files etc..) and associated the same with ADF Pipelines.

Having good exposure to create linked services for various source systems and using the credentials for the same from Azure Key Vault.

Having hands on experience in creating the Power bi visualizations that suits the business requirement. (Consume the data from SQL Azure Database).

Having hands on experience in Power Bi service with publishing reports and give suitable permissions to it. Generated computed tables power BI by using DAX.

Using Power BI, developed pie charts, bar graphs, stacked bars, line charts

Environment: Azure SQL Server, Azure Data Factory, Azure Data Lake, Bitbucket, Microsoft Power apps, Microsoft Azure Storage accounts, SharePoint, Power shell.

Client: Caresource, Dayton, OH Oct’20 – Sep’21 Role: SQL/SSIS Developer

Description:

Care Source is a nonprofit that began as a managed health care plan serving Medicaid member in Ohio. It provides public health care programs including Medicaid, Medicare, and Marketplace Care Source has a diverse offering of insurance plans on the Health Insurance Marketplace. The company also offers Medicare Advantage plans that help consumers close the gap of coverage as they age.

Responsibilities:

Created package to transfer data between OLTP and OLAP database using Lookup, Derived Columns Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc.

We are getting some other Claims, Repriced claims, Eligibility files in different file formats like Text, Excel, CSV.Developed packages to handle data cleansing, loading multiple files and executing tasks multiple times (using For Each Loop).

Modified the Existing SSIS Packages and working on new SSIS packages.

Load the data into destination tables by both full and incremental load by truncating and using joins on multiple columns within the stored procedures.

Involved in performance tuning and optimizing for sql server/ssis by using performance monitor.

optimized SQL queries for improved performance and availability.

Migrated data from on premise to Azure Cloud.

We are using the team city for build and Octopus tools for deployment.

Working on production failures.

Creating the jobs using the Tidal and Scheduling jobs to automate the ETL Process and to deliver the automated reports on daily, weekly and monthly basis.

Environment: MSSQL Server 2017/2019, SVN, GIT, SQL Server Integration Service, TFS, Team City, Octopus, Service Now, Data migration Assistant.

Client: Verizon Wireless, Dallas Dec’19 – Sep’20

Role: MSBI Developer

Description:

SPA (Sales Performance & Analysis) is a Web-Based Application that Provides Sales Activity Reporting, Product Code Information, along with Research and Update Tools Sales Reporting System and Tracking Tool for Commissions Operations Source of commission’s eligible transactions to the commission’s calculations engine in VBIS/Primary provider system for the Bookings transactions, for which Payees receive credit towards their Commissions.

Responsibilities:

SPA receives feeds from many orders entry, provisioning systems and other systems such as VESDW (VRD), Nordic, Estates (OrderPro, Coms, F&E), Premises, OVI and many more. In addition, we will send to the files to sources.

SPA receives different source file formats like Text, CSV.

Worked with T-SQL to create Tables, views, Function and Stored Procedures, sub-query, correlated Sub query, complex Stored Procedures and ad-hoc TSQL statement.

Created packages using SSIS Transformations like Aggregate, Sort, Lookup, Derived Columns Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Process Task etc.

Modified the Existing SSIS Packages and added more audits and alerts in case of package failure. Load the data into destination tables by both full and incremental load by truncating and using joins on multiple columns within the stored procedures.

Migrate SPA Application to OCI Cloud, convert, and migrate legacy databases using SQL Server Data tools.

We migrated data from on Prem to Azure cloud using ADF.

Worked on automating database deployments and created the jobs using SQL agent Scheduled and maintained daily and monthly loads of Objectives and Adjustments data through jobs, tasks, and alerts in Dev&UAT Environments.

Implemented various tasks and transformations for data cleansing and performance tuning of packages by using SSIS.

Monitoring the SQL agent jobs and fixed the job failures and monitoring the SQL Server performance using Red-gate tools.

Environment: Windows10, Windows Server 2012, MSSQL Server 2008, 2017, Microsoft Visual studio 2008, 2017, GIT Lab, Cold Fusion.

Client: HCH, Dallas Sep’19 – Nov’19

Role: ETL Developer

Description: Healthcare Highways is a health plan built on by a high-performance network that delivers quality care, cost savings, and sustainable value to customers. They are new competitor in health care working with employers of large, small, and hourly workforces to get the most value out of their health plans and the best health outcomes for all members. They believe independent healthcare providers should influence where and how patients are treated.

Responsibilities:

We are using Pilot fish to convert the 834,835 and 837 files into XML Standards.

Created package to transfer data between OLTP and OLAP database using Lookup, Derived Columns Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc.

We are getting some other Claims, Repriced claims, Eligibility files in different file formats like Text, Excel, CSV.Developed packages to handle data cleansing, loading multiple files and executing tasks multiple times (using For Each Loop).

Created jobs using SQL agent to automate the ETL Process and database Backups and Restore.

Modified the Existing SSIS Packages and added more audits and alerts in case of package failure.

Load the data into destination tables by both full and incremental load by truncating and using joins on multiple columns using stored proc’s.

Creating the jobs using the job Agent and Scheduling jobs to automate the ETL Process and to deliver the automated reports on daily, weekly and monthly basis.

Using Power BI, developed pie charts, bar graphs, and table, donut, and matrix reports using DAX functions.

Creating Row level security with power bi and integration with power bi service portal.

Created complex Reports using Charts, Gauges, Tables, matrix, worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.

Created Drill Down Reports, Drill Through Report by Region.

Performance tuning to optimize SQL queries using query analyzer.

Scheduled Automatic refresh and scheduling refresh in power bi service.

Environment: Windows10, Windows Server 2008, MSSQL Server 2017, SQL Server Integration services, Pilot fish Integration tool, Power BI.

Client: CHAS HEALTH, Spokane, WA Dec’18 - Jun’19

Role: Data Engineer

Description: CHAS Health is a non-profit, federally qualified health center (FQHC) providing high-quality medical, dental, pharmacy, and behavioral health, Education, Pregnecy&Pediatrics services to families and individuals. Virtual Visits Service also available in CHAS. Virtual Visits are an innovative approach to meet your healthcare needs in a way that is convenient and comfortable. Currently This service available for Behavioral Health and Nutrition. They are planning to provide this for medical as well.

Responsibilities:

Worked with T-SQL to create databases, Tables, views, Function, Stored Procedures, sub-queries and ad-hoc T-SQL statements.

Normalized the database tables and pushed into production server.

Modified existing tables, views and stored procedures based on the requests.

Setup Red gate SQL server monitor for the production servers.

Created package to transfer data between OLTP and OLAP database using Lookup, Derived Columns Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc. to load underlying data in to the tables from Text files, Excel file and CSV file.

Created jobs using SQL agent to automate the ETL Process and database Backups and Restore.

Modified the Existing SSIS Packages and added more audits and alerts in case of package failure.

Identified slow running query and optimization of stored procedures and tested applications for performance, data integrity using Redgate monitoring tools.

Identified slow running query and optimization of stored procedures and tested applications for performance, data integrity using SQL Profiler.

Developed Tableau reports and dashboards from multiple data sources using data blending.

Designed Tableau data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts.

Scheduling jobs using SQL agent to automate the ETL Process.

Environment: Windows 10, Windows server 2008 MSSQL Server 2014/2017, SQL Server Integration Service, SQL Server Reporting Services, Redgate Toolbelt, SQL Monitor, TFS, Tableau,Erwin, Hornbill ticketing System.

Client: Rohinni LLC, CDA, ID Apr’18 – Oct’18

Role: MSBI Developer

Description: Rohini LLC manufactures LED lighting products. It offers its products for electronic, transportation, display, outdoor, and other lighting applications. The company’s innovative, proprietary robotic process supersedes complex LED manufacture by placing mini and micro-LEDs directly on virtually any substrate at unprecedented speeds, in high volumes, and at greatly reduced cost and designers create lighting that is brighter, thinner, lighter, lower power and more dynamic than currently available packaged LED products. Moreover, by removing manufacturing limitations, Rohini’s process is ideally suited for products ranging from consumer electronics to automotive lighting to outdoor signage – the applications are virtually limitless. We are getting the Manufacturing machines information to the central Location in different file formats, Extract the data from central location and maintain an integrated Data warehouse on SQL Server through ETL, Created Dashboards and Reports using Power BI.

Responsibilities:

Developed packages to handle data cleansing, loading multiple files and executing tasks multiple times (using For Each Loop)

Modified the Existing SSIS Packages and added more audits and alerts in case of package failure

Worked with T-SQL to create Tables, Function and Stored Procedures, sub-query, correlated Sub query, complex Stored Procedures and

ad-hoc TSQL statement. Identified the Data Source and defining them to build the Data Source Views.

Developed Query for generating drill down reports in SSRS 2017.Designed and implemented stylish report layouts.

Generated Reports using Global Variables, Expressions and Functions for the reports.

Designed Power BI data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts

Developed Power BI reports and dashboards from multiple data sources using data blending

Explored data a variety of ways and across multiple visualizations using Power BI Strategies

Expertise in design of experiments, data collection, analysis and visualization

Designed index and optimized queries by using execution plan for performance tuning of the database.

Environment: MSSQL Server 2017, SQL Server Integration Service, SQL Server Reporting Services, Query Analyzer, Enterprise Manager, SQL Profiler, Windows XP, MDB Access, MS Word, TFS.

Client: Engie Insight, Spokane, WA Aug’16 - Dec’17

Role: ETL Developer

Description: ECOVA is an Energy Management company and it has different type of businesses Energy Management and Sustainability, Monitoring and Maintenance, Expense Management and Facility Optimization etc. The Energy and Sustainability management platform combines data, technology and energy expertise delivering powerful business insights through a single intuitive platform. EDGE Application which manages the clients, vendors, billing details etc. One of their requirements was to migrate the data from AirTrack (SQL Server) to the Datawarehouse through ETL.

Responsibilities:

Migrated data from Heterogeneous data Sources (Excel, Flat file, CSV file) to centralize SQL Server databases using SQL Server Integration Services (SSIS) to overcome transformation Constraints.

Created package to transfer data between OLTP and OLAP database using Lookup, Derived Columns Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc. to load underlying data in to the tables from Text files, Excel file and CSV file.

Good Understanding & experience on Cloud computing architecture, technical design and implementations including Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).

Analyze, design and build Modern data solutions using Azure PaaS service to support visualization of data. Understand current Production state of application and determine the impact of new implementation on existing business processes.

Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.

I have used python to create create and update Notebooks in datakake.

Generated Reports using Global Variables, Expressions and Functions for the reports. Designed and implemented stylish report layouts using tableau. Developed Tableau reports and dashboards from multiple data sources using data blending.

Designed Tableau data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts.

Developed both online and direct query reports in Tableau from various sources.

Environment: MSSQL Server 2012/2014, SQL Server Integration Service, SQL Server Reporting Services, Query Analyzer, Enterprise Managers Profiler, Index Tuning Wizard), Windows XP, MS Access, MS Word, SolarWinds, Azure Cloud, Power shell

Client: SCANTRON, San Diego, CA Aug’05 - Sep’08

Role: Data Analyst

Description: Scranton is best known for its pioneering and exceptional products and services that allow the rapid, accurate and reliable capture of student performance data. For more than three decades, Scranton has helped education, commercial, and government organizations worldwide measure and improves effectiveness with assessment and survey solutions.

Responsibilities:

Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS Package.

Extensively involved in designing the SSIS packages to load data into Data Warehouse. Involved in creating SSIS jobs to automate the reports generation, cube refresh packages.

Extract Transform Load (ETL) development using SQL Server 2008, SQL 2014 Integration Services (SSIS).

Hands on experience in using different transformations like fuzzy lookup, derived column transformation etc using SSIS.

Involved in the Production Support of SSIS Packages.

Enhancing and deploying the SSIS Packages from development server to production server. Responsible for Designing and Creating SSAS Cubes from the Data Ware House.

Good understanding of Data Marts, Data warehousing, Operational Data Store (ODS), OLAP, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimensions Tables using MS Analysis Services.

Created SSAS Cubes having Factless Fact Tables and Slowly Changing Dimensions. Involved in writing MDX queries and performance optimization of the SSAS cubes.

Experience in writing complex MDX queries for calculated members and measure groups.

Experience in creating Tabular reports, Matrix reports, List reports, Parameterized reports, Sub reports, Adhoc reports through Report Builder Using SSRS 2014.

Designed data models using Erwin. Developed physical data models and created DDL scripts to create database schema and database objects.

Worked with T-SQL to create Tables, Views, and triggers and stored Procedures.

Creating Users, User Groups and Projects through Tableau server Web and assigned roles and permissions for Tableau users.

Environment: MSSQL Server 2012, SQL Server Integration Service, SQL Server Reporting Services, Enterprise Manager, DTS Designer, Tableau 9.6, Reader, Tableau Server 7/8, Windows XP, MS Access, MS Word.



Contact this candidate