Post Job Free

Resume

Sign in

Power Data

Location:
Bellevue, WA
Posted:
November 07, 2020

Contact this candidate

Resume:

Srikanth Gunnam

Email Id: adhmz2@r.postjobfree.com Phone: 203-***-****

Summary:

Passionate Computer Science graduate with professional experience in Business Intelligence, Data Analysis, Data Modeling, Reporting and Analytics Platform.

Expertise in various reporting tools like Microsoft Power BI, Report Builder, and SSRS.

Good experience in Microsoft Business Intelligence stack which includes SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and Azure Data Factory.

Good experience on Power BI Desktop, Power BI Service, and Power BI Mobile.

Good experience on Data Analysis Expressions (DAX), ODATA Queries and M Query Language.

Experience in gathering requirements, analyzing requirements, creating functional and non-functional requirement documents.

Good knowledge of all the phases of Software Development Life Cycle (SDLC).

Experience in on tools like JIRA and VSTS (Azure DevOps).

Strong Knowledge on Power BI to import data from various sources such as SQL Server, Azure SQL DB, SQL Server Analysis Services (Tabular Model), MS Excel, Lotus notes, IBM Netezza and Salesforce etc.,

Experience in installing and configuring Gateways (enterprise gateway and personal gateway) in Power BI for scheduling the refresh based on customer requirement.

Good knowledge on schedule refresh and automation of refresh using Power Shell script.

Experience in creating Row-Level Security within Power BI desktop and integration with Power BI service portal.

Well versed in writing SQL queries to import and transform data and DAX queries for calculated columns, measure query’s in Power BI.

Strong Knowledge in dimensional Star Schema and Snowflakes Schema.

Hands on experience in Performance Tuning, Query Optimization.

Good knowledge on various programming languages like C, Java, Python, HTML, R, and Structured Query Language (SQL).

Good experience on Extract, Transform & Load (ETL) - SSIS packages for integrating data using various connections from heterogeneous sources like excel, csv, oracle, flat file, text file and XML by applying multiple transformations like data conversion, conditional split, bulk insert, merge, pivot, union all, and slowly changing dimension.

Experienced in creating various reports like a table, matrix, drill down, drill through, parameterized, cascaded, linked and sub-reports using SSRS and report builder.

Good knowledge of SAP Business Objects.

Excellent analytical, communication and troubleshooting skills.

Knowledge on Bid Data Ecosystem: Hadoop and Spark. And also, knowledge on Bid data tool like Sqoop, flume, Hive, Map Reduce, Flume, HBase, Zookeeper, PIG, Kafka and HDFS.

Certifications:

1.Microsoft Certified “Analyzing and Visualizing Data with Power BI” (e984481a98524b34b6e716b06c3f9d34)

2.IBM Certified “Introduction to Data Science”

3.SQLBI Certified in “DAX” and “Advanced DAX”

4.SQLBI Certified in “Data Modeling for Power BI”

Professional Training:

1.IBM Data Science Professional Certificate by IBM. (Feb 2019 - Current) - Online

2.Data Science Specialization by John Hopkins University. (Oct 2018 - Current) - Online

3.Data Scientist Master’s Program by Simplilearn. Courses included in this training are Data Science with R, SAS and Python, Machine Learning, Big Data Hadoop and Spark, Tableau and Data Science Castope. (June 2019 - Current) -Online

4.Deep Learning with Tensorflow and Artificial Intelligence by Simplilearn. (June 2019 - Current) - Online

Educational Background:

Graduate Degree:

University

University of New Haven

Location

West Haven, Connecticut, USA -06516

Degree

Master of Science

Major

Computer Science

Dates of Attendance

Jan 2016 – Dec 2017

GPA

3.81 out of 4.00

Undergraduate Degree:

University

Jawaharlal Nehru Technological University Kakinada

Location

Kakinada, Andhra Pradesh, India - 533003

Degree

Bachelor of Technology

Major

Computer Science and Engineering

Dates of Attendance

Aug 2011 – April 2015

GPA

83.64 out of 100

Work Experience:

Role: Business Intelligence Engineer Duration: Feb 2020 – Present

Company: Zions BanCorporation Location: Salt Lake City, UT

Description:

The project intent is to leverage and envision reports from the native Power BI Premium server and reporting features and data warehouse, data set, data consumption functionalities and also custom warehousing.

Design, development and implementation of our data models and reports that will be built on Microsoft Azure DevOps data and other potential correlating data sources. Power BI reports around the agile process, status of the requirements, testing and development processes, work traceability reporting, progress trends as well as the quality and maturity of projects and teams across the enterprise. Data models from SDLC and Product Delivery (DevOps) pipeline raw data that in turn will be consumed by product/enterprise dashboards and reports that are then used to oversee and manage at all levels.

Roles & Responsibilities:

Design, develop and implement data models and Power BI reports using Power BI Desktop 2. XX and Azure Analysis Services built on data sources like Azure DevOps/ Azure DevOps Server 2019.0.1 Patch 3 and above, Azure SQL DB, and other potential correlating data sources.

Install, manage and deploy optimal and effective architecture and implementation of the Power BI platform which includes Power BI Desktop application Windows 10 64 bit, Data Flows, Power BI Service and Power BI Report Server.

Configure various data source connectors like Azure DevOps, Azure DevOps Server, Azure SQL Database, Azure SQL Datawarehouse and other in Power BI Desktop.

Programming in various programming languages like Structure Query Language (SQL), OData Queries, Python Scripts and Power Query/ M Query to run against specified server to extract, transform and load data into Power BI Desktop.

Program complex calculated columns and measures for analytics using Data Analysis Expression (DAX) language.

Custom visuals are developed based on the requirement using Python, R and Type Script programming languages. Python 3 and R Studio was installed and configured to support custom visuals in Power BI.

Develop, maintain, and extend schemas, warehouse and dataflows/ pipelines for Extract Transform and Load (ETL) process.

Program in Python, SQLPLus, and PowerShell script in order to extract and load the data as required by using tools like Azure Data Factory, Anaconda 4.X notebooks by installing various Python open source packages, Oracle Database server, visual studio and PowerShell studio.

Build reports and dashboards using Power BI and SAP Business Objects Web Intelligence using Power BI Desktop, Report Builder and SAP Business Objects Suite by communicating with team leadership and stakeholders.

Following the Software Development Lifecycle (SDLC) to move the enterprise level dashboards to Production by completing phases like development, testing and deploying.

Integrate SAP Business Objects reports to Power BI and Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by REST API.

Managing Power BI Premium Capacities with on premise server deployment, Users and Licenses, Assigning Workspaces.

Managing Access to App Workspaces Apps and Content Packs. Monitoring Usage and Manage capacity.

Implement Row-level security in Enterprise dashboarding by creating role in Power BI Desktop and assigning users to the roles in the cloud-based analytics service called Power BI Service.

Install and configure Enterprise gateway and Personal gateway application on the on premise serve to pull to data from the on-premise data source and refresh the dashboards in Power BI service.

Gather business requirements, develop design methodology and project documentation.

Role: Business Intelligence Developer Duration: Oct 2018 – Feb 2020

Company: Zurich American Insurance Location: Schaumburg, IL

Description:

Enterprise Integration & Data Management (EIDM) is the project where the data stored on on-premises (Netezza) will be migrated to cloud (Microsoft Azure SQL Server). Along with that, various reports of the organization will be migrated from SAP Business Objects Enterprise to Microsoft Power BI. A lot of analysis, reports, and insights in the data will be generated. This project also includes predictive analytics.

Roles & Responsibilities:

Installed and configured Agility Workbench Pro to connect and access Netezza Database. Built SAP Business Object reports by connect to Netezza database. Analyzed existing SAP Business Objects reports to integrate to Power BI reports.

Responsible for gathered requirements from the stakeholders, financial analysts, underwriters, and underwriter managers to build reports/dashboards.

Designed and built enterprise level Power BI platform by working on premium capacity, licensing, workspace, app and user access.

Configured Azure Data Factory Gen1 to do Extract Transform and Load activities from Netezza on premise database server to Azure Data Lake Storage Gen1 and wrote U-SQL, T-SQL and Python Scripts to do transformation and copied it to Azure SQL Database.

Installed, managed and deployed optimal and effective architecture and implementation of the Power BI platform which includes Power BI Desktop application Windows 10 64-bit, Data Flows, Power BI Service and Licensing.

Configured various connectors like Azure SQL Database, Salesforce Objects, Excel, SQL Server and other to connect to various data sources like SQL Server, Excel, Salesforce, and Azure SQL Database to pull data into Power BI.

Programmed in Structured Query Language (SQL) and Python to extract data from data source to Power BI Desktop. And wrote Power Queries/ M Queries to do transformation in Power BI Desktop.

Designed and implemented enterprise level data models in Power BI Desktop by following data warehouse standards to create enterprise level dashboards/ reports.

Automated Power Query refresh using power shell script and windows task scheduler.

Installed and configured Enterprise gateway and Personal gateway in Power BI service.

Created Workspace and apps for business users to view the developed reports.

Scheduled Automatic refresh and scheduling refresh in Power BI service.

Wrote calculated columns, Measures query’s in power bi desktop to show good data analysis techniques.

Weekly presentation to the business users about the reports and their changes as required.

Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.

Developed analysis reports and visualization using DAX functions like table function, aggregation function and iteration functions.

Created reports using time intelligence calculations and functions.

Worked on all types of transformations that are available in Power BI query editor

Created stored procedures and SQL queries to pull data into power pivot model.

Worked on all types of transformations that are available in Power BI query editor.

Responsible for creating reports in Report Builder, Power BI Dashboard and SSRS.

Utilized Power BI to create various analytical dashboards that depict critical KPIs.

Developed Paginated Reports in Report Builder 3.0 and published it to Power Bi Service.

Created dashboards that display key information for decision making purpose using Heat maps, Geo maps, Treemaps, Pie charts, Bar charts, Circle views, Line charts, Area charts, Scatter plots, and Bullet graphs.

Developed custom reports, ad-hoc reports using Report builder for easy user interface and deployed them on the server using Power BI and SQL Server Reporting Services (SSRS).

Responsible for publishing the created Power BI dashboards/apps to Power BI service and provide access to the end users.

Worked on improving the report performance tuning and also involved in troubleshooting, resolving and escalating data related issues and validating data to improve data quality.

Designed Power BI data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts.

Worked on development of knowledge transfer documentation.

Role: Business Intelligence Developer Duration: May 2018 – Oct 2018

Company: Adam Information Technologies Location: Madison, Wisconsin

Roles & Responsibilities:

Implemented Extracting, Transforming and Loading (ETL tools) packages using SQL Server Integration Services (SSIS).

Generated different types of reports like Client-Side Report, Tabular Reports, Matrix Reports, Drill down Reports, Parameterized Reports, Ad-hoc Reports, and Stylish Reports using SQL Server Reporting Services (SSRS) in Business intelligence development studio (BIDS).

Created SSAS packages, OLAP cube, and multi-dimensional modeling on Terabytes Database, and worked in Designing and Building Dimensions and Facts with star schema and snowflake schema using SQL Server Analysis Services (SSAS).

Developed advanced BI applications using the MS BI stack including SSIS, SSRS, SSAS, T-SQL/stored procedures as well as Multi-Dimensional Expressions (MDX).

Develop and deployed SSIS packages in SQL Server.

Installed and configured Power BI platform. Converted all operational excel report into Power BI Reports.

Extensive experience in Extracting, Transforming and Loading (ETL tools) using SSIS, DTS Import/Export Data, Bulk Insert, BCP and SSIS packages.

Worked on Control flow tasks such as Execute SQL task, Send Mail Task, File System Task, Dataflow Task and used different data sources and destination with derived column, lookup transformation within Dataflow Task.

Created Database and Database Objects like Tables, Stored Procedures, Views, Triggers, Rules, Defaults, user defined data types and functions.

Debugged and enhanced existing stored procedures for better performance gains.

Created SSIS packages using different type's task, with Error Handling, Log particular event by creating logging and worked on package configuration to run package on different server.

Used SSRS, Tableau and MS Power BI to create customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.

Created ETL packages with different data sources (SQL Server, Flat Files, Excel source files, XML files) and then loaded the data into destination tables by performing different kinds of transformations using SSIS packages.

Used various kinds of transformations like Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.

Installed and Configured Enterprise Gateway and Personal Gateway in Power Bi Services.

Published reports and dashboards using Power BI.

Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI.

Client: Go4biz Duration: May 2016 – Dec 2017

Role: BI Developer

Responsible for gathered requirements from the end users.

Worked on Sqoop, flume, Hive, Map Reduce, Flume, HBase, Zookeeper, PIG, Kafka and HDFS.

Responsible for creating advanced calculations in Power BI.

Worked on connecting Power BI to various data sources, modeling the data in Power BI.

Publishing reports and setting up the necessary connection details and scheduling.

Designing, implementing and developing reports and analytics solutions using Microsoft Power BI.

Providing the support to the existing dashboards.

Responsible in migrating and enhancing the reports in SSRS to Power BI.

Supporting and responsible for responding to issue raised by customers in a timely manner.

Writing complex stored procedures, maintenance and enhancement.

Conducting unit testing and trouble shooting.

Implementing ROW Level Security by defining various constraints for each defined role.

Troubleshooting complex bugs and problems with Power BI reports, data models, and their interactions with underlying data sources

Responsible for attending Agile meetings (planning sessions and presenting demos).

Role: Information Technology- Data Analyst Duration: Jan 2015 – Nov 2015

Company: Adam Information Technology Location: Hyderabad, India

Roles & Responsibilities:

Responsible for data validation, cleansing, consolidation and mining for sense, consistency and accuracy

and compliance standards.

Responsible for development and maintenance of data collection and analytic tools; including databases

related systems.

Execute complex analytics projects that involve data collection and analysis for a broad array of

retention issues across disciplines functional areas

Python scripts for Data Wrangling and Data Cleansing.

Developed complex stored procedures, functions triggers, indexes, views and SQL join queries.

Develop strategies to improve program performance by translating quantitative findings into practical

insights.

Contribute to problem solving process design efforts.

Worked on Sqoop, flume, Hive, Map Reduce, Flume, HBase, Zookeeper, PIG, Kafka and HDFS.

Ensure data integrity consistency is achieved across relevant data systems.

Use multivariate statistical analysis techniques to analyze data produce preliminary reports.

Academic Projects:

Project: DriveMe: An Android Application Under the Esteemed Guidance of Dr. Barun Chandra

Description:

DriveMe is an android application which is a kind of Online Cab Booking system. This application helps customers and drivers to book rides and give rides respectively. Drive Me provides both driver and customer login in the same application. Along with this DriveMe also provides truck facility to deliver the luggage from one place to another place.

Software’s and Technologies Used:

Eclipse: It is an Integrated Development Environment (IDE). Eclipse for android package includes Eclipse JAVA Development tools, Eclipse XML editor and tools and other.

Android Development Tools (ADT): It is a plug-in for the Eclipse IDE that is designed to provide a powerful, integrated environment to design an android application.

Android Software Development Kit (SDK): It contains a set of development tools which are used to develop an application for the Android platform. It contains required libraries, debugger and an emulator.

JAVA SE Development Kit and different APIs are required. Front End – XML, Back End – JAVA, Tools – Eclipse IDE, wamp and Database – MySQL

Project: Exploring AWS Under the Esteemed Guidance of Dr. Amir Esmailpour

Description:

Utilized Amazon Web Services in doing various tasks like launching a virtual machine, creating and connecting MySQL database, created and queried NoSQL table, created and queried NoSQL table, hosting a static website analyze big data with Hadoop using hive tool, analyze big data Hadoop using EMR, big data analysis with Hadoop using hive tool vs emr and building a mobile application.

Tools:

Amazon Web Services, Hadoop, Hive, Sqoop and Spark

Project: Online Alert System Under the Esteemed Guidance of Dr. Naveen Kumar

Description:

Online Alert System is an android application which provides location updates of the user to specified contacts periodically. The users who want to be safe can connect to the internet and GPS by that user can send location updates to trusted contacts. This android application is more effective, quick and accurate in providing location updates and helps the user to maintain safety measures in an emergency. In many cases like kidnapping and emergency, this application helps to know the location of the victim. To overcome these problems, we are developing an android application to get the victim’s exact location and send his location updates to their parents or guardians periodically.

Tools:

Software Development Kit : Java SE 7, Android SDK, JDK

IDE : Eclipse

Plugin : ADT (Android Development Tools).



Contact this candidate