Post Job Free

Resume

Sign in

Advanced SQL, powerBI, tableau, azure

Location:
Edmonton, AB, Canada
Posted:
October 02, 2023

Contact this candidate

Resume:

Lekhasree Narayanagari

Data Engineer BI Analyst MSBI Developer DBA

Email: adz32p@r.postjobfree.com

Phone: 780-***-****

PROFESSIONAL SUMMARY

•Business Intelligence professional with 5+ years of experience and proven expertise in data warehousing and business intelligence tools Power BI, MSBI (SSRS, SSAS, and SSIS), and Azure Data tools such as Azure Storage accounts, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure SQL Database and Azure Analysis Services.

•Known for strong critical thinking and analytical skills as well as ability to complete projects accurately, on time and within budget.

•Experience in Configure and maintain Report Manager and Report Server for SSRS, Deployed and Scheduled the Reports in Report Manager.

•Used Azure download/upload tasks to download and upload blobs.

•Generated reports and visualizations by connecting to various data sources (MS SQL Server, Snowflake, MySQL, MS Access, Share Point, Excel, CSV).

•Expertise in using Common Data Service entities creation and using in Power BI desktop to create reports.

•Worked on creating complex ETL Packages using SSIS.

•Worked on Performance Tuning & Query Optimization.

•Worked extensively on both OLTP & OLAP Environments.

•Extensive knowledge of data warehouse methodologies.

•Strong ability to analyze user requirements, Business Requirement Specifications.

•Proficiency in Database design, Coding, unit testing and Implementation.

•Hands on Experience in optimizing the queries by creating various clustered, non-clustered indexes and indexed views.

•Proficient in creating objects such as Stored Procedures, Views, Triggers, User defined functions, cursors, derived tables, common table expressions (CTEs) and Complex Queries on Microsoft SQL Server 2019/2016/2012.

•Worked in Implementation life cycles of both Agile and Waterfall Methodologies.

•Experience at Transforming and validating data using SSIS Transformations like Conditional Split, Lookup, Merge Join, Sort and Derived Column

•Insight statistical knowledge with p-value, A/B testing, UAT, hypothesis testing, linear regression, and logistic regression.

•Experience in extracting data from data sources and transforming it using SQL and Python.

•Good knowledge of System Development Life Cycle (SDLC).

•Followed SDLC methodologies in performance of all development activities, including design, development, test, and quality assurance support.

•Team player with good logical reasoning ability, coordination and able to complete projects independently.

•Experience in Designing and Building the Dimensions and cubes with star schema using SQL Server Analysis Services.

•Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of reports like drill down, drill through, Parameterized, Cascading, Conditional, Table, Matrix, Chart, and Sub Reports.

•Excellent understanding of Relational Database Systems, Normalization, logical and physical data modeling.

TECHNICAL SKILLS

• Database Technologies: SQL Server, Snowflake, MySQL, MS Access, Netezza, Nosql, Apache Hadoop.

• Scripting: SQL, Python, Power Shell, TSQL, DAX

• Other Tools: Microsoft Visio

• Change Management Tools: Tortoise SVN, Visual Source Safe (VSS), Team Foundation Server (TFS), GIT

• Operating Systems: Windows Server, MS DOS, UNIX, Linux

• ETL Tools: SSIS, Azure Data Factory, Informatica

• Data Modeling Tools: Visio, Erwin, Red gate

• Tools and Packages: SQL Server Management studio, Visual Studio, Enterprise Manager, SQL Profiler, SQL Server Data Tools, DTS, Report Builder, SQL Enlight.

• Reporting Tools: Power BI, SSRS, Excel.

• Cloud Technologies: Azure Data Factory, Databricks, Synapse Analytics, Azure Data Lake Storage, SQL Database, Azure Analysis Services, Azure Storage Account, Blob Storage.

• Data Visualization: Tableau 10, Tableau 10 server 10, Power BI, Excel, Power Source.

PROFESSIONAL EXPERIENCE

Client: BMO- ON

Role: Data Engineer Dec 2022- Till

Responsibilities:

•Communicated with the deployment team to deploy the packages and help to execute them.

•Created various SQL scripts to perform ETL of daily, monthly and yearly data.

•Executed the existing SQL scripts and fixed the scripts to run the packages successfully.

•Communicated with various teams to gather the requirements and developed the packages accordingly.

•Worked with various tools like Power BI, DBMS, SSIS and SSMS to develop and execute the SQL packages.

•Effectively communicated with the incident team to make the changes in the production environment and followed up on the incidents.

•Communicated with the QA team to test the solutions developed.

•Created an overall progress report to communicate with the manager.

•Worked with creating packages from the scratch and execute them.

•Having Hands-on experience in writing complex SQL queries.

•Worked with the Spark for optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Py Spark, Pair RDD's, and Spark YARN.

•Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi-structured data coming from various sources.

•Created Power BI Reports using the Tabular SSAS models as source data in Power BI desktop and publish reports to service.

•Developed and published reports and dashboards using Power BI and written effective DAX formulas and expressions.

•Involved in pulling the ELB logs & Python / S3 Logs from the S3 bucket.

•Developed Shell, Perl and Python scripts to automate and provide Control flow to Pig scripts.

•Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure SQL DB, Azure HDInsight).

•Designed, developed, and tested packages to extract, transform and load data using SQL Server Integration Services (SSIS).

• Used Spark Streaming to divide streaming data into batches as an input to the Spark engine for batch processing.

Client: Loyalty One - ON Jan 2021 – Nov 2022

Role: ETL Developer

Responsibilities:

•Designed Contract Retention database using SQL Server RDBMS technology.

•Designed and developed data models for the data warehouse using Star schema.

•Creating Stored Procedure and Scheduled them in Azure Environment.

•Developed data mapping documentation to establish relationships between source and target tables including transformation processes.

•Developed and implemented SSIS ETL processes to load data from source files to target table.

•Created Data Flow Tasks with Merge Joins to get data from multiple sources from multiple servers, with Look Ups to access and retrieve data from a secondary dataset, with Row Count to have the statistics of rows inserted and updated also worked with Script Task to perform a custom code.

•Worked on creating dependencies of activities in Azure Data factory.

•Worked with Package Variables, with Event Handlers to react to the appropriate error conditions, with SSIS Parallelism, with Package properties to make package more proficient. Building the pipelines to copy the data from source to destination ADF.

•From analyzing sales and inventory data, decreased warehouse costs by 35% by eliminating dead inventory.

•Manage BI Strategy, including data sourcing, acquisition, integration, as well as all activities related to analytics and reporting.

•Created Pipelines in ADF using Linked Services / Datasets / Pipeline / to Extract, Transform, and load data from different sources like Blob storage, Azure SQL Database, write-back tool and backwards.

•Developed reports using the data imported in the local Power BI environment for Finance, Sales, Warehouse, Supply chain, Delivery Execution, Production, and Fleet departments.

•Designed and implemented migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-party tools).

•Deployed and tested (CI/CD) our developed code using Visual Studio Team Services (VSTS).

•Monitored all the packages that were scheduled. Involved in debugging the Packages by utilizing SSIS features like breakpoints, Data-Viewers, and custom loggings.

•Developed Aggregations, partitions, and calculated members for cube as per business requirements.

•Involved in analyzing and designing disaster recovery / replication strategies with business managers to meet the business requirements.

•Designed Dimensional Modeling using SSAS packages for End-User. Created Hierarchies in Dimensional Modeling.

•Developed ADF pipelines to automate ETL activities and migrate SQL server data to Azure SQL database. Using Azure Data Factory, create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores.

•Building complex ETL processes that transform data visually with data flows or by using compute services such as Azure HDInsight Hadoop, Azure Databricks, and Azure SQL Database.

•Built SSIS packages and scheduled jobs to migrate data from disparate sources into SQL server and vice versa.

•Created SSIS packages with which data from different resources were loaded daily in order to create and maintain a centralized data warehouse. Made the package dynamic so it fit the environment.

•Formulated the strategy, development, and implementation of executive and line of business dashboards through Power BI.

•Data Integration tool based on ETL by using Informatica provides data Integration

software and services for various business, industries and government organizations including telecommunication.

•Apache HBase is a database that runs on a Hadoop cluster. I can access HBase data through either a native Java API or through a Thrift or REST gateway, making it accessible by any language.

•Involved in Data management that provides data integration and transformation.

Client: Flexiti Financials - Toronto Sep 2019 – Dec 2020

Role: MSBI Developer

Responsibilities:

•Worked on creating dependencies of activities in Azure Data factory.

•Creating Stored Procedure and Scheduled them in Azure Environment.

•Experienced in Extracting, Transforming and Loading (ETL) data from Excel, Flat file, to MS SQL Server by using DTS and SSIS services.

•Installed and configured SQL Server 2008 and 2008 R2 Enterprise Edition Environment.

•Designed, developed, and tested packages to extract, transform and load data using SQL Server Integration Services (SSIS).

•Developed advanced ETL import packages using SSIS, T-SQL to transform and import raw index data into database tables.

•Developed SSIS packages to Migrate data to staging server and then to production server.

•Processed data from flat files excel files, .csv files into proprietary hierarchical data structure system through SSIS packages and custom procedures.

•Building the pipelines to copy the data from source to destination in Azure Data Factory (ADF V1).

•Evaluated database performance and performed maintenance duties such as Tuning, Backup, Restoration and Disaster Recovery.

•Worked on Data Driven Subscription Tables for Quarterly Reports, Monthly Reports as well as Daily Reports.

•Identified and added the report parameters and created the reports based on the requirements using SSRS.

•Built and published customized interactive dashboards utilizing Excel Power BI.

•Maintained existing Power BI reports and dashboards and made changes onto the same as per new and changing business requirements.

•Deployed SSIS packages to SQL 2012/2014/2016 and scheduled jobs to extract data from sources on daily basis.

•Developed SQL Server Integration Services (SSIS) packages to automate the nightly extract, transform and load jobs

•Netezza Structured Query Language (NZSQL) used SQL commands to create and manage Netezza databases, user access, and permissions for the databases, as well as to query and modify the contents of the databases.

•Developed cubes by defining data sources and data source views and using SQL Server Analysis Services (SSAS).

•Loaded the processed data into final viewpoint centralized DWH for further Power BI reporting.

•Worked on Key performance Indicators (KPIs), design of star schema and snowflake schema in Analysis Services (SSAS).

•Worked in cooperation with other teams and business analysts for achievement of efficient database access and for processing applications. This includes activities like designing, adding, eliminating, or maintaining indexes, organizing tables, recommending SQL scripts to application development teams.

Client: HDFC Bank - India May 2017 – July 2019

Role: Data Analyst

Responsibilities:

•Developed data mapping documentation to establish relationships between source and target tables including transformation processes.

•Created Data Flow Tasks with Merge Joins to get data from multiple sources from multiple servers, with Look Ups to access and retrieve data from a secondary dataset, with Row Count to have the statistics of rows inserted and updated and worked with Script Task to perform a custom code.

•Used Python packages for any data cleaning and data migration, merging, modifying, extracting user specific data using libraries like Pandas.

•Perform Data analysis using python libraries such that Pandas, Numpy, Seaborn and create dashboards using Tableau, Grafana.

•Created the ETL specification for loading data into Data Warehouse using python and performed ETL Alteryx and Python by ensuring smooth flow of data from source and end destination.

•Worked with Package Variables, with Event Handlers to react to the appropriate error conditions, with SSIS Parallelism, with Package properties to make package more proficient.

•Utilized T-SQL to pull data from various databases and created various dashboards, scorecards and reports to support loan business decision making.

•Created Stored Procedures, user-defined functions, views for handling business logic and functionality such as Email Alerts, Data Extracts build-in Excel.

•Created Drill-through reports, parameterized reports and linked reports using SSRS 2005 and 2008 and were integrated in the Performance point server.

•Creating and maintaining various subscriptions in the Report Manager for the reports.

•Involved in User communication for requirement gathering and Change control and code control policies and procedures.

•Worked with various upstream and downstream customers in interfacing various systems and processes for Data extractions, ETL, Analytics and reporting needs.

•In AWS I worked for EC2, S3, VPC, Route 53, ELB, Classic Load Balancer, ECR (Elastic Container Registry), ECS (Elastic Container Service), AMI (Amazon Machine Images), cloud watch, cloud trail, configuring the Lambda functions to make them event driven, or schedule driven.

•Developed Stored Procedures to support ETLs and Reporting.

•Wrote T-SQL for the retrieval of the data. Identified the Data Source and defining them to build the Data Source Views.

•Querying, creating stored procedures and writing complex queries and T-SQL Joins to address various reporting operations and random data requests.

•Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.

•Created new forecasting model that consolidated different markets leading to quicker, more dynamic forecasts and greater efficiencies; Dealt with time series data (decomposition, forecasting).

Educational Qualifications: Aug 2013- Apr 2017

Bachelor’s in Electronics and Communication Engineering affiliated under JNTUA in India.



Contact this candidate