venkat
dengineer***gmail.com
Profile Summary:
Having 12+ years of experience in various projects developing and supporting Power Bi Reports & Dashboards, Doing Transitions, Production Support, Client Servicing and Team Management for Manufacturing, Oil & Gas, Energy & Utilities and Banking Domains.
Designed optimized data models for the applications & Managed end to end Projects from Requirements gathering to final delivery.
Having Good Knowledge of DAX, Creating Calculated Columns, Measures and Calculated Tables.
Experienced on DAX Data Analysis expressions and functions, Power Pivot, Enhancing Power BI visualization and formatting skills.
Data extraction and Power query for data transformation, Creating Row level security with Power BI and integration with Power BI Service Portal and Gateway services and handling user and app security.
Creation of Power BI dashboard and transforming the manual reports, support Power BI dashboard deployment.
Expert in data modelling and writing complex DAX queries.
Optimize the performance of reports in terms of time to refresh and size of the reports.
Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL
Implement ad-hoc analysis solutions using Azure Data Lake Analytics/Store, HDInsight.
Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL Datawarehouse environment.
Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
In-depth understanding of database management systems, online analytical processing (OLAP) and ETL framework
Experience in interacting with various business stakeholders and able to gather the business specifications and translate into data visualizations for effective decision making.
Hands on experience in using Hadoop Stack such as HDFS, Hive, Sqoop, Spark and Scala.
Ability to deal with Data Ingestion tools like Sqoop, Flume.
Hands-on experience with JIRA Agile Tool for Agile planning and delivery to business transformation.
Worked in the integration of wide range of source ERP’s like SAP, BAAN, JD Edwards and Salesforce with Informatica Power Center.
Extensively worked on Data Extraction, Transformation and loading data from various sources like Oracle, SQL Server and Flat files.
Have cultivated a strong skill to adapt and understand new domains easily and quickly
Technical Skills:
BI/Reporting Applications : Power BI Desktop, Excel Services 2013, Power View.
Hadoop Ecosystem : Hive, Spark, Sqoop, Flume, Pig.
Languages : Scala, MS SQL, SAS 9.3, SAS MACRO, HTML5.
Application : Informatica 9.6, 9.5, 9.1, 8.6, SAS Enterprise Guide, Management Console.
Database : Oracle 9i, 11g, 12C, SQL Developer, SSAS(Tabular), Power Query, Power Pivot.
Cloud : AWS, Azure Data Lake, Azure Data Factory
Work Experience:
Employer: Inbiz Concepts Inc, Norwood, MA
Duration: 4/1/2023 to current
Role: Senior Data Engineer
Client: Cemex
Duration: 01/2022 –3/2023
Role: Senior Data Engineer
Responsibilities:
Build a high-quality Data Lakes and Data Warehousing team and design the team to scale. Build cross functional relationships with Data analysts, Product owners and Engineers to understand data needs and deliver on those needs.
Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
Deployed and tested (CI/CD) our developed code using Visual Studio Team Services (VSTS).
Responsible for Ingestion of Data from Blob to Kusto and maintaining the PPE and PROD pipelines.
Worked in Agile development environment in sprint cycles of two weeks by dividing and organizing tasks.
Designed and developed MS Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards. Develop scripts to automate the execution of ETL using shell scripts under Unix environment.
Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure SQL DB).
Developed UDFs (python) for Pig and Hive to preprocess and filter data sets for analysis in Azure Databricks
Migrated Azure DB to Azure Synapse.
Work with business partners to understand business objectives and develop value-added reporting and provide ad-hoc data extracts and analysis using MS Power BI
Environment: Azure Data factory, Azure Data bricks, Azure SQL
Client: Findability Sciences Inc
Project: BI - ASM
Duration: 02/2021 – 12/2021
Role: Senior Power BI Developer
Employer: HCL
Responsibilities:
Responsible for extracting data from upstream sources and creating tabular model
Done logical transformation of data using DAX
Responsible for developing Power BI reports using Power BI desktop tool by extracting data from tabular model and host those reports in Power BI Service.
Gather requirements from multiple stakeholders and directors, analyze, and document, functional requirements and data specifications for MS Power BI dashboards/Reports.
Work with business partners to understand business objectives and develop value-added reporting and provide ad-hoc data extracts and analysis
Build regular ad-hoc reports using MS Power BI with the help of various tools such as Group by, Drill-down, Cascading, Summarized, Matrix.
Worked extensively on MS Power BI features such as Power Pivot and Power View
Good Hands-on Experience using MS Power BI Publishing options and creating Dashboards
Build and maintain network KPIs; analyze progress and enablers to advance program targets
Environment: MS Power BI, SQL Server 2008 R2, 2012, SSAS, SSIS, SSRS
Client: Exxon Mobil
Project: BI - ASM
Duration: 05/2020 – 01/2021
Role: Power BI Developer
Employer: HCL
Responsibilities:
Create dashboards and interactive visual reports using Power BI
Identify key performance indicators (KPIs) with clear objectives and consistently monitor those
Analyzing data and present data through reports that aid decision-making
Convert business requirements into technical specifications and decide timeline to accomplish
Create relationships between data and develop tabular and other multidimensional data models Chart creation and data documentation explaining algorithms, parameters, models, and relations
Design, develop, test, and deploy Power BI scripts and perform detailed analytics
Perform DAX queries and functions in Power BI.
Redefine and make technical/strategic changes to enhance existing Business Intelligence systems
Interest to work in Challenging environment, where proactively identifying bottle necks, chronic issues and improvement areas and take initiative in identifying and implementing the innovative solution that would save manual efforts and leads to strategic solution.
Monitor application health and scheduled data loads.
Environment: MS Power BI, SQL Server 2008 R2, 2012, SSAS, SSIS
Client: Sempra Energy
Project: BI - ASM
Duration: 11/2018 – 05/2020
Role: Power BI Developer
Employer: HCL
Responsibilities:
Gathered Business Requirements, interacted with Business Developers, Sales Managers and Admission team to get a better understanding of the Business Processes.
Develop YoY, MoM Performance Charts, Targets & Funnels, 6 months’ trajectories.
Resolved the performance issues of the Reports.
Extensively worked on Dax Calculations part and used extensively power query.
Assisting all the stake holders to enhance Data driven culture and supporting decision making depends on data.
Actively worked with Vice President, International Head of Sales and prepared Adhoc reports.
Environment: MS Power BI
Client: Nationwide Insurance
Project: CAN 360
Duration: 03/2018 – 10/2018
Role: Data Engineer
Employer: HCL
Responsibilities:
Part of a GDPR Data Ingestion team, responsible for ingestion data from different sources and different formats to Raw Data Layer in parquet format using spark.
Performed ETL tasks by creating mappings on JSON file as per the requirement.
Prepared the design documents and performed the unit testing.
Involved in maintenance activity of the project and resolve the issues in Acceptance and Production environments.
Worked with client team and interacted with functional analysts on daily basis to establish performance goals and suggestions for the improvements.
Client: INSURANCE COMPANY
Project: BI-ASM
Duration: 11/2014 – 02/2018
Role: ETL Developer
Employer: HCL
Responsibilities:
Worked with business customers and source teams for requirement gathering and issue resolution.
Worked as L3 and L4 support to develop minor enhancements and small projects.
Expertise in analysis, design, development and implementation of Informatica mappings.
Worked extensively in building automation scripts for simplifying the load process.
Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups in Informatica MDM Multi-Domain.
Environment: Informatica Power Centre, Informatica Cloud, SSIS
Client: NetGear
Project: Data Analysis Report Generation
Duration: 11/2014 – 02/2018
Role: Application Engineer
Employer: CSS CORP
Responsibilities:
Created different types of reports SAS and performed tasks such as Importing raw data from excel sheets into SAS data sets.
Validating and Cleaning data so that no incorrect data is present and if any correct them.
Setting Variable properties, Generating Reports with the specified requirements and data required for analysis.
Presenting the reports with the company logo embedded in the reports and making them browser compatible so that they can be published easily and rtf compatible so that any word processor can be able to open it.
Exporting Data back into excel sheets for the Senior Management of Netgear to analyze historic transactional data and expand their customer base and earn lucrative profit and to run the business successfully.
Environment: SAS Enterprise Guide and SAS Programming (9.3) techniques.
Client: Dell
Project: Promotional Optimization
Duration: 01/2011 – 08/2013
Role: Subject Matter Expert
Employer: Sitel
Responsibilities:
Extracted raw data and created SAS data sets that are required for project analysis.
Cleaning of data using data step and SAS Base procedures, validating, sorting and merging techniques is done using SAS/Base include Macros and SQL.
Using PROC SORT, PROC FORMAT, PROC CONTENTS, PROC TRANSPOSE DATA STEP programming.
From the processed data several reports were generated used Proc report, Proc Tabulate and converted the data into format and sent to client.
Generated HTML, listings and Excel and RTF reports for presenting the findings of various statistical procedures using Proc Report and Proc Print and along with SAS ODS.
Environment: SAS Enterprise Guide and SAS Programming (9.3) techniques.
Education:
Degree / Course
Institute
Duration
PGPIT. Business Intelligence
NIIT, Chennai
08/2013 – 05/2014
B.Tech. Computer Science
Anna University, Chennai
08/2006 – 05/2010