Krishna Mohan
Email: *******.*******@*****.***
Mobile:+133-****-****
PROFESSIONAL SUMMARY
•12+ years of experience in IT, including expertise in Azure cloud services and Big Data technologies,Business Intelligence and Data warehouse implementations.
•5 yrs of Experience in implementing Bigdata projects using Azure Databricks.
•Skilled in building data pipelines using Azure Data Factory and Azure Databricks, and proficient in loading data into Azure Data Lake, Azure SQL Database, and Azure SQL Data Warehouse.
•Proficient in ensuring data quality and integrity through validation, cleansing, and transformation operations.
•Implemented data visualization using Power BI.
•Developed ETL transformations and validation using Spark SQL/Spark Data Frames with Azure Databricks and Azure SQL.
•Deployed Data Factory for creating data pipelines to orchestrate the data into SQL databases.
•Working on Snowflake modeling using data warehousing techniques, data cleansing, Slowly Changing Dimension, surrogate key assignment, and change data capture.
•Built the Logical and Physical data model for Snowflake as per the required changes.
•Defined roles and privileges required to access different database objects.
•Experienced in developing CI/CD frameworks for data pipelines and collaborating with DevOps teams for automated pipeline deployment.
•Proficient in scripting languages such as Python and SQL.
•Hands-on experience in developing large-scale data pipelines using Spark and Hive.
•Hands-on experience using version control tools such as GitHub and ARM templates.
•Implemented advanced data manipulation and analysis techniques using window functions, resulting in optimized and efficient data processing and analytics pipelines.
•Orchestrated data integration pipelines in ADF using various Activities like Get Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until etc
•Experience in using Star Schema and Snowflake schema for Modeling and using FACT & Dimensions tables, Physical & Logical Data Modeling.
•Analyze and present data to help the company make important decisions.
•Help team implement important BI systems.
•Increase company efficiency by identifying and adjusting key performance indicators.
•Convert the company's goals and objectives into technical specifications to develop project timelines.
•Provided hands-on support to data collection and reporting.
•Used SQL skills to mine data and provide the company with beneficial information.
•Designing, creating and implementing BI solutions
•Translating business goals into technical specifications
•Creating interactive visual reports using BI tools
•Hands on Experience on Power BI Desktop and Service/Report Server and worked on the Power BI Reports & Dashboards with SQL Server.
•Experienced to use standard visualizations like Slicer, Card, Stacked column chart, Clustered column chart, Line and stacked column chart, Line and clustered column chart, map, Pie- Charts, Bar Graphs, Drill downs, Drill through, Navigation Reports etc... Using Power BI.
•Comfortable in working with filters, calculated columns, measures, relationships and transformations of edit query section.
•Created SQL views as per Power BI reporting requirement.
•Developed calculated measures using Data analysis expression language (DAX).
•Worked on on-premises gateways to refresh the data sources/creating a live connection with My SQL Server.
•Worked on DAX expressions like filters, Aggregate, Functions and time intelligence etc.
•Working on DAX functions to create Calculated Columns and Measures.
•Extensive experience and knowledge in Charts & Graphs & KPI, Marks cards, charts, Calculated fields.
•Thorough Knowledge on Data Warehouse concepts like Star Schema, Snowflake, Dimension and Fact Tables.
EDUCATION
•Bachelors Degree In Computer Science from SK University
CORPORATE AFFLIATIONS
•Worked as Big Data Engineer in IBM from Sep-2022 to July-2024.
•Worked as Azure Data Engineer in ITC InfoTech from Aug-13 to Sep-2022.
•Worked as OBIEE Associate for Cognizant from Nov-11 to Aug-13.
TECHNICAL SKILLS
Cloud Services
Azure Data Factory, Azure Databricks, Azure Gen2, Data Lake, Key Vault.
Languages
SQL, PL/SQL, Python
IDE & Notebooks
Visual Studio, Jupyter, Databricks,
Databases
SQL DB,Oracle SQL
PROFFESSIONAL SUMMARY
Client: Shell India Pvt Ltd
Duration: Oct-2022 to July-24
Role: AzureData Engineer
Substance Volume Tracking: project is related to Shell lubricants projects. Substance volume tracking helps you to comply with the relevant regulations by recording the quantities.
of substances needing to be tracked that you purchase, import, produce, sell, or export. The solution is designed to connect to various on premises data sources and ingest the data in Azure Data Lake for various sites using ADF. The data is transformed for various end points consumption such as Enterprise Data Management, Data Warehouse using Power BI.
RESPONSIBILITIES:
•Designed and implemented scalable data ingestion pipelines using Azure Data Factory to ingest data from SQL databases, CSV files.
•Developed data processing workflows using Azure Databricks, leveraging Spark for distributed data processing and transformation tasks.
•Ensured data quality and integrity by performing validation, cleansing, and transformation operations using Azure Data Factory and Azure Databricks.
•Designed and implemented a cloud-based data warehouse solution on Azure using Snowflake, leveraging its scalability and performance capabilities.
•Created and optimized Snowflake schemas, tables, and views to efficiently store and retrieve data for analytics and reporting purposes.
•Collaborated with data analysts and business stakeholders to understand requirements and implemented appropriate data models and data structures in Snowflake.
•Developed and optimized Spark jobs for data transformations, aggregations, and machine learning tasks on large datasets.
•Maintained documentation for data pipelines, data models, and overall system architecture to facilitate collaboration and knowledge sharing.
•Configured event-based triggers and scheduling mechanisms to automate data pipelines and workflows.
•Implemented data lineage and metadata management solutions for tracking and monitoring data flow and transformations.
•Identified and resolved performance bottlenecks in data processing and storage layers, optimizing query execution and reducing data latency.
•Worked with Microsoft Azure services such as HDInsight Clusters, Blob Storage, Data Factory, and Logic Apps.
•Performed ETL using Azure Databricks and migrated on-premises to Azure SQL DB.
•Conducted performance tuning and capacity planning exercises to ensure scalability and efficiency of the data infrastructure.
•Collaborated with DevOps engineers to develop automated CI/CD and test-driven development pipelines on Azure DevOps based on client requirements.
•Designed and developed code and scripts in PySpark for Acquisition and Transformation layers from different sources.
•Moved all Meta data files generated from various source systems to ADLS for further processing.
•Imported data from different sources like ADLS and BLOB for computation using Spark.
•Implemented Spark using Python in Data bricks utilizing Data Frames and Spark-SQL API for faster Processing of data.
•Used Avro, Json and Parquet data formats to store data into ADLS.
•Created data driven work flows for data movement and transformation using Data Factory.
•Extracted huge files by using Azure Storage Explorer from the Data Lake.
•Interacting with business users to gathering business requirements for preparing high level and low-level design documents.
•Creating dashboards and interactive visuals using BI tools
•Working with technical teams to ensure the success of BI projects.
•Multi-tasking projects to create an efficient and effective work environment.
•Perform DAX queries using Power BI tools.
•Provided hands-on support to data collection and reporting.
•Supported BI teams to help them implement important BI systems.
•Used SQL skills to mine data and provide the company with beneficial information.
•Helped solve major company issues using data as evidence-based support.
Client: British American Tobacco.
Duration: Aug 2020 – Sep 2022
Role: Azure Data Engineer
British American Tobacco (BAT) is a British multinational company that manufactures and sells cigarettes, tobacco and other nicotine products.
The solution is designed to connect to various on premises data sources and ingest the data in Azure Data Lake for various sites using ADF. The data is transformed for various end points consumption such as Enterprise Data Management, Data Warehouse and Azure Analysis Services and Power BI.
Responsibilities:
•Designed and developed code and scripts in PySpark for Acquisition and Transformation from different sources.
• Moved all Meta data files generated from various source systems to ADLS for further processing.
•Imported data from different sources like ADLS and BLOB for computation using Spark.
•Implemented Spark using Python in Data bricks utilizing Data Frames and Spark-SQL API for faster Processing of data.
•Used Avro, Json and Parquet data formats to store data into ADLS.
•Created data driven workflows for data movement and transformation using Data Factory.
•Extracted huge files by using Azure Storage Explorer from the Data Lake.
•Involved in all the Acquisition, Transformation and Model phases of the project.
•Monitoring the Spark Jobs in the cluster environment debugging for the failed jobs. Interacted closely with business users, providing end to end support.
•Interacting with business users to gathering business requirements for preparing high level and low-level design documents.
•Creating dashboards and interactive visuals using BI tools
•Working with technical teams to ensure the success of BI projects.
•Multi-tasking projects to create an efficient and effective work environment.
•Provided hands-on support to data collection and reporting.
•Supported BI teams to help them implement important BI systems.
•Used SQL skills to mine data and provide the company with beneficial information.
•Helped solve major company issues using data as evidence-based support.
Client: Sri Lankan Airlines
Duration: May 2017 – Aug 2020
Role: BI Developer
Sri Lankan Airlines: The National Airline of Sri Lanka, is an award-winning carrier with a firm reputation as a global leader in service, comfort, safety, reliability, and punctuality.
Launched in 1979, Sri Lankan is currently expanding and further diversifying its wide range products and services in order to drive the country’s ongoing boom in tourism and economic development.
SLA EDW implementation is all about, designing data warehouse for reservation system, revenue accounting, Inventory, which allowed business to check the insights of data. Provided most of the reports are analytical report, where business using them in their day-to-day operations, monthly and annual general body meetings.
RESPONSIBILITIES:
•Gathered, analyzed user requirements, and worked closely with Business analysts, Client to create Technical Specifications
•Designed and developed the OBIEE Metadata Repository using OBIEE Admin tool by importing the required objects (Dimensions and Facts) with integrity constraints into Physical Layer
•Developed multiple Dimensions with multiple Drill-Down Hierarchies in Business Model Layer and customization of Presentation catalogs in Presentation Layer
•Developed Time Series objects for computing Year Ago, YTD, MTD, WTD measures.
•Built Interactive Dashboards with drill-down capabilities, use of global prompts and Presentation variables.
•Developed many Reports using different Analytics Views (Pivot Table, Chart)
•Column Selector), Dynamic / Interactive Dashboards with drill-down capabilities, charts, tabular using global and local Filters.
•Extensively used OBIEE Delivers and I-Bots
•Worked on data level security and object level security.
•Worked on usage tracking and developed reports required by users for usage tracking.
•Debugged reports and Dashboards visibility with respect to user’s responsibility and web groups in an integrated environment
•Coordinated the Analytics RPD and Web Catalog objects migration from Dev environment to QA environment and to the Production environment.
•Performed Unit, Integration testing to validate report and mapping functionality.
Client: Air Asia Airlines
Duration: Mar 2015 – Apr 2017
Role: BI Developer
Air Asia: (Oracle ADM) implementation with Air Asia using Oracle Airline Data Model. AirAsia is the largest low-cost airline. A Data warehousing solution would help them bring data from upstream systems & facilitate DSS by providing Analytical reports & dashboards. Data from various source systems (Navitaire & Flat files) are brought together on a daily/monthly basis to serve the reports.
RESPONSIBILITIES
•Understand and Analyzing the Business Requirements
•Participated in the requirement gather and review.
•Analyzing gaps and providing required data model.
•Working on Oracle OLAP cubes using Analytics workspace Manager
•Experience in developing, validating BI technical design document from business specifications document.
•Expertise in identifying dimension and fact tables and modeling into star schema, developing dimensional Hierarchies, level-based measures and summary tables as necessary.
•Creating Cubes and maintain cubes using daily and incremental loads.
•Designed and developed the OBIEE Metadata Repository using OBIEE Admin tool by importing the required objects (Dimensions and Facts) with integrity constraints into Physical Layer.
•Developed Time Series objects for computing Year Ago, YTD, MTD, WTD measures.
•Experience in developing various dynamic and interactive complex reports/dashboards.
•Worked on data level security and object level security.
•Worked on usage tracking and developed reports required by users for usage tracking.
•Created contexts, logical schema, and physical schema and mapped logical schemas to different physical schemas under topology.
•Have knowledge on scenario and their execution.
•Worked on table to table and flat file to DB loading.
•Developed Sql queried for data quality checks.
Client: Sri Lankan Airlines
Duration: Aug 2013 – Feb 2015
Role: BI Developer
Sri Lankan Airlines: The National Airline of Sri Lanka, is an award-winning carrier with a firm reputation as a global leader in service, comfort, safety, reliability, and punctuality.Launched in 1979, Sri Lankan is currently expanding and further diversifying its wide range of products and services to drive the country’s ongoing boom in tourism and economicdevelopment.
SLA EDW implementation is all about, designing data warehouse for reservation system, revenue accounting, Inventory, which allowed business to check the insights of data. Provided most of the reports are analytical report, where business using them in their day-to-day operations, monthly and annual general body meetings.
RESPONSIBILITIES
•Gathered, analyzed user requirements, and worked closely with Business analysts, Client to create Technical Specifications
•Designed and developed the OBIEE Metadata Repository using OBIEE Admin tool by importing the required objects (Dimensions and Facts) with integrity constraints into Physical Layer
•Developed multiple Dimensions with multiple Drill-Down Hierarchies in Business Model Layer and customization of Presentation catalogs in Presentation Layer
•Developed Time Series objects for computing Year Ago, YTD, MTD, WTD measures.
•Built Interactive Dashboards with drill-down capabilities, use of global prompts and Presentation variables.
•Developed many Reports using different Analytics Views (Pivot Table, Chart, and Column Selector), Dynamic / Interactive Dashboards with drill-down capabilities, charts, tabular using global and local Filters.
•Extensively used OBIEE Delivers and iBots
•Worked on data level security and object level security.
•Worked on usage tracking and developed reports required by users for usage tracking.
•Debugged reports and Dashboards visibility with respect to user’s responsibility and web groups in an integrated environment
•Coordinated the Analytics RPD and Web Catalog objects migration from Dev environment to QA environment and to the Production environment.
•Performed Unit, Integration testing to validate report and mapping functionality.
Client: Fluke Corporation
Duration: Nov 2011 – Aug 2013
Role: BI Developer
Fluke Corporation is a subsidiary of the Danaher Corporation, is a manufacturer of industrial testing equipment including electronic test equipment. Fluke Corporation is a global corporation with operations worldwide. It designs, develops, manufactures, and sells commercial electronic test and measurement instruments for scientific, service, educational, industrial, and government applications. Fluke Biomedical and Fluke Networks are sister organizations.
This Project aims at providing the production support and maintenance of their BI applications and database systems. The support work includes addressing and consulting the end users on the day-to-day BI application queries. This project also aims at enhancement of the BI applications as per the business needs.
This project involves technical design, build & unit test of the Siebel analytics application and bug fixing during Integration Test, UAT and post go-live support.
RESPONSIBILITIES
•Working as BI Developers.
•Providing the consultation to the end users on their day-to-day queries on BI systems.
•Monitoring the OBIEE servers and environments. This includes monitoring the server performance and maintenance.
•Managing the user’s permissions and access control by controlling the user’s web group access.
•Managing and monitoring the user’s OBIEE Reports, dashboards and I-Bots.
•Managing and cleaning the OBIEE web catalog by removing the inactive users from the system.
•Managing and monitoring the users license usage to make sure the efficient utilization of the BI applications.
•Developing Dashboard, reports and I-Bots based on the business requirements as a part of application enhancement.
•Developing and modifying the Informatica mappings based on the application development and enhancement. Also monitoring the mappings loads.
•Developing, updating, maintaining the PL/SQL table, Views and package scripts in CVS repository.