Yekanth
Email Id: *********@*****.*** Phone no: +447*********
Career Summary:
• A seasoned strategic data developer with over 4 years of hands-on experience in cloud computing and DevOps, specializing in Microsoft and Azure, as well as AWS platforms.
• Expertise includes designing, architecting, and implementing scalable cloud, on-premises, Data migrations, data platform solutions, and automation frameworks.
• Experience in using ADLS, Delta Lake as a Source, and pulling data using ADF, Databricks, Synapse, Azure Blob used for storage, and performing analytics in Azure Synapse Analytics.
• Extensive experience in data modelling and schema design, specializing in Star Schema and Snowflake Schema techniques for creating efficient and scalable data models.
• Skilled in building and managing CI/CD pipelines using Azure DevOps and GitHub to streamline application code deployments across both cloud and on-premises environments.
• Demonstrates expertise in designing, developing, and implementing data and analytical solutions using Data Analytics tools, SSRS, Power BI, Tableau, and Looker.
• Skilled in implementing data quality and governance measures, including data lineage, impact analysis, and metadata management, to enhance data discovery, compliance.
• Excellent interpersonal, time management, and problem-solving skills in collaborative teams. Technologies
• Data Analytics: Tableau, SSRS, Power BI, Qlik
• ETL: SSIS, Power Query, ADF, Synapses, Databricks, Fabric
• DWH: MySQL, Oracle, MySQL, MongoDB, Snowflake, ADLS, Delta Lake, Azure SQL DB
• Languages: SQL, Python, PySpark, DAX, MDX, Power Query, Big Query • SDLC: Jira, CI/CD, DevOps, Visual Studio, Power Automate
Professional Work Experience
Organization: Flash Tech IT,,[Data & Analytical Engineer] (Apr 2024 to Sep 2025)
• Work with cross-functional department users to gather and understand the requirements related to DWH, ETL/ELT, BI and convert the requirements into technical solutions.
• Ingested data in mini-batches and performs RDD transformations on those mini-batches of data by using Spark Streaming to perform streaming analytics in Databricks.
• Created and provisioned different Databricks clusters needed for batch and continuous streaming data processing and installed the required libraries for the clusters.
• Created Databricks Spark jobs with PySpark to perform several table-to-table operations.
• Utilize Databricks Delta Lake storage layer to create versioned Apache Parquet (delta) files with transaction log and audit history, and to perform ACID transactions.
• Integrated Rest API with Azure SQL Database and Azure Blob Storage for real-time data access and storage, while using Pedantic models for structured data interactions.
• Successfully executed a proof of concept for Azure implementation, with the wider objective of migrating on-premises servers and data platforms to the Azure cloud.
• Processed structured and semi-structured data into Spark Clusters using Spark SQL and Data Frames API, enabling efficient ETL workflows, data transformations, and aggregations.
• Developed, maintained, and managed advanced reporting, Data cleaning, Report Design, analytics, dashboards, and other BI solutions using Power BI and Tableau.
• Evaluated current scheduling practices to identify business requirements and translated them into technical requirements to enhance and create a new custom Power BI report. Organization: Mars Info, Client: Varo Bank,[Data Engineer] (May 2022 to Aug 2023)
• Was responsible for working with stakeholders to gather the requirements, own the solutions, and design end to end data platform solutions, including DWH, ETL, and BI.
• Configured and optimized ADF pipelines for high performance and scalability, leveraging features such as parallel processing, data partitioning, and dynamic data flow activities.
• Utilized ADF pipelines, Databricks notebooks for data extraction from various semistructured sources, including file-based sources such as CSV, Parquet, and JSON files.
• Ingested data into Azure services such as Azure Data Lake, Azure Storage, Azure SQL, and Azure Data Warehouse, and performed data processing tasks in Azure Databricks.
• Worked with Microsoft Azure services including HDInsight Clusters, Blob, Data Factory, Logic Apps, and conducted proof of concept (POC) projects on Azure Data Bricks.
• Worked on RDDs & Data frames (Spark SQL) using PySpark for analyzing, processing the data
• Extensively chipped on Spark Context, Spark-SQL, RDDs Transformation, and Data Frames.
• Developed effective ETL packages with SSIS and ADF pipelines for Fact and Dimension tables, incorporating complex transformations and SCD type 1 and 2 changes.
• Executed in-depth data analysis using Power BI, SSRS, and Tableau to uncover trends, patterns, and actionable insights, greatly enhancing strategic decision-making.
• Responsible for resolving report defects in the dashboard application, publishing reports to testing and production environments, and facilitating the testing process. Organization: Mars Info, Client: Lineage Logistics,,[Data Engineer] (Sep 2021 to April 2022)
• Collaborated with business stakeholders to comprehend reporting needs and requirements, ensuring the alignment of BI, ETL, and data storage solutions with business goals.
• Enhancing and deploying the SSIS packages from the development server to the production server and scheduling jobs for executing the stored SSIS packages.
• Configured SQL mail agent for sending automatic emails when SSIS packages fail, succeed.
• Engineered interactive Power BI reports with drill-down capabilities and custom visuals, significantly amplifying data exploration and user engagement.
• Architected a logical dimensional model for a new data mart using Erwin and Kimball methodology for DWH design, facilitating enhanced data organization and accessibility.
• Designed and developed fact and dimension tables with varying granularity levels to populate the data mart from the existing data warehouse, optimizing data structure.
• Created Power BI, SSRS reports using Report Parameters, Drop-Down Parameters, MultiValued Parameters, Debugging Parameter Issues Matrix Reports, and Charts.
• Created Data Marts and multi-dimensional models like Star schema and Snowflake schema. Organization: Mars Info, Client: T Mobile,,[BI Developer] (Feb 2021 – Sep 2021)
• Developed ETL workflows with SSIS to process Fact and Dimension tables with complex transformations and SCD Type 1/Type 2 logic, automating workflows using SQL Agent Jobs.
• Generated on-demand and scheduled reports for business analysis or management decisions using SQL Server Reporting Services, Power BI, and Tableau Desktop.
• Designed different types of KPI's for percentage gr
• Growth in different fields and compared over the period, and designed aggregations and precalculations in SSAS.
• Worked on transforming data into user-friendly visualization to give business users a complete view of their business using Power BI, SSRS, and Excel.
• Created different types of Reports such as Crosstab, Drill-down, Summary, Parameterized, Sub reports, and formatted them using SSRS and Power BI.
• Designed Power BI reports based on business requirements and deployed the reports to workspaces and automated report datasets using Power BI Gateways.
• Was responsible for the installation and configuration of Gateways to schedule Power BI datasets.
Educational Qualifications
• Master's Degree in Computer Science.
• Bachelor's degree in Information Technology.
Professional Certifications
• AWS Academy cloud foundations.
• Big Data Computing-NPTEL.
• Microsoft Technology Associate- MICROSOFT.
• Microsoft Certified: Azure Data Engineer Associate (DP-203)