Akanksha T
Data & BI Engineer
Mobile: 0-776-***-**** Email: *************@*****.***
Professional Career Summary
o Over 4 years of experience as a Data and BI Engineer, designing, developing, and maintaining cloud- based ETL pipelines, data lakes, and data warehouse solutions for enterprise clients. o Skilled in implementing metadata-driven pipelines, PySpark big data workflows, and Python automation scripts, optimizing data processing and improving pipeline efficiency across large datasets. o Proficient in Power BI, Tableau, SSAS, and SAP BO for creating interactive dashboards and multi- dimensional reports, enabling actionable insights for business stakeholders and decision-makers. o Experienced in data governance and security frameworks (IAM, RBAC, Purview), GDPR, PII, ensuring compliance, controlled access, and protection of sensitive enterprise data. o Adept at collaborating with stakeholders and cross-functional teams, gathering requirements, translating business needs into technical solutions, and delivering high-quality, scalable analytics solutions. Career Summary
A results-driven data engineer with a proven ability to deliver robust, scalable, and high-quality data solutions across enterprise environments. Skilled at collaborating with stakeholders to translate business requirements into efficient data pipelines and analytical workflows, ensuring accurate, timely, and actionable insights. Adept at implementing end-to-end data processing and reporting frameworks that streamline operations, enhance decision-making, and support organizational goals. Demonstrates a strong commitment to data governance, process standardization, and continuous improvement, consistently delivering solutions that optimize workflows and drive business value.
Technical Skills
Category Skills
Cloud Platforms Azure, AWS
Big Data & Processing PySpark, Spark, Delta Lake, Streaming Analytics, Event Hub Programming Languages PySpark, Python, SQL, GraphQL, KQL ETL & Data Integration SSIS, ADF, Databricks, Fabrick, Synapse Business Intelligence Power BI, Tableau, SSRS, SAP BO Data Storages SSAS, SAP BW, SQL Server, ADLS, Delta Lake, One Lake, Synapse Pools Data Governance & Security IAM, RBAC, Azure Purview Version Control & DevOps Git, Azure DevOps, CI/CD Pipelines Methodologies Agile, Waterfall, Dimensional Modeling Professional working Experience
Client: Rhenus, Employer: Sun Tech
Duration: Mar 2024 to Aug 2025
Role Data Engineer, Responsibilities
o Partnered with stakeholders and business teams to understand complex enterprise data requirements, ensuring ETL workflows and data models were optimized for scalability, accuracy, and efficiency. o Implemented new end-to-end pipelines using Databricks, Fabric, Unity Catalog, Synapse, Python, and PySpark, enabling seamless data ingestion, transformation, and storage for cross-department analytics. o Leveraged PySpark for distributed big data processing, optimizing ETL operations and reducing runtime for large datasets stored in Delta Lake and ADLS, improving workflow reliability. o Developed Python scripts for automation, including data validation, test execution, and monitoring of ETL processes, minimizing manual intervention and enhancing operational efficiency. o Built metadata-driven pipelines in ADF and Databricks integrated with Delta Lake and Azure Purview, ensuring consistency, maintainability, and traceability across enterprise data workflows. o Designed and implemented data lake and data warehouse architectures to consolidate and optimize storage, enabling faster access to analytics-ready datasets for reporting and BI. o Created Power BI dashboards that visualized KPIs and operational metrics, empowering business users to make timely, accurate, data-driven decisions across multiple functional areas. o Applied data governance practices using IAM and RBAC, securing access to sensitive datasets and ensuring full compliance with corporate security, privacy, and data policies. o Migrated legacy CRM data into Dataverse and Azure SQL DB, standardizing formats, validating accuracy, and enabling seamless integration with modern analytics solutions. o Coordinated with DevOps teams to integrate pipelines into CI/CD workflows, automating deployments and version control across development, test, and production environments. o Tuned complex SQL queries and transformations to enhance performance, enabling rapid, efficient, accurate, and reliable data retrieval for reporting and analytics workflows. o Conducted multiple requirement gathering workshops to align data pipelines, reporting dashboards, and governance frameworks with enterprise business objectives and operational analytics needs. Client: CoreSite, Employer: Envision
Duration: Oct 2021 to Dec 2022
Role Data Engineer, Responsibilities
o Engaged with business users to gather and document requirements, ensuring that data pipelines captured transactional, operational, and CRM datasets accurately for analytics and reporting purposes. o Implemented new ETL workflows leveraging CosmosDB, Databricks, Synapse, ADF, and Python, enabling efficient processing and transformation of diverse enterprise datasets. o Built Airflow workflows to orchestrate complex ETL processes across Azure SQL DB, Databricks, and Synapse, reducing manual scheduling errors and increasing pipeline reliability. o Integrated AWS S3 and Redshift data into ADLS, creating a hybrid cloud ecosystem that supported analytics, reporting, and downstream BI workflows across platforms. o Optimized PySpark ETL jobs for large-scale data transformations, improving throughput and enabling timely availability of clean datasets for business intelligence consumption. o Developed Python automation scripts for data validation, error handling, and ETL testing, ensuring consistent and accurate pipeline execution across multiple environments. o Designed metadata-driven pipelines in ADF and Databricks with Delta Lake and Azure Purview, standardizing source-to-target processes for reusability and maintainability. o Delivered Power BI dashboards combining multiple sources including CosmosDB, Redshift, and Azure SQL DB, providing unified and interactive reporting for business users. o Applied data governance and security frameworks (IAM, RBAC, Purview), securing sensitive datasets and ensuring full regulatory compliance and enterprise operational security. o Tuned SQL transformations and queries for performance optimization, supporting faster data extraction, processing, and reporting for analytics workflows. o Managed CRM data migrations into Azure SQL DB and Delta Lake, improving data consistency, accuracy, and downstream usability for analytics and operational reporting solutions. o Conducted requirement validation sessions with cross-functional teams to ensure ETL, reporting, and governance solutions met technical and business expectations effectively Client: Flosonics Medical, Employer: Envision
Duration: Dec 2020 to Oct 2021
Role Data Analyst, Responsibilities
o Collaborated with business stakeholders and SAP users to gather data requirements, ensuring ETL workflows, reporting processes, and analytics pipelines aligned with business priorities. o Developed and deployed SSIS packages to extract, transform, and load data from multiple SAP BW and ERP systems into Azure SQL DB and ADLS, ensuring high data accuracy and availability. o Built and maintained SSAS cubes for enterprise OLAP reporting, enabling multi-dimensional analysis and empowering finance, operations, and supply chain teams with actionable insights. o Designed SAP BW to Azure Data Lake integration pipelines using ADF and Databricks, modernizing legacy workflows while enabling cloud-based analytics and hybrid reporting solutions. o Automated ETL testing and data validation using Python scripts, reducing manual reconciliation, increasing pipeline reliability, and ensuring data quality across production workflows. o Created metadata-driven ETL pipelines with ADF, Databricks, and Delta Lake, providing reusable and traceable frameworks for enterprise-wide data integration and transformation. o Developed SAP BO and Power BI dashboards combining on-premise and cloud data, offering interactive visualizations and consolidated reporting for cross-functional teams. o Applied data governance and version control using Git and RBAC policies, maintaining secure access to pipeline code, protecting sensitive data assets, and ensuring compliance. o Migrated critical finance, HR, and operational datasets into Azure Synapse Analytics, ensuring a consolidated reporting environment and streamlined analytics workflows. o Optimized SQL queries and transformations for large datasets, improving performance, reducing ETL runtime, and enabling faster access to reporting and analytics data. o Coordinated requirement gathering and validation sessions with multiple departments to ensure ETL design, reporting solutions, and governance practices aligned with business objectives. o Delivered enterprise-scale data warehouse and data lake architectures integrating Delta Lake with SAP BW schemas, providing a unified analytics platform for both cloud and on-premise data. Client: BNSF Railway, Employer: Envision
Duration: May 2020 to Nov 2020
Role BI Analyst, Responsibilities
o Collaborated with business stakeholders to gather and document requirements, designing ETL pipelines and reporting solutions that improved operational efficiency and analytics accuracy. o Developed and maintained SSIS packages to extract, transform, and load data from multiple SAP modules and external sources into centralized data warehouses, ensuring clean, consistent, analytics-ready datasets. o Built and managed SSAS cubes to enable multi-dimensional reporting, allowing finance, operations, and sales teams to perform advanced analytics and generate actionable insights efficiently. o Designed, developed, and deployed SSRS reports and Tableau dashboards to deliver interactive visualizations and consolidated business reporting for executives and management teams. o Leveraged Alteryx to automate repetitive data preparation, blending, and cleansing processes, reducing manual effort, improving data quality, and accelerating reporting timelines across business units. o Built metadata-driven ETL pipelines in SSIS and Alteryx, ensuring reusability, traceability, and efficient execution of large-scale data workflows for enterprise reporting and analytics. o Optimized SQL queries, stored procedures, and data transformations across large datasets, improving performance and providing faster access to business intelligence outputs. o Implemented data governance frameworks, including role-based access control and audit trails, ensuring secure access to SAP and reporting systems while maintaining compliance with organizational policies. o Conducted requirement gathering and validation sessions with cross-functional teams to ensure BI and ETL solutions aligned with business goals and operational objectives. o Integrated multiple SAP and external data sources into Power BI and Tableau dashboards, providing consolidated, interactive, and actionable insights for finance, operations, and sales teams. o Automated reporting workflows using SSIS, Alteryx, and scheduled jobs, improving refresh cycles, reducing errors, and ensuring timely availability of reports to stakeholders. o Delivered scalable data warehouse and reporting architectures, combining SSAS, SAP, and cloud analytics tools, to create a unified environment supporting enterprise-wide analytics and operational reporting. Professional Certifications
Certified Azure Data Engineer Associate DP-203
Certified Power BI Data Analyst – PL-300
Educational Qualifications
Bachelor’s in Computer Science – 2020
Sumathi Reddy Institute of Technology for Women, Warangal Master’s in Advanced Computer Science with Advanced Practice – 2025 Northumbria University, Newcastle