*********@*****.***
https://www.linkedin.com/in/paulamibandyopadhyay
Professional Summary
Results-driven IT professional with 14 years of extensive experience in data-intensive applications. Proficient in ETL solutions, orchestrating robust workflows, leveraging Azure services and Databricks notebooks to reduce data processing times by 40% and improve query performance by 25%. Successfully led planning and development of Data Warehouse, optimizing data access, reporting and analytics for over 50 stakeholders, boosting report efficiency by 30%. Led the migration of on-premises Data Warehouse to Azure, completing within a stringent timeline, 15% under budget while delivering operational reports and analytical dashboards that improved decision-making speed by 20%. With strong analytical skills, can effectively translate business requirements into impactful, quantifiable data solutions leading to measurable business outcomes with improved data integrity and compliance metrics by 35%. Have effectively led cross-functional teams of more than 10 members, ensuring project milestones and fostering collaboration. Skills
Databricks, Azure Services, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure DevOps, Microsoft Fabric, Azure Synapse, Delta Live Tables, SSIS/SSAS/SSRS, SQL, DBT, CA ERwin, Informatica, TALEND, DAX, Power Query, Tableau, Python, Scala, PySpark, GITHUB, Power BI, Oracle, VB, C#.Net, IBM DB, IBM Mainframe, ClearCase, JIRA, Rally, Service Now, Clear Quest, TFS, Change Management, BI reporting, Dynatrace Data Engineering, Data Analytics, Data Architecture & Modelling, Data Visualization, Data Migration, Data Warehousing, Data Governance, Data, Cross-functional Technical Leadership, Retrospective Analysis, Agile, Waterfall, Problem Solving Work Experience
Smiley Technologies Inc, Little Rock, AK, Nov ‘24 – Jan ‘25 Senior Data Engineer
Automation of ADF and Databricks workflows using Airflow/Prefect Smiley provides a unique, total core banking software and services solution for community banks and financial institutions. In this project, I enhanced existing Azure Data Factory (ADF) pipelines and developed new Databricks pipelines, leveraging external orchestration to define tasks and manage workflows across multiple tools, platforms, and environments. I implemented a web-based UI for real-time workflow monitoring (Azure monitor/Databricks Audit Logs), log analysis, and efficient troubleshooting of failed tasks. Additionally, I automated Databricks job creation using Python scripting, which improved disaster recovery processes and reduced development time by 20%. I also used DBT for data modeling, ensuring seamless integration with Power BI. Paulami Bandyopadhyay
Senior Data Engineer and BI Consultant
2 P a g e
Milliman-Inc, Chicago, IL, Dec ‘21 – Oct ‘24
Senior Data Engineer
Building Modern Automated Data Management Platform for Venerable Following Voya Financial's acquisition by Venerable Holdings, a $5M partnership with Milliman was initiated to clean, transform, and integrate large volumes of diverse data. This effort enabled Power BI reporting, allowing actuaries to focus on modeling and improving processing timelines by 30%.
• As Lead Data Engineer, collaborated with stakeholders, designed a scalable data pipeline architecture employing lean DataOps approach, ingesting a large volume of high frequency data, standardizing and centralizing data tables for quicker integration with Power BI.
• Implemented auto loader to detect and process new files automatically, data quality checks via Databricks Dashboards for early discrepancy detection and integrated industry benchmarks into workflows for enhanced credibility with reduced task completion delays by 15%.
• Collaborated with data scientists, prepared data models for Power BI, designing interactive dashboards that met stakeholder needs, optimizing efficiency through parallelism with Spark partitions to maximize throughput by 20%, while maintaining separate development environments.
• In coordination with DevOps team, ensured compliance with data governance regulations using Unity Catalog.
• Resolved data issues with ceding companies through improved data insights through dynamic validations in Power BI reports and supported external SOC audit controls driving $1M in additional revenue
• Managed a 5-member team, provided constructive feedback, encouraging innovative thinking within the team. Legacy Modernization of Life and Annuity Predictive Analytics (LAPA) Infrastructure As a part of acquisition of Ruark Consulting, this project focused on migration over 5 TB of data for health, annuity, and mortality experience studies for over 16 million members from on premise SQL Server/SSIS to Azure Data Lake/Databricks using ADF pipelines.
• Led the legacy modernization team, developing a staggered migration plan that ensured zero downtime, supporting client delivery for over 20 clients on schedule.
• Managed an Azure DevOps board to track project progress, maintaining reports and artifacts for a $2M project.
• Coordinated with DevOps & Infra team to set up secure Azure accounts and endpoints, facilitating data transfers between domains with a 100% compliance rate and achieving a 40% improvement in data retrieval times.
• Collaborated with Actuarial Analysts and Data Scientists, designed data flow diagrams through reverse engineering, successfully outlining new Azure pipelines that enhanced process efficiency by 25%.
• Transformed SSIS codes to Databricks notebooks, maintaining a GitHub code repository with over 100 versions. Ruark Consulting-LLC (Acquired by Milliman Inc), Simsbury, CT, Dec ‘20 – Nov ‘21 Data Engineer
Experience Studies – Variable Annuity/ Fixed Indexed Annuity/ Mortality Ruark Consulting, LLC, an actuarial firm focused on principles-based insurance data analytics and risk management, leveraging over 12 years of data, covering $1.1 trillion in annuity account values wanted to enhance their process for better maintainability, and performance by centralizing data, streamlining business rules, and automating processes.
• Managed and refined data architecture/models for experience studies and predictive analytics, enhancing data efficiency by 30%. Employed industry best practices, achieving a 20% reduction in data processing times.
• Identified enhancements that led to a 15% increase in operational productivity for actuaries, improving data access for research opportunities, and a scalable platform for future growth.
• Designed and implemented data-driven client reporting tools, improving reporting accuracy by 40% 3 P a g e
• Collaborated with external data contributors to enhance data integrity, ensuring secure access for over 60 users. Infosys Ltd - Sep ‘13 – Nov ‘20
Data Engineer
Client- The Hartford Financials, Hartford, CT
Building a new Group Benefit Data Warehouse
The project focuses on building a centralized Data Warehouse for Group Benefits applications. After comprehensive analysis of policy and claim systems, designed the pipeline to ingest, clean, reorganize and load data various sources to a centralized DW for efficient reporting and analysis, maintaining strict accuracy through ETL processes using Talend.
• Designed DW using dimensional modelling to retrieve data sourced in diverse formats seamlessly enhancing data accessibility by 40%.
• Created data models for OLAP database, creating a semantic layer of Business Universe and developed BI reports, improving processing efficiency by 30%.
• Used SQL Profiler for optimization and performance tuning, achieving a 25% reduction in load times. Client- Aetna Health Insurance, Hartford, CT
Maintenance of Financial processing/reporting in Aetna Group Insurance Domain
• Developed a Data Rep sourcing multiple applications, normalizing data and enhancing consistency by 35%, while creating conceptual, logical, and physical models for end-to-end ETL solution for a domain-specific Data Mart for financial processes ranging from enrollment to benefit payment, reducing processing times by 30%.
• Developed and scheduled reports in Tableau, used Workflow Monitor to optimize performance and dependency management, improving report efficiency by 25% supporting financial analysis for over 50 key stakeholders. Building a new Life Death Master- Social Security Death Index Application Companies pay billions annually in life insurance claims through periodic search as per law for unclaimed benefits, our project aimed to reduce manual efforts, enhance data accuracy and generate efficient reports through a new ETL/reporting pipeline, improving data load efficiency by 30% and reducing response times by 25%. Data Migrator Application- Aetna Administrative System Data Migrator, repository for member demographic data, interfacing with Claims, Billing, and Group Insurance systems. Developed an optimized ETL process to populate OLAP databases, supporting generation of multi-dimensional reports and cubes, improving data access efficiency by 30%. Cognizant Technology Solutions - Jul ‘10 – Aug ’13 Integration Lead
Client- Wells Fargo, SFO, CA
Mainframe Rehosting Project -Focused on modernization of Core Banking System from existing Legacy MF system. Client- Manulife Financial, Boston, MA
Life Systems- Agency - Application developer, unit & system tester for batch and online application, supported ad-hoc 4 P a g e
enhancements and break fix service requests, stringently following SDLC cycle. 5 P a g e
Education
• Bachelor of Technology in Computer Science and Engineering, West Bengal University of Technology, Kolkata, India with 9.02 DGPA.
Certifications
• Databricks Certified Data Engineer Professional
• AZ-900 Microsoft Certified Azure Fundamentals
• SAFe 5 Agilist from Scaled Agile Inc.
• AHIP Part A - Health Insurance Principles - HIPAA
• LOMA 280 - Principles of Insurance-Life, Health and Annuities