Chandrakant Gajjar
Databricks Certified Data Engineer Professional
*********@*****.*** 302-***-****
PROFESSIONAL SUMMARY:
Dynamic Senior Data Engineer and Databricks Certified Data Engineer Professional with 20+ years of total IT experience, including 10+ years architecting and optimizing scalable, high-performance data solutions. Proven expertise in Databricks, PySpark, SQL and PL/SQL, with a strong track record of improving data integrity and scalability through multi-layered Medallion Architecture implementations. Led the end-to-end development of interactive Power BI dashboards, delivering 99% reporting accuracy and actionable insights. Skilled in managing cross-functional, global teams and driving efficient project delivery in fast-paced environments. Recognized for strong analytical thinking, technical leadership, and the ability to translate complex data challenges into impactful business outcomes.
SKILLS:
Data Strategy & Governance: Master Data Management (MDM), CDC (Spark Structured Streaming, Golden Gate), Data Quality KPIs, Unity Catalog, Data Lineage, Data Glossary, Governance Frameworks, PHI & HIPAA Compliance
Data Engineering & Platforms: Databricks (Delta Live Tables, Autoloader, Medallion Architecture, DABs), Oracle (Exadata, PL/SQL), PostgreSQL, AWS S3, ADLS Gen2, Azure Data Factory, Python, PySpark, SparkSQL, SQL, Shell Scripting, Pandas, Kafka Streaming, RESTful API
Data Architecture & Analytics: Data Lake / Lakehouse / Delta Lake Architecture, Data Warehousing, Data Modeling, Data Marts, ETL/ELT Frameworks, Database Performance Tuning (AWR Reports, SQL Tuning Advisor, Explain Plan, Query Optimization), Healthcare Data Standards & Compliance, Data Analysis
Visualization & Reporting: Power BI, Dashboard Design, KPIs, Executive Reporting, Advanced Analytics & Visualization Techniques
DevOps & Delivery: CI/CD Pipelines, DABs, Git, Github Copilot, Agile & SDLC Methodologies, Test Automation, Cloud Migration, System Integration, Issue Management
Leadership & Stakeholder Engagement: Team Management & Mentorship, Project Estimation, Vendor Coordination, Budget Oversight, Conflict Resolution, Cross-functional Collaboration, Stakeholder Communication
Soft Skills: Strategic Thinking, Customer-Centric Mindset, Business Ethics, Communication, Planning & Prioritization, Analytical Problem Solving, Adaptability & Resilience
PROFESSIONAL EXPERIENCE:
AT&T, Middletown, NJ April 2026 – Current
Sr. Data Engineer
Designed and implemented a metadata-driven Medallion Architecture using Databricks Delta Live Tables (DLT) to unify disparate data streams from GoldenGate (CDC/Full Load) and Kafka (POST API), ensuring a single source of truth for enterprise contact data.
Engineered advanced data reconciliation logic within the Bronze and Silver layers, utilizing source-priority weighting and timestamp sequencing to resolve record conflicts and maintain data integrity across high-velocity ingestion pipelines.
Accelerated the development lifecycle by utilizing GitHub Copilot for rapid prototyping and code optimization, while leveraging VS Code and Databricks Asset Bundles (DABs) to deploy version-controlled infrastructure-as-code (IaC).
Alnylam Pharmaceuticals, Boston, MA January 2023 – Mar 2026
Sr. Data Engineer
Engineer scalable data pipelines and ETL frameworks using Databricks on AWS as well as Azure including Data Factory, leveraging PySpark and SQL within a Medallion Architecture (Bronze, Silver, Gold).
Implement Databricks AI/BI Genie to generate AI-powered analytics reports, enabling business users to ask natural language questions and receive accurate SQL-driven insights from Delta Lake datasets.
Integrate structured and semi-structured data from SAP, Veeva, Snowflake, and Smartsheet, enabling real-time analytics to support global supply chain and workforce strategies.
Implement data governance and security controls via Unity Catalog, ensuring compliance with HIPAA and enterprise data standards across data lake and reporting layers. Enhance data lineage and access management to safeguard sensitive information.
Optimize high-volume Spark and SQL workloads, achieving 40% performance improvements, while automating data validation and quality checks to ensure 99% data accuracy for enterprise reporting systems.
Develop Power BI dashboards delivering real-time insights, contributing to a 50% improvement in business performance.
Implement CI/CD pipelines using Git ensuring seamless integration and delivery of data projects
Lead an offshore team of Data Engineers & BI Developers, overseeing deliverables, code reviews and ensuring project execution aligned with data strategy and business outcomes.
Estimate project efforts and resource planning to ensure end-to-end execution within defined timelines, aligning deliverables with business goals and enabling informed decision making.
Oracle America Inc., Boston, MA March 2019 – December 2022
Principal Data Engineer
Developed and launched listings and discrepancy modules in Oracle's Clinical Trial Health Science product, streamlining data validation and reporting workflows to reduce manual review time by 60% and improve data accuracy and regulatory compliance across clinical operations.
Engineered API-driven data integration solutions and built efficient ETL pipelines using Databricks and SQL, enhancing interoperability and accelerating data processing by 45% across distributed systems and downstream analytics platforms.
Revamped Oracle databases by analyzing AWR reports and leveraging SQL Tuning Advisor to implement 30% performance improvements on critical workloads and ensuring reliable data for analytical insights.
Design test strategies and automation solutions to ensure seamless data migration and system integration.
DTCC Solutions LLC, Boston, MA July 2017 – March 2019
ETL Developer
Designed and launched GICRS (Global Instruments Cross Reference Services), a high-performance, enterprise-grade securities cross-reference platform that improved data accuracy by 35% and enhanced accessibility across multiple financial systems.
Developed and optimized ETL pipelines in Oracle using SQL and PL/SQL, streamlining data ingestion and transformation processes, resulting in a 40% reduction in processing time and supporting high-volume transactional workloads of over 10 million records daily.
Enhanced OLAP and Data Warehouse performance by implementing Star and Snowflake schema models, improving query response times by up to 50% and increasing data accessibility for business intelligence reporting across multiple departments.
Commercial Tax Departments via TCS, Gandhinagar, India August 2014 – June 2017
Database Lead
Managed Oracle database development & performance tuning for high-volume tax applications.
Designed PL/SQL-based ETL workflows & data integration solutions.
Conducted real-time database monitoring & issue resolution.
Led a team of DBAs & Developers to deliver production grade database solutions – conducted real-time performance monitoring, troubleshooting, code reviews, and ensuring project delivery aligned with business SLAs.
ADDITIONAL PROFESSIONAL EXPERIENCE:
ETL Developer at Paramount Pictures via TCS (Jul 2012 – Jun 2014)
Analyst Programmer at Aegon UK Life Insurance via TCS (Apr 2011 – May 2012)
Database Developer at Uganda Revenue Authority via TCS (Oct 2007 - Mar 2011)
Application Developer at Ashima Limited (Aug 2004 - Oct 2007)
EDUCATION:
Master of Computer Applications (MCA) – Gujarat University, India (2004)
Bachelor of Science (Physics) – Gujarat University, India (2001)
CERTIFICATIONS:
Databricks Certified Data Engineer Professional (Active) Certification ID: 180038057