Post Job Free
Sign in

Data Engineer - ETL, Airflow, Databricks, Snowflake, DBT

Location:
Texas City, TX
Posted:
February 09, 2026

Contact this candidate

Resume:

Abhishek Kochhar

******************@*****.*** 937-***-**** linkedin.com/in/abhishekkochhar101

PROFILE

Data Engineer with 4+ years of experience building scalable ETL and real-time data pipelines across Azure, AWS and GCP ecosystems. Specialized in Airflow, Databricks, Snowflake, and DBT for high-volume enterprise clients in QSR, CPG, Telecom, and Digital Marketing domains. Proven ability to reduce costs, optimize performance, and support business-critical analytics. Seeking full-time data engineering and analytics positions.

EDUCATION

Master of Science in Marketing Analytics,

Wright State University Dayton Campus

2024 – 04/2026 USA

Bachelor in Engineering,

Dr B.R. Ambedkar National Institute of Technology

2017 – 2021 India

WORK EXPERIENCE

Graduate Data Engineer Intern, Sigmoid Analytics 06/2025 – present USA

•Developed and debugged ETL pipelines in Python and SQL for ingesting and transforming diverse data sources.

•Pre-Sales Initiatives: Collaborated on technical discovery for SAP integration capabilities and built prototypes for a Data Quality Dashboard and a Weather-Analytics App to demonstrate value to potential clients.

Client: Jack in the Box (QSR Domain)

•Built a third-party delivery ingestion framework (UberEats, DoorDash) using Python and Airflow to centralize distributed franchise data.

•Designed the framework to automate data extraction, enabling the business to perform financial reconciliation and gain granular details on store-level delivery performance. Data Engineer, Sigmoid Analytics 05/2024 – 08/2024 India

•Received SPOT Award for high performance and proactive client management.

•Designed and implemented cloud-based ETL architecture, optimizing storage and processing in Databricks and Snowflake.

•Developed a global business data model to unify multi-region datasets, improving scalability and consistency.

•Reduced processing time by 30% by streamlining data pipelines and real-time reporting integrations. Client: Kenvue (CPG Domain)

•Consolidated and transformed disparate datasets into a centralized platform for global operations.

•Enabled real-time analytics and operational dashboards using Databricks + Snowflake stack. 1 / 2

Consultant - Data Engineering, Tredence Analytics Pvt Ltd. 01/2023 – 04/2024 India

•Promoted from Analyst to Consultant in 1.5 years for delivering consistent client results.

•Migrated legacy systems to cloud platforms (Azure, GCP), increasing scalability by 200% and cutting annual costs by $500K.

•Led technical training sessions for junior engineers and helped team members obtain Databricks certification.

•Recognized with “Pat on the Back” award for outstanding performance. Client: SurveyMonkey (Digital Marketing Domain)

•Built data ingestion pipelines using Fivetran and Snowflake, reducing data latency by 40%.

•Implemented DBT and Airflow workflows to support automated marketing dashboards. Client: Activision (Gaming Domain)

•Migrated legacy workflows (ODI, SAP BODS) to GCP with Airflow and BigQuery.

•Created 30+ Airflow DAGs to automate transformations and support real-time analytics. Analyst, Tredence Analytics Pvt Ltd. 07/2021 – 12/2022 India

•Supported maintenance and scaling of 10+ data pipelines across large retail data warehouses.

•Independently built data engineering solutions for client use cases in telecom and industrial sectors. Client: T-Mobile (Telecom Domain)

•Migrated Teradata ETL processes to Snowflake using Azure Data Factory and Databricks.

•Reduced ETL processing time by 50% and minimized downtime to under 1% using API-based pipelines. Client: Adani Coal (Industrials Domain)

•Built end-to-end data pipelines using ADF, Power Automate, and Logic Apps.

•Streamlined automation workflows for internal data science platforms. SKILLS

Data Engineering, ETL & Orchestration: Apache Airflow, Azure Data Factory (ADF), Control-M, DBT, ETL framework design, pipeline orchestration.

Cloud Platforms & Data Warehousing: Microsoft Azure, Google Cloud Platform (GCP), Amazon Web Services (AWS), Databricks, Snowflake, BigQuery, data warehousing, data lakes. Programming & Data Processing: Python, SQL, PySpark, Apache Kafka, PostgreSQL. Analytics, BI & Visualization: Power BI, Apache Superset, Microsoft Excel, data analysis. Data Integration & Automation: Fivetran, API-based data pipelines, Power Automate. DevOps, CI/CD & Infrastructure: Docker, CI/CD pipelines, Git. Machine Learning & Advanced Analytics: Scikit-learn. Business, Marketing & Process Expertise: Marketing research, marketing strategy, Lean Six Sigma, FACETS. CERTIFICATION

Databricks Data Engineer

Professional

DP-900 Data fundamentals :

Microsoft

Databricks Data Engineer

Associate

Google Cloud Certified -

Associate Cloud Engineer

AZ-900 Azure fundamentals :

Microsoft

Lean Six Sigma Green Belt

2 / 2



Contact this candidate