Post Job Free
Sign in

Data Engineer - ELT Pipelines & Snowflake Expert

Location:
Manhattan, NY, 10286
Salary:
30
Posted:
February 17, 2026

Contact this candidate

Resume:

Professional Summary

Data Engineer with * years of experience building and supporting ELT pipelines using Python, SQL, Snowflake, dbt, and Apache Airflow. Strong in troubleshooting production issues, improving pipeline reliability, and writing maintainable transformation logic. Hands-on with data quality checks, documentation, and medallion-style layering (raw/stage/marts) Technical Skills

• Languages: Python, SQL

• Data Platforms / Warehousing: Snowflake

• Distributed Processing (Basic): Apache Spark (PySpark basics), Databricks fundamentals

• Transformation: dbt (models, tests, documentation)

• Orchestration: Apache Airflow

• Cloud / Integration: Azure Data Factory

• Dev Tools: Git, Docker (basic)

Experience

Data Engineer – L&T Technology Services, India (Jun 2021 – Jul 2023)

• Built and maintained ELT pipelines using Python + SQL + Snowflake, supporting enterprise reporting and analytics use cases.

• Developed dbt transformation models (staging, marts) with reusable patterns and tests, improving consistency and maintainability of analytics datasets.

• Orchestrated and monitored workflows in Airflow, implementing retries, task dependencies, and standardized logging to improve pipeline reliability.

• Implemented data validation and reconciliation checks (completeness, duplicates, schema drift, row-count comparisons) to reduce downstream reporting issues.

• Supported Snowflake warehouse organization aligned with medallion architecture (raw/stage/marts) to improve usability for analytics consumers.

• Tuned SQL transformations by optimizing joins, reducing unnecessary scans, and pruning columns, improving runtime for key pipelines by up to 40% in production.

• Investigated failed pipeline runs by analyzing logs, isolating root cause (data vs configuration), and applying fixes to restore successful runs.

• Documented recurring failures, fixes, and operational steps to reduce repeat incidents and improve team handoffs. Data Engineer Intern – L&T Technology Services, India (Jan 2021 – Jun 2021)

• Assisted in building ETL jobs using Python and SQL for customer and sales datasets.

• Automated repeatable transformation steps and basic QA checks, reducing manual effort 20–30% for routine refresh tasks.

• Documented pipeline logic and supported debugging of failed runs by tracing source-to-target issues. Data Analyst Intern – Fortis Healthcare, India (Jun 2020 – Dec 2020)

• Built Power BI dashboards and SQL reports for operational and performance tracking.

• Cleaned and validated datasets, supporting day-to-day decision making with accurate KPI reporting. Academic/Personal Projects – MS Program

End-to-End Data Platform (Snowflake + DBT + Airflow)

• Built a production-style pipeline with ingestion transformations marts using Snowflake, dbt, and Airflow.

• Implemented dbt tests (not null, unique, relationships) and workflow checks to improve dataset reliability and trust.

• Used Git workflows (branching, version control) to simulate team-based delivery and maintain clean change history.

• Delivered analytics-ready marts and standardized models to reduce repeated manual reporting effort. Certification: Snowflake SnowPro Core Certification Education

MS in Information Technology–St. Francis College, USA. (Jan 2024 – May 2025) B.Tech in Civil Engineering – GRIET, India (Aug 2017 – July 2021) ManiKanta Reddy Y

****************@*****.*** 551-***-**** git linekdin



Contact this candidate