| Distance: |
Resume alert |
Resumes 121 - 130 of 12249 |
Charleston, IL
Charan Datti Charleston, IL +1-217-***-**** # ************@*****.*** Profile Summary • Data Engineer with 4+ years of experience designing scalable CDC and ETL frameworks using Debezium, Apache Spark, and AWS to deliver real-time, analytics-ready ...
- Oct 15
Dallas, TX
... Proven expertise in Change Data Capture (CDC), Debezium, and data lake hydration for enterprise analytics. Hands-on experience with Apache Airflow, AWS Glue, EMR, Step Functions, and Lambda (Python). Adept at designing and optimizing ETL and ELT ...
- Oct 15
Leander, TX
... • Constructed a secure CDC (Change Data Capture) framework using Debezium and PostgreSQL, syncing 200,000+ investor updates per week into Snowflake marketing marts and improving lead conversion tracking accuracy by 32%. • Formulated and deployed dbt ...
- Oct 15
Charlotte, NC
... Skills • Data Engineering & ETL: PySpark, Spark SQL, Delta Lake, Databricks, AWS Glue, EMR, Azure Data Factory, Change Data Capture (CDC), DBT, ETL Frameworks, Data Partitioning, Data Modeling • Cloud Platforms & Services: AWS (S3, RedShift, Kinesis ...
- Oct 15
Plant City, FL
... • Stopped duplicate and missing records from CDC sources using AWS DMS to S3 with watermarking and idempotent upserts in Spark; data defects reduced by 40%. • SQL (Athena/Redshift/Snowflake): Wrote performance-critical queries for sessionization and ...
- Oct 15
Texas City, TX
... Client: Exprad IT Jan 2021 – Jun 2022 Data Analyst, India •Ingested large-scale customer and behavioral data into Snowflake using Azure pipelines, improving data accessibility for analytics teams •Designed warehouse models using SCD, CDC, surrogate ...
- Oct 15
San Antonio, TX, 78240
... Built data ingestion pipelines using Kafka Connect to capture Change Data Capture (CDC) events from SQL Server and Oracle into Synapse Analytics. Implemented Data Lakehouse architecture using Delta Lake in Databricks, enabling ACID transactions, ...
- Oct 15
Austin, TX, 78727
... Used CDC with Debezium and Kafka plus SCD Type 2 in BigQuery to ensure accurate historical sales tracking. • Ensured data quality using Great Expectations within Airflow, reducing pipeline failures by 45%. Enforced data governance and security best ...
- Oct 15
New York City, NY
Reddieswara Naidu Kalla USA +1-845-***-**** **********************@*****.*** Summary Experienced Data Engineer with a proven record in designing CDC pipelines and efficient ETL frameworks using Apache Spark and Snowflake Streams & Tasks. Achieved a ...
- Oct 15
Karnataka, India
... Hadoop, Structured Streaming ETL/ELT & Workflow Orchestration: Apache Airflow, dbt, Azure Data Factory, Delta Live Tables, CDC (Change Data Capture) Data Warehousing & Modeling: Azure Synapse Analytics, Snowflake, Kimball Dimensional Modeling, Data ...
- Oct 15