Post Job Free
Sign in

Senior Data Engineer

Company:
Jeavio
Location:
Vadodara, Gujarat, India
Posted:
May 03, 2025
Apply

Description:

We are seeking an experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering and AWS infrastructure, with hands-on experience in building and maintaining data pipelines and the necessary infrastructure components. The role will involve using a mix of data engineering tools and AWS services to design, build, and optimize data architecture.

Key Responsibilities:

● Design, develop, and maintain data pipelines using Airflow and AWS services.

● Implement and manage data warehousing solutions with Databricks and PostgreSQL.

● Automate tasks using GIT / Jenkins.

● Develop and optimize ETL processes, leveraging AWS services like S3, Lambda, AppFlow, and DMS.

● Create and maintain visual dashboards and reports using Looker.

● Collaborate with cross-functional teams to ensure smooth integration of infrastructure components.

● Ensure the scalability, reliability, and performance of data platforms.

● Work with Jenkins for infrastructure automation.

Technical and functional areas of expertise:

● Working as a senior individual contributor on a data intensive project

● Strong experience in building high performance, resilient & secure data processing pipelines preferably using Python based stack.

● Extensive experience in building data intensive applications with a deep understanding of querying and modeling with relational databases preferably on time-series data.

● Intermediate proficiency in AWS services (S3, Airflow)

● Proficiency in Python and PySpark

● Proficiency with ThoughtSpot or Databricks.

● Intermediate proficiency in database scripting (SQL)

● Basic experience with Jenkins for task automation

Nice to Have :

● Intermediate proficiency in data analytics tools (Power BI / Tableau / Looker / ThoughSpot)

● Experience working with AWS Lambda, Glue, AppFlow, and other AWS transfer services.

● Exposure to PySpark and data automation tools like Jenkins or CircleCI.

● Familiarity with Terraform for infrastructure-as-code.

● Experience in data quality testing to ensure the accuracy and reliability of data pipelines.

● Proven experience working directly with U.S. client stakeholders.

● Ability to work independently and take the lead on tasks.

Education and experience:

● Bachelors or master’s in computer science or related fields.

● 5+ years of experience

Stack/Skills needed:

● Databricks

● PostgreSQL

● Python & Pyspark

● AWS Stack

● Power BI / Tableau / Looker / ThoughSpot

● Familiarity with GIT and/or CI/CD tools

Apply