Post Job Free
Sign in

Sr. Data Architect

Company:
Qode.world
Location:
Kerala, India
Pay:
Negotiate
Posted:
October 25, 2025
Apply

Description:

Sr. Data Architect

Experience: 10+ years

Notice period : Immediate

Location :Trivandrum / Kochi

Job Description

A minimum of 10 years of experience in data engineering, encompassing the development and scaling of data

warehouse and data lake platforms.

The candidate should possess a strong background in Snowflake, demonstrating leadership in technical

design, architecture, and implementation of complex data solutions.

Working hours - 8 hours, with a few hours of overlap during EST Time zone. This overlap hours is mandatory as

meetings happen during this overlap hours. Working hours will be 12 PM - 9 PM.

Responsibilities

Mandatory Skills: Snowflake experiance, Data Architecture experiance, ETL process experiance, Large Data

migration solutioning experiance

· Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability,

performance, and reliability.

Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within

Snowflake environments. Ensure efficient data extraction from SAP BW/ECC systems

· Collaborate with stakeholders to understand business requirements and translate them into technical

specifications and data models.

· Develop and maintain data architecture standards, guidelines, and best practices, including data

governance principles and DataOps methodologies.

· Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within

Snowflake environments.

· Provide technical guidance and mentorship to data engineering teams, fostering skill development and

knowledge sharing.

· Conduct performance tuning and optimization of Snowflake databases and queries.

· Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data

governance, and DataOps practices.

Primary Skills:

· Extensive experience in designing and implementing data solutions using Snowflake. DBT,

· Proficiency in data modeling, schema design, and optimization within Snowflake environments.

· Strong understanding of cloud data warehousing concepts and best practices, particularly with

Snowflake.

· Expertise in python/java/scala, SQL, ETL processes, and data integration techniques, with a focus on

Snowflake.

· Familiarity with other cloud platforms and data technologies (e.g., AWS, Azure, GCP )

· Demonstrated experience in implementing data governance frameworks and DataOps practices.

·

· Familiarity with realtime streaming technologies and Change Data Capture (CDC) mechanisms.

· Knowledge of data governance principles and DataOps methodologies

· Proven track record of architecting and delivering complex data solutions in cloud platforms/ Snowflake.

Secondary Skills(If Any):

· Understanding of SAP BW/ECC systems, including data extraction, transformation, and loading (ETL/ELT)

processes.

. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.

. Experience in designing, developing, and implementing SAP ABAP programs, reports, interfaces, and

enhancements

. Working experience in SAP environments

· Knowledge of data security and compliance standards

. Proficiency in SAP ABAP (Advanced Business Application Programming) development.

· Excellent communication and presentation skills, with the ability to convey complex technical concepts to

juniors, non-technical stakeholders.

· Strong problem-solving and analytical skills - Ability to work effectively in a collaborative team environment and

lead cross-functional initiatives.

Certifications Required(If Any)

Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.

· Certifications related Snowflake (e.g., SnowPro core/Snowpro advanced Architect/Snowpro advance Data

Engineer ) are desirable but not mandatory.

Apply