Job Role - Data Engineering - Subject Matter Expert (SME)
Job Type/Work Mode - PARTIALLY REMOTE Position
Job Location - Pune
Job Summary
We are seeking a seasoned Data Engineering SME with strong experience in data platforms, ETL tools, and cloud technologies. The ideal candidate will lead the design and implementation of enterprise-scale data solutions, provide strategic guidance on data architecture, and play a key role in data migration, data quality, and performance tuning initiatives. This role demands a mix of deep technical expertise, project management, and stakeholder communication.
Key Responsibilities
• Lead the design, development, and deployment of robust, scalable ETL pipelines and data solutions.
• Provide technical leadership and SME support for data engineering teams across multiple projects.
• Collaborate with cross-functional teams including Data Analysts, BI Developers, Product Owners, and IT to gather requirements and deliver data products.
• Design and optimize data workflows using tools such as IBM DataStage, Talend, Informatica, and Databricks.
• Implement data integration solutions for structured and unstructured data across on-premise and cloud platforms.
• Conduct performance tuning and optimization of ETL jobs and SQL queries.
• Oversee data quality checks, data governance compliance, and PII data protection strategies.
• Support and mentor team members on data engineering best practices and agile methodologies. Nagpur / Pune (Hybrid)
• Analyze and resolve production issues in a timely manner.
• Contribute to enterprise-wide data transformation strategies including legacy-to-digital migration using Spark, Hadoop, and cloud platforms.
• Manage stakeholder communications and provide regular status reports.
Required Skills and Qualifications
• Bachelor's degree in Engineering, Computer Science, or a related field (MTech in Data Science is a plus).
• 10+ years of hands-on experience in ETL development and data engineering.
• Strong proficiency with tools: IBM DataStage, Talend, Informatica, Databricks, Power BI, Tableau.
• Strong SQL, PL/I, Python, and Unix Shell scripting skills.
• Experience with cloud platforms like AWS and modern big data tools like Hadoop, Spark.
• Solid understanding of data warehousing, data modeling, and data migration practices.
• Experience working in Agile/Scrum environments.
• Excellent problem-solving, communication, and team collaboration skills.
• Scrum Master or Product Owner certifications (CSM, CSPO) are a plus
LEVEL OF EXPERTISE
Power BI - 4 years
Tableau - 4 years
IBM Data stage - 3 to 4 years
Talend Open Studio - 3 years
Data Modeling - 4 years
SQL - 4 years
Data Warehousing - 2 years
Data Migration - 3 years
Informatica - 4 years
Databricks - 2 years
AWS - 3 years
Hadoop - 3 years
Apache Spark - 4 years
Agile Methodology - 2 years
Agile Software Development - 2 years