KEY ACCOUNTABILITIES INCLUDE:
Design and develop scalable data pipelines to collect, process, and store large volumes of data.
Collaborate with data scientists and analysts to drive scope & requirements as well as deliver strong technical design of data analytics solutions.
Ensure data quality and integrity through cleansing processes, validation, and automated testing.
Develop and maintain requirements, design documentation and test plans.
Implement data integration solutions to combine data from various sources.
Optimize data workflows for performance and reliability and reduced cloud consumption.
Monitor and troubleshoot data pipeline issues to ensure smooth operation.
Establish release management & CI/CD for data solutions.
Provide direction and coordination for development and support teams including globally located resources.
Participate in the development of a safe and healthy workplace. Comply with instructions given for their own safety and health and that of others, in adhering to safe work procedures. Co-operate with management in fulfilment of its legislative obligations.
Other duties as assigned by management.
REQUIREMENTS:
Must be a self-starter capable of independently meeting objectives and interacting with members of various teams successfully
Strong analytical background and mindset
Demonstrated expertise in SOP development, training strategy, and process improvement
Ability to elicit buy-in and cooperation from a variety of individuals as well as departments
Hands-on, flexible, and responsive to dynamic fast paced work environment
Capability managing several initiatives
Strong team player with a continuous improvement mindset
EDUCATION & EXPERIENCE:
Minimum
Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
5 years’ experience in a data engineering role.
3 years’ experience building and supporting data lakehouse architectures using delta lake and change data feeds.
3 years’ experience with Spark and Python.
3 years’ experience writing code using OO programming.
3 years’ experience designing data warehouse table architecture such as star schema or Kimball method
Experience developing and installing wheelhouse packages for managing dependencies and distributing code.
Experience creating CI/CD pipelines for analytics solutions.
Preferred
Hands-on experience implementing data solutions using Microsoft Fabric.
Experience with machine learning and data science tools.
Knowledge of data governance and security best practices.
Experience in a larger IT environment with over 3,000 users and multiple domains.
SPECIALIST CERTIFICATIONS:
Current industry certifications from Microsoft cloud/data platforms or equivalent certifications. One or more of the following:
Microsoft Certified: Fabric Data Engineer Associate
Microsoft Certified: Azure Data Scientist Associate
Microsoft Certified: Azure Data Fundamentals
Google Professional Data Engineer
Certified Data Management Professional (CDMP)
IBM Certified Data Architect – Big Data
SEKO Worldwide is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
R-100614