Position: Data Engineer
Reports to: Data Engineering Manager
Description
The Data Engineer plays a critical role in designing and delivering production-grade data solutions that drive automation, analytics, controls, and data-informed decision-making. Operating within a modern CI/CD-enabled data stack, this individual will help build and maintain an enterprise data warehouse and data mart that support both operational and strategic objectives.
Sitting at the intersection of systems and analytics, this role is well-suited for a creative, out-of-the-box thinker with a proven track record in data integration, cloud architecture, and ELT/ETL pipelines. The engineer will develop scalable, event-driven workflows using AWS services such as Lambda, S3, and SQS, along with orchestration tools like Airflow, Dagster, or Prefect. In addition to core data engineering responsibilities, the role requires adaptability and initiative to deliver custom, non-standard data solutions in response to evolving business needs—ranging from ad hoc automation to unique process workflows outside typical pipelines. The successful candidate will be instrumental in shaping Gateway Financial’s data landscape through both innovation and execution.
Key Responsibilities
• Build and support bidirectional integrations between the data warehouse and internal systems using APIs, flat files, webhooks and other available tooling
• Profile, integrate, cleanse and restructure data from disparate sources into logical models to be consumed by BI tools, internal applications and external clients
• Translate business logic into technical requirements and build and maintain analytic data marts and models in the data warehouse
• Write production-quality ELT transformations with an emphasis on best practices, consistency, performance and scalability
• Design and optimize cloud data warehouse architecture
• Design and maintain scalable, event-driven data pipelines using orchestration tools (e.g., Airflow, Dagster, Prefect) and AWS services (e.g., Lambda, S3, SQS), incorporating queue-based workflows beyond traditional ETL/ELT
• Identify and communicate opportunities for improvements to data quality and performance in source systems, pipelines and models
• Work on concurrent projects of varying size and scope, adhering to timelines and objectives
• Partner with Business Intelligence Analysts and technical stakeholders to drive process, architecture and infrastructure enhancements
• Establish and extend departmental best practices and standards
• Continually improve ongoing data pipelines and simplify self-service support for stakeholders
• Participate in Data Governance activities as needed to document data pipelines and models
• Partner with the larger Technology, Data & Analytics team to deliver self-service visual analytics and solutions that enable quick insights and data driven decision making
Requirements
• 3+ years of relevant work experience in a role requiring application of data modeling, SQL and analytic skills
• 2+ years of experience in the application of another programming language such as python
• Experience with ETL/ELT tools and building and maintaining a data warehouse
• Experience designing scalable data schemas using modeling strategies such as 3NF, Data Vault, Dimensional Modeling, and Slowly Changing Dimensions (SCDs)
• Prior experience working in dbt (Data Build Tool)
• Working knowledge of interacting with RESTful API endpoints for data extraction and integration
• Experience working in a CI/CD environment and using code source control tools like GitHub and GitLab
• Experience working with semi-structured data types such as JSON and XML
• Experience with Cloud Data Warehousing technology such as Snowflake
• Demonstrated ability to think creatively and develop innovative solutions to complex data challenges beyond standard tooling and techniques
• Good written and verbal communication skills
Preferences
• Experience in building and automating visual analytics in data visualization tools such as Tableau or Sigma
• Experience working within an Agile development process, utilizing Kanban boards in tools such as Microsoft DevOps or Jira to manage and prioritize work
• Experience with containerization and deployment using Docker, Kubernetes, and CI/CD tools like GitHub Actions
• Experience with AWS products such as Lambda, S3, SQS, CodeBuilder, Elastic Container Registry, and Elastic Container Services
• Experience with system-level scripting (e.g., PowerShell, command line, batch files) to support automation and deployment tasks in data workflows.
• Auto finance experience