Post Job Free
Sign in

IT Engineer - Data-(Hybrid)

Company:
Shuvel Digital
Location:
Vienna, VA
Posted:
June 05, 2025
Apply

Description:

Greetings!

Our client Navy Federal Credit Union is the world's largest credit union with over 10 million members, over $149 billion in assets, and over 23,000 employees.

Our client is seeking an IT Engineer - Data-15507-Hybrid. You seem to have a rock-solid profile and your overall background seems to be a great match for the position.

Please review the below information for clarity on the position description.

Basic Purpose:

Develop strategies for data acquisition, data pipelines, and database implementation. Responsible for designing, building, integrating data from various resources, and managing big data. Develop and write complex queries, while ensuring they are easily accessible, work smoothly, with the goal of optimizing the performance of Navy Federal's big data ecosystem within CI/CD pipelines. Recognized as an expert with a specialized depth and/or breadth of expertise in discipline. Solves highly complex problems; takes a broad perspective to identify solutions. Leads functional projects. Works independently.

Responsibilities:

Provide Data Intelligence and Data Warehousing (DW) solutions and support by leveraging project standards and leading data platforms

Build and maintain Azure data pipelines using DevSecOps process

Define and build data integration processes to be used across the organization

Build conceptual and logical data models for stakeholders and management

Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives

Prepare advanced project implementation plans which highlight major milestones and deliverables, leveraging standard methods and work planning tools

Document existing and new processes to develop and maintain technical and non-technical reference materials.

Recognize potential issues and risks during the analytics project implementation and suggest mitigation strategies

Coach and mentor project team members in carrying out analytics project implementation activities

Communicate and own the process of manipulating and merging large datasets

Perform other duties as assigned Qualifications and Education Requirements:

Master's degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience

Expert skill in Azure Data Factory, Databricks

Advanced skill in Azure SQL, Azure Data Lake, and Azure App Service, Python, T-SQL

Experienced in sourcing, maintaining, and updating data in On-Prem and Cloud environments

Knowledge of and the ability to perform basic statistical analysis

Thorough understanding of SQL

Experienced in the use of ETL tools and techniques

Experience Designing and building of data pipelines using API ingestion and Streaming ingestion methods.

Ability to understand the business problem and determine what aspects of it require optimization; articulate those aspects in a clear and concise manner

Ability to understand other projects or functional areas in order to consolidate analytical needs and processes

Demonstrates change management and/or excellent communication skills

Understands data warehousing, data cleaning, data pipelines and other analytical techniques required for data usage

Demonstrates deep understanding of multiple data related concepts

Understands the concepts and application of data mapping and building requirements

Understands data models, large datasets, business/technical requirements, BI tools, data warehousing, statistical programming languages and libraries

Experience using GIT & Source Control

Skilled in managing the process between updating and maintaining data source systems and implementing data related requirement

Apply