Job Description
KDA Consulting Inc. is seeking a highly skilled Data Engineer to design, build, and maintain scalable data pipelines and architectures supporting mission-critical programs within the Intelligence Community (IC). This role focuses on enabling efficient data processing, analytics, and decision-making by delivering reliable, high-quality data solutions in cloud environments.
The ideal candidate will have strong experience with AWS cloud services, data engineering best practices, and big data technologies, with the ability to operate in a fast-paced, collaborative environment supporting complex mission systems.
Data Pipeline Development & Engineering
Design, develop, and maintain scalable ETL/ELT pipelines to ingest, transform, and load data from multiple sources
Build and optimize data workflows supporting both batch and real-time processing
Ensure high availability and reliability of data pipelines across enterprise systems
AWS Cloud & Data Services
Leverage AWS services such as S3, Glue, Lambda, Redshift, EMR, and Kinesis to support data engineering solutions
Design and manage cloud-based data architectures that are secure, scalable, and cost-efficient
Implement best practices for data storage, partitioning, and lifecycle management
Data Optimization & Performance
Optimize data storage and retrieval for performance, scalability, and cost efficiency
Implement indexing, partitioning, and query optimization strategies
Monitor and troubleshoot data pipeline performance issues
Data Quality & Governance
Ensure data accuracy, consistency, and integrity across all data pipelines and systems
Implement validation, monitoring, and error-handling mechanisms
Support data governance practices and compliance with security and IC standards
Collaboration & Integration
Work closely with data scientists, software engineers, and mission stakeholders to support analytics and operational needs
Integrate data solutions with enterprise applications, APIs, and downstream analytics platforms
Participate in Agile development processes and contribute to continuous improvement efforts
Requirements
Active TS/ SCI W/ Polygraph Required.
Bachelor’s degree in Computer Science, Engineering, or a related technical field (or equivalent experience)
Demonstrated experience in data engineering and pipeline development
Strong experience with AWS data services (e.g., S3, Glue, Redshift, Lambda, EMR)
Proficiency in SQL and Python for data processing and transformation
Experience working with large-scale data sets and big data technologies
Strong analytical and problem-solving skills
Full-time