Post Job Free
Sign in

Ab Initio Developer

Company:
Mindware INC
Location:
Reston, VA
Posted:
May 04, 2025
Apply

Description:

Title: Ab Initio Developer

Location: Hybrid/Reston, VA

Job Type: Contract W2 to Perm

Contract Length: 12 month T2P

Hours Per Week: 40

VISA Type: USC, GC, H4 EAD, L2 VISA only. NO H1Bs

Candidates Local to Washington DC, Maryland, Virginia only

Job Description:

Our client, a premier Federal Healthcare organization has an urgent need for a Ab Initio Developer to provide hands-on work designing and implementing data integration solutions for data warehouse and business intelligence systems.

The Ab Initio Developer is responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance.

This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company.

Responsibilities include:

20% Design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability.

20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.

15% Provides detailed guidance and performs work related to Modeling Data Warehouse solutions in the cloud OR on-premise. Understands Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.

15% Oversees the delivery of engineering data initiatives and projects. Supports long term data initiatives as well as Ad-Hoc analysis and ELT/ETL activities. Creates data collection frameworks for structured and unstructured data. Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.

15% Enforces the implementation of best practices for data auditing, scalability, reliability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.

10% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.

5% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.

Location: We are seeking candidates that reside in one of the following states: MD, VA, Wash DC and is eligible for conversion to FTE status.

Our client is also requiring that this hire comes onsite 1x per week to their Reston VA office.

Conversion to perm: Yes. Salary: 140K

Lead experience not required

Required Skills

8 years Experience in data engineering and cross functional team to implement scalable and fine tuned ETL/ELT solutions for optimal performance. Experience developing and updating ETL/ELT scripts. Hands-on experience with application development, relational database layout, development, data modeling.

Must have at least 5 years of ETL Design and Development experience using Ab Initio

Must have AWS Experience (services in Data is preferred, such as Kafka, Redshift, Glue, etc..)

2+ years of Data Integration project experience on Cloudera Platform

Working knowledge of HDFS, Hive, Impala and other related Hadoop technologies

Sound understanding of SQL and ability to write well performing SQL queries

1-2 years of Data Integration project experience on Hadoop Platform, preferably Cloudera

Ab Initio CDC (Change Data Capture) experience in a Data Integration/ETL project setting is a plus

Working knowledge of HDFS, Hive, Impala and other related Hadoop technologies

Sound understanding of SQL and ability to write well performing SQL queries

Good knowledge of OLTP and OLAP data models and other data warehouse fundamentals

Rigor in high code quality, automated testing, and other engineering best practices, ability to write reusable code components

Ability to unit test the code thoroughly and to troubleshoot issues in production environments

Must have some working experience with Unix/Linux shell scripting

Must be able to work independently and support other junior developers as needed

Some Java development experience is nice to have

Knowledge of Agile Development practices is required

Bachelor's degree required.

Apply