Post Job Free
Sign in

Sr. AbInitio Developer

Company:
NR Consulting
Location:
United States
Posted:
February 03, 2026
Apply

Description:

Title: Ab Initio ETL Developer

Work Location: : Dallas, TX

Position Type: Contract

Duration: Long Term

Job Description:

• Design, develop, and deploy ETL processes using Ab Initio GDE.

• Build high-performance data integration and transformation pipelines.

• Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.

• Develop and optimize graphs for batch and real-time processing.

• Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.

• Implement continuous flows, web services, and message-based integration with Ab Initio.

o Continuous Flows (Co-Op & GDE)

Nice to have skills:

• Exposure to AWS, Azure, or GCP for cloud-based data solutions.

• Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.

• Containerization (Docker, Kubernetes) knowledge desirable.

• Monitoring & Security:

• Job monitoring and scheduling experience (Control-M, Autosys, or similar).

• Familiarity with security standards, encryption, and access management.

Skills:

Design, develop, and deploy ETL processes using Ab Initio GDE.

• Build high-performance data integration and transformation pipelines.

• Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.

• Develop and optimize graphs for batch and real-time processing.

• Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.

• Implement continuous flows, web services, and message-based integration with Ab Initio.

o Continuous Flows (Co-Op & GDE)

o Plans and Psets

o Conduct-It for job scheduling and orchestration

o Graphs and Parameter Sets

Nice to have:

• Exposure to AWS, Azure, or GCP for cloud-based data solutions.

• Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.

• Containerization (Docker, Kubernetes) knowledge desirable.

• Monitoring & Security:

• Job monitoring and scheduling experience (Control-M, Autosys, or similar).

• Familiarity with security standards, encryption, and access management.

Remote

Apply