Ab Initio ETL Developer

Cyber 1 Armor Logo
  • Environmental
  • Applications have closed

Ab Initio ETL Developer

LOCATION: Dallas, TX(Onsite)
Duration: 12 Months

Job Type-W2 contracts

Required skills:

Please submit only candidates who are authorized to work in the United States.

Only applicants who are currently local to Dallas Texas or are willing to relocate will be considered.

Design, develop, and deploy ETL processes using Ab Initio GDE.

Build high-performance data integration and transformation pipelines.

Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.

Develop and optimize graphs for batch and real-time processing.

Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.

Implement continuous flows, web services, and message-based integration with Ab Initio.

o Continuous Flows (Co-Op & GDE)

Nice to have skills:

Exposure to AWS, Azure, or Google Cloud Platform for cloud-based data solutions.

Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.

Containerization (Docker, Kubernetes) knowledge desirable.

Monitoring & Security:

Job monitoring and scheduling experience (Control-M, Autosys, or similar).

Familiarity with security standards, encryption, and access management.

Skills: Design, develop, and deploy ETL processes using Ab Initio GDE.

Build high-performance data integration and transformation pipelines.

Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.

Develop and optimize graphs for batch and real-time processing.

Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.

Implement continuous flows, web services, and message-based integration with Ab Initio.

o Continuous Flows (Co-Op & GDE)

o Plans and Psets

o Conduct-It for job scheduling and orchestration

o Graphs and Parameter Sets

Nice to have:

Exposure to AWS, Azure, or Google Cloud Platform for cloud-based data solutions.

Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.

Containerization (Docker, Kubernetes) knowledge desirable.

Monitoring & Security:

Job monitoring and scheduling experience (Control-M, Autosys, or similar).

Familiarity with security standards, encryption, and access management.