ETL/ELT Developer Key Responsibilities
Design, develop, test, and implement ETL/ELT solutions using FiveTran, DBT, and Snowflake to support regulatory reporting and historical data integrations.
Develop and maintain code for data mappings and transformations in both on-premise (DB2, AIX) and cloud-based (AWS, Snowflake) environments.
Collaborate with business analysts to review requirements, assist in system specifications, and create technical specifications for data integrations.
Build and optimize data pipelines using FiveTran for data ingestion and DBT for transformation, ensuring compliance with Informatica development standards.
Perform in-depth unit testing, participate in system and user acceptance testing, and support production implementation with comprehensive documentation.
Work with DBAs to implement database changes and optimize Snowflake schemas, tables, and queries for performance.
Analyze and resolve data integration issues, propose solutions, and enhance existing processes for efficiency.
Provide accurate effort and duration estimates for development tasks and participate in code reviews to ensure quality and compliance. Required Skills and Qualifications ETL/ELT Development (50%)
3-7 years of experience in ETL/ELT design and development, with proficiency in FiveTran for data ingestion and DBT for data transformation.
Strong expertise in Snowflake for schema creation, table setup, query optimization, and performance tuning.
Experience with Informatica Power Center and Informatica Cloud for data mappings and transformations.
Familiarity with data warehousing principles and API-based data integrations. Database and Scripting (50%)
Proficient in database development using DB2, SQL Server, and Snowflake, with strong SQL skills for writing and optimizing complex queries.
Experience with Python scripting for automation, data processing, and workflow orchestration.
Knowledge of software development life cycle (SDLC) methodologies and version control tools (e.g., Git, Jira).
Understanding of data modeling principles to support efficient database design. Additional Considerations
Knowledge of the Property & Casualty (P&C) insurance industry is a plus.
Experience with stored procedures, Unix/Perl scripting, or balancing and control mechanisms is advantageous.
Familiarity with AWS components (e.g., S3, Glue, RDS) and cloud migration strategies is highly valued.
Ability to develop metrics and visualizations for data insights is a plus.
Candidates must be local to Central New Jersey to facilitate collaboration and occasional on-site work. Why Join Us?
This role offers an opportunity to join a dynamic Data Services team responsible for critical regulatory and reporting integrations. You ll work with modern tools like FiveTran, DBT, and Snowflake, contribute to the modernization of our data infrastructure, and collaborate with a forward-thinking team dedicated to delivering high-quality data solutions that drive business success.
What is a Pipeline Job?
These roles represent future opportunities we've uncovered through our client discussions. We have stripped away the rigid Must Haves, Mandatories, and Required criteria to find the right fit for their needs. By applying for these future roles, we will complete our human-centered process to see if you are a fit while adding your profile to our database to be considered for additional openings. When you apply, rest assured a human will thoroughly review your resume and respond to you personally. We take pride in finding the right match for each job, valuing your unique talents and potential over just what's on your resume.