Job Description
Do you love a career where you Experience, grow & Contribute at the same time, while earning at least 10% above the market? If so, we are excited to have bumped into you.
Learn how we are redefining the meaning of work, and be a part of the team raved by Clients, Job-seekers, and Employees.
Jobseeker Video Testimonials
Employee Glassdoor Reviews
If you are an ETL Developer looking for excitement, challenge, and stability in your work, then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long-term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential, leveraging our Disruptive Talent Solution.
Role: ETL Developers
Location: Mclean, VA
Exp: 8 Years
Requirements
We are looking for a skilled and versatile ETL Developer with strong expertise in ETL/ELT pipeline development, data design, cloud technologies (especially AWS), and automation practices (CI/CD). The ideal candidate will have experience working across various stages of the data lifecycle – from ingestion and transformation to orchestration and deployment – using tools such as IICS (Informatica Intelligent Cloud Services), Python/PySpark, and Shell scripting.
Key Responsibilities:
Design, build, and maintain robust ETL/ELT data pipelines using IICS, Python, or PySpark to support large-scale data processing and analytics.
Collaborate with data architects and analysts to design scalable data models and processing solutions.
Develop and maintain shell scripts for task automation, job orchestration, and system monitoring.
Work closely with DevOps teams to implement CI/CD pipelines for data solutions, ensuring fast and reliable deployments.
Deploy and manage data workflows and infrastructure on AWS cloud services (e.g., S3, Lambda, Glue, EMR, Redshift, Athena).
Ensure data quality, integrity, and compliance through testing, validation, and monitoring frameworks.
Participate in performance tuning and optimization of ETL jobs and data processing applications.
Troubleshoot data pipeline failures and perform root cause analysis and resolution.
Required Skills:
ETL/ELT Tools: Hands-on experience with Informatica IICS or similar platforms.
Programming: Strong proficiency in Python and/or PySpark for data transformation and processing.
Scripting: Advanced knowledge of Shell scripting in Unix/Linux environments.
Cloud: Experience working with AWS services like S3, EC2, Glue, Redshift, Lambda, etc.
CI/CD: Familiarity with tools like Jenkins, GitLab CI, or AWS CodePipeline.
Data Modeling & Design: Ability to interpret business requirements into scalable and efficient data architecture.
Strong problem-solving and communication skills, with an ability to collaborate across technical and business teams.
Nice to Have:
Exposure to data governance, metadata management, or data cataloging tools.
Knowledge of SQL tuning and performance optimization techniques.
Experience with monitoring tools (e.g., CloudWatch, DataDog).
Understanding of Agile/Scrum methodologies.
Educational Qualifications:
Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
Benefits
Visit us at Alignity Solutions is an Equal Opportunity Employer, M/F/V/D.
CEO Message: Click Here
Clients Testimonial: Click Here
Full-time