Job Title: Engineer - Software Engineer
Location: India- Remote
Job ID: 23649
Company Overview
At Wimmer Solutions, we’ve built a reputation for results-driven, innovative, business and technology solutions that help companies execute on their strategic initiatives. We custom tailor our services to fit our clients’ needs from consulting and technology process outsourcing to interim technical talent resourcing.
We’re all about people and community. Since 2002, we have offered technology staffing and managed services for the great Seattle area. We focus on getting to know our clients and candidates to create lasting partnerships and ensure success. Not only do we provide a wide variety of benefits like medical and dental, but we are also committed to inspiring and creating positive changes in the world with our community outreach programs. You will join a large team of professionals who are passionate about their careers as well as their community.
Position Summary:
We are seeking an experienced and motivated Data Engineer to join our team. In this role, you will design, develop, and maintain scalable data solutions that support critical business needs. You’ll work across distributed data platforms, cloud infrastructure, and modern orchestration tools to enable efficient and reliable data processing, storage, and analytics.
This role requires strong technical expertise and includes participation in an on-call rotation to ensure the stability and availability of production systems and pipelines.
NAP is a real-time, event-streaming analytical platform that delivers high-quality, pre-stitched 360-degree views of customers, products, inventory, customer service, fulfillment, logistics, and credit. The Insights Delivery Team provides data and insights to enable data analysts, data scientists, leadership, store personnel, and other business stakeholders to drive the Nordstrom customer experience with near real-time access to critical insights.
WHAT YOU GET TO DO :
Data Platform Development: Design and build scalable data pipelines using platforms such as BigQuery, Hadoop/EMR/DataProc, or Teradata.
Cloud Integration: Develop and optimize cloud-based solutions on AWS or GCP to process large-scale datasets.
Workflow Orchestration: Manage data workflows using Apache Airflow to ensure system reliability and performance.
Containerization: Deploy and manage containerized applications using Kubernetes for scalability and efficient resource management.
Event Streaming: Implement event-streaming systems using Kafka for real-time data processing.
Programming & Automation: Write efficient, maintainable code in Python and SQL for data processing and analytics.
Database Management: Design and optimize both relational and non-relational databases for high-performance querying.
Collaboration: Partner with cross-functional teams including data scientists, analysts, and product managers to define and deliver data solutions.
Quality Assurance: Participate in code reviews and technical discussions to ensure delivery of high-quality, scalable data systems.
System Monitoring: Participate in the on-call rotation to monitor, troubleshoot, and resolve issues in the production environment.
On-Call Responsibilities
Monitor system health and respond to alerts promptly
Troubleshoot and resolve incidents to minimize service disruption
Escalate unresolved issues and document incident resolutions
Responsibilities During On-Call:
Monitor system health and respond to alerts promptly.
Troubleshoot and resolve incidents to minimize downtime.
Escalate issues as needed and document resolutions for future reference.
WHAT YOU BRING
Primary Technologies: BigQuery, Hadoop/EMR/DataProc, Snowflake, Teradata, AWS, GCP, Kubernetes, Kafka, Apache Airflow, Python, SQL
Required Skills & Professional Experience:
Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience)
3+ years of hands-on experience in data engineering or related roles
Experience with distributed data platforms (BigQuery, Hadoop, Snowflake, Teradata, etc.)
Proficiency in Apache Airflow for orchestration
Strong programming skills in Python and SQL
Experience building cloud-native solutions on AWS and/or GCP
Familiarity with Kubernetes for container orchestration
Knowledge of Kafka for real-time data pipelines
Solid problem-solving skills and ability to troubleshoot complex systems
Strong communication and team collaboration skills
Preferred Qualifications - Additional technologies that are not required
Familiarity with CI/CD pipelines and automated deployments
Knowledge of data governance, security, and compliance best practices
Exposure to DevOps tools and processes
MORE ABOUT WIMMER SOLUTIONS
Wimmer Solutions is proud to be an equal-opportunity employer. All applicants will be considered for employment regardless of race, color, religion or belief, age, gender identity, sexual orientation, national origin, parental status, veteran, or disability status. Wimmer Solutions is committed to achieving a diverse employee network through all aspects of the hiring process and we welcome all applicants.
If you are passionate about what you do and want to join a diverse team dedicated to diversity, equity, and inclusion in the workplace, we would love to hear from you. Get the job you have always wanted. You will join a broad team of professionals who are energized about their careers as well as their community. For more career opportunities or to refer a friend, please visit and talk to a recruiter today.
Wimmer Solutions is committed to building a diverse team and encourage applications from people of all backgrounds.