Job Description
We are looking for an experienced AWS Engineer to join our team. This long-term contract position offers an exciting opportunity to work on innovative AI initiatives, leveraging cutting-edge tools like Amazon Bedrock and Agentic AI. The ideal candidate will play a critical role in assessing current systems, defining goals, and implementing solutions that streamline processes and enhance efficiency.
Responsibilities:
• Collaborate with stakeholders to evaluate existing systems and define AI development goals.
• Design and implement scalable solutions using Amazon Bedrock and Agentic AI frameworks.
• Develop efficient workflows using multiple AI agents to perform small, sequential tasks with smooth handoffs.
• Build and optimize ETL pipelines for seamless data extraction, transformation, and loading processes.
• Utilize Apache Spark and Hadoop to manage and analyze large datasets effectively.
• Integrate Apache Kafka for real-time data processing and event-driven architectures.
• Leverage AWS services, including AWS Data Pipeline, to create robust cloud-based data solutions.
• Troubleshoot and refine AI models and systems to ensure performance and reliability.
• Provide technical expertise and support for ongoing AI initiatives and day-to-day operations.
• Document processes and workflows to facilitate knowledge sharing and system maintenance.• Proficiency in Python programming for data manipulation and system integration.
• Hands-on experience with Apache Spark, Hadoop, and Kafka for big data processing.
• Strong knowledge of ETL processes and tools to handle complex data workflows.
• Expertise in Amazon Web Services (AWS), including AWS Data Pipeline.
• Familiarity with Amazon Bedrock and Agentic AI frameworks for AI development.
• Ability to design and implement scalable solutions using cloud-based architectures.
• Excellent problem-solving skills and ability to troubleshoot complex systems.
• Strong communication skills to collaborate with cross-functional teams and stakeholders.