Job Summary
We are seeking an experienced and innovative AI Engineer with a strong background in Generative AI, large language models (LLMs), and advanced analytics.
This role involves designing and developing AI-driven solutions using state-of-the-art tools and frameworks to solve complex business problems.
The ideal candidate will have hands-on experience with NLP, NLG, image processing, and AI agent development, and will thrive in a fast-paced, collaborative environment.
Key Responsibilities
Design and implement AI solutions using Generative AI, LLMs, and deep learning frameworks.
Develop and fine-tune models using tools such as TensorFlow, PyTorch, and AWS SageMaker.
Build and deploy NLP/NLG models for tasks such as sentiment analysis, topic modeling, and named entity recognition.
Work with image processing models like CLIP, ResNet, and COCO.
Develop AI agents using agentic programming and frameworks like LangChain, RAG, and PEFT.
Integrate and manage data pipelines using SQL, Spark, and AWS services.
Collaborate with cross-functional teams to translate business needs into AI solutions.
Ensure compliance with data privacy and security standards.
Utilize tools such as Git, Jenkins, Tableau, and Kibana for version control, CI/CD, and visualization.
Work with structured and unstructured data, performing data cleansing and normalization.
Stay current with advancements in AI, LLMs, and related technologies. Required Qualifications
Master’s degree in Computer Science, Statistics, Math, Engineering, or a related field (PhD preferred).
3+ years of experience in machine learning or deep learning model development.
1+ year of experience with deep learning (CNN, RNN, LSTM) and NLP/NLG tools.
Proficiency in Python (SciPy, NumPy, PySpark), SQL, and Shell scripting.
Experience with AWS SageMaker, Jupyter Notebook, and cloud-based ML environments.
Hands-on experience with LLMs (OpenAI, LLaMA, Claude, Cohere), LangChain, LoRA, RAG, and Knowledge Graphs.
Familiarity with NLP libraries such as NLTK, Spacy, and frameworks like BERT, RoBERTa.
Experience with data visualization tools such as Tableau, Kibana, or AWS QuickSight.
Strong understanding of data governance, privacy, and security best practices.
Experience with containerization tools like Docker and Kubernetes. Preferred Qualifications
Experience with search architectures (Solr, ElasticSearch, AWS OpenSearch).
Familiarity with querying ontologies (Zeno, OWL, RDF, SPARQL).
Knowledge of microservices, service mesh, and API development.
Experience with Domino Datalab or similar platforms.
PhD in a relevant field is strongly preferred.
Education: Bachelors Degree