Post Job Free
Sign in

Ai ml developer

Location:
Manchester, NH, 03102
Posted:
July 08, 2025

Contact this candidate

Resume:

PAGIDI NAGA SRINU

AI/ML Consultant AI Developer Machine Learning Specialist

Email: ****************@*****.***. LinkedIn

Contact: +1-603-***-****

P R O F I L E:

Results-driven AI/ML Consultant with over 3 years of experience designing, implementing, and optimizing AI/ML models to solve complex business problems. Proficient in Python, cloud services (AWS, Azure), and advanced ML frameworks. Adept in model evaluation, bias mitigation, and performance tuning. Experienced in integrating AI/ML solutions across cloud architectures and driving actionable insights through data analysis. Passionate about staying at the forefront of AI advancements and leveraging them to create scalable, real-world solutions. E D U C A T I O N :

● Master's Degree in Computer Information Systems, New England College, Henniker, New Hampshire.

E X P E R T I S E

● Built and deployed end-to-end AI/ML pipelines using PyTorch and TensorFlow for real-world business use cases.

● Developed AI chatbot solutions using AWS Bedrock, Claude, and Step Functions, enabling intelligent automation and customer interaction.

● Deployed advanced NLP models on Azure Machine Learning and Azure OpenAI, supporting secure, scalable language-based applications.

● Collaborated with cross-functional teams to design and implement scalable, cloud-native machine learning systems.

● Designed SQL-based data workflows for effective feature engineering and automated data processing.

● Applied machine learning to solve business problems with data-driven decision systems and predictive modeling.

● Demonstrated a strong analytical mindset, translating complex problems into practical AI/ML solutions.

• Continuously stay up to date with the latest trends in AI, ML, and LLMs through research, forums, and industry publications.

• Transformed business requirements into analytical models, designed ML algorithms, and deployed solutions in production environments.

• Practiced MLOps, including model versioning, CI/CD pipelines, and containerized deployments using Docker, Kubernetes, Flask, and FastAPI. 1

• Proficient in machine learning techniques such as Decision Trees, Random Forest, Naïve Bayes, Logistic Regression, Linear/Multiple Regression, K-Means, KNN, SVM, and deep learning models like CNNs, RNNs, LSTMs, and Autoencoders.

• Worked with Apache Spark (Streaming & SQL), Hadoop, and Hive for large-scale distributed data processing.

• Developed Kafka-based event-driven architectures, including custom producers and consumers for real-time analytics.

• Delivered and maintained AI/ML model lifecycles, covering development, training, testing, and deployment for both real-time and batch inference.

• Created innovative AI applications, including a generative AI gift suggestion tool and an automated NLP-powered incident reporting system.

• Designed industry-specific AI/ML use cases using PyTorch, TensorFlow, and Keras, addressing key business challenges.

• Integrated open-source LLMs such as Mistral, LLaMA-2, and GEMMA with AWS services, leveraging AWS Bedrock for scalable AI deployment.

• Automated machine learning workflows and reporting using Python and SQL, reducing manual effort and improving reliability.

• Gained hands-on experience across AWS, Azure, and GCP, working on AI/ML solutions using serverless computing, data lakes, and cloud-native services. WORK EXPERIENCE:

o Kanerika,[Hyderabad, Telangana] (From Jan-2021 to Aug- 2023) Role: AI Developer

• Collaborated with data scientists and domain experts to solve key retail challenges such as demand forecasting, customer churn prediction, and fraud detection using scalable ML solutions on Google Cloud.

• Designed and deployed personalized recommendation systems and dynamic pricing models based on historical customer behavior data.

• Built generative AI solutions for automating product descriptions and summarizing customer reviews using PaLM, LLaMA 3.x, and Mixtral on Vertex AI.

• Fine-tuned domain-specific LLM pipelines using Mistral and Pixtral, improving chatbot accuracy and automating customer support processes.

• Developed Retrieval-Augmented Generation (RAG) assistants to support product catalog management and customer segmentation strategies. 2

• Created real-time data pipelines with BigQuery, Cloud Composer (Airflow), and Snowflake to support inventory tracking and sales analytics.

• Built NLP pipelines for intent classification, keyword extraction, and sentiment analysis to enhance personalized marketing campaigns.

• Developed predictive models for return likelihood, customer lifetime value (LTV), and sales forecasting using PyTorch, TensorFlow, and Scikit-learn.

• Applied regression, clustering, and decision trees to uncover seasonal trends and optimize ad spending strategies.

• Implemented scalable, asynchronous APIs using FastAPI to serve real-time search and recommendation features.

• Built interactive Tableau and Kibana dashboards to visualize KPIs, conversion rates, and model insights for business stakeholders.

• Set up CI/CD workflows with GitHub Actions and Jenkins to automate ML model testing, versioning, and deployment on GCP.

• Used Python libraries like pandas, NumPy, Seaborn, Matplotlib, NLTK, and SciPy for data cleaning, feature engineering, and exploratory analysis.

• Designed and trained deep learning models (CNNs, RNNs) for product image tagging and modeling customer behavior sequences.

• Worked closely with product and engineering teams to integrate AI/ML features into production systems, ensuring scalability and business alignment. o Tech Vedika (Hyderabad, Telangana) (September 2019 to December 2020)

Role: ML Developer

• Collaborated with cross-functional teams to support the design, development, and deployment of AI/ML systems on Amazon Web Services (AWS), contributing to real-world projects across retail and media domains.

• Assisted in delivering production-ready ML and generative AI solutions focused on improving user engagement, automating repetitive tasks, and enhancing system performance.

• Worked alongside data scientists and engineers to apply best practices across the ML lifecycle, from initial data exploration and feature engineering to model deployment and basic monitoring.

• Contributed to the development of deep learning models using TensorFlow and PyTorch for emotion and demographic recognition in marketing ads, helping improve prediction accuracy by 20%.

• Deployed NLP and DL models using Amazon SageMaker, experimenting with built-in algorithms and learning to configure container-based training environments. 3

• Wrote Python code leveraging Scikit-learn, spaCy, Transformers, and TensorFlow, supporting various data preprocessing, model training, and evaluation workflows.

• Evaluated multiple ML models, ultimately contributing to selecting Decision Trees based on comparative testing against SVMs, Random Forests, and Gradient Boosted Trees.

• Supported the implementation of AWS SageMaker Pipelines, gaining hands-on exposure to automating model training, batch inference, and deployment processes.

• Assisted with deploying containerized ML services using Docker, ECR, ECS, and EKS, learning how to manage scalable model lifecycles in production.

• Used MLflow for basic experiment tracking and version control, integrated with S3 and Databricks during team workflows.

• Helped build and test deep learning architectures (CNNs, RNNs) for image and speech recognition, contributing to performance optimization with attention mechanisms.

• Integrated Hugging Face models with LangChain to build early-stage LLM applications; deployed chat functionality via API Gateway + Lambda.

• Developed a basic churn prediction pipeline using Random Forest and Gradient Boosting, supporting a measurable churn rate reduction of 15%.

• Participated in building generative AI prototypes for document extraction tasks, refining prompt techniques to reduce hallucinations and boost relevance.

• Automated demographic data collection processes using RPA tools, improving data accuracy and reducing manual data entry efforts.

• Helped design and manage ETL pipelines using AWS Glue, Lambda, Step Functions, and Python, contributing to a 40% gain in data pipeline efficiency.

• Built simple, scalable backend services using FastAPI and Docker, deployed via ECS Fargate for seamless model serving.

• Explored and helped deploy reusable MLOps components as part of an internal ML platform-as-a-service, supporting faster development cycles.

• Wrote and maintained code using tools like PyCharm, Eclipse, Pyscript, and Sublime Text, following clean coding practices and collaborative version control. 4



Contact this candidate