Sridhar Madala
Email: ************@*****.***
Phone: 607-***-****
Accomplished Gen AI Engineer with around 8 years of experience specializing in designing and implementing advanced data solutions.
Proficient in leveraging Generative AI technologies and Azure Data Engineering platforms to drive innovation and efficiency. Demonstrated expertise in leadership roles within Data Engineering, Data Analysis, and Data Science.
Skilled in Data Migration, Predictive Modeling, and developing Web Development projects. Passionate about utilizing cutting-edge technologies to solve complex data challenges and deliver impactful results.
Proficient in a wide array of data technologies including SQL, NoSQL, Hadoop, Spark, and cloud platforms like AWS, Azure, and GCP.
EDUCATION, TRAINING, CERTIFICATION
Master’s in computer science, State University of New York at Binghamton
Bachelors in Compute Science, Prasad V. Potluri Siddhartha Institute of Technology, Andhra Pradesh, India
Certifications:
Certification on Generative AI with Large Language Models (Offered by AWS & Deep Learning.AI)
https://www.coursera.org/account/accomplishments/verify/HU9ZYS48R95H
Google Cloud certification on Generative AI
https://www.coursera.org/account/accomplishments/verify/FWC3AKGZQE3S
Databricks Accredited “Generative AI Fundamentals”
https://credentials.databricks.com/f48f5c0e-a9be-4876-b8bc-ca97de9108f0#gs.9ditx7
Google certification for “Crash Course on Python”
https://www.coursera.org/account/accomplishments/verify/68W3LYGS9ADB
Microsoft's certification for “Azure Fundamentals”
https://learn.microsoft.com/en-us/users/sridharmadala-3971/credentials/16aef20bbf696f94?ref=https%3A%2F%2Fwww.linkedin.com%2F
TECHNICAL ENVIRONMENT
Cloud: Azure Devops Azure Blob Storage Azure Data Lake Azure Data Factory Azure Data Bricks Azure Synapse Azure SQL
AI/ML Techniques: Fine-tuning with custom data, vector embedding, NLP neural network optimization, LOPS, Dockers, Kubenetes
Tech Stack: Python PySpark SQL Java Streamlit HTML5 CSS3 Cognos. JavaScript Tableau Power BI SSIS SAS
Frameworks & Tools: TensorFlow PyTorch LangChain LlamaIndex Streamlit Flask Vertex AI workbench DSPy
Tools: Jupiter Notebook Pycharm, Visual Studio Eclipse Git Docker Excel Jira Service Now Kanban
Generative AI Technologies: Open-source and paid LLM models (Llama2, OpenAI, Google Gemini Pro)
Vector Databases: ChromaDB, Pinecone
Deployment Platforms: AWS Bedrock, AWS (EC2, Lambda), Azure Functions, Azure Machine Learning, Hugging Face
Domain: Health Care Insurance Retail
MAJOR ASSIGNMENTS
Sphere, NewYork, USA Jul 2024 - Present
GenAI Engineer
Responsibilities:
Developed and deployed NLP models for intent detection, named entity recognition, and text summarization, leveraging Transformer architectures (e.g., BERT) to enhance chatbot accuracy
Implemented and fine-tuned both open-source and paid LLM models, customizing solutions to meet specific project requirements and performance goals
Managed and optimized vector databases such as ChromaDB and Pinecone, enhancing AI applications' data retrieval capabilities
Designed and implemented retrieval-augmented generation workflows that integrate vector databases (e.g., ChromaDB, Pinecone) with LangChain for dynamic content retrieval and summarization
Designed and adapted AI models and LLMs with Docker and cloud (GCP/Azure) expertise throughout the SLDC
Developed scalable AI solutions using LangChain and LlamaIndex frameworks, demonstrating expertise in generative AI technologies
Configured Azure Functions for serverless scaling of generative AI workloads, ensuring seamless dynamic resource allocation
Utilized Azure Blob Storage for efficient management and versioning of large training datasets and model artifacts
Engineered optimized prompts to improve model accuracy and responsiveness, leveraging best practices in prompt engineering for multiple generative AI projects
Developed reusable AI agents to automate processes and support development teams
Leveraged transfer learning to enhance LLM performance in domain-specific applications
Successfully developed and deployed cutting-edge AI applications, demonstrating a strong impact in enhancing user engagement and operational efficiency
Leveraged Azure ML for deploying AI models, utilizing cloud services to ensure scalability and reliability of AI applications
Integrated GPT-4 and similar LLMs using Azure OpenAI, implementing scalable and secure solutions to support advanced NLP workflows in production environments.
Fine-tuned transformer-based LLMs with advanced prompt engineering and in-context learning to optimize text generation
C & S Wholesale Groceries, Texas, USA Aug 2023 – Jul 2024
GenAI Engineer
Responsibilities:
Led the development of user-friendly Streamlit web applications, integrating Python to improve user experience.
Seamlessly integrated APIs, managing data for front-to-back communication in Generative AI apps.
Managed Streamlit to Generative AI communication, enhancing data flow and user interactions with Lang chain and Llama.
Managed and developed generative AI tools with a focus on retrieval-augmented generation (RAG).
Designed, developed, and deployed chatbots and virtual assistants using the RAG, leveraging advanced AI capabilities to enhance customer engagement and streamline business processes
Enhanced NLP solutions across sentiment analysis and NER, improving text classification and topic modeling.
Applied TensorFlow, PyTorch, scikit-learn techniques to NLP models, ensuring scalability and performance.
Expedited NLP development and improved model accuracy using NLTK, spaCy, Gensim, Transformers.
Deployed responsive visual elements in Streamlit apps, enriching user engagement with AI content via Python.
Optimized chatbots with Langchain and LLMs for superior performance, integrating diverse query engines.
Enhanced productivity by deploying RAG for summarization and chatbots, leveraging Langchain and retrieval methods.
Engineered robust RAG-based models with diverse Vector DBs for scalable data handling.
Directed Streamlit and Generative AI integration projects, ensuring timely milestone achievement.
Leveraged LLMs and RAG in application development, enhancing user experience via Langchain and OpenAI.
MCG Health, Seattle, United States May 2023 – Aug 2023
Azure Data Engineer
Responsibilities:
Designed and implemented a service in Dot.Net Framework that triggers the creation of a Jira issue when a case is registered in Salesforce.
Also, Leveraged Azure Logic Apps to streamline the integration process, ensuring real-time synchronization between Salesforce and Jira.
Achieved a 30% improvement in response times by implementing a system that leveraged the On Call Database and PagerDuty, by automating team notifications.
Integrated the Slack API to automatically create dedicated channels for new issues, optimizing team collaboration.
Programmed the service to automatically add relevant team members to Slack channels based on issue criteria, facilitating swift problem resolution.
Tested and refined the service, leading to efficient case handling and improved inter- departmental communication.
Tata Consultancy Services, Chennai, India Sep 2020 – Jul 2022
Azure Data Engineer
Responsibilities:
Achieved efficient data processing by spearheading the design of cloud-native ETL processes, ensuring seamless integration.
Increased data integrity by employing Python APIs and Azure Data Factory for data migration to Azure Blob storage.
Optimized data pipeline scalability within Azure, enhancing performance by utilizing Data Factory and Databricks.
Seamlessly integrated Python and PySpark using diverse ETL tools, facilitating intricate data transformations.
Improved structured data output by leveraging Python and SQL for ETL from various sources to CSV files.
Streamlined development workflows by implementing Azure DevOps best practices for continuous integration and delivery.
Expertise in designing and implementing tailored NoSQL data models, utilizing Cosmos DB's flexible schema and multi-model capabilities for optimized performance and Scalability.
Implemented robust Azure Monitoring Service solutions, meticulously tracking the execution and performance of daily scheduled jobs to ensure optimal operational efficiency.
Demonstrated proficiency across a spectrum of Azure services, including Synapse, Azure SQL, Azure Data Lake Gen2, Azure Blob Storage, Azure Container Services, Azure Functions, SQL Data Warehouse, and Service Bus, harnessing their capabilities to meet diverse business needs.
Applied advanced encryption methodologies, leveraging AES encryption, Databricks, PySpark, Python, Key Vault, and service principles to ensure data security and compliance.
Leveraged API calls and HTTP requests to seamlessly pull data from disparate sources into Azure ADLS, ensuring data integrity and availability.
Demonstrated adeptness in scheduling and monitoring pipelines, employing a nuanced understanding of various activities and triggers, including time-based and event-based triggers.
Extensively utilized Databricks notebooks for in-depth interactive analysis, leveraging Spark APIs to derive actionable insights and drive data-informed decision-making processes.
Tata Consultancy Services, Chennai, India Jun 2017 – Aug 2020
Data Engineer
Responsibilities:
Developed reports and dashboards by pulling data from different sources using SAS, SSMS and representing data in tableau and Excel file as per business requirement.
Identified and automated manual running reports using SSIS (ETL tool) which reduced the manual effort and improves the cost saving.
Worked on SAS and Cognos to create self-service tools in creating user-friendly detailed reports and providing root fixes for several issues that were encountered.
Created stored procedures, complex queries and views using SQL, PL/SQL to perform data wrangling, processing, exploratory data analysis and to implement analytics algorithms that support the UI development.
Developed ETL and data pipelines using SSIS and Informatica to extract meaningful data from different data sources resulting in a 30% improvement in data accuracy, completeness, and availability.
Designed and optimized data models for Data Warehouse and operational databases, collaborating with architects to ensure efficient data storage and retrieval.
Conducted training sessions for end-users on utilizing reporting tools such as Tableau and SAS, empowering them to interpret and leverage data for decision-making purposes.
Designed and published Tableau dashboards to help clients visualize and interpret the data to implement data-driven solutions for a Healthcare Client on a recurring basis.
Data ingestion and integration from data warehouses and databases by building, enhancing and scheduling ETL pipelines.
Migration and optimization of databases to cloud servers to ensure efficient data storage and retrieval which led to a 50% improvement in their overall efficiency and usage.
Performed data manipulation and validation in SQL to support enhancements to existing stored procedures and functions in SQL servers based on logic changes and performance tuning.
Additional Experience:
Synergy Technologies, Hyderabad, India Jul 2016 – Jun 2017
Research Analyst Intern
Responsibilities:
Developed a digital learning web application incorporating Python, Django, and React, featuring an admin panel and user dashboards for student performance and metrics analytics, thereby enhancing educational experiences.
Developed custom extensions leveraging Python, PyQt5, and JavaScript to optimize manual processes, automate content management, and enrich learning methodologies, leading to notable enhancements in study productivity, a 15% decrease in operational expenditures, and an elevated user experience for a substantial learner community.
Collaborated with cross-functional teams to compile research reports and presentations for stakeholders, ensuring effective communication of findings and insights.
Passions:
Personal Voice Assistant Chatbot: Generative AI
Real-Time Speech Recognition and Voice-Enabled AI Chatbot Integration using BING and OpenAI.
Generative AI: Image Exploration Web App
Generate image captions, Object detection, QA images.
Mastering Streamlit: Essential Commands for Interactive Apps
Essential for creating quick interactive shareable web apps.