JARNAIL SINGH
NewYork, New York *****
*****************@*****.***
PROFESSIONAL SUMMARY
17+ year experience as Data,GenAI & Cloud architect. Design end-to-end Generative AI solutions aligned with business goals. Architect AI models and pipelines for text, image, code, and multimodal generation tasks. Evaluate and select pre-trained models (e.g., OpenAI GPT, LLaMA, Claude, Stable Diffusion). coordinating with the Development team to fix the Build related issues. updating the configurations using Jenkins ‘aws plugins. Managing Aws sagemaker to deploy models through endpoints. Working on GenAI Bedrock to manage LLM's Snowflake internals and integration with other data processing technologies Cloud computing experience with AWS, Microsoft Azure, Google Cloud Define data acquisition, curation, and governance strategies for training and inference. Configuring the Retention policy to all S3 buckets to IA standard and Glacier. Configuring GCP Cloud environment like Cloud run,cloud functions,BigQuery,BigTable,CICD etc Configuring the API Gateway services, SNS and SQS according to configuration request. Designed, configured and deployed Amazon Web Services (AWS) for a multitude of applications using the AWS stack (EC2, S3, RDS, Cloud Watch, SQS, IAM), focusing on high- availability, fault tolerance, and auto-scaling Configure Build & Release/Deployments pipeline in Azure DevOps. Worked on structured data ingestion for analytic consumption from a wide range of sources with Azure Data Factory i.e. Databricks, Collibra, Sagemaker. Extensive experience in Terraform IaaC. Worked on snowflake,Azure Data Lake Gen2,GCP BigQuery,BigTable and GCP Data set, AWS redshift and Kinesis. recipes to perform middleware binary Installation and Configuration tasks involving JDK, Tomcat binaries installation. Structured and maintained assigned virtual servers. Monitored server traffic through Nagios. Administered web servers on Apache and Nginx. Installed and configured Nagios monitoring tool, while using it for monitoring
TECHNICAL SKILLS
Cloud Migration
Git
kinesis
Jenkins
AWS CodePipeline
Azure CICD
Aws beanstalk
Landing zone
Ansible
Terraform
Docker
Kubernetes
Nexus
Grafana
Shell Scripting
YAML
Windows
Unix/Linux
AWS Bedrock
Azure OpenAI
DALLE
Redshift
Azure Databricks
BigQuery
Databricks
LakeHouse
AWS
Azur
..
PROFESSIONAL EXPERIENCE
CLOUD ENTERPRISE ARCHITECT(DATA/GEN AI ARCHITECT), 02/2016 - Current
HPE, Dallas, TX
Overseeing service planning and architecture for clients from Retail, Pharma & Banking.
Architect AI models and pipelines for text, image, code, and multimodal generation tasks.
Define technical blueprints integrating GenAI with existing enterprise platforms.
Meet with clients to do discovery and the designing of an AWS infrastructure with IaaC and followed application deployment through kubernets.
Design Data DevOps pipeline workflow and architecture with AWS EKS.
Evaluate and select pre-trained models (e.g., OpenAI GPT, LLaMA, Claude, Stable Diffusion).
Leading GCP migration effort for one German Insurance client.
Build GenAI application using AWS Bedrock, Azure ML studio.
MLOps & Deployment: Implement best practices for GenAI model deployment, monitoring, and lifecycle management within GxP-compliant environments.
Performed GenAI application deployment with Foundation Models in Bedrock. i.e chatbot using Amazon Bedrock's LLM(Langchain&Langgraf),MCP, Azure OpenAI.
Fine-tune and customize LLMs or Diffusion models on domain-specific datasets.
Experience in Azure Databricks, Azure Synapse and Azure Streaming analytics.
Manage Data Engineering & ML Studio in sagemaker(‘aws) and ML studio(azure), Vertex(GCP), Dataproc, dataflow.
Working on application deployment on Kubernetes(Microservices) to adhere de-coupled architecture standard.
Managing Real-Time data ingestion with confluent Kafka and changing the batch process culture to real-time data ingestion culture.
Manage data engineering with Databricks(aws & Azure) with data governance (Collibra).
Integrated LLM-powered agents with enterprise data systems using LangChain, LangGraph, and OpenAI function calling APIs to achieve autonomous task completion workflows.
Developed end-to-end GenAI and Agentic AI pipelines combining Retrieval-Augmented Generation (RAG), tool use, and self-reflection agents to enhance accuracy and adaptability.
Integrated multi-agent workflows for drug discovery knowledge graph analysis, combining reasoning agents with data-query agents to identify compound–target interactions.
Enabled autonomous reasoning and goal-driven task execution using memory, feedback loops, and real-time API integration across systems like Databricks, ServiceNow, and Jira.
Worked on Terraform (Infrastructure as Code) IaaC to build and maintain cloud(gcp/azure/aws) infrastructure along with Airflow and Autosys.
Lead the deployment of GenAI solutions on cloud platforms like AWS, Azure, GCP.
Integrate GenAI APIs with applications via microservices, APIs, and MLOps pipelines.
Working on GenAI in Azure environment with Azure OpenAI & AWS Bedrock(FM's like AI21 Labs' Jurassic, Anthropic's Claude, Cohere), azure synapse etc.
Use amazon bedrock to generate the code and summarize application requirement from long-form text.
Architected and deployed multi-agent AI systems integrating reasoning, planning, and execution agents to autonomously perform analytical and decision-making tasks.
Built domain-specific AI agents using LangGraph, LangChain, and OpenAI GPT models integrated with Databricks, APIs, and enterprise databases.
Designed and deployed containerized microservices on Google Cloud Run, achieving scalable, serverless infrastructure with zero-maintenance overhead.
Built CI/CD pipelines using Cloud Build and GitHub Actions for automated deployment of Cloud Run services across dev, staging, and production environments.
Containerized Python/Node.js/Go applications with Docker and deployed to Cloud Run from Artifact Registry for rapid rollout and rollback management.
Implemented blue-green and canary deployments on Cloud Run using traffic splitting for safe, incremental feature rollouts.
Modeled partitioned Parquet/Delta datasets in S3 for cost-efficient querying via Athena.
Defined logical and physical data models in AWS Glue Data Catalog, aligning schema definitions with Databricks and Power BI.
Worked on Redis cache cluster to enhance the application performance Experience in GenAI on DevOps and AM Automation, Sagemaker integration in Machine learning to integrating on-prem Machine Learning with cloud-based machine learning solution like sagemaker(‘aws) and ML Studio (Azure) or Vertex (Google Cloud) or DataFlow and dataproc.
Integrate GenAI APIs with applications via microservices, APIs, and MLOps pipelines.
Manage streaming data ingestion with Kinesis & Amazon MSK with Amazon MQ to integrate with other legacy application.
Continuously monitor and optimize model inference, latency, and cost.
Apply quantization, distillation, and other optimization techniques for production readiness.
Working on Azure ML Studio (Microservices) to setup a MLOPs pipeline to use the data and find deep insight and deployment through endpoints.
Experience in discovery, assessment, planning, design, execution, optimize phases of migration.
Managing Real-Time data ingestion with confluent Kafka and changing the batch process culture to real-time data ingestion culture.
Designing and implementing real time data analytics solution so client can take informed decision.
Working on Azure DevOps ecosystem which includes CI/CD Pipeline deployment, git branch management through gitflow & trunk, migration of application to Kubernetes, creating Infra with Terraform, Code security with SonarQube, DevOps Engineering, Static analysis and composition analysis.
CLOUD ENTERPRISE ARCHITECT, 07/2013 - 02/2016
State Street, Boston, MA
Working with Twenty-seven Global Director of Cloud Operations to define and implement Cloud/Ops practices across clients.
Cloud operations team lead with daily standups, delegation of work and management of the offshore team and validating scope of work against work completed. Looking over deliverables.
Design, build, and manage cloud infrastructure using automated processes (infrastructure as code) for AWS and Azure with Microservices.
Implement access and security controls, including security groups, profiles, permissions, and key management strategy.
Install and configure monitoring and logging tools and set up client dashboards.
Design infrastructural solutions for clients to create a better security, monitoring, resiliency, and cost-effective AWS Environment.
Worked on data integration services from azure i.e. ADF, Azure Databricks, data engineering, data warehousing, Service Bus, Kafka, spark, Rancher, Kubernetes (Microservices) & docker, kinesis.
Worked on Streaming data ingestion with Kafka and Amazon MSK for fraud detection.
Experience in Terraform in Aws & Azure cloud to setup the infrastructure and manage it with different terraform tools like terraform modules, remote state, dry state, conditional logic etc.
Build Awd DevOps Pipeline(aws code build, code deploy etc..
Worked on GCP data and AI/ML stack i.e vertex, dataflow and dataproc.
Part of Cloud migration from on-prem VMware environment to public cloud (Azure & AWS) environment with Microservices(Kubernetes).
Manage Cloud formation & IaaC(Terraform) to build the cloud infrastructure.
Experience in Blue-Green Deployment in Azure/AWS.
Performed POC on AZURE migration for Dev Environment.
Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases.
Worked on DevOps component like Jenkin CI/CD design which includes components like Git, Jenkin, Docker, Kubernetes, ansible, Selenium, Terraform.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data' technologies like Kinesis & AWS MSK.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Design for high availability and business continuity using self-healing-based architectures, fail-over routing policies, multi-AZ deployment of EC2 instances, ELB health checks, Auto Scaling, and other disaster recovery models.
Designed and implemented for elasticity and scalability using elastic ache, CloudFront – Edge locations, RDS (read replicas, instance sizes) etc.
Through IaaC (Infrastructure as a Code) build and deploy cloud infrastructure on azure & aws.
VPC peering via DMS with other Accounts allowing access and routing to service and users of separate account to communicate.
Successfully Migrated SAS Application & Database from VMware to AZURE using Azure migration strategy because some application components were not cloud ready.
Prepare DevOps best practices, training and knowledge base document for team members.
Closely worked with US customers and our engineering team to convey the business requirement into technical points and learned tools like ML and SAS.
CLOUD ENGINEER, 02/2009 - 12/2012
Rackspace, Austin, TX
Automated Ansible to deploy and provision test environment for the codes that are ready to be tested by selenium.
Successfully created CloudFormation across multiple AWS accounts including DynamoDB and RDS configurations, Configured GitHub Enterprise with Jenkins CI, and AWS Code Build and CodeDeploy.
Used version Control System to commit and push codes to GitHub.
Built developers code using Git plugins and Maven, Ant.
Tested the build features before deployment in Docker containers.
Performed data migration using AWS Data Sync from on-premises legacy systems to AWS Cloud Infrastructure.
Configured EC2 instances, security groups, subnets, NACL's, route tables and IGW within AWS VPC.
Worked with CI/CD Code Pipeline using AWS CodePipeline, AWS Code Commit, AWS Code Build, AWS CodeDeploy.
Performed API testing using Postman.
Implemented Unit, Smoke, and Stress testing for application reliability.
Design, build and manage cloud infrastructure using automated processes (Infrastructure as code).
Performed data migration using AWS Data Sync from on-premises legacy systems to AWS Cloud Infrastructure.
Successfully Migrated SAS Application & Database from VMware to ‘aws.
Migrate application to cloud using shift & Lift technique.
EDUCATION
Delhi University, 01/2006
Bachelor: Computer Application
CERTIFICATIONS
AWS Certified Solution Architect-Professional
Azure Certified Data Engineer
Azure Certified DevOps Engineer
Google Certified Cloud Architect
#HRJ#d8dda722-4dce-491d-801d-99bd3e9c4266#