Post Job Free
Sign in

Data Engineering Azure

Location:
Manhattan, NY, 10019
Salary:
140000
Posted:
May 18, 2024

Contact this candidate

Resume:

Jarnail Singh

[ad5sd5@r.postjobfree.com]

Education: Bachelor in Computer Application, Delhi University (2003-2006)

14+ year experience as DevOps & Cloud architect.

Managed source code repository, build, and release configurations, processes, and tools to support daily development, test, and production build and software deployment

operations.

Maintained Maven environment by setting up local, remote and central repositories with specified configuration in maven configuration files.

Creating new jobs in Jenkins and managing the build related issues.

coordinating with the Development team to fix the Build related issues.

updating the configurations using Jenkins ‘aws plugins.

Managing Aws sagemaker to deploy models through endpoints.

Working on GenAI Bedrock to manage LLM’s

Snowflake internals and integration with other data processing technologies

Cloud computing experience with AWS, Microsoft Azure, Google Cloud

Configuring the various resources logs like ALB, NLB, VPC Flow logs, CloudTrail logs etc.

Configuring the Retention policy to all S3 buckets to IA standard and Glacier.

Configuring GCP Cloud environment like Cloud run,cloud functions,BigQuery,BigTable,CICD etc

Configuring the API Gateway services, SNS and SQS according to configuration request.

Designed, configured and deployed Amazon Web Services (AWS) for a multitude of applications using the AWS stack (EC2, S3, RDS, Cloud Watch, SQS, IAM), focusing on high-

availability, fault tolerance, and auto-scaling

Configure Build & Release/Deployments pipeline in Azure DevOps.

Worked on structured data ingestion for analytic consumption from a wide range of sources with Azure Data Factory i.e. Databricks, Collibra& snowflakes, Alteryx Sagemaker.

Extensive experience in Terraform IaaC.

Worked on snowflake,Azure Data Lake Gen2,GCP BigQuery,BigTable and GCP Data set, AWS redshift and Kinesis.

recipes to perform middleware binary Installation and Configuration tasks involving JDK, Tomcat binaries installation.

Structured and maintained assigned virtual servers. Monitored server traffic through Nagios. Administered web servers on Apache and Nginx.

Installed and configured Nagios monitoring tool, while using it for monitoring

Professional Experience:

Role: Solution Architect (DevOps&Data)

Client: Trilead (An HPE Company), Feb 2016 – Till Date

Roles & Responsibilities:

Overseeing service planning and architecture for client

Report directly to the director Technology regarding AWS production workbook, project plan and architectural designs and client portfolios • Implementing well architecting framework and AWS best practices for all application designs

Meet with clients to do discovery and the designing of an AWS infrastructure for the application build needs

Design Data DevOps pipeline workflow and architecture with AWS EKS.

Leading GCP migration effort for one German Insurance client.

Build GCP Cloud DevOps environment like Google cloud function, cloud run,VPC,Access control,CICD etc..

Build Data Lake (GCP & Azure Data Storage Gen2) to make sure all reporting work should be performed at data lake.

Manage Amazon MSK,Amazon MQ in streaming environment.

Performed a POC on creating chatbot using Amazon Bedrock’s LLM

Experience in Git workflow i.e Git branching and Git merge in complex .net/java development environment.

Extensive experience in Azure Data bricks.

Manage Data Engineering & ML Studio in sagemaker(‘aws) and ML studio(azure),Vertex(GCP)

Working on application deployment on Kubernetes to adhere de-coupled architecture standard.

Managing Real-Time data ingestion with confluent Kafka and changing the batch process culture to real-time data ingestion culture.

Manage MLOPS pipeline with Aws sagemaker canvas to define pipeline which ultimately streamline the model deployment.

Manage Azure DevOps with Git flow as branching strategy, SonarQube & vera code for code security scanning.

Create authorization frameworks for better access control in snowflake

Ensuring data quality and integrity by implementing data validation, cleansing, and auditing processes on the AWS platform along with AI based cloud data analytical tools like Collibra & Alteryx.

Optimizing data infrastructure and systems for performance, scalability, and cost efficiency on the Azure platform with Nerdio i.e. Worked on Nerdio to deploy and automate AVD, reduce storage cost with Advance autoscaling.

Worked on Terraform (Infrastructure as Code) IaaC to build and maintain cloud(gcp/azure/aws) infrastructure along with Airflow and Autosys.

Working on GCP DevOps services like Google Cloud Run,Cloud Function,Anthos, Site Reliability & CI/CD.

Use amazon bedrock to generate the code and summarize application requirement from long-form text.

Worked on application refactoring and database modernization to make them cloud ready with AWS EKS and GCP Kubernetes.

Worked on Azure Event Hub and confluent Kafka to manage Realtime streaming event from open source DB’s like PostgreSQL.

Worked on Redis cache cluster to enhance the application performance Experience in GenAI on DevOps and AM Automation, Sagemaker integration in Machine learning to integrating on-prem Machine Learning with cloud-based machine learning solution like sagemaker(‘aws) and ML Studio (Azure) or Vertex (Google Cloud)

Manage Hotfix & Patch merge in production.

Manage streaming data ingestion with Kinesis & Amazon MSK with Amazon MQ to integrate with other legacy application.

Worked on Terraform IaaC which include managing implicit dependencies, creating variables,Conditional logic,testing code with terratest.

Workign on Azure ML Studio to setup a MLOPs pipeline to use the data and find deep insight and deployment through endpoints.

Experience in discovery, assessment, planning, design, execution, optimize phases of migration

• Managing Real- Time data ingestion with confluent Kafka and changing the batch process culture to real-time data ingestion culture. • Designing and implementing real time data analytics solution so client can take informed decision.

Working on Azure DevOps ecosystem which includes CI/CD Pipeline deployment, git branch management through gitflow & trunk, migration of application to Kubernetes, creating Infra with Terraform, Code security with SonarQube,DevOps Engineering, Static analysis and composition analysis .

Role: Cloud Data/DevOps Engineer Client: State Street (Boston, MA) July 2013 – Feb 2016

Roles & Responsibilities:

•Working with Twenty-seven Global Director of Cloud Operations to define and implement Cloud/Ops practices across clients

•Cloud operations team lead with daily standups, delegation of work and management of the offshore team and validating scope of work against work completed. Looking over

deliverables.

•Design, build, and manage cloud infrastructure using automated processes (infrastructure as code) for AWS and Azure.

•Implement access and security controls, including security groups, profiles, permissions, and key management strategy

•Install and configure monitoring and logging tools and set up client dashboards

•Design infrastructural solutions for clients to create a better security, monitoring, resiliency, and cost- effective AWS

Environment.

•Worked on data integration servcies from azure i.e. ADF, Azure Databricks, Service Bus, Kafka,spark,Rancher,Kubernetes & docker,kinesis.

•Worked on Streaming data ingestion with kafka and Amazon MSK for fraud detection.

•Experience in Terraform in Aws & Azure cloud to setup the infrastructure and manage it with different terraform tools like terraform modules, remote state, dry state, conditional logic etc.

•Part of Cloud migration from on-prem VMware environment to public cloud (Azure & AWS) environment.

•Performed git merge for hotfixes and new features into main branch.

•Performed POC on AZURE migration for Dev Environment. • Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases. • Worked on DevOps component like Jenkin CI/CD design which includes components like Git, Jenkin, Docker, Kubernetes, ansible, Selenium, Terraform. • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies like Kinesis & AWS MSK. • Build analytics tools that utilize the

data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Design for high availability and business continuity using self-healing-based architectures, fail-over routing policies, multi-AZ

deployment of EC2 instances, ELB health checks, Auto Scaling, and other disaster recovery models.

•Designed and implemented for elasticity and scalability using elastic ache, CloudFront – Edge locations, RDS (read replicas, instance sizes) etc.

•Through IaaC (Infrastructure as a Code) build and deploy cloud infrastructure on azure & aws...

•VPC peering via DMS with other Accounts allowing access and routing to service and users of separate account to

communicate.

Highlights/Achievement:

•Successfully Migrated SAS Application & Database from VMware to AZURE using Azure migration strategy because some application components were not cloud ready. • Prepare DevOps best practices, training and

knowledge base document for team members. • Closely worked with US customers and our engineering team

to convey the business requirement into technical points and learned tools like ML and SAS

Role : Cloud Engineer (AWS/Azure)

Client: Rackspace (Dallas, Texas)

Feb 09 – Dec12.

Roles and responsibilities.

•Automated Ansible to deploy and provision test environment for the codes that are ready to be tested by selenium.

•Successfully created CloudFormation across multiple AWS accounts including DynamoDB and RDS configurations, Configured GitHub Enterprise with Jenkins CI, and AWS Code Build and CodeDeploy.

•Used version Control System to commit and push codes to GitHub

•Built developers code using Git plugins and Maven, Ant.

•Tested the build features before deployment in Docker containers.

•Performed data migration using AWS Data Sync from on-premises legacy systems to AWS Cloud Infrastructure.

•Configured EC2 instances, security groups, subnets, NACL's, route tables and IGW within AWS VPC.

•Worked with CI/CD Code Pipeline using AWS CodePipeline, AWS Code Commit, AWS Code Build, AWS CodeDeploy.

•Performed API testing using Postman.

•Implemented Unit, Smoke, and Stress testing for application reliability.

•Design, build and manage cloud infrastructure using automated processes (Infrastructure as code). Performed data migration using AWS Data Sync from on-premises legacy systems to AWS Cloud Infrastructure.

Highlights/Achievement:

•Successfully Migrated SAS Application & Database from VMware to ‘aws.

•Migrate application to cloud using shift & Lift technique.

Technical Skills:

Build Management Tools:

Maven

Version Control Tool:

Git

Code Analysis and Review Tool

SonarQube

Continuous Integration Tools:

Jenkins, AWS CodePipeline, Azure CICD

Ticketing Tools:

Jira, ServiceNow

Configuration Management tool

Ansible, Terraform

Container Orchestration

Docker, Kubernetes

Repository Managers Tools

Nexus and Frog

Application Server Monitoring

Grafana, Signal

Scripting Languages:

Shell Scripting, YAML

Operating Systems:

Windows, Unix/Linux.

Data Analytics

Redshift/Azure Databricks, BigQuery

Cloud:

AWS (VPC, IAM, EC2, S3,

EMR, Cloud Formation, ECS,ECR, EKS),Azure, GCP



Contact this candidate